Home » Python Artisan Engineer

Python Artisan Engineer

Date Posted —

Type of Work:
Full Time
Salary:
1500 USD
Hours per Week:
45

Job Description

Overview

Step into the role of a Python Artisan Engineer, where craftsmanship meets innovation. Your mission will be to expertly shape our social media platform, seamlessly integrating API, data processing, and AI engineering. As an artisan, you will weave together technical mastery with forward-thinking solutions. Join our cadre of specialists in setting the gold standard for the next era of digital progression.

Role Description

As a Python Platform Developer, you’ll dive deep into our platform’s API layer while mastering the core data processing elements. The role requires collaboration with diverse technical and non-technical teams, including product strategists, QA experts, data specialists, and tech architects.

Key Responsibilities

– API Development: Craft and optimise our API layer, leveraging technologies like FastAPI, Pydantic, Kafka, and Redis.
– Core Data Processing Layer: Oversee and enhance the core data processing operations using tools like Apache Airflow and Kafka Sensors. This includes handling vast data scales, partitioning, and fine-tuning Kafka configurations to efficiently consume high-volume, high-throughput data.
– AI Engineering: Engage with leading LLMs, including OpenAI, Llama, and Vector databases. Design and deploy state-of-the-art solutions for both generative and descriptive AI. Seamlessly integrate AI Copilot into the development lifecycle, harnessing its generative code and solution design capabilities.
– Quality Assurance: Commit to our high standards by ensuring every piece of code goes through rigorous automated testing – spanning unit tests, integration, and end-to-end tests.
– Data Management: Handle vast data scales with proficiency in crafting efficient queries and optimizing existing ones. Ensure robust ETL pipelines using Apache Airflow and efficiently extract, transform, and load data from sources such as MongoDB.
– Dashboard and Visualization: Utilize your data science skills to pull data, and create dashboards, and visualizations using relevant tools. Transform data into actionable insights and desired formats.
– Collaboration: Immerse yourself in our extreme programming approach that promotes pair programming, alignment with QA for testing paradigms, and synergies with tech architects for superior solutions.

Essential Qualifications

– Minimum 4 years of backend development experience with Python.
– Proficiency in integrating with data layers such as MongoDB and Kafka, handling CRUD operations, schema migrations, and CDD, ETL data pipeline interaction.
– Expertise in prominent Python web & ORM frameworks.
– Comprehensive understanding of the Software Development Life Cycle, emphasizing continuous integration and deployment.
– Experience with processing high-volume, high-throughput data from Kafka, including knowledge of partitioning and configuration fine-tuning.
– Familiarity with distributed computing in Python, utilizing Apache Airflow for distributed tasks and data processing.
– Ability to create and manage data dashboards and visualizations using tools like Pandas.
– Steadfast commitment to upholding data privacy and security standards.

Preferred Qualifications

– Experience with containerization, specifically Docker, and optimizing its performance.
– Hands-on experience in GitOps, primarily in steering automation code pipelines.
– Familiarity with Kubernetes and ArgoCD for streamlined deployments and application configurations.
– Solid background working in cloud environments, especially AWS.

Our Promise

By joining us, you’re not just becoming a part of a team; you’re entering a community. With our dedication to collaboration, continuous learning, and quality, you’re poised to thrive in an environment that simultaneously challenges and nurtures.

APPLY FOR THIS JOB:

Company: Seller Candy
Name: Barath Kumar
Email:

Skills