Skip to main content

Job Description

   Back

Python Developer (AWS | Kafka | SQL | Data Engineering)

22-12-2025 13:54:11

Job_302910

8 - 12 years

  • Pune, Maharashtra, India (PUN)

Job Title: Python Developer (AWS | Kafka | SQL | DataEngineering)

Experience: 7–10 years

We are looking for a proactiveand highly skilled Python Developer with solid experience in AWS,Kafka, SQL databases, and a strong understanding of data engineeringpractices. The ideal candidate will be responsible for designing, building,and optimizing scalable data pipelines—both batch and real-time—using moderncloud and streaming technologies.


Key Responsibilities

  • Design, build, and maintain scalable, reliable data pipelines and ETL workflows in AWS for batch and streaming use cases.
  • Develop event-driven workflows using AWS EventBridge for automation and orchestration.
  • Write clean, efficient, and maintainable Python code using libraries such as Boto3 and Pandas.
  • Develop and deploy containerized applications using Docker and orchestrate them via EKS.
  • Build and manage real-time streaming pipelines using Kafka, AWS Kinesis, and AWS Step Functions.
  • Work with relational databases such as MySQL and PostgreSQL, ensuring performance and reliability.
  • Leverage big data tools—Apache Spark, AWS Glue, Athena—to develop and manage data lakes and lakehouse solutions.
  • Refactor and optimize Python code for better readability, performance, and testability.
  • Create automated tests using unittest / pytest for ensuring pipeline quality.
  • Monitor and troubleshoot data pipelines with CloudWatch and other AWS monitoring tools.
  • Manage secure access and resource policies using AWS IAM.
  • Build and manage REST APIs and data endpoints using AWS API Gateway.
  • Collaborate with DevOps teams to set up and maintain CI/CD pipelines.
  • Use GitHub for version control, code reviews, and collaborative development.

Required Skills & Experience

  • Strong hands-on experience with Python, including Boto3 and Pandas.
  • Proficiency in core AWS services: EventBridge, IAM, API Gateway, Glue, Athena, Step Functions, S3, EC2, etc.
  • Solid experience working with Kafka, Docker, and EKS for container orchestration.
  • Strong understanding of SQL and experience with MySQL / PostgreSQL.
  • Proven ability to build and maintain batch and real-time pipelines.
  • Experience with Apache Spark, AWS Glue, and Athena for big data processing.
  • Strong foundation in Python code optimization and automated testing.
  • Hands-on experience in pipeline monitoring and troubleshooting using CloudWatch.
  • Familiarity with GitHub and collaborative code development.

Good to Have

  • Experience building APIs with Flask.
  • Knowledge of AWS services like SNS, SQS, Timestream, and tools such as Apache Airflow.
  • Exposure to AWS Kinesis streaming.
  • Experience with CI/CD automation tools.
  • Understanding of AWS Redshift, ElastiCache (Redis), AWS DMS.