Skip to main content

Job Description

   Back

Senior Data Engineer - Airflow DAG's

08-08-2025 15:14:09

Job_302768

5 - 8 years

  • Chennai, Tamil Nadu, India (CHN)

Our Culture
At Neurealm, we believe in the power of performance, creativity, andcollaboration. Our team members, whom we call "Neuronauts," thrive inan environment driven by innovation, trust, and continuous learning. Weencourage everyone to challenge boundaries and explore the edge of what’spossible.

Joinus and be part of a purpose-led, AI-driven future. At Neurealm, your ideasmatter, your work has impact, and your career can reach new heights.

We look forward to the possibility ofwelcoming you into our world — now as Neurealm — where human ingenuity andtechnology come together to shape what’s next !!

WORK TIMINGS : 2 PM -11 PM IST (3 Hours overlap with EST Time Zone )

Description:

We seek a dedicated anddetail-oriented Senior Developer to join our dynamic team.
The responsibility of the successful candidate will be to handle repetitivetechnical tasks, such as Healthy Planet MS SQL file loads into a datawarehouse, monitor Airflow DAGs, manage alerts, and rerun failed processes.
Additionally, the role will require the analyst to monitor various daily andweekly jobs, which may include generation of revenue cycle reports and datadelivery to external vendors.
The perfect candidate will have a robust experience with MS/Oracle SQL, Python,Epic Health Systems, and other relevant technologies.

Overview:

As a Senior Developer , youwill play a vital role to improve the operation of our data load and managementprocesses.
Your primary responsibilities will be to ensure the accuracy and timeliness ofdata loads, maintain the health of data  pipelines, and monitor that allscheduled jobs are completed successfully.
You will collaborate with cross-functional teams to identify and resolveissues, improve processes, and maintain a high standard of data integrity.

Responsibilities:

·      Manage and perform Healthy Planet file loadsinto a data warehouse.

·      Monitor Airflow DAGs for successful completion,manage alerts, and rerun failed tasks as necessary.

·      Monitor and oversee other daily and weekly jobs,including FGP cash reports and external reports.

·      Collaborate with the data engineering team tostreamline data processing workflows.

·      Develop automation scripts to reduce manualintervention in repetitive tasks using SQL and Python.

·      Ensure all data-related tasks are performedaccurately and on time.

·      Investigate and resolve data discrepancies andprocessing issues.

·      Prepare and maintain documentation for processesand workflows.

·      Conduct periodic data audits to ensure dataintegrity and compliance with defined standards.

 

Skillset Requirements:

·      MS/Oracle SQL

·      Python

·      Data warehousing and ETL processes

·      Monitoring tools such as Apache Airflow

·      Data quality and integrity assurance

·      Strong analytical and problem-solving abilities

·      Excellent written and verbal communication

Additional Skillset:

·      Familiarity with monitoring and managing ApacheAirflow DAGs.

Experience:

·      Minimum of 5 years’ experience in a similarrole, with a focus on data management and process automation.

·      Proven track record of successfully managingcomplex data processes and meeting deadlines.

Education:

Bachelor’s degree in ComputerScience, Information Technology, Data Science or a related field.