25-03-2026 16:30:24
Job_303888
2 - 4 years
Job Location: Preferable Bangalore or other locations
Need By: Immediate
Required Qualifications & Skills:
Mandatory: Java, Spring boot, Hadoop, SQL
1. Technical Expertise:
Backend: Java (6+), Spring Boot, Microservices, REST APIs, Kafka, PostgreSQL, Hive.
CI/CD & DevOps: Jenkins, ArgoCD, Helm, Kubernetes, Docker, Terraform (preferred).
Cloud Platforms: Azure, GCP (certifications are a plus).
Big Data: Hadoop, Hive, Spark, NiFi (a strong plus).
Agile Tools: Jira, Confluence, Git.
2. Experience:
2-4 years of hands-on software development experience in enterprise applications.
Proven track record as a technical lead or technical anchor delivering complex projects.
Strong debugging, problem-solving, and performance-tuning skills.
3. Soft Skills:
Excellent communication skills for cross-functional collaboration.
Ability to think critically and provide innovative solutions to technical challenges.
Job Description
Hands-on Technical Expertise & Problem-Solving
· Provide deep technical expertise in backend (Java, Spring Boot, Hadoop, Kafka, PostgreSQL, Hive)
· Collaborate with architects and senior engineers to define and implement scalable architectures.
Software Development & System Design
· Design efficient database schemas and optimize queries for PostgreSQL and Hive.
· Develop real-time data streaming solutions using Kafka and integrate them into enterprise applications.
· Implement distributed computing solutions for handling large-scale data processing (Spark/NiFi preferred).
DevOps, CI/CD, and Cloud Deployment
· Design and maintain robust CI/CD pipelines using Jenkins, ArgoCD, and Helm to streamline build, test, and deployment workflows.
· Automate infrastructure provisioning and deployments across Azure and GCP cloud platforms.
· Implement monitoring, logging, and alerting solutions to ensure system reliability.
Performance Optimization & Scalability
· Conduct code reviews, performance tuning, and debugging to ensure optimal system efficiency.
· Identify and resolve bottlenecks in databases, APIs, and frontend components to meet performance SLAs.
Big Data Engineering & Analytics
· Develop data pipelines and batch/real-time processing solutions leveraging Hadoop, Kafka, Hive, Spark, and NiFi.
· Optimize data ingestion, transformation, and query performance to handle large-scale data workloads efficiently.
Agile Execution & Technical Excellence
· Work in an Agile environment, actively participating in sprint planning, stand-ups, and retrospectives.
· Ensure projects are delivered on time with high-quality standards, following industry best practices.
Technical Documentation & Best Practices
· Maintain high-quality technical documentation, including architectural decisions, best practices, and coding standards.
· Advocate for clean code, design patterns, and modern software engineering principles.