Software Engineer III - AWS Data ETL

JPMorganChase

JPMorganChase

Software Engineering

Mumbai, Maharashtra, India

Posted on May 15, 2026

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.

As a Software Engineer III at JPMorganChase within the Consumer and Community Banking, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

Job responsibilities

  • Own end‑to‑end delivery for data warehousing and multi‑terabyte to petabyte‑scale migration initiatives, ensuring reliability, performance, security, and cost efficiency.
  • Provide technical leadership across Python, Spark, Snowflake, and AWS, while driving strong Agile practice, CI/CD automation, and operational excellence.
  • Foster a culture of ownership, collaboration, psychological safety, and continuous learning.
  • Own multi‑workstream roadmaps; plan releases, define milestones, track progress, and remove blockers to hit scope, schedule, and quality targets.
  • Create detailed estimates and work breakdown structures, manage dependencies, risks, and stakeholder expectations.
  • Define and evolve data platform architecture for batch and streaming use cases (event‑driven, microservices, warehouse patterns).
  • Manage data migration (on‑prem to cloud), data profiling, reconciliation, lineage, quality (e.g., deduplication, schema validation), and backfills.
  • Define SLAs; implement monitoring/alerting; lead incident response, root cause analysis, and continuous improvement.
  • Implement LLM and Agentic model for Business Data Insight analysis and reporting

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 3+ years applied experience
  • Hands-on practical experience in system design, application development, testing and operational stability and CI-CD process.
  • Hands-on practical experience in developing spark-based Frameworks for end-to-end ETL, ELT & reporting solutions using key components like Spark Core & Spark Streaming.
  • Proficient in coding in one or more Coding languages – Java or Python
  • Experience working on Data Warehouse platform.
  • Cloud implementation experience with AWS including:
    • AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Airflow (or) Lambda + Step Functions + Event Bridge
    • Data De/Serialization: Expertise in at least 2 of the formats: Parquet, Iceberg, AVRO, JSON
    • AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager
    • Exposure to Terraform
  • In-depth knowledge of the financial services industry and their IT systems. Practical cloud native experience preferably AWS.

Preferred qualifications, capabilities, and skills

  • Experience in Snowflake and Databricks nice to have
  • Experience in Gen AI skills nice to have.

Design and deliver market-leading technology products in a secure and scalable way as a seasoned member of an agile team