Software Engineer III, PySpark, Databricks
JPMorganChase
We have an exciting and rewarding opportunity for you to take your data engineering career to the next level.
As an Senior Data Engineer - Databricks/PySpark/AWS at JPMorgan Chase within the Corporate Sector-Global Finance team, you will be a key member of an agile team, responsible for building and delivering AWS enabled data products that are secure, stable, and scalable. In this role, you will develop data infrastructure, tool integrations, and retrieval systems that enable, interpret, and act on enterprise data in support of the firm’s business goals. You will work alongside senior engineers, grow your expertise in agentic AI data engineering, and contribute to a culture of engineering excellence.
Job responsibilities
- Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
- Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development
- Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems
- Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture
-
Contributes to software engineering communities of practice and events that explore new and emerging technologies
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 3+ years applied experience
- Over 5+ actical experience with Spark, SQL, Databricks, and the AWS cloud ecosystem.
- Expertise in Apache NiFi, Lakehouse/Delta Lake architectures, system design, application development, testing, and ensuring operational stability.
- Strong programming skills in PySpark and SparkSQL.
- Proficient in orchestration using Airflow.
- In-depth knowledge of Big Data and data warehousing concepts.
- Experience with CI/CD processes.
- Solid understanding of agile methodologies, including DevOps practices, application resiliency, and security measures.
- Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security
- Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)
Preferred qualifications, capabilities, and skills
- Familiarity with Snowflake, Terraform, and LLM.
- Exposure to cloud technologies such as AWS Glue, S3, SQS, SNS, Lambda, etc.
- AWS certifications such as SAA, Associate Developer, Data Analytics Specialty, or Databricks certification
Design and deliver market-leading technology products in a secure and scalable way as a seasoned member of an agile team