Senior Lead Software Engineer - Databricks & Python

JPMorganChase

JPMorganChase

Software Engineering

Glasgow, UK

Posted on Apr 28, 2026

Join an agile team delivering high-quality technology products.

As a Senior Lead Software Engineer (Python/Java, Spring, API, AWS) at JPMorganChase in Global Finance Technology, you’ll build secure, scalable solutions, write production code and tests, review code, and pair program. In Regulatory Reporting, you’ll develop and optimize Python/PySpark big data pipelines across asset classes.

Job responsibilities

  • Lead by example: contribute production‑quality code daily, write tests, perform code reviews, and pair program.
  • Design and deliver secure, high‑quality microservices and web applications with Java/Spring Boot and React; own deep debugging, root‑cause analysis, and performance tuning for high-availability services.
  • Own end‑to‑end build‑and‑run: design, implement, test, deploy, and operate services (“you build it, you run it”).
  • Design, develop, and maintain robust big data pipelines using Python and PySpark on Databricks platform on AWS, optimizing complex queries and data processing workflows to ensure efficient performance at scale.
  • Provide technical leadership and act as SME for microservices, CI/CD, observability, performance engineering, and data modelling.
  • Champion SDLC practices: test automation, CI/CD, security‑by‑default, quality gates, and disciplined change management.
  • Embed non‑functional requirements (security, scalability, reliability, observability, cost) into designs; implement logging, metrics, tracing, SLOs, error budgets, alerting, and runbooks.
  • Foster an inclusive team culture; mentor engineers and drive continuous improvement and craftsmanship.
  • Participate in the full Software Development Life Cycle (SDLC), including requirements gathering, design, development, testing, deployment, and maintenance, working with our partners Product Owners and end users to support their business use cases.
  • Implement data quality checks, monitoring, and alerting mechanisms to ensure data accuracy and pipeline reliability and act as both Production Support and SRE function as part of daily responsibilities.
  • Hands‑on use of AI coding assistants to accelerate delivery (e.g., Claude Code/Claude CoWork, IDE Copilot): prompt design, code/test generation, refactoring, and documentation synthesis; ability to validate outputs for correctness, security, performance, and licensing compliance.

Required qualifications, capabilities, and skills

  • Strong hands-on experience in data engineering or related roles
  • Strong proficiency in Python and PySpark for large-scale data processing
  • Advanced proficiency in Java and Spring Boot; strong fundamentals, design patterns, and secure coding.
  • Full‑stack delivery with React (component/state management) and secure RESTful API design.
  • Demonstrated experience with AWS, Databricks and Apache Spark ecosystem
  • Reliability and performance engineering: concurrency, thread management, caching, and resiliency patterns (circuit breakers, retries, backoff), with cost awareness.
  • Proven track record shipping and operating production systems; comfortable troubleshooting in Kubernetes, CI/CD, and cloud environments.
  • Relational and NoSQL databases: schema design, performance tuning, and secure data access.
  • Experience with AWS cloud services (S3, ECS, SNS/SQS, Lambda, etc.)
  • Strong analytical skills with ability to investigate data issues, identify root causes, and implement solutions

  • Experience with the complete SDLC, Jules/Jenkins, Spinnaker, Sonar and Agile methodologies

Preferred qualifications, capabilities, and skills

  • Understanding of regulatory finance/external reporting (workflows, aggregations, reconciliations, controls).
  • Cloud certifications; proven cloud‑native delivery on AWS.
  • Experience with large‑scale distributed systems, event‑driven architectures, and messaging/streaming patterns.
  • Observability/SRE depth: telemetry pipelines, alerting strategies, incident response, post‑mortems, and continuous improvement.
  • Experience with data orchestration tools (Airflow, Step Functions, etc.).

    Understanding of financial services industry and regulatory requirements.

    Databricks or AWS certifications

  • Automated testing frameworks, e.g. Playwright, Cucumber, Gherkin etc.

    Experience with Parquet, JSON, CSV, Avro, Delta Lake


J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world’s most prominent corporations, governments, wealthy individuals and institutional investors. Our first-class business in a first-class way approach to serving clients drives everything we do. We strive to build trusted, long-term partnerships to help our clients achieve their business objectives.
We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation.

Our professionals in our Corporate Functions cover a diverse range of areas from finance and risk to human resources and marketing. Our corporate teams are an essential part of our company, ensuring that we’re setting our businesses, clients, customers and employees up for success.
Promote secure, scalable finance tech, build data pipelines and microservices, boost reliability, and mentor engineers for high impact.