Senior Data Engineer, AVP
Deutsche Bank
Job Description:
Job Title: Senior Data Engineer, AVP
Location: Pune, India
Role Description
Our Technology, Data and Innovation (TDI) strategy is focused on strengthening engineering expertise, introducing an agile delivery model, as well as modernising the bank's IT infrastructure with long-term investments and taking advantage of cloud computing.
You will be working in the Transaction Monitoring and Data Controls team designing, implementing, and operationalising Java components.
What we’ll offer you
As part of our flexible scheme, here are just some of the benefits that you’ll enjoy
- Best in class leave policy
- Gender neutral parental leaves
- 100% reimbursement under childcare assistance benefit (gender neutral)
- Sponsorship for Industry relevant certifications and education
- Employee Assistance Program for you and your family members
- Comprehensive Hospitalization Insurance for you and your dependents
- Accident and Term life Insurance
- Complementary Health screening for 35 yrs. and above
Your key responsibilities
- Design, build, and maintain scalable and reliable PySpark/DBT/BigQuery data pipelines, pre-dominantly on Google Cloud Platform (GCP) to process high-volume transaction data for regulatory and internal compliance monitoring.
- Implement robust data quality frameworks and monitoring solutions to ensure the accuracy, completeness, and timeliness of data within our critical transaction monitoring systems.
- Contributing to DevOps capabilities to ensure maximum automation of our applications
- Collaboration across the TDI areas such as Cloud Platform, Security, Data, Risk & Compliance areas to create optimum solutions for the business, increasing re-use, creating best practice, and sharing knowledge
Your skills and experience
- Expert hands-on Data Engineering using at least one of:
- Java/Scala/Kotlin in a toolset such as Apache Spark, Dataflow/Apache-Beam, Apache Flink
- Python in a toolset such as PySpark or Dataflow/Apache-Beam
- SQL based using DBT
- Professional experience of at least one data warehousing technology (ideally Google Big Query), including knowledge of partitioning, clustering, and cost/performance optimization strategies.
- Hands on experience writing and maintaining DevOps pipelines in at least one "CI/CD" tool such as Team City, Jenkins, GitHub Actions.
- Experience contributing to software design and architecture including consideration of meeting non-functional requirements (e.g., reliability, scalability, observability, testability) and understanding of relevant Architecture styles and their trade-offs - e.g., Data Warehouse, ETL, ELT, Monolith, Batch, Incremental loading vs Stateless processing
- Experience navigating and engineering within a secure, enterprise hybrid cloud environment within a large, regulated, and complex technology landscape
- Experience of working with a globally distributed team requiring remote interaction across locations, time zones and diverse cultures and excellent communication skills (verbal and written)
How we’ll support you
- Training and development to help you excel in your career
- Coaching and support from experts in your team
- A culture of continuous learning to aid progression
- A range of flexible benefits that you can tailor to suit your needs
About us and our teams
Please visit our company website for further information:
https://www.db.com/company/company.html
We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.
Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.
We welcome applications from all people and promote a positive, fair and inclusive work environment.