Data Engineer

Societe Generale

Societe Generale

Software Engineering, Data Science

Montreal, QC, Canada

Posted on Apr 17, 2026

Responsibilities

ABOUT THE JOB:
The aim of Global Banking Technology & Operations (GBTO) Canada is to deliver day-to-day services to Société Générale investment and corporate bank units and their clients to accelerate their transformation. GBTO differentiates itself from competitors with the pace of the agile transformation delivered, spreading the technology and data culture, shortening the decision-making process, and adopting a true industrial approach, leveraging on different teams either transversal or aligned to the different sub business units.

The Counterparty Credit Risk (CCR) team belongs to the XRM (Cross Risk Metrics) department that aims at supporting and developing tools primarily for the risk department (RISQ). You will join the CCR AMER feature team in charge of several applications including post-origination portfolio monitoring tools shared by 1LOD and 2LOD, used to digitize key forms & decisions, automate manual processes & permanent controls, create a clean audit trail, generate real-time reporting, and improve data quality.

We heavily rely on Agile framework (Scrum):

  • To build a solid and trustful relationship with our functional partners thanks to a continuous and transparent communication.
  • To maintain a strong team dynamic that focuses on the delivery of value for the users and maintaining the application with a long-term view.

Autonomous on your developments, you will always have the support of the team to continuously improve functionally and technically. You will be encouraged to also share your point of view and approaches in software craftmanship. Continuous improvement is key for us!

  • Create and maintain clean audit trails and generate real-time reporting.
  • Improve data quality and reliability across applications.
  • Work within an Agile Scrum framework to ensure continuous and transparent communication with functional partners.
  • Contribute to a strong team dynamic focused on long-term value delivery and application sustainability.
  • Design, build, and maintain scalable and robust data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data.
  • Participate in data remodeling initiatives supporting migration from on‑premises private cloud environments to Microsoft Azure.
  • Develop, test, and maintain databases, large-scale processing systems, and data warehouses optimized for cloud deployment.
  • Collaborate with risk analysts and other stakeholders to gather data requirements and deliver high-quality datasets for analytics and machine learning.
  • Optimize data delivery by redesigning infrastructure for scalability, reliability, and efficiency.
  • Implement and enforce data security, governance, and compliance standards in line with internal policies and regulatory requirements.
  • Monitor and troubleshoot data systems to ensure data integrity, availability, and performance.
  • Automate repetitive data tasks and workflows to improve operational efficiency.
  • Stay up to date with industry trends, tools, and cloud technologies to continuously improve data engineering practices.
  • Document data engineering processes, data flows, and system configurations.
  • Manage release processes, including release notes, deployment scripts, contingency plans, and rollback procedures.
  • Participate in L3 production support and incident management, including root cause analysis and resolution.