hero

Find Your Dream Job Today

Our mission is to help high-achieving LGBTQ+ undergraduates reach their full potential.

Senior Data Engineer - Latency Critical Trading Technology

Millennium Management

Millennium Management

IT, Data Science
Miami, FL, USA
Posted 6+ months ago
Senior Data Engineer - Latency Critical Trading Technology

We are building a world-class, next generation packet capture data platform which will power the next generation of our systematic portfolio engines. The Latency Critical Trading Group is looking for a Senior Data Engineer to join our growing team. The team consists of low latency linux engineers, network engineers, datacenter engineers, and C++ engineers who are responsible for building our low latency stack. This is an opportunity for individuals who are passionate about quantitative investing. The role builds individuals’ knowledge and skills in four key areas of quantitative investing, namely: data, statistics, technology, and financial markets.

Desirable Candidates

  • Ph.D. or master's in computer science, mathematics, statistics, or other field requiring quantitative analysis

  • 10+ years of financial industry experience

  • Experience working with systematic investing teams

  • Experience working on a trading desk

  • Experience managing global teams of 10+ engineers

  • Programming expertise in Python and C++

  • Experience with statistical toolkits such as Matlab, R, python/scikit, etc

  • Hands-on experience building mathematical and numeric tools for efficient computation

  • SQL Programming skills

  • Strong problem-solving skills

  • Effective communication skills

Job Responsibilities

  • Leadership

    • Interact with portfolio managers and quantitative analysts to understand their use cases

    • Partner with global user base and organize data requirements; effectively communicate data quality characteristics

    • Architect and design our framework for low latency data engineering

    • Serve as a thought-leader to showcase best-practices and modern research techniques

    • Provide analytical expertise to portfolio management teams to assist in understanding large market-data sets

    • Provide guidance and training to the internal data quality team to build their expertise and research capabilities

  • Building a Data Platform

    • Building technology tools to acquire and tag datasets

    • Building data analysis tools on top of our captured data

    • Improving data visualization tools and capabilities

    • Consolidating several PCAPs in a single PCAPNG with meta-data to represent capture location/time across capture regions

  • Microstructure SME

    • Building exchange microstructure expertise and helping educate clients as needed

    • Analyzing datasets to generate key descriptive statistics

  • Data Evaluation

    • Assessing quality of all PCAPs both live and historical

    • Extending parsing and identification of other gaps

  • Data Structuring

    • Inventorying all gaps

    • Documenting history of exchange microstructure behavior changes

    • Understanding and Documenting session times including holiday schedules

    • Understanding and Documenting message timestamp rules for each exchange

    • Documenting history of session schedule changes and protocols

  • Data Monitoring

    • Comparing day-on-day changes in latency, data rate, bursts, etc

  • Data Cleaning

    • handling closing gaps via multiple techniques (other internal captures, vendor data, exchange data, etc)

    • Cleaning historical data

  • Technical

    • Assessing PTP (Precision Time Protocol) quality within a sole source and across multiple sources

  • Research

    • Researching on potential alpha sources and present to portfolio managers and quantitative analysts

  • Other

    • Engaging with vendors, brokers to understand characteristics of datasets

    • Utilizing and maintaining world-class data processing and transformation techniques