Data Warehouse Architect (contract)
Xcel Energy
IT
Denver, CO, USA
USD 81-108 / hour
Posted on Mar 14, 2026
Data Architect – Operational Technology to Cloud (Databricks)
Role Summary
The Data Architect – Operational Technology to Cloud will lead the architecture and design of secure, scalable data pipelines that move operational technology (OT) data from on-premises industrial systems into the enterprise Databricks Lakehouse platform.
This role focuses on enabling governed analytics and AI-ready data products by integrating historian systems, sensor telemetry, and industrial asset data into cloud-based data platforms. The architect will collaborate with business leaders, platform engineers, and security teams to modernize legacy data movement pathways and create efficient, secure methods for delivering operational data across the enterprise.
The position plays a critical role in improving how high-volume industrial data is ingested, secured, and exposed for dashboards, analytics, and AI use cases.
Key Responsibilities
Architecture & Solution Design
This role supports enterprise initiatives focused on modernizing operational data pipelines across the organization.
Currently, OT data from industrial systems is available but often moves through high-latency, costly, or legacy pathways. The architect will evaluate existing infrastructure and design new secure data channels that improve performance, scalability, and cost efficiency.
The goal is to make operational data more accessible for:
Experience
Required Qualifications
Experience Integrating Data From Systems Such As
Success In This Role Will Be Measured By
The ideal candidate combines strong data platform engineering skills with operational technology experience. They understand how industrial assets generate telemetry data and can design secure, modern pipelines that bring that data into cloud analytics platforms.
This role requires someone who can balance technical depth with collaboration, helping multiple teams modernize legacy data infrastructure while maintaining strict governance and security standards.
Pay Rate Range
81 - 108 USD hourly
Role Summary
The Data Architect – Operational Technology to Cloud will lead the architecture and design of secure, scalable data pipelines that move operational technology (OT) data from on-premises industrial systems into the enterprise Databricks Lakehouse platform.
This role focuses on enabling governed analytics and AI-ready data products by integrating historian systems, sensor telemetry, and industrial asset data into cloud-based data platforms. The architect will collaborate with business leaders, platform engineers, and security teams to modernize legacy data movement pathways and create efficient, secure methods for delivering operational data across the enterprise.
The position plays a critical role in improving how high-volume industrial data is ingested, secured, and exposed for dashboards, analytics, and AI use cases.
Key Responsibilities
Architecture & Solution Design
- Lead the end-to-end architecture for ingesting and integrating operational technology data into the Databricks Lakehouse.
- Evaluate and design modern data movement patterns for historian systems, asset telemetry, and industrial data sources.
- Architect scalable pipelines using batch, streaming, and CDC ingestion patterns.
- Design data layers including landing, curated, and serving layers within the lakehouse architecture.
- Integrate data from OSI PI, AVEVA systems, SCADA platforms, and industrial telemetry sources.
- Design methods for ingesting high-volume asset sensor data with low latency and high reliability.
- Translate OT asset context and hierarchies into analytics-ready data models.
- Define networking and security controls for moving data from secure on-prem environments into AWS-hosted Databricks workspaces.
- Architect solutions using VPCs, subnets, private connectivity, IAM policies, and encryption.
- Ensure adherence to enterprise security requirements and governance standards.
- Implement data governance standards including:
- Unity Catalog-based access controls
- Data classification and stewardship
- Metadata management and lineage
- Establish standards for data quality, observability, and reliability across pipelines.
- Partner with security, infrastructure, analytics, and operations teams to align on architecture decisions.
- Create architecture documentation, reference patterns, and implementation playbooks.
- Support engineering teams during solution delivery and architecture reviews.
- Ensure operational data is structured and governed for AI, machine learning, and advanced analytics use cases.
- Support development of AI-ready data products and feature engineering datasets.
This role supports enterprise initiatives focused on modernizing operational data pipelines across the organization.
Currently, OT data from industrial systems is available but often moves through high-latency, costly, or legacy pathways. The architect will evaluate existing infrastructure and design new secure data channels that improve performance, scalability, and cost efficiency.
The goal is to make operational data more accessible for:
- AI and advanced analytics
- Enterprise dashboards
- Operational decision-making
- Cross-department data products
Experience
Required Qualifications
- 6–10 years of experience in data architecture, data engineering, or industrial data platforms
- Experience working in operational technology (OT), utilities, energy, or industrial environments
- Strong experience with:
- Databricks Lakehouse architecture
- Delta Lake
- Spark / PySpark
- Python
- Experience designing data ingestion pipelines (ETL / ELT / CDC).
- Familiarity with time-series and asset-centric data modeling.
Experience Integrating Data From Systems Such As
- OSI PI Historian
- AVEVA platforms
- SCADA systems
- Asset sensor telemetry
- Experience designing secure data architectures in AWS
- Knowledge of:
- VPC networking
- IAM
- Encryption
- Private connectivity
- Cloud security controls
- Experience with:
- Unity Catalog
- Metadata management
- Data lineage
- Data classification
- Master data governance
- Ability to communicate with both technical and business stakeholders
- Experience working across multiple architecture and engineering teams
- Experience integrating AVEVA PI Asset Framework with cloud data platforms
- OT/ SCADA background with OSI PI experience
- Experience building real-time or near-real-time OT data pipelines
- Databricks or AWS certifications
- Experience working in regulated utility or energy environments
- Familiarity with IoT device data and industrial telemetry streams
- Architecture leadership
- Security-by-design mindset
- Systems thinking
- Cross-team collaboration
- Technical coaching and mentorship
- Clear technical documentation and communication
Success In This Role Will Be Measured By
- Delivery of secure OT data products within Databricks
- Reduction in latency and cost of operational data pipelines
- Faster onboarding of new OT data sources
- Adoption of standard architecture patterns across teams
- Positive reviews from enterprise architecture and security boards
- Round 1: Technical interview with architecture and data platform team members
- Round 2: Panel interview with cross-functional stakeholders (3 team members)
The ideal candidate combines strong data platform engineering skills with operational technology experience. They understand how industrial assets generate telemetry data and can design secure, modern pipelines that bring that data into cloud analytics platforms.
This role requires someone who can balance technical depth with collaboration, helping multiple teams modernize legacy data infrastructure while maintaining strict governance and security standards.
Pay Rate Range
81 - 108 USD hourly