hero

Find Your Dream Job Today

Out for Undergrad
companies
Jobs

Google Cloud Data Engineer

Guidehouse

Guidehouse

Data Science
McLean, VA, USA
Posted on Dec 18, 2025

Job Family:

Data Science Consulting


Travel Required:

Up to 10%


Clearance Required:

Ability to Obtain Public Trust

What You Will Do:

Guidehouse is seeking an experienced Data Engineer to join our Technology AI and Data practice within the Defense & Security segment. This individual will have a strong data engineering background and be a hands-on technical contributor, responsible for designing, implementing, and maintaining scalable, cloud-native data pipelines which power interactive dashboards that enable federal clients to achieve mission outcomes, operational efficiency, and digital transformation. This is an exciting opportunity for someone who thrives at the intersection of data engineering, Google Cloud technologies, and public sector modernization. The Data Engineer will collaborate with cross-functional teams and client stakeholders to modernize legacy environments, implement scalable BigQuery-centric data pipelines using Dataform and Python, and support advanced analytics initiatives for our federal client within the insurance space.

Client Leadership & Delivery

  • Collaborate with government clients to understand enterprise data architecture, ingestion, transformation, and reporting requirements within a Google Cloud Platform (GCP) environment.

  • Communicate technical designs, tradeoffs, and delivery timelines clearly to both technical and non-technical audiences.

  • Lead the development of extract-transform-load (ETL) and extract-load-transform (ELT) pipelines using Cloud Composer (GCP hosted Airflow), Dataform, and BigQuery to support our analytical data warehouse powering downstream Looker dashboards.

  • Adhere to high-quality delivery standards and promote measurable outcomes across data migration and visualization efforts.

Solution Development & Innovation

  • Design, develop, and maintain scalable ETL/ELT pipelines using SQL (BigQuery), Dataform (SQLX), Cloud Storage, and Python (Cloud Composer/Airflow, Cloud Functions).

  • Apply modern ELT/ETL and analytics engineering practices using BigQuery and Dataform to enable version-controlled, testable, and maintainable data transformations.

  • Leverage tools such as Gitlab and Github to manage version control, merge requests, and promotion pipelines.

  • Optimize data pipelines and warehouse performance for large-scale analytical workloads, including partitioning, clustering, incremental processing, and cost optimization to enable downstream BI utilizing Looker.

  • Validate compliance with federal data governance, security, and performance standards.

  • Design and document enterprise data models, metadata strategies, data lineage frameworks, and other relevant documentation, as needed.

  • Align data from multiple discrete datasets into a cohesive, interoperable architecture, identifying opportunities for linkages between datasets, normalization, field standardization, etc.

  • Assist with cleanup of existing data and models, including use of ETL.

Practice & Team Leadership

  • Work closely with data architects, data scientists, data analysts, and cloud engineers to deliver integrated solutions.

  • Collaborate across Scaled Agile Framework (SAFe) teams and participate in Agile ceremonies including standups, retrospectives, and Program Increment (PI) planning.

  • Manage tasks and consistently document progress and outcomes using Confluence and Jira.

  • Support documentation, testing, and deployment of data products.

  • Mentor junior team members and contribute to reusable frameworks and accelerators.

  • Contribute to thought leadership, business development, and best practice development across the AI & Data team.


What You Will Need:

  • US Citizenship and the ability to obtain and maintain a federal Public Trust clearance. Individuals with an active Public Trust clearance are preferred.

  • Bachelor’s degree in computer science, engineering, mathematics, statistics, or a related technical field.

  • Minimum One (1) to Five (5) years of experience in data engineering within cloud environments.

  • Strong proficiency in SQL for data modeling and data quality tests and Python for pipeline design. Comfortable with the command line in Linux for git, deploying code to the cloud, and interacting with cloud files.

  • Experience with orchestration tools including but not limited to Cloud Composer (Airflow), Luigi, Prefect, Dagster, etc.

  • Experience with modern analytics engineering toolsets (dbt, Dataform, Databricks, etc.) and familiarity with best practices and methodologies behind the tools (data lineage, dependency graphs, tags, data quality tests, etc.).

  • Proven experience with business intelligence tools and cloud platforms (AWS, Azure, GCP).

  • Familiarity with CI/CD practices, principles, and tools such as Gitlab, separation of environments, and idempotency as it relates to data pipelines.

  • Experience building ETL/ELT pipelines and integrating data sources into reporting platforms.

  • Familiarity with data governance, metadata, and compliance frameworks.

  • Excellent communication, facilitation, and stakeholder engagement skills.


What Would Be Nice To Have:

  • Master’s degree in computer science, engineering, mathematics, statistics, or a related technical field.

  • Experience in data engineering within cloud environments.

  • Familiarity with machine learning (ML) and optimal data configurations to support model workloads.

  • Familiarity with Agile project management methodologies and Atlassian toolsets (Confluence, Jira).

  • Understanding of data quality and data pipeline test procedures and best practices.

  • Experience working within matrixed teams of data engineers, BI developers, and QA engineers.

  • Experience implementing cloud data governance and data management tools such as Dataplex.

  • Experience with serverless architectures including cloud functions.

  • Experience with Javascript to support development in Dataform.

  • Experience using LLM-based coding assistants such as Gemini Code Assist to automate and streamline software development tasks.

  • Relevant GCP certifications such as GCP Professional Data Engineer, GCP Professional Cloud Architect, and GCP Professional ML Engineer.

  • Experience working with public sector clients.

  • Familiarity with federal contracting and procurement processes.


What We Offer:

Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace.

Benefits include:

  • Medical, Rx, Dental & Vision Insurance

  • Personal and Family Sick Time & Company Paid Holidays

  • Position may be eligible for a discretionary variable incentive bonus

  • Parental Leave and Adoption Assistance

  • 401(k) Retirement Plan

  • Basic Life & Supplemental Life

  • Health Savings Account, Dental/Vision & Dependent Care Flexible Spending Accounts

  • Short-Term & Long-Term Disability

  • Student Loan PayDown

  • Tuition Reimbursement, Personal Development & Learning Opportunities

  • Skills Development & Certifications

  • Employee Referral Program

  • Corporate Sponsored Events & Community Outreach

  • Emergency Back-Up Childcare Program

  • Mobility Stipend

About Guidehouse

Guidehouse is an Equal Opportunity Employer–Protected Veterans, Individuals with Disabilities or any other basis protected by law, ordinance, or regulation.

Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco.

If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at 1-571-633-1711 or via email at RecruitingAccommodation@guidehouse.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation.

All communication regarding recruitment for a Guidehouse position will be sent from Guidehouse email domains including @guidehouse.com or guidehouse@myworkday.com. Correspondence received by an applicant from any other domain should be considered unauthorized and will not be honored by Guidehouse. Note that Guidehouse will never charge a fee or require a money transfer at any stage of the recruitment process and does not collect fees from educational institutions for participation in a recruitment event. Never provide your banking information to a third party purporting to need that information to proceed in the hiring process.

If any person or organization demands money related to a job opportunity with Guidehouse, please report the matter to Guidehouse’s Ethics Hotline. If you want to check the validity of correspondence you have received, please contact recruiting@guidehouse.com. Guidehouse is not responsible for losses incurred (monetary or otherwise) from an applicant’s dealings with unauthorized third parties.

Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee.