Search jobs now Find the right job type for you Create a job alert Explore how we help job seekers Contract talent Permanent talent Learn how we work with you Executive search Finance and Accounting Technology Marketing and Creative Legal Administrative and Customer Support Technology Risk, Audit and Compliance Finance and Accounting Digital, Marketing and Customer Experience Legal Operations Human Resources 2026 Salary Guide Demand for Skilled Talent Report Building Future-Forward Tech Teams Job Market Outlook Press Room Labor market overview AI in recruiting Navigating the AI era Staffing for small businesses Browse jobs Find your next hire Our locations
Data Engineer
<p>I’m building a world-class team to power our next generation of data products. We’re looking for a Senior Data Engineer who knows AWS inside and out—someone who can <strong>design secure, scalable data pipelines</strong>, <strong>own ETL/ELT workflows</strong>, <strong>engineer cloud data infrastructure</strong>, and <strong>deliver dimensional and semantic models</strong> that our analysts, data scientists, and applications can trust.</p><p>You’ll work closely with product, security, platform engineering, and analytics to move our architecture toward a <strong>real-time, governed, cost-aware</strong>, and <strong>highly automated</strong> data ecosystem.</p><p><strong>What You’ll Do</strong></p><ul><li><strong>Design &amp; build end-to-end pipelines</strong> on AWS (batch and streaming) using services like <strong>Glue, EMR, Lambda, Step Functions, Kinesis, MSK</strong>, and <strong>Fargate</strong>.</li><li><strong>Develop robust ETL/ELT</strong> (PySpark, Spark SQL, SQL, Python) for structured, semi-structured, and unstructured data at scale.</li><li><strong>Own data storage &amp; processing layers</strong>: <strong>S3 (Lake/Lakehouse), Redshift (or Snowflake on AWS), DynamoDB</strong>, and <strong>Athena</strong> with strong partitioning, compaction, and performance tuning.</li><li><strong>Implement data models</strong> (3NF, dimensional/star, Data Vault, Lakehouse medallion) for analytics and operational workloads.</li><li><strong>Engineer secure infrastructure-as-code</strong> with <strong>Terraform</strong> (or <strong>CDK</strong>) across multi-account setups; implement CI/CD via <strong>GitHub Actions</strong> or <strong>AWS CodeBuild/CodePipeline</strong>.</li><li><strong>Harden security &amp; governance</strong>: use <strong>IAM</strong>, <strong>Lake Formation</strong>, <strong>KMS</strong>, <strong>Secrets Manager</strong>, <strong>VPC/PrivateLink</strong>, <strong>GLUE Catalog</strong>, and fine-grained access controls. Partner with SecOps on compliance (e.g., <strong>SOC 2</strong>, <strong>FedRAMP</strong>, <strong>HIPAA</strong> depending on dataset).</li><li><strong>Observability &amp; reliability</strong>: build monitoring with <strong>CloudWatch</strong>, <strong>OpenTelemetry</strong>, and data quality checks (e.g., <strong>Great Expectations</strong>, <strong>Deequ</strong>), implement SLOs and alerts.</li><li><strong>Champion best practices</strong>: code reviews, testing (unit/integration), documentation, runbooks, and blameless postmortems.</li><li><strong>Mentor</strong> mid-level engineers and collaborate on architectural decisions, standards, and technical roadmaps.</li></ul><p><br></p>
<p><strong>Core Tech You’ll Use</strong></p><ul><li><strong>AWS</strong>: S3, Glue, EMR, Lambda, Step Functions, Kinesis/MSK, Redshift, Lake Formation, IAM, KMS, Secrets Manager, CloudWatch</li><li><strong>Data</strong>: PySpark/Spark, SQL, Python, dbt (nice to have), Kafka, Airflow</li><li><strong>IaC/CI/CD</strong>: Terraform (or CDK), GitHub Actions / CodePipeline</li><li><strong>Quality/Governance</strong>: Great Expectations/Deequ, Glue Catalog, Data Contracts, Delta/Apache Hudi/Iceberg (plus)</li></ul><p><strong>What Makes You a Great Fit</strong></p><ul><li>5–10+ years in data engineering with at least <strong>3+ years deep on AWS</strong>.</li><li>Hands-on with <strong>Spark/PySpark</strong>, <strong>distributed ETL</strong>, and <strong>data modeling</strong> for analytics (dimensional) and/or operational use.</li><li>Strong command of <strong>SQL performance tuning</strong>, partitioning, file formats (<strong>Parquet/ORC/Avro</strong>), and cost/performance trade-offs.</li><li>Proven <strong>security-first mindset</strong>: least privilege, secrets rotation, encryption at rest/in transit, network boundaries.</li><li>Experience building <strong>production-grade</strong> data platforms with IaC and CI/CD.</li><li>Excellent communicator who can translate business goals into data architecture.</li></ul><p><strong>Nice-to-Haves</strong></p><ul><li>Experience with <strong>Redshift RA3</strong> tuning, <strong>Delta/Hudi/Iceberg</strong>, <strong>Data Mesh</strong> patterns, or <strong>dbt</strong>.</li><li>Knowledge of <strong>ML feature stores</strong> (SageMaker Feature Store or equivalent).</li><li>Background in <strong>regulated environments</strong> (public sector, healthcare, financial services).</li><li>Active <strong>Public Trust</strong> or ability to obtain a clearance (preferred but not required).</li></ul><p><br></p>
<h3 class="rh-display-3--rich-text">Technology Doesn't Change the World, People Do.<sup>®</sup></h3> <p>Robert Half is the world’s first and largest specialized talent solutions firm that connects highly qualified job seekers to opportunities at great companies. We offer contract, temporary and permanent placement solutions for finance and accounting, technology, marketing and creative, legal, and administrative and customer support roles.</p> <p>Robert Half works to put you in the best position to succeed. We provide access to top jobs, competitive compensation and benefits, and free online training. Stay on top of every opportunity - whenever you choose - even on the go. <a href="https://www.roberthalf.com/us/en/mobile-app" target="_blank">Download the Robert Half app</a> and get 1-tap apply, notifications of AI-matched jobs, and much more.</p> <p>All applicants applying for U.S. job openings must be legally authorized to work in the United States. Benefits are available to contract/temporary professionals, including medical, vision, dental, and life and disability insurance. Hired contract/temporary professionals are also eligible to enroll in our company 401(k) plan. Visit <a href="https://roberthalf.gobenefits.net/" target="_blank">roberthalf.gobenefits.net</a> for more information.</p> <p>© 2025 Robert Half. An Equal Opportunity Employer. M/F/Disability/Veterans. By clicking “Apply Now,” you’re agreeing to Robert Half’s <a href="https://www.roberthalf.com/us/en/terms">Terms of Use</a> and <a href="https://www.roberthalf.com/us/en/privacy">Privacy Notice</a>.</p>
  • Washington, DC
  • onsite
  • Temporary
  • 55 - 65 USD / Hourly
  • <p>I’m building a world-class team to power our next generation of data products. We’re looking for a Senior Data Engineer who knows AWS inside and out—someone who can <strong>design secure, scalable data pipelines</strong>, <strong>own ETL/ELT workflows</strong>, <strong>engineer cloud data infrastructure</strong>, and <strong>deliver dimensional and semantic models</strong> that our analysts, data scientists, and applications can trust.</p><p>You’ll work closely with product, security, platform engineering, and analytics to move our architecture toward a <strong>real-time, governed, cost-aware</strong>, and <strong>highly automated</strong> data ecosystem.</p><p><strong>What You’ll Do</strong></p><ul><li><strong>Design &amp; build end-to-end pipelines</strong> on AWS (batch and streaming) using services like <strong>Glue, EMR, Lambda, Step Functions, Kinesis, MSK</strong>, and <strong>Fargate</strong>.</li><li><strong>Develop robust ETL/ELT</strong> (PySpark, Spark SQL, SQL, Python) for structured, semi-structured, and unstructured data at scale.</li><li><strong>Own data storage &amp; processing layers</strong>: <strong>S3 (Lake/Lakehouse), Redshift (or Snowflake on AWS), DynamoDB</strong>, and <strong>Athena</strong> with strong partitioning, compaction, and performance tuning.</li><li><strong>Implement data models</strong> (3NF, dimensional/star, Data Vault, Lakehouse medallion) for analytics and operational workloads.</li><li><strong>Engineer secure infrastructure-as-code</strong> with <strong>Terraform</strong> (or <strong>CDK</strong>) across multi-account setups; implement CI/CD via <strong>GitHub Actions</strong> or <strong>AWS CodeBuild/CodePipeline</strong>.</li><li><strong>Harden security &amp; governance</strong>: use <strong>IAM</strong>, <strong>Lake Formation</strong>, <strong>KMS</strong>, <strong>Secrets Manager</strong>, <strong>VPC/PrivateLink</strong>, <strong>GLUE Catalog</strong>, and fine-grained access controls. Partner with SecOps on compliance (e.g., <strong>SOC 2</strong>, <strong>FedRAMP</strong>, <strong>HIPAA</strong> depending on dataset).</li><li><strong>Observability &amp; reliability</strong>: build monitoring with <strong>CloudWatch</strong>, <strong>OpenTelemetry</strong>, and data quality checks (e.g., <strong>Great Expectations</strong>, <strong>Deequ</strong>), implement SLOs and alerts.</li><li><strong>Champion best practices</strong>: code reviews, testing (unit/integration), documentation, runbooks, and blameless postmortems.</li><li><strong>Mentor</strong> mid-level engineers and collaborate on architectural decisions, standards, and technical roadmaps.</li></ul><p><br></p>
  • 2026-03-10T00:00:00Z

Data Engineer Job in Washington, DC | Robert Half