Data Engineer
We are looking for a skilled Data Engineer to join our team on a long-term contract basis in Las Vegas, Nevada. In this role, you will leverage your expertise in Google Cloud Platform to design and optimize scalable data solutions that power analytics, machine learning, and business intelligence initiatives. This position offers an exciting opportunity to collaborate with cross-functional teams and build high-performance data infrastructure.<br><br>Responsibilities:<br>• Design and implement scalable data pipelines and workflows using Google Cloud Platform services such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Dataproc.<br>• Develop and maintain ETL processes to ensure data quality, reliability, and optimal performance.<br>• Collaborate with data scientists, business intelligence teams, and stakeholders to support advanced analytics and machine learning projects.<br>• Establish and enforce data governance protocols, quality checks, and monitoring frameworks.<br>• Optimize query performance and storage solutions within BigQuery and other Google Cloud services.<br>• Ensure solutions adhere to best practices for scalability, security, and cost-efficiency.<br>• Partner with DevOps and cloud infrastructure teams to manage deployment using tools like Terraform and Deployment Manager.<br>• Troubleshoot data pipeline issues and conduct thorough root cause analysis to ensure system reliability.<br>• Implement modern orchestration frameworks such as Airflow to streamline workflows and automation.
• Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent experience in the field.<br>• Minimum of 5 years of experience in data engineering or a related role.<br>• Proficient in Google Cloud Platform services, including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Cloud Composer.<br>• Strong programming skills in Python for data transformation and automation.<br>• Experience with ETL tools and modern orchestration frameworks such as Airflow or Composer.<br>• Solid understanding of data modeling, partitioning, and query optimization techniques.<br>• Familiarity with CI/CD practices, Git, and infrastructure-as-code tools like Terraform.<br>• Demonstrated ability to solve complex problems and work effectively in a collaborative, agile environment.
<h3 class="rh-display-3--rich-text">Technology Doesn't Change the World, People Do.<sup>®</sup></h3>
<p>Robert Half is the world’s first and largest specialized talent solutions firm that connects highly qualified job seekers to opportunities at great companies. We offer contract, temporary and permanent placement solutions for finance and accounting, technology, marketing and creative, legal, and administrative and customer support roles.</p>
<p>Robert Half works to put you in the best position to succeed. We provide access to top jobs, competitive compensation and benefits, and free online training. Stay on top of every opportunity - whenever you choose - even on the go. <a href="https://www.roberthalf.com/us/en/mobile-app" target="_blank">Download the Robert Half app</a> and get 1-tap apply, notifications of AI-matched jobs, and much more.</p>
<p>All applicants applying for U.S. job openings must be legally authorized to work in the United States. Benefits are available to contract/temporary professionals, including medical, vision, dental, and life and disability insurance. Hired contract/temporary professionals are also eligible to enroll in our company 401(k) plan. Visit <a href="https://roberthalf.gobenefits.net/" target="_blank">roberthalf.gobenefits.net</a> for more information.</p>
<p>© 2025 Robert Half. An Equal Opportunity Employer. M/F/Disability/Veterans. By clicking “Apply Now,” you’re agreeing to <a href="https://www.roberthalf.com/us/en/terms">Robert Half’s Terms of Use</a>.</p>
- Las Vegas, NV
- onsite
- Temporary
-
55.00 - 70.00 USD / Hourly
- We are looking for a skilled Data Engineer to join our team on a long-term contract basis in Las Vegas, Nevada. In this role, you will leverage your expertise in Google Cloud Platform to design and optimize scalable data solutions that power analytics, machine learning, and business intelligence initiatives. This position offers an exciting opportunity to collaborate with cross-functional teams and build high-performance data infrastructure.<br><br>Responsibilities:<br>• Design and implement scalable data pipelines and workflows using Google Cloud Platform services such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Dataproc.<br>• Develop and maintain ETL processes to ensure data quality, reliability, and optimal performance.<br>• Collaborate with data scientists, business intelligence teams, and stakeholders to support advanced analytics and machine learning projects.<br>• Establish and enforce data governance protocols, quality checks, and monitoring frameworks.<br>• Optimize query performance and storage solutions within BigQuery and other Google Cloud services.<br>• Ensure solutions adhere to best practices for scalability, security, and cost-efficiency.<br>• Partner with DevOps and cloud infrastructure teams to manage deployment using tools like Terraform and Deployment Manager.<br>• Troubleshoot data pipeline issues and conduct thorough root cause analysis to ensure system reliability.<br>• Implement modern orchestration frameworks such as Airflow to streamline workflows and automation.
- 2025-09-29T16:38:43Z