Data Engineer
<p>Our transportation client is seeking a <strong>Data Engineer</strong> to support large‑scale logistics operations by building reliable, scalable, and cloud‑based data pipelines. This role is hands‑on, focused on delivering high‑quality data flows that improve shipment visibility, operational efficiency, and real‑time analytics across the supply chain.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Design, build, and maintain <strong>ETL/ELT pipelines</strong> that process high‑volume operational and logistics data</li><li>Develop transformation logic and automation using <strong>Python</strong>, <strong>SQL</strong>, and Azure-native tooling</li><li>Implement and orchestrate workflows in <strong>Azure Data Factory</strong>, <strong>Synapse</strong>, and <strong>Databricks</strong></li><li>Optimize data lake and warehouse performance, including tuning queries, pipelines, and storage layers</li><li>Monitor pipeline health and proactively troubleshoot failures, bottlenecks, and data quality issues</li><li>Contribute to data modeling efforts to support analytics, reporting, and downstream applications</li><li>Collaborate with BI, product, supply chain, and application teams to align pipelines with business needs</li><li>Maintain strong documentation around workflows, standards, and operational procedures</li><li>Support governance initiatives related to <strong>data quality</strong>, lineage, cataloging, and access policies</li><li>Follow best practices for security, compliance, and cloud resource management</li></ul><p><br></p><p><br></p>
<p><br></p><p><strong>Requirements</strong></p><ul><li><strong>3+ years</strong> of experience in data engineering or data pipeline development</li><li>Strong proficiency in <strong>SQL</strong> (complex joins, window functions, performance tuning)</li><li>Hands-on experience using <strong>Python</strong> for data transformations, automation, and ETL scripting</li><li>Experience with cloud data ecosystems, ideally <strong>Azure</strong> (Data Factory, Synapse, Databricks)</li><li>Understanding of <strong>data lakehouse</strong> design, warehousing concepts, and distributed processing</li><li>Familiarity with <strong>CI/CD pipelines</strong>, Git‑based version control, and modern DevOps practices</li><li>Experience working with <strong>large datasets</strong>, streaming or batch processing environments</li><li>Ability to analyze pipeline issues, diagnose root causes, and improve system reliability</li><li>Strong communication skills and comfort partnering with cross-functional teams</li><li>Bonus: experience with logistics, transportation, or supply chain data flows</li></ul><p><br></p>
<h3 class="rh-display-3--rich-text">Technology Doesn't Change the World, People Do.<sup>®</sup></h3>
<p>Robert Half is the world’s first and largest specialized talent solutions firm that connects highly qualified job seekers to opportunities at great companies. We offer contract, temporary and permanent placement solutions for finance and accounting, technology, marketing and creative, legal, and administrative and customer support roles.</p>
<p>Robert Half works to put you in the best position to succeed. We provide access to top jobs, competitive compensation and benefits, and free online training. Stay on top of every opportunity - whenever you choose - even on the go. <a href="https://www.roberthalf.com/us/en/mobile-app" target="_blank">Download the Robert Half app</a> and get 1-tap apply, notifications of AI-matched jobs, and much more.</p>
<p>All applicants applying for U.S. job openings must be legally authorized to work in the United States. Benefits are available to contract/temporary professionals, including medical, vision, dental, and life and disability insurance. Hired contract/temporary professionals are also eligible to enroll in our company 401(k) plan. Visit <a href="https://roberthalf.gobenefits.net/" target="_blank">roberthalf.gobenefits.net</a> for more information.</p>
<p>© 2025 Robert Half. An Equal Opportunity Employer. M/F/Disability/Veterans. By clicking “Apply Now,” you’re agreeing to Robert Half’s <a href="https://www.roberthalf.com/us/en/terms">Terms of Use</a> and <a href="https://www.roberthalf.com/us/en/privacy">Privacy Notice</a>.</p>
- Memphis, TN
- onsite
- Temporary
-
58 - 68 USD / Hourly
- <p>Our transportation client is seeking a <strong>Data Engineer</strong> to support large‑scale logistics operations by building reliable, scalable, and cloud‑based data pipelines. This role is hands‑on, focused on delivering high‑quality data flows that improve shipment visibility, operational efficiency, and real‑time analytics across the supply chain.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Design, build, and maintain <strong>ETL/ELT pipelines</strong> that process high‑volume operational and logistics data</li><li>Develop transformation logic and automation using <strong>Python</strong>, <strong>SQL</strong>, and Azure-native tooling</li><li>Implement and orchestrate workflows in <strong>Azure Data Factory</strong>, <strong>Synapse</strong>, and <strong>Databricks</strong></li><li>Optimize data lake and warehouse performance, including tuning queries, pipelines, and storage layers</li><li>Monitor pipeline health and proactively troubleshoot failures, bottlenecks, and data quality issues</li><li>Contribute to data modeling efforts to support analytics, reporting, and downstream applications</li><li>Collaborate with BI, product, supply chain, and application teams to align pipelines with business needs</li><li>Maintain strong documentation around workflows, standards, and operational procedures</li><li>Support governance initiatives related to <strong>data quality</strong>, lineage, cataloging, and access policies</li><li>Follow best practices for security, compliance, and cloud resource management</li></ul><p><br></p><p><br></p>
- 2026-03-16T00:00:00Z