Lead Data Engineer
<p>We are looking for a highly skilled Lead Data Engineer to join our team. This role requires an individual with extensive experience in Databricks and a solid understanding of the insurance industry. You will play a pivotal role in designing and implementing scalable data architectures to support advanced analytics, reporting, and AI/BI/ML initiatives across various insurance functions.</p><p><br></p><p>Responsibilities:</p><p>• Develop and implement enterprise-level data architectures using Databricks Lakehouse, Delta Lake, and Azure cloud services.</p><p>• Create and maintain secure, scalable data pipelines to process structured and unstructured data from multiple sources.</p><p>• Define data integration, modeling, and governance frameworks tailored to insurance data domains such as policy, claims, underwriting, and billing.</p><p>• Collaborate with actuarial, underwriting, and BI teams to design semantic layers and analytics-ready datasets.</p><p>• Optimize Databricks cluster performance and workflows to ensure cost efficiency and reliability.</p><p>• Establish and enforce data security protocols aligned with industry regulations and compliance standards.</p><p>• Implement data quality, lineage, and metadata management frameworks using tools like Unity Catalog or Collibra.</p><p>• Mentor teams on best practices for data modeling, Databricks optimization, and cloud architecture.</p><p>• Review and validate solutions from divisional teams and vendors to ensure scalability, resilience, and cost-effectiveness.</p><p>• Evaluate emerging technologies and develop blueprints for adoption to enhance data capabilities.</p>
• Bachelor’s or master’s degree in Computer Science, Data Engineering, Information Systems, or a related field.<br>• At least 7 years of experience in data architecture or engineering, including 4 years working with Databricks.<br>• Expertise in insurance data models, including policy, claims, underwriting, and accounting domains.<br>• Proficiency in Python, PySpark, ETL processes, and Databricks workflows.<br>• Strong knowledge of Azure Databricks and modern data lakehouse architecture.<br>• Experience integrating data with BI tools such as Power BI, Tableau, or Looker.<br>• Familiarity with data governance tools like Unity Catalog or Collibra.<br>• Excellent communication and stakeholder management skills.
<h3 class="rh-display-3--rich-text">Technology Doesn't Change the World, People Do.<sup>®</sup></h3>
<p>Robert Half is the world’s first and largest specialized talent solutions firm that connects highly qualified job seekers to opportunities at great companies. We offer contract, temporary and permanent placement solutions for finance and accounting, technology, marketing and creative, legal, and administrative and customer support roles.</p>
<p>Robert Half works to put you in the best position to succeed. We provide access to top jobs, competitive compensation and benefits, and free online training. Stay on top of every opportunity - whenever you choose - even on the go. <a href="https://www.roberthalf.com/us/en/mobile-app" target="_blank">Download the Robert Half app</a> and get 1-tap apply, notifications of AI-matched jobs, and much more.</p>
<p>All applicants applying for U.S. job openings must be legally authorized to work in the United States. Benefits are available to contract/temporary professionals, including medical, vision, dental, and life and disability insurance. Hired contract/temporary professionals are also eligible to enroll in our company 401(k) plan. Visit <a href="https://roberthalf.gobenefits.net/" target="_blank">roberthalf.gobenefits.net</a> for more information.</p>
<p>© 2025 Robert Half. An Equal Opportunity Employer. M/F/Disability/Veterans. By clicking “Apply Now,” you’re agreeing to <a href="https://www.roberthalf.com/us/en/terms">Robert Half’s Terms of Use</a>.</p>
- Charlotte, NC
- remote
- Contract / Temporary to Hire
-
- USD / Hourly
- <p>We are looking for a highly skilled Lead Data Engineer to join our team. This role requires an individual with extensive experience in Databricks and a solid understanding of the insurance industry. You will play a pivotal role in designing and implementing scalable data architectures to support advanced analytics, reporting, and AI/BI/ML initiatives across various insurance functions.</p><p><br></p><p>Responsibilities:</p><p>• Develop and implement enterprise-level data architectures using Databricks Lakehouse, Delta Lake, and Azure cloud services.</p><p>• Create and maintain secure, scalable data pipelines to process structured and unstructured data from multiple sources.</p><p>• Define data integration, modeling, and governance frameworks tailored to insurance data domains such as policy, claims, underwriting, and billing.</p><p>• Collaborate with actuarial, underwriting, and BI teams to design semantic layers and analytics-ready datasets.</p><p>• Optimize Databricks cluster performance and workflows to ensure cost efficiency and reliability.</p><p>• Establish and enforce data security protocols aligned with industry regulations and compliance standards.</p><p>• Implement data quality, lineage, and metadata management frameworks using tools like Unity Catalog or Collibra.</p><p>• Mentor teams on best practices for data modeling, Databricks optimization, and cloud architecture.</p><p>• Review and validate solutions from divisional teams and vendors to ensure scalability, resilience, and cost-effectiveness.</p><p>• Evaluate emerging technologies and develop blueprints for adoption to enhance data capabilities.</p>
- 2025-11-05T16:53:45Z