We are looking for a skilled Data Engineer to join our team in Ann Arbor, Michigan, and contribute to the development of a modern, scalable data platform. In this role, you will focus on building efficient data pipelines, ensuring data quality, and enabling seamless integration across systems to support business analytics and decision-making. This position offers an exciting opportunity to work with cutting-edge technologies and play a key role in the transformation of our data environment.<br><br>Responsibilities:<br>• Design and implement robust data pipelines on Azure using tools such as Databricks, Spark, Delta Lake, and Airflow.<br>• Develop workflows to ingest and integrate data from diverse sources into Azure Data Lake.<br>• Build and maintain data transformation layers following the medallion architecture principles.<br>• Apply data quality checks, validation processes, and deduplication techniques to ensure accuracy and reliability.<br>• Create reusable and parameterized notebooks to streamline batch and streaming data processes.<br>• Optimize merge and update logic in Delta Lake by leveraging efficient partitioning strategies.<br>• Collaborate with business and application teams to understand and fulfill data integration requirements.<br>• Enable downstream integrations with APIs, Power BI dashboards, and reporting systems.<br>• Establish monitoring, logging, and data lineage tracking using tools like Unity Catalog and Azure Monitor.<br>• Participate in code reviews, agile development practices, and team design discussions.