Data Engineer
We are looking for a skilled Data Engineer to join our team in Kitchener, Ontario. In this role, you will design, build, and optimize data solutions to support business objectives, leveraging modern technologies and frameworks. You will work closely with cross-functional teams to implement robust data infrastructure and ensure seamless data integration across platforms.<br><br>Responsibilities:<br>• Develop, implement, and maintain scalable data pipelines using Databricks, Python, and Spark.<br>• Design and optimize data models and architectures, including Medallion Architecture, for both batch and streaming data processing.<br>• Integrate Azure Cloud components such as Azure Data Factory, Key Vault, and Blob Storage into data workflows.<br>• Collaborate with stakeholders to understand data requirements and deliver solutions that align with business goals.<br>• Utilize advanced data manipulation techniques to manage complex data structures and ensure high performance.<br>• Implement coding best practices, testing methodologies, and CI/CD pipelines within a DevOps framework.<br>• Participate in Agile or Spotify framework processes to ensure efficient project execution.<br>• Monitor and troubleshoot data workflows to maintain system reliability and integrity.<br>• Explore opportunities for incorporating Machine Learning and Artificial Intelligence concepts into data solutions.<br>• Provide documentation and training to support the use of developed data systems.
• Bachelor’s or Master’s degree in Informatics, Software Engineering, or a related field.<br>• At least 4 years of experience in Data Warehousing, including proficiency in Medallion Architecture and data processing techniques.<br>• Hands-on expertise with Databricks, Python, Spark, and dimensional modelling.<br>• Strong knowledge of Azure Cloud components and their integration into data workflows.<br>• Familiarity with coding standards, testing practices, and CI/CD pipelines in a DevOps environment.<br>• Advanced understanding of data modelling techniques and the trade-offs of different data structures.<br>• Experience working within Agile, Spotify, or similar project management frameworks.<br>• Preferred skills include Oracle databases, additional programming languages (e.g., Bash, Java, Scala), and foundational knowledge of ML and AI concepts.
<p>Robert Half is the world’s first and largest specialized talent solutions firm that connects highly qualified job seekers to opportunities at great companies. We offer contract, temporary and permanent placement solutions for finance and accounting, technology, marketing and creative, legal, and administrative and customer support roles.</p>
<p>Robert Half works to put you in the best position to succeed. We provide access to top jobs, competitive compensation and benefits, and free online training. Stay on top of every opportunity - whenever you choose - even on the go. <a href="https://www.roberthalf.com/ca/en/mobile-app" target="_blank">Download the Robert Half app</a> and get 1-tap apply, notifications of AI-matched jobs, and much more.</p>
<p>This job posting is for a current vacancy with our client.</p>
<p>Our specialized recruiting professionals apply their expertise and utilize our proprietary AI to find you great job matches faster.</p>
<p>Questions? Call your local office at 1.888.490.4429. All applicants applying for Canadian job openings must be authorized to work in Canada.</p>
<p>Only job postings for jobs located in Quebec appear in French.</p>
<p>© 2025 Robert Half. By clicking “Apply,” you’re agreeing to <a href="https://www.roberthalf.com/ca/en/terms">Robert Half’s Terms of Use</a>.</p>
- Kitchener, ON
- onsite
- Permanent
-
80000.00 - 90000.00 CAD / Yearly
- We are looking for a skilled Data Engineer to join our team in Kitchener, Ontario. In this role, you will design, build, and optimize data solutions to support business objectives, leveraging modern technologies and frameworks. You will work closely with cross-functional teams to implement robust data infrastructure and ensure seamless data integration across platforms.<br><br>Responsibilities:<br>• Develop, implement, and maintain scalable data pipelines using Databricks, Python, and Spark.<br>• Design and optimize data models and architectures, including Medallion Architecture, for both batch and streaming data processing.<br>• Integrate Azure Cloud components such as Azure Data Factory, Key Vault, and Blob Storage into data workflows.<br>• Collaborate with stakeholders to understand data requirements and deliver solutions that align with business goals.<br>• Utilize advanced data manipulation techniques to manage complex data structures and ensure high performance.<br>• Implement coding best practices, testing methodologies, and CI/CD pipelines within a DevOps framework.<br>• Participate in Agile or Spotify framework processes to ensure efficient project execution.<br>• Monitor and troubleshoot data workflows to maintain system reliability and integrity.<br>• Explore opportunities for incorporating Machine Learning and Artificial Intelligence concepts into data solutions.<br>• Provide documentation and training to support the use of developed data systems.
- 2025-12-11T17:24:05Z