<p>We are looking for a skilled IT Applications Analyst to join our team in Toledo, Ohio. This contract-to-permanent position offers an exciting opportunity to contribute to the development and enhancement of core business applications while ensuring their optimal performance. The ideal candidate will have a strong background in systems analysis and demonstrate the ability to adapt quickly to changing priorities.</p><p><br></p><p>Responsibilities:</p><p>• Analyze business needs and translate them into technical requirements for system enhancements.</p><p>• Collaborate with stakeholders to gather, document, and refine system requirements.</p><p>• Configure and optimize Salesforce and other enterprise systems to improve functionality.</p><p>• Develop and maintain technical documentation for system processes and updates.</p><p>• Perform data analysis using SQL to support business decisions and resolve system issues.</p><p>• Coordinate EDI and ETL processes to ensure accurate and efficient data integration.</p><p>• Identify and troubleshoot issues within business systems, providing timely resolutions.</p><p>• Work closely with cross-functional teams to implement and test system changes.</p><p>• Monitor and evaluate system performance, recommending improvements as needed.</p><p>• Stay updated on industry trends to identify opportunities for innovation and system improvement.</p>
We are looking for a skilled Data Engineer to join our team in Ann Arbor, Michigan, and contribute to the development of a modern, scalable data platform. In this role, you will focus on building efficient data pipelines, ensuring data quality, and enabling seamless integration across systems to support business analytics and decision-making. This position offers an exciting opportunity to work with cutting-edge technologies and play a key role in the transformation of our data environment.<br><br>Responsibilities:<br>• Design and implement robust data pipelines on Azure using tools such as Databricks, Spark, Delta Lake, and Airflow.<br>• Develop workflows to ingest and integrate data from diverse sources into Azure Data Lake.<br>• Build and maintain data transformation layers following the medallion architecture principles.<br>• Apply data quality checks, validation processes, and deduplication techniques to ensure accuracy and reliability.<br>• Create reusable and parameterized notebooks to streamline batch and streaming data processes.<br>• Optimize merge and update logic in Delta Lake by leveraging efficient partitioning strategies.<br>• Collaborate with business and application teams to understand and fulfill data integration requirements.<br>• Enable downstream integrations with APIs, Power BI dashboards, and reporting systems.<br>• Establish monitoring, logging, and data lineage tracking using tools like Unity Catalog and Azure Monitor.<br>• Participate in code reviews, agile development practices, and team design discussions.