<p>We are looking for a skilled Financial Analyst to join our manufacturing client's team just South of Indianapolis. This role focuses on supporting strategic financial planning, analysis, and reporting across various departments, including operations and sales. The ideal candidate will excel in data-driven decision-making, financial modeling, and forecasting, while contributing to process improvements and automation initiatives.</p><p><br></p><p>Responsibilities:</p><p>• Prepare financial statements and management discussion and analysis (MD&A) reports to support organizational reporting.</p><p>• Collaborate with sales and commercial teams to manage pricing, analyze discount requests, and maintain price lists.</p><p>• Partner with departmental leaders to drive key projects, initiatives, and ad-hoc analyses related to sales, margins, and SG&A expenses.</p><p>• Develop and refine financial forecasts and models, including multi-scenario planning and quarterly/annual budgeting.</p><p>• Conduct variance analysis and communicate actionable insights to senior management.</p><p>• Provide financial guidance and build cross-functional collaboration with teams such as HR, Marketing, IT, and Supply Chain.</p><p>• Analyze and enhance existing processes, identifying opportunities for improvement and supporting implementation of new procedures.</p><p>• Maintain and analyze the cost accounting system, including production cost reports and inventory reconciliation.</p><p>• Generate estimates for new and proposed product costs while monitoring manufacturing costs and cycle counts.</p><p>• Assist in creating presentations for senior leadership and contribute to strategic planning efforts.</p>
We are looking for a skilled Data Engineer to join our team in Carmel, Indiana. In this long-term contract role, you will design, build, and optimize data pipelines and systems to support business needs. The ideal candidate will bring expertise in data engineering tools and frameworks, along with a passion for solving complex challenges.<br><br>Responsibilities:<br>• Develop and maintain robust data pipelines using modern frameworks and tools.<br>• Implement ETL processes to ensure accurate and efficient data transformation.<br>• Optimize data storage and retrieval systems for performance and scalability.<br>• Collaborate with cross-functional teams to understand data requirements and deliver solutions.<br>• Utilize Apache Spark and Hadoop for large-scale data processing.<br>• Work with Databricks to streamline data workflows and enhance analytics.<br>• Apply machine learning techniques using tools like scikit-learn and Pandas.<br>• Integrate Kafka for real-time data streaming and processing.<br>• Analyze and troubleshoot data-related issues to ensure system reliability.<br>• Document processes and workflows to support future development and maintenance.