<p>Robert Half is seeking talented engineers to join us s in building products. You will have the opportunity to work on complex technical problems, build new features, and improve existing products across various platforms.</p><p><strong>Software Engineer, Machine Learning Responsibilities</strong></p><ul><li>Collaborate with cross-functional teams (product, design, operations, infrastructure) to build innovative application experiences</li><li>Implement custom user interfaces using latest programming techniques and technologies</li><li>Develop reusable software components for interfacing with back-end platforms</li><li>Analyze and optimize code for quality, efficiency, and performance</li><li>Lead complex technical or product efforts and provide technical guidance to peers</li><li>Architect efficient and scalable systems that drive complex applications</li><li>Identify and resolve performance and scalability issues</li><li>Work on a variety of coding languages and technologies</li><li>Establish ownership of components, features, or systems with expert end-to-end understanding</li></ul><p><br></p>
<p><strong>Machine Learning Engineer (Hands‑On Builder) - Consultant </strong></p><p><strong>Location: </strong>Philadelphia, PA - Hybrid 4x Onsite </p><p><strong>Pay: </strong>Available on W2 </p><p><strong>Duration: </strong>9-week contract, potential for extension or conversion </p><p><strong>Overview</strong></p><p>We are seeking an experienced <strong>Machine Learning Engineer</strong> who excels at building, training, and deploying real-world machine learning models from the ground up. This role is ideal for someone who enjoys transforming data into practical, scalable solutions and has deep hands-on experience with end-to-end ML pipelines.</p><p><strong>Key Responsibilities</strong></p><ul><li>Design, develop, and deploy production-grade machine learning models.</li><li>Build complete ML pipelines, including data preprocessing, modeling, evaluation, and deployment.</li><li>Collaborate with cross-functional teams to translate business needs into ML solutions.</li><li>Clearly communicate complex technical concepts to non-technical stakeholders.</li><li>Document models, methodology, and decisions to support transparency and reproducibility.</li></ul><p><br></p>
<p>We are looking for an AI Engineer to join our team. In this role, you will contribute to the development and implementation of AI and Machine Learning solutions that optimize renewable energy projects. You will work on creating scalable models, applications, and workflows that drive data-driven decision-making across the organization while adhering to established engineering standards and practices.</p><p><br></p><p>Responsibilities:</p><p>• Develop, test, and deploy AI/ML models and applications to enhance renewable energy planning, forecasting, and construction processes.</p><p>• Build and maintain MLOps workflows, including data pipelines, model packaging, versioning, monitoring, and retraining.</p><p>• Collaborate with IT and data teams to ensure AI solutions meet security, integration, and performance requirements.</p><p>• Break down technical requirements into actionable tasks and contribute to design implementation during reviews.</p><p>• Deliver scalable and secure solutions by following established standards and reference architectures.</p><p>• Support deployment of AI/ML solutions to cloud environments, with a focus on Azure.</p><p>• Create APIs and integrate AI solutions into enterprise systems to enable seamless operations.</p><p>• Utilize advanced machine learning frameworks such as TensorFlow and Scikit-learn to develop innovative solutions.</p><p>• Analyze data engineering concepts and apply them to enhance AI workflows.</p><p>• Provide technical input and partner with stakeholders to meet project objectives</p>
We are looking for a highly experienced Lead Artificial Intelligence (AI) Engineer to spearhead the development and implementation of cutting-edge AI and Machine Learning solutions within our organization. This role is integral to driving innovation, optimizing operational processes, and delivering impactful business outcomes across various projects. As a senior technical expert, you will collaborate with cross-functional teams to design scalable systems, set technical standards, and mentor emerging talent in AI engineering.<br><br>Responsibilities:<br>• Design, develop, and deploy production-ready AI and machine learning solutions tailored to renewable energy planning, construction optimization, and risk management.<br>• Establish and enforce technical standards for AI/ML development, MLOps pipelines, and the management of model lifecycles.<br>• Collaborate with IT and data teams to ensure AI systems are secure, scalable, and seamlessly integrated into the enterprise environment.<br>• Work closely with business stakeholders to translate strategic goals into practical AI applications that drive measurable results.<br>• Lead initiatives to explore and implement generative AI, predictive analytics, and optimization technologies to enhance forecasting and operational efficiency.<br>• Mentor and guide technical teams, fostering knowledge-sharing and the adoption of best practices in AI development.<br>• Evaluate emerging AI tools and technologies to ensure the organization remains at the forefront of innovation.<br>• Oversee the deployment and scaling of AI models in production environments, ensuring performance and reliability.<br>• Drive the development of automation solutions using AI to streamline processes and improve productivity.<br>• Collaborate with partners and vendors to integrate AI solutions effectively into existing systems.
<p>Join our team as a DevOps Engineer specializing in Artificial Intelligence (AI) and Large Language Model (LLM) infrastructure. You will play a critical role in architecting, deploying, and optimizing scalable AI platforms using modern DevOps practices and state-of-the-art tools.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Build, automate, and manage CI/CD pipelines for deploying and maintaining AI/LLM workloads.</li><li>Collaborate with AI engineers and data scientists to streamline model deployment, versioning, and monitoring.</li><li>Design and maintain cloud infrastructure using Infrastructure as Code (IaC) platforms such as Terraform and Ansible.</li><li>Orchestrate and manage containerized AI environments using Kubernetes.</li><li>Implement robust monitoring and logging solutions utilizing Grafana and Prometheus.</li><li>Optimize AI model inference and training workloads—especially for NVIDIA GPU-powered environments.</li><li>Apply strict security and compliance standards for all infrastructure components.</li><li>Diagnose and resolve production issues, continuously improving reliability and scalability of AI services.</li></ul>
<p>Robert Half is hiring! We are looking for an experienced Artificial Intelligence (AI) Engineer to join our team. In this role, you will design and implement cutting-edge AI and machine learning solutions to enhance our SaaS platform. You will collaborate with cross-functional teams to optimize workflows, improve customer experiences, and drive innovation through intelligent features.</p><p><br></p><p>Responsibilities:</p><p>• Develop and deploy robust machine learning models for predictive analytics, generative AI, and other advanced capabilities within a SaaS environment.</p><p>• Create scalable data pipelines for model training, testing, and monitoring, ensuring optimal performance and reliability.</p><p>• Collaborate with product, engineering, and data teams to identify and implement AI-driven solutions that address business challenges.</p><p>• Design and integrate AI functionalities, such as recommendations and classification systems, while maintaining efficiency and accuracy.</p><p>• Incorporate AI models into cloud-based systems using APIs, microservices, and containerized infrastructure.</p><p>• Assess and implement third-party AI tools and frameworks to enhance productivity and product capabilities.</p><p>• Ensure models align with privacy, security, and fairness standards, maintaining compliance across all implementations.</p><p>• Document workflows, track experiments, and maintain reproducibility for all AI-related processes.</p><p>• Keep up-to-date with advancements in AI technologies, machine learning techniques, and SaaS architecture trends.</p><ul><li>Python, Typescript, React </li></ul>
<p>Robert Half is seeking talented engineers to join us s in building products. You will have the opportunity to work on complex technical problems, build new features, and improve existing products across various platforms.</p><p><strong>Software Engineer, Machine Learning Responsibilities</strong></p><ul><li>Collaborate with cross-functional teams (product, design, operations, infrastructure) to build innovative application experiences</li><li>Implement custom user interfaces using latest programming techniques and technologies</li><li>Develop reusable software components for interfacing with back-end platforms</li><li>Analyze and optimize code for quality, efficiency, and performance</li><li>Lead complex technical or product efforts and provide technical guidance to peers</li><li>Architect efficient and scalable systems that drive complex applications</li><li>Identify and resolve performance and scalability issues</li><li>Work on a variety of coding languages and technologies</li><li>Establish ownership of components, features, or systems with expert end-to-end understanding</li></ul>
We are looking for a highly skilled Senior Software Engineer with expertise in artificial intelligence and machine learning to join our dynamic team in Jacksonville, Florida. This role demands a strong technical background and hands-on experience in developing and operationalizing cutting-edge AI solutions. You will have the opportunity to innovate and create impactful systems using the latest advancements in generative AI, reinforcement learning, and large language models (LLMs).<br><br>Responsibilities:<br>• Design and optimize large language models (LLMs) for specialized applications.<br>• Implement reinforcement learning algorithms and multi-agent systems to enhance automation capabilities.<br>• Develop generative AI tools for efficient data retrieval and visualization.<br>• Establish and maintain MLOps pipelines to ensure seamless deployment and monitoring of AI models.<br>• Collaborate with cross-functional teams to align technical solutions with business needs.<br>• Conduct ongoing research to integrate the latest AI advancements into system designs.<br>• Troubleshoot and resolve technical challenges related to AI model performance.<br>• Document processes and provide technical insights to stakeholders.<br>• Ensure compliance with industry standards and best practices for AI development.
We are looking for a highly skilled Sr. Software Engineer to join our team in Jacksonville, Florida. This role will focus on designing and implementing advanced AI solutions, leveraging cutting-edge technologies such as deep learning and transformer-based models to drive innovation and deliver impactful results. The ideal candidate will bring a combination of technical expertise, strategic thinking, and leadership capabilities to collaborate across teams and shape the future of AI within our organization.<br><br>Responsibilities:<br>• Develop and implement advanced AI models, including deep learning and transformer-based architectures, to address complex business challenges.<br>• Design and build automated pipelines for MLOps, ensuring seamless integration, deployment, and monitoring of AI solutions.<br>• Collaborate with leadership and stakeholders to align AI strategies with organizational objectives, identifying opportunities and mitigating risks.<br>• Lead cross-functional efforts to ensure AI systems are scalable, secure, and optimized for performance.<br>• Publish research findings, represent the organization at industry events, and contribute to safeguarding intellectual property.<br>• Evaluate technical capabilities and cost considerations to balance innovation with practicality in AI projects.<br>• Mentor team members, fostering a culture of collaboration, learning, and resilience.<br>• Continuously refine and improve AI systems based on performance metrics and user feedback.
<p>We’re seeking a Data Engineer to build and maintain scalable data pipelines that power analytics, reporting, and machine learning across the organization. You’ll turn raw data into clean, reliable, and accessible datasets that drive business decisions.</p><p>What You’ll Do</p><ul><li>Design and maintain data warehouses and data lakes</li><li>Build ETL/ELT pipelines integrating data from multiple systems</li><li>Optimize performance for large-scale datasets</li><li>Ensure data quality, security, and governance</li><li>Collaborate with analysts and ML teams to create analytics-ready datasets</li><li>Automate workflows and monitoring</li></ul><p><br></p>
We are looking for a skilled Data Engineer to join our team in Los Angeles, California. This role focuses on designing and implementing advanced data solutions to support innovative advertising technologies. The ideal candidate will have hands-on experience with large datasets, cloud platforms, and machine learning, and will play a critical role in shaping our data infrastructure.<br><br>Responsibilities:<br>• Develop and maintain robust data pipelines to ensure seamless data extraction, transformation, and loading processes.<br>• Design scalable architectures that support machine learning models and advanced analytics.<br>• Collaborate with cross-functional teams to deliver business intelligence tools, reporting solutions, and analytical dashboards.<br>• Implement real-time data streaming solutions using platforms like Apache Kafka and Apache Spark.<br>• Optimize database performance and ensure efficient data storage and retrieval.<br>• Build and manage resilient data science programs and personas to support AI initiatives.<br>• Lead and mentor a team of data scientists, machine learning engineers, and data architects.<br>• Design and implement strategies for maintaining large datasets, ensuring data integrity and accessibility.<br>• Create detailed technical documentation for workflows, processes, and system architecture.<br>• Stay up-to-date with emerging technologies to continuously improve data engineering practices.
We are looking for a skilled Senior Software Engineer to design and implement innovative software solutions. This role involves creating scalable systems, optimizing performance, and contributing to cutting-edge technologies, including machine learning and vector databases. Join our team in New York, New York, and help shape the future of software development.<br><br>Responsibilities:<br>• Design and develop scalable software solutions using Python and modern architectural patterns.<br>• Build and maintain distributed systems with a focus on multi-threaded programming.<br>• Develop and integrate RESTful APIs to ensure seamless communication between applications.<br>• Optimize data storage and retrieval using MongoDB and Redis.<br>• Explore and implement vector databases such as pgvector and timescale.<br>• Collaborate on machine learning initiatives, including the application of large language models (LLMs).<br>• Utilize cloud platforms like Amazon Web Services (AWS) to enhance system performance and scalability.<br>• Conduct thorough testing and debugging to ensure high-quality software delivery.<br>• Stay updated on emerging technologies to drive innovation within the team.
<p><strong>Data Engineer (Hybrid, Los Angeles)</strong></p><p><strong>Location:</strong> Los Angeles, California</p><p><strong>Compensation:</strong> $140,000 - $175,000 per year</p><p><strong>Work Environment:</strong> Hybrid, with onsite requirements</p><p>Are you passionate about crafting highly-scalable and performant data systems? Do you have expertise in Azure Databricks, Spark SQL, and real-time data pipelines? We are searching for a talented and motivated <strong>Data Engineer</strong> to join our team in Los Angeles. You'll work in a hybrid environment that combines onsite collaboration with the flexibility of remote work.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and implement data pipelines and ETL workflows using cutting-edge Azure technologies (e.g., Databricks, Synapse Analytics, Synapse Pipelines).</li><li>Manage and optimize big data processes, ensuring scalability, efficiency, and data accuracy.</li><li>Build and work with real-time data pipelines leveraging technologies such as Kafka, Event Hubs, and Spark Streaming.</li><li>Apply advanced skills in Python and Spark SQL to build data solutions for analytics and machine learning.</li><li>Collaborate with business analysts and stakeholders to implement impactful dashboards using Power BI.</li><li>Architect and support the seamless integration of diverse data sources into a central platform for analytics, reporting, and model serving via ML Flow.</li></ul><p><br></p>
<p>We are looking for an experienced AI Solutions Architect to join our team in Columbia, South Carolina. In this role, you will design and implement advanced AI solutions leveraging Microsoft Azure technologies to support business objectives. You will collaborate with cross-functional teams to develop scalable, secure, and innovative systems while staying updated on the latest AI trends and advancements.</p><p><br></p><p>Responsibilities:</p><p>• Develop and implement AI solutions utilizing Microsoft Azure services, including OpenAI, Cognitive Services, and Machine Learning.</p><p>• Lead the customization and deployment of Microsoft Copilot across platforms such as M365, Power Platform, and Dynamics.</p><p>• Architect efficient, scalable, and secure AI systems that align with organizational goals.</p><p>• Work closely with IT, data, and business teams to identify AI use cases and translate them into actionable solutions.</p><p>• Design and maintain robust data pipelines to support AI and machine learning workloads.</p><p>• Ensure compliance with security, privacy, and ethical AI standards throughout solution development.</p><p>• Provide technical leadership and mentorship to development teams, fostering best practices.</p><p>• Monitor the performance of AI models and implement continuous improvements to optimize results.</p><p>• Stay informed about advancements in Microsoft AI technologies and emerging trends in the industry.</p>
<p>Position Overview</p><p>We are seeking a highly skilled Data Scientist with deep expertise in Artificial Intelligence, Natural Language Processing (NLP), Computer Vision (CV), and Generative AI. This role is ideal for an innovative problem‑solver who thrives in a fast‑paced environment and is passionate about developing advanced AI/ML models that drive meaningful business value.</p><p>You will design, train, and deploy cutting‑edge AI models—including large language models (LLMs) and multi‑agent systems—turning complex data into scalable, high‑impact solutions.</p><p><br></p><p>Key Responsibilities</p><ul><li>Develop, train, and optimize machine learning and deep learning models</li><li>Build advanced AI solutions leveraging LLMs, multi‑agent systems, fine‑tuning methods, and inference optimization</li><li>Translate complex data science methodologies into clear, actionable insights for business stakeholders</li><li>Collaborate with cross‑functional teams to ensure AI solutions align with business needs</li><li>Create compelling presentations, dashboards, and data stories for non‑technical audiences</li><li>Contribute to innovation initiatives involving NLP, CV, Generative AI, and predictive analytics</li></ul>
<p>We are looking for a skilled and innovative Data Engineer to join our team in Grove City, Ohio. In this role, you will be responsible for designing and implementing advanced data pipelines, ensuring the seamless integration and accessibility of data across various systems. As a key player in our analytics and data infrastructure efforts, you will contribute to building a robust and scalable data ecosystem to support AI and machine learning initiatives.</p><p><br></p><p>Responsibilities:</p><p>• Design and develop scalable data pipelines to ingest, process, and transform data from multiple sources.</p><p>• Optimize data models to support analytics, forecasting, and AI/ML applications.</p><p>• Collaborate with internal teams and external partners to enhance data engineering capabilities.</p><p>• Implement and enforce data governance, security, and quality standards across hybrid cloud environments.</p><p>• Work closely with analytics and data science teams to ensure seamless data accessibility and integration.</p><p>• Develop and maintain data products and services to enable actionable insights.</p><p>• Troubleshoot and improve the performance of data workflows and storage systems.</p><p>• Align data systems across departments to create a unified and reliable data infrastructure.</p><p>• Support innovation by leveraging big data tools and frameworks such as Databricks and Spark.</p>
<p>Our Enterprise Data and Analytics team is growing. We’re looking for a Lead Data Scientist to assist with building and developing our Data Science team and lead us into the next generation of banking. We are reimagining how data is used across the bank to better serve our customers support our communities and make our colleagues lives better. Our goal is to be the best performing Regional Bank in America and we need data and analytics to meet that goal.</p><p>As we advance our data science and analytics capabilities we want a Lead Data Scientist to develop experts in modeling complex business problems and discovering business insights using statistical algorithmic mining and visualization techniques. We are looking for a leader who has a passion for developing others driving change and continuously improving and evolving the application of technologies to meet todays and tomorrow’s challenges.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><p>· Prioritizes analytical projects based on business value and technological readiness</p><p>· Performs large-scale experimentation and build data-driven models to answer business questions</p><p>· Conducts research on cutting-edge techniques and tools in machine learning/deep learning/artificial intelligence</p><p>· Evangelizes best practices to analytics and products teams</p><p>· Acts as the go-to resource for machine learning across a range of business needs</p><p>· Owns the entire model development process, from identifying the business requirements, data sourcing, model fitting, presenting results, and production scoring</p><p>· Provides leadership, coaching, and mentoring to team members and develops the team to work with all areas of the organization</p><p>· Works with stakeholders to ensure that business needs are clearly understood and that services meet those needs</p><p>· Anticipates and analyzes trends in technology while assessing the emerging technologies impact(s)</p>
<p>Are you passionate about building AI-powered systems that automate and orchestrate complex workflows? Our team is seeking a Workflow Engineer with a focus on AI & LLM Platforms to design, implement, and scale cutting-edge solutions for next-generation agentic and data-driven applications.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Architect and implement workflow solutions integrating Large Language Models (LLMs) and machine learning components.</li><li>Build and optimize AI-powered, agentic workflow patterns for real-world business processes.</li><li>Develop robust orchestration logic using Python or Node.js.</li><li>Collaborate with AI engineers and data scientists to design and integrate agentic systems and applied data science solutions.</li><li>Implement Retrieval Augmented Generation (RAG) pipelines to enhance LLM- and data-driven applications.</li><li>Automate and monitor machine learning workflows, ensuring performance, reliability, and scalability.</li><li>Support deployment and lifecycle management of production AI/ML workflows on modern platforms.</li><li>Document best practices and promote knowledge sharing related to workflow automation and AI/LLM system integration.</li></ul>
<p><strong>Data Pipeline Development</strong></p><ul><li>Design, build, and optimize scalable ETL/ELT pipelines to support analytics and operational workflows.</li><li>Ingest structured, semi-structured, and unstructured data from multiple internal and external sources.</li><li>Automate and orchestrate data workflows using tools like Airflow, Azure Data Factory, AWS Glue, or similar.</li></ul><p><strong>Data Architecture & Modeling</strong></p><ul><li>Develop and maintain data models, data marts, and data warehouses (relational, dimensional, and/or cloud-native).</li><li>Implement best practices for data partitioning, performance optimization, and storage management.</li><li>Work with BI developers, data scientists, and analysts to ensure datasets are structured to meet business needs.</li></ul><p><strong>Cloud Engineering & Storage</strong></p><ul><li>Build and maintain cloud data environments (Azure, AWS, GCP), including storage, compute, and security components.</li><li>Deploy and manage scalable data systems such as Snowflake, Databricks, BigQuery, Redshift, or Synapse.</li><li>Optimize cloud data cost, performance, and governance.</li></ul><p><strong>Data Quality & Reliability</strong></p><ul><li>Implement data validation, error handling, and monitoring to ensure accuracy, completeness, and reliability.</li><li>Troubleshoot pipeline failures, performance issues, and data discrepancies.</li><li>Maintain documentation and data lineage for transparency and auditability.</li></ul><p><strong>Collaboration & Cross‑Functional Support</strong></p><ul><li>Partner with product, engineering, and analytics teams to translate business requirements into technical solutions.</li><li>Support self-service analytics initiatives by preparing high-quality datasets and data products.</li><li>Provide technical guidance on data best practices and engineering standards.</li></ul><p><br></p><p><br></p>
<p>We’re seeking a Senior AI/ML Engineer to build and scale real-time data pipelines, drive platform reliability, and lead core AI/ML engineering initiatives. You’ll work across teams to deliver high‑performance, secure, and cost‑efficient data and machine learning systems.</p><p><br></p><p>Responsibilities:</p><ul><li>Architect and maintain scalable data pipelines (GCP, Kafka, Flink).</li><li>Own reliability, observability, and performance of the data/ML platform.</li><li>Optimize infrastructure performance and cloud costs.</li><li>Standardize data modeling and testing (dbt or similar).</li><li>Implement Infrastructure as Code and automation best practices.</li><li>Ensure security, governance, and compliance across data systems.</li><li>Lead cross-functional initiatives and mentor engineering peers.</li><li>Drive continuous improvement in tools, processes, and standards.</li></ul><p><br></p><p><br></p>
We are looking for a dedicated Data Scientist to join our team in Louisville, Kentucky. In this role, you will leverage advanced data science techniques to analyze healthcare data, develop predictive models, and deliver actionable insights that enhance patient care and operational efficiency. This position offers the opportunity to make a meaningful impact by addressing health disparities and supporting innovative healthcare initiatives.<br><br>Responsibilities:<br>• Utilize machine learning, predictive analytics, and data mining techniques to uncover patterns and trends in healthcare datasets.<br>• Design and implement predictive models to anticipate patient outcomes and guide proactive clinical interventions.<br>• Create clear and impactful data visualizations to communicate analytical findings to both technical and non-technical audiences.<br>• Collaborate with Data Architects to develop scalable, cloud-based analytic solutions tailored to population health and value-based care.<br>• Establish and monitor KPIs to assess the effectiveness of data-driven strategies in improving patient outcomes and efficiency.<br>• Participate in healthcare projects that focus on chronic disease management and enhancing provider performance.<br>• Stay updated on emerging technologies and methodologies to continuously enhance data science capabilities within the organization.<br>• Ensure compliance with data security and privacy regulations while handling sensitive healthcare information.
<p>We’re looking for an AI Engineer who loves turning cutting-edge models into real-world products. You’ll build, prototype, and deploy AI-powered solutions — from intelligent automation to generative AI applications — that directly impact how teams work and make decisions.</p><p>What You’ll Do</p><ul><li>Design and deploy machine learning and generative AI solutions in production</li><li>Build AI-powered tools such as chat interfaces, recommendation systems, and intelligent workflows</li><li>Work with structured and unstructured data to train and fine-tune models</li><li>Develop APIs and services to integrate AI into web applications</li><li>Optimize model performance, accuracy, and scalability</li><li>Collaborate with product, engineering, and data teams to turn ideas into usable tools</li></ul><p><br></p>
<p>We are looking for a skilled Data Scientist to join our team in Saint Paul, Minnesota. This contract position offers the opportunity to collaborate with cutting-edge technologies and contribute to the advancement of AI and data initiatives. The ideal candidate will thrive in a business-facing role, engaging with leadership and stakeholders to drive impactful solutions.</p><p><br></p><p>Responsibilities:</p><p>• Develop and implement machine learning models using Python and Databricks to address complex business challenges.</p><p>• Collaborate with cross-functional teams to identify data-driven opportunities and deliver actionable insights.</p><p>• Work within open architectures to design and execute scalable data solutions.</p><p>• Participate in leadership meetings, effectively communicating technical concepts and recommendations to non-technical stakeholders.</p><p>• Mentor and support team members to enhance overall technical and business-facing capabilities.</p><p>• Utilize tools such as Apache Spark, Hadoop, and Kafka for efficient data processing and analysis.</p><p>• Design and execute ETL workflows to ensure robust data transformation and integration.</p><p>• Contribute to the continuous improvement of data science practices and methodologies.</p><p>• Stay updated on emerging technologies and trends in AI and machine learning to maintain a competitive edge.</p><p>• Assist in bridging skills gaps within the team by providing expertise and guidance.</p>
We are looking for an experienced Senior Data Engineer with a strong background in Python and modern data engineering tools to join our team in West Des Moines, Iowa. This is a long-term contract position that requires expertise in designing, building, and optimizing data pipelines and working with cloud-based data warehouses. If you thrive in a collaborative environment and have a passion for transforming raw data into actionable insights, we encourage you to apply.<br><br>Responsibilities:<br>• Develop, debug, and optimize Python-based data pipelines using frameworks such as Flask, Django, or FastAPI.<br>• Design and implement data transformations in a data warehouse using tools like dbt, ensuring high-quality analytics-ready datasets.<br>• Utilize Amazon Redshift and Snowflake for managing large-scale data storage and performing advanced querying and optimization.<br>• Automate data integration processes using platforms like Fivetran and orchestration tools such as Prefect or Airflow.<br>• Build reusable and maintainable data models to improve performance and scalability for analytics and reporting.<br>• Conduct data analysis and visualization leveraging Python libraries such as NumPy, Pandas, TensorFlow, and PyTorch.<br>• Manage version control for data engineering projects using Git and GitHub.<br>• Ensure data quality through automated testing and validation processes.<br>• Document workflows, code, and data transformations following best practices for readability and maintainability.<br>• Optimize cloud-based data warehouse and lake platforms for performance and integration of new data sources.