We are looking for a highly experienced Data/Information Architect to lead the design and implementation of enterprise-wide data architecture solutions. This long-term contract position offers the opportunity to shape scalable, efficient, and high-performance data systems that align with organizational goals. The ideal candidate will have a strong background in data modeling, governance frameworks, and modern lakehouse architectures, working collaboratively with both technical teams and business stakeholders to deliver actionable insights.<br><br>Responsibilities:<br>• Develop and maintain enterprise data models that support analytics, reporting, and operational needs.<br>• Implement best practices for data modeling in a lakehouse architecture, ensuring scalability and alignment with business requirements.<br>• Architect data structures using methodologies such as Star Schema, Snowflake Schema, and Data Vault for optimal performance.<br>• Collaborate with stakeholders to gather data requirements and ensure business objectives are met.<br>• Work alongside Data Engineers to create physical data models optimized for Microsoft Fabric and Delta Lake environments.<br>• Establish governance processes, including metadata management, data lineage, and quality assurance protocols.<br>• Define and manage canonical data models across key business domains to ensure consistency.<br>• Document and enforce data modeling standards to drive adoption across engineering teams.<br>• Assess and enhance existing data models, identifying areas for improvement in performance and alignment.<br>• Provide strategic guidance on data management tools and technologies to support organizational goals.
We are seeking a hands-on Senior Enterprise Architect in Artificial Intelligence (AI) to join our global Enterprise Architecture team. This role blends deep technical expertise with architectural design and practical implementation to drive AI-powered transformation initiatives.<br><br>As part of a forward-thinking global technology team, you’ll collaborate across business, data, and product functions to design and implement AI/ML solutions that enable digital products and services.<br><br>Key Responsibilities<br><br>Design and architect enterprise-scale AI/ML solutions across areas such as Machine Learning, Generative AI, Deep Learning, Virtual Assistants, and Cognitive Services (Vision/Image, Text/Language processing).<br>Develop and communicate AI roadmaps, future-state architectures, and design artifacts.<br>Rapidly prototype and build proof-of-concepts (PoCs) and MVPs for AI models and algorithms.<br>Evaluate and recommend AI/ML tools, platforms, and frameworks; conduct ROI analysis.<br>Experiment with and fine-tune LLMs, train custom models, and assess performance metrics.<br>Perform data exploration, cleansing, and feature engineering to prepare datasets for model training.<br>Guide and mentor engineering and data science teams through AI/ML solution design, deployment, and integration into enterprise workflows.<br>Continuously scan industry innovations and apply emerging AI/ML technologies to business problems.<br>What We’re Looking For<br><br>Strong technical and business acumen in creating technology-driven solutions.<br>Passion for experimenting with and adopting emerging AI/ML technologies.<br>Excellent communication and influencing skills; ability to present complex technical concepts to both technical and non-technical audiences.<br>Proven ability to balance timeliness, cost, and quality in solution design.<br>Experience leading digital transformation, target operating models, and performance improvement initiatives.<br>Qualifications<br><br>Bachelor’s degree in STEM or related field (MBA a plus).<br>5+ years in AI/ML solution architecture, prototyping, and experimentation.<br>5+ years working with AWS and/or Azure data, analytics, and AI services.<br>3+ years of experience with data science tools and frameworks.<br>Recent, hands-on experience with Generative AI, LLMs, and Agentic AI platforms.<br>Knowledge of cloud-native services (data storage, compute, networking, security).<br>Strong understanding of statistical methods, data preprocessing, and feature engineering.
We are looking for a Senior Database Engineer to provide expert technical leadership for our global, cloud-based data infrastructure. This role involves designing, operating, and optimizing scalable, secure, and resilient database systems to support enterprise-scale workloads across AWS and Azure. As this is a Contract position with the possibility of becoming permanent, it offers an excellent opportunity to contribute to the development of cutting-edge database solutions while collaborating with cross-functional teams.<br><br>Responsibilities:<br>• Design and manage multi-region database architectures across AWS and Azure to support geo-distributed workloads.<br>• Architect and maintain relational, NoSQL, and document databases such as Snowflake, PostgreSQL, DynamoDB, Cosmos DB, and MongoDB.<br>• Lead hands-on database migrations between cloud platforms and legacy systems with a focus on scalability and reliability.<br>• Implement indexing strategies, optimize queries, and establish scaling patterns for handling large datasets and real-time applications.<br>• Enhance database performance to ensure high availability, low latency, and cost efficiency at an enterprise level.<br>• Support and refine data ingestion workflows and pipeline integrations using tools like AWS Glue, Step Functions, Lambda, and Azure Data Factory.<br>• Collaborate with Data Engineering teams to develop streaming solutions using Kafka, Kinesis, and AWS services.<br>• Apply robust security measures, including encryption, access controls, and secrets management, to protect database systems.<br>• Develop disaster recovery strategies and maintain backup solutions to ensure data integrity and availability.<br>• Monitor database systems using tools like CloudWatch, Azure Monitor, and Datadog, ensuring optimal reliability and performance.
We are looking for a Senior Database Engineer to take on a critical role in shaping the future of our global data platform. In this position, you will lead technical strategy, architect robust multi-cloud systems, and oversee initiatives to ensure reliability, scalability, and cost efficiency. You will have a hands-on approach, providing mentorship and collaborating with leadership to drive impactful technical decisions. This is a contract opportunity with the potential for a permanent position, located in Lehi, Utah.<br><br>Responsibilities:<br>• Develop and execute the technical roadmap for a scalable and reliable data infrastructure.<br>• Architect and implement multi-region, cross-account data platforms to support global operations.<br>• Establish and enforce engineering standards for database design, data pipelines, reliability, and observability.<br>• Lead post-incident reviews and implement solutions to prevent recurring issues.<br>• Collaborate with product and engineering teams to identify technical risks and optimize roadmaps.<br>• Design and oversee large-scale data migrations, ensuring fault tolerance and self-healing capabilities.<br>• Optimize database performance through indexing, query tuning, and capacity planning.<br>• Implement robust security measures, including encryption, secrets management, and access controls.<br>• Partner with cross-functional teams to align business requirements with technical solutions.<br>• Provide hands-on leadership in developing critical systems and resolving complex production incidents.
<p>We are looking for a highly skilled Senior Data Scientist to join our Enterprise Analytics team in Easton, OH. This long-term contract position offers the opportunity to leverage cutting-edge data science techniques to drive business insights and optimize customer experiences. As part of a centralized team, you will collaborate across departments to influence strategic decisions and deliver actionable insights that align with organizational goals.</p><p><br></p><p>This will be a contract-to-permanent role, 4 days onsite per week in the Easton neighborhood of Columbus.</p><p><br></p><p>Responsibilities:</p><p>• Utilize advanced analytics techniques to analyze business data and uncover actionable insights.</p><p>• Develop and implement large-scale experiments and data-driven models to address complex business challenges.</p><p>• Conduct research on emerging tools and methodologies in machine learning, deep learning, and artificial intelligence.</p><p>• Define requirements for training and evolving deep learning models and algorithms.</p><p>• Present data-driven recommendations to product teams and other stakeholders to guide decision-making.</p><p>• Partner with cross-functional teams, including Marketing, Finance, IT, and Sales, to support organizational goals.</p><p>• Create intuitive data visualizations and performance metrics to enable stakeholders with actionable insights.</p><p>• Collaborate within an Agile development framework to support projects from inception to delivery.</p><p>• Influence the direction of the organization through impactful analytics and strategic consultation.</p><p>• Perform other duties as needed to support the broader analytics team.</p>
We are looking for an experienced Senior Data Engineer with a strong background in Python and modern data engineering tools to join our team in West Des Moines, Iowa. This is a long-term contract position that requires expertise in designing, building, and optimizing data pipelines and working with cloud-based data warehouses. If you thrive in a collaborative environment and have a passion for transforming raw data into actionable insights, we encourage you to apply.<br><br>Responsibilities:<br>• Develop, debug, and optimize Python-based data pipelines using frameworks such as Flask, Django, or FastAPI.<br>• Design and implement data transformations in a data warehouse using tools like dbt, ensuring high-quality analytics-ready datasets.<br>• Utilize Amazon Redshift and Snowflake for managing large-scale data storage and performing advanced querying and optimization.<br>• Automate data integration processes using platforms like Fivetran and orchestration tools such as Prefect or Airflow.<br>• Build reusable and maintainable data models to improve performance and scalability for analytics and reporting.<br>• Conduct data analysis and visualization leveraging Python libraries such as NumPy, Pandas, TensorFlow, and PyTorch.<br>• Manage version control for data engineering projects using Git and GitHub.<br>• Ensure data quality through automated testing and validation processes.<br>• Document workflows, code, and data transformations following best practices for readability and maintainability.<br>• Optimize cloud-based data warehouse and lake platforms for performance and integration of new data sources.
<p><strong>Data Engineer (Python / AWS)</strong></p><p><strong>Location:</strong> Remote (Northeast / Greater Boston area preferred)</p><p><strong>Type:</strong> Full-Time</p><p><strong>Level:</strong> Mid-to-Senior Individual Contributor</p><p><strong>About the Role</strong></p><p>We are looking for a strong individual contributor who excels in the Python data ecosystem and enjoys building reliable, scalable data pipelines. This role sits within a data engineering group responsible for integrating large volumes of data from external partners and transforming it into usable datasets for internal teams. You’ll work with modern cloud tools while also helping our team gradually transition away from a legacy platform.</p><p>This position is ideal for someone who wants to stay hands-on, focus on technical execution, and remain in an IC role for the next several years. We’re not looking for someone who is aiming to move immediately into architecture or leadership.</p><p>This team is fully distributed, and although candidates in the Boston area can go into the office, the rest of the group is remote. Anyone local may occasionally sit with other teams when on site.</p><p><br></p><p><strong>What You’ll Do</strong></p><ul><li>Build and maintain ETL pipelines that ingest, clean, and aggregate data received from external vendors and large enterprise partners.</li><li>Develop Python‑based data processing workflows deployed on AWS cloud services.</li><li>Work with tools such as AWS Glue, Airflow, dbt, and PySpark to support data transformations and pipeline orchestration.</li><li>Help modernize existing workflows and assist in the gradual migration away from a legacy data system.</li><li>Collaborate with internal stakeholders to understand data needs, define requirements, and ensure reliable integration of partner data feeds.</li><li>Troubleshoot pipeline issues, optimize performance, and improve overall system stability.</li><li>Contribute to best practices around code quality, testing, documentation, and data governance.</li></ul><p><br></p>
<p><strong>Data Scientist </strong></p><p><strong>3 month Contract to Hire </strong></p><p><strong>Location: </strong>4x Onsite Hybrid Columbus, OH</p><p><strong>Pay: </strong>Available on W2 </p><p>A leading financial services organization is seeking a data-driven <strong>Data Scientist</strong> to join its Enterprise Analytics team. In this role, you will leverage customer, product, channel, and digital data to uncover opportunities that enhance consumer experience and drive business growth. You will translate complex analytical findings into clear, actionable insights for stakeholders across Product, Marketing, Digital, and senior leadership.</p><p>This position offers the chance to influence strategic decisions across the enterprise while working within a collaborative, high-performing analytics and data science group.</p><p><br></p><p><strong>Responsibilities</strong></p><ul><li>Transform data from enterprise analytics tools into actionable insights related to customer behavior, digital sales, servicing performance, and value delivery.</li><li>Communicate findings and recommendations clearly to internal stakeholders and senior leadership.</li><li>Participate in cross-functional initiatives, providing opportunity analysis, measurement plans, dashboards, and post-release/testing analyses.</li><li>Translate complex technical concepts into intuitive visualizations, dashboards, and presentations.</li><li>Extract and analyze data using tools such as Python, R, and SAS.</li><li>Build intuitive, automated dashboards with a focus on anomaly detection and actionable visualization.</li><li>Partner with Digital, Marketing, Product, IT, Fraud, and Risk teams to evaluate initiatives and identify/mitigate risks.</li><li>Develop and refine KPIs and metrics to assess digital and omnichannel performance.</li><li>Create self-service assets that enable business partners to access and interpret data independently.</li><li>Apply automation techniques to streamline manual analytical processes.</li></ul>
<p><strong>Position Summary:</strong></p><ul><li>We are looking for a Data Operations Engineer to support and oversee the automated data‑pipeline environment built on AWS. This position bridges data engineering and customer operations, ensuring that incoming datasets are processed accurately, consistently, and securely within established ingestion and transformation frameworks.</li><li>Key responsibilities include monitoring automated workflows, troubleshooting processing failures, validating data quality, and helping onboard new customers by aligning their data formats to a standardized internal model.</li><li>The role requires strong proficiency in SQL and Python, practical experience with AWS services, and the ability to communicate effectively with external customers when data issues arise.</li></ul><p><strong>Responsibilities:</strong></p><p><strong>Data Pipeline Monitoring & Operations:</strong></p><ul><li>Monitor automated batch and streaming data pipelines in AWS</li><li>Identify, troubleshoot, and resolve data processing failures</li><li>Investigate file‑level errors, schema mismatches, and transformation issues</li><li>Perform root‑cause analysis and document resolutions</li><li>Ensure data integrity, completeness, and timeliness across environments</li><li>Escalate architectural or systemic issues to the Data Engineering team</li></ul><p><strong>Customer Data Onboarding & Implementation:</strong></p><ul><li>Collaborate directly with customers to understand their file formats and data structures</li><li>Create and maintain mapping templates to align customer data to a normalized data model</li><li>Validate sample files and run tests on ingestion workflows</li><li>Configure ingestion parameters within predefined frameworks</li><li>Support customer go‑live processes and initial data processing cycles</li></ul><p><strong>Data Quality & Continuous Improvement:</strong></p><ul><li>Write SQL queries to validate data accuracy and research anomalies</li><li>Develop lightweight Python scripts for validation, transformation checks, or automation tasks</li><li>Improve monitoring processes, internal documentation, and operational playbooks</li><li>Work with engineering teams to strengthen platform reliability and observability</li></ul><p><strong>Customer & Cross‑Functional Collaboration:</strong></p><ul><li>Communicate clearly with customers regarding file issues or data discrepancies</li><li>Partner with internal teams including Data Engineering, Product, and Support</li><li>Provide feedback to enhance scalability, resilience, and overall platform performance</li></ul>
<p><strong>Position Summary:</strong></p><ul><li>We are looking for a Data Operations Engineer to support and oversee the automated data‑pipeline environment built on AWS. This position bridges data engineering and customer operations, ensuring that incoming datasets are processed accurately, consistently, and securely within established ingestion and transformation frameworks.</li><li>Key responsibilities include monitoring automated workflows, troubleshooting processing failures, validating data quality, and helping onboard new customers by aligning their data formats to a standardized internal model.</li><li>The role requires strong proficiency in SQL and Python, practical experience with AWS services, and the ability to communicate effectively with external customers when data issues arise.</li></ul><p><strong>Responsibilities:</strong></p><p><strong>Data Pipeline Monitoring & Operations:</strong></p><ul><li>Monitor automated batch and streaming data pipelines in AWS</li><li>Identify, troubleshoot, and resolve data processing failures</li><li>Investigate file‑level errors, schema mismatches, and transformation issues</li><li>Perform root‑cause analysis and document resolutions</li><li>Ensure data integrity, completeness, and timeliness across environments</li><li>Escalate architectural or systemic issues to the Data Engineering team</li></ul><p><strong>Customer Data Onboarding & Implementation:</strong></p><ul><li>Collaborate directly with customers to understand their file formats and data structures</li><li>Create and maintain mapping templates to align customer data to a normalized data model</li><li>Validate sample files and run tests on ingestion workflows</li><li>Configure ingestion parameters within predefined frameworks</li><li>Support customer go‑live processes and initial data processing cycles</li></ul><p><strong>Data Quality & Continuous Improvement:</strong></p><ul><li>Write SQL queries to validate data accuracy and research anomalies</li><li>Develop lightweight Python scripts for validation, transformation checks, or automation tasks</li><li>Improve monitoring processes, internal documentation, and operational playbooks</li><li>Work with engineering teams to strengthen platform reliability and observability</li></ul><p><strong>Customer & Cross‑Functional Collaboration:</strong></p><ul><li>Communicate clearly with customers regarding file issues or data discrepancies</li><li>Partner with internal teams including Data Engineering, Product, and Support</li><li>Provide feedback to enhance scalability, resilience, and overall platform performance</li></ul>
We are looking for an experienced Business Intelligence (BI) Engineer to join our team in Englewood, Colorado. In this role, you will focus on developing and optimizing dashboards using Amazon QuickSight to support analytics initiatives across multiple projects. This position requires strong technical expertise, excellent communication skills, and a creative approach to dashboard design and data storytelling. This is a long-term contract opportunity within the telecom services industry.<br><br>Responsibilities:<br>• Develop enterprise-level dashboards using Amazon QuickSight to deliver actionable insights.<br>• Collaborate with stakeholders to gather and translate business requirements into reporting specifications and prototypes.<br>• Design and prototype interactive dashboards, ensuring clarity, usability, and impactful storytelling.<br>• Work alongside Data Engineering teams to define required datasets and support semantic layer needs.<br>• Utilize QuickSight Q and AI features to enhance data interpretation and guide strategic decision-making.<br>• Create calculated fields, parameters, and variables to optimize reporting functionalities.<br>• Ensure dashboards integrate both qualitative and quantitative data for comprehensive analysis.<br>• Conduct user experience (UX) design for dashboards, focusing on layout and accessibility.<br>• Support government projects requiring eligibility for federal security clearance and strict vetting processes.<br>• Communicate effectively with cross-functional teams to ensure alignment and successful project execution.
<p>Robert Half is hiring! </p><p>The Power Platform & Business Analytics Developer is responsible for building data analytics solutions, workflow automation, and system integrations using Microsoft Power Platform. This role develops Power BI reports and semantic models using Microsoft Fabric data warehouses, creates automated workflows and digital forms, integrates Power Platform solutions with ERP systems through REST APIs, and supports SharePoint Online document management and governance initiatives.</p><p><strong>Key Responsibilities:</strong></p><p><strong>Power BI & Business Analytics</strong></p><ul><li>Develop Power BI semantic models, datasets, dashboards, and reports using Microsoft Fabric data warehouses</li><li>Partner with business leaders to define and implement business metrics and KPIs</li></ul><p><strong>Power Automate & e-Form Development</strong></p><ul><li>Design and maintain automated workflows for e-form submissions and approval processes</li><li>Implement notifications, routing, and exception-handling mechanisms</li><li>Create digital forms using Power Apps or SharePoint Forms</li><li>Ensure workflows are scalable, secure, and properly monitored</li></ul><p><strong>RESTful API Integration</strong></p><ul><li>Develop integrations between Power Platform solutions and ERP systems using REST APIs</li><li>Create and maintain custom connectors for API integrations</li><li>Document integration mappings, workflows, and data flows</li></ul><p><strong>SharePoint Document Management</strong></p><ul><li>Implement approval workflows and automated notifications through Power Automate</li><li>Support business units in creating document libraries, workflows, and automations</li><li>Manage SharePoint Online architecture including metadata, content types, and permissions</li></ul><p><strong>Collaboration, Support & Governance</strong></p><ul><li>Work with stakeholders to translate business requirements into technical solutions</li><li>Document workflows, reporting assets, and integration architecture</li><li>Support governance processes including environment strategy and change management</li></ul><p><strong>Required Qualifications:</strong></p><ul><li>Bachelor’s degree in Computer Science, Data Analytics, Information Technology, or equivalent experience</li><li>2–5+ years of experience with Microsoft Power Platform (Power BI, Power Automate, Power Apps)</li><li>Strong Power BI development experience using Microsoft Fabric data warehouses</li><li>Experience designing complex automated workflows and building digital forms</li><li>Practical experience integrating systems using REST APIs</li><li>Knowledge of OAuth2, JSON, HTTP methods, and custom connectors</li><li>Experience supporting SharePoint Online architecture and workflow automation</li><li>Strong analytical, problem-solving, and documentation skills</li><li>Ability to effectively communicate and collaborate with non-technical stakeholders</li></ul><p><br></p>
<p>Robert Half is hiring! We are looking for a skilled Business Intelligence (BI) Engineer to join our team in Greenville, South Carolina. In this role, you will design and deliver high-performing, scalable, and secure BI solutions, enabling actionable insights across various business domains such as Policy, Claims, and Billing. You will collaborate with stakeholders and data engineering teams to build reusable models and optimize reporting solutions using industry-leading tools.</p><p><br></p><p>Responsibilities:</p><p>• Design and develop reporting solutions using data from multiple systems, including Policy, Claims, Billing, and third-party sources.</p><p>• Create reusable semantic and data models to support enterprise-wide analytics.</p><p>• Develop interactive dashboards and operational reports using tools such as Power BI and Cognos Analytics.</p><p>• Implement security measures, including role-based and row-level access controls, for BI solutions.</p><p>• Collaborate with stakeholders in Claims, Underwriting, and Finance to refine and standardize metrics.</p><p>• Build and optimize semantic models and executive-level dashboards leveraging Azure Data Lake and Power BI.</p><p>• Design star schemas for reporting and analytics, ensuring data lineage and governance.</p><p>• Perform data validation and reconciliation to maintain accuracy and consistency of BI outputs.</p><p>• Document data definitions, transformations, and business logic to support audit and regulatory reporting requirements.</p><p>• Optimize report performance through advanced query tuning and aggregation strategies.</p>
We are looking for an experienced Business Intelligence (BI) Engineer to join our team in Coppell, Texas. In this long-term contract role, you will play a key part in analyzing and interpreting user, product, and market data to support strategic business initiatives. You will collaborate with stakeholders across various departments to deliver actionable insights that drive efficiency and innovation.<br><br>Responsibilities:<br>• Partner with business stakeholders to understand priorities and develop timely, effective analytical solutions.<br>• Analyze user, product, and market data to support initiatives such as customer segmentation, competitor analysis, and predictive modeling.<br>• Provide insights and recommendations to enhance business intelligence roadmaps and development strategies.<br>• Develop, maintain, and leverage models to optimize product pricing, identify market opportunities, and improve margins.<br>• Collaborate with cross-functional teams to architect new strategies using data-driven insights.<br>• Identify and communicate key findings to support existing and future business initiatives.<br>• Evaluate business processes to identify opportunities for improvement and assist in proposing solutions.<br>• Document and define business functions and processes to ensure clarity and alignment.<br>• Utilize advanced tools like Power BI, Tableau, and Snowflake to manipulate and visualize data.<br>• Support decision-making by delivering high-quality analytic solutions that align with organizational goals.
<p>We are looking for a skilled Business Intelligence (BI) Engineer to join our team on a contract basis in Columbus, Ohio. In this role, you will leverage your expertise in data modeling, reporting, and visualization to empower data-driven decision-making. You will design impactful dashboards, streamline reporting processes, and contribute to the organization’s analytics strategy to deliver actionable insights.</p><p><br></p><p>Responsibilities:</p><p>• Develop and maintain comprehensive data models to support analytics and reporting needs.</p><p>• Create unified dashboards by consolidating multiple reports for improved accessibility and efficiency.</p><p>• Design visually engaging dashboards using Power BI to present clear and actionable insights.</p><p>• Analyze complex datasets to identify trends, generate user stories, and support decision-making.</p><p>• Utilize Azure tools such as Databases, Data Lake, and Data Factory to manage and transform data.</p><p>• Implement and enforce best practices in data governance to ensure accuracy and compliance.</p><p>• Support the organization’s data strategy to enhance reporting and business intelligence capabilities.</p><p>• Collaborate with stakeholders to understand requirements and deliver tailored solutions.</p><p>• Provide technical expertise to improve analytics workflows and processes.</p>
<p>We are looking for a skilled Power BI Data Visualization Specialist to join our team in Stamford, Connecticut. In this role, you will leverage your expertise in data visualization and storytelling to transform complex datasets into actionable business insights. Your contributions will support decision-making processes and enhance operational efficiency through well-designed dashboards and reports.</p><p><br></p><p>Responsibilities:</p><p>• Create dynamic dashboards and visualizations using Power BI to effectively communicate data insights.</p><p>• Interpret and analyze complex datasets to generate meaningful business recommendations.</p><p>• Ensure data accuracy and integrity by meticulously validating and reconciling information.</p><p>• Collaborate with stakeholders to understand their data needs and deliver tailored solutions.</p><p>• Design intuitive and user-friendly dashboards that align with visualization best practices.</p><p>• Utilize relational data sources to identify and explore data relationships.</p><p>• Develop Power BI reports that are embedded within applications when required.</p><p>• Apply knowledge of structured datasets to extract insights from operational systems.</p><p>• Partner with teams to support data-driven operations in areas such as finance, manufacturing, and supply chain.</p>
We are looking for a Data Visualization Specialist to join our team in Cincinnati, Ohio. In this role, you will leverage your expertise in business systems analysis to transform complex data into actionable insights through effective visualization techniques. The ideal candidate will have a passion for understanding business processes and applying technical skills to create impactful data solutions.<br><br>Responsibilities:<br>• Collaborate with stakeholders to analyze business processes and identify data visualization needs.<br>• Design and develop interactive dashboards and reports using tools such as Power BI, Tableau, and Qlik.<br>• Apply best practices in data visualization to ensure clear and accurate representation of data.<br>• Utilize data modeling techniques to enhance reporting capabilities and support business decision-making.<br>• Perform data analysis to identify opportunities for enrichment and aggregation, utilizing tools like Excel.<br>• Work with on-premises or cloud-based data platforms to ensure seamless integration and accessibility of data.<br>• Provide technical expertise and guidance on visualization tools and techniques to team members and clients.<br>• Stay updated on industry trends and advancements in data visualization technologies.
We are looking for an experienced Data and Analytics Manager to lead master data management initiatives within our organization. This role involves developing and implementing data governance policies and procedures to ensure accuracy, consistency, and reliability across all systems. As a long-term contract position based in Dallas, Texas, this opportunity offers the chance to collaborate with cross-functional teams and lead efforts to optimize data processes.<br><br>Responsibilities:<br>• Develop and oversee master data governance standards to ensure data accuracy, consistency, and reliability.<br>• Collaborate with stakeholders across departments to communicate business processes and data requirements.<br>• Approve and process requests for creating or modifying master data while auditing completed mappings for quality assurance.<br>• Lead a team of mapping specialists, allocating resources effectively and ensuring adherence to timelines and priorities.<br>• Create and maintain documentation to support team training and mapping verification processes.<br>• Monitor and audit team performance to ensure consistent quality and address knowledge gaps.<br>• Coordinate updates and progress reports for organizational collaboration meetings, ensuring alignment with business goals.<br>• Proactively communicate with stakeholders, providing regular updates on team efforts, challenges, and achievements.<br>• Review and analyze source data processes to ensure a thorough understanding of data mapping requirements.<br>• Track changes to IT architecture and processes to assess impacts on data usage and quality.
We are looking for a highly experienced Senior Machine Learning Engineer to join our team in Boston, Massachusetts. In this role, you will design, develop, and deploy cutting-edge machine learning systems that solve complex problems and scale effectively in production environments. This position offers an exciting opportunity to contribute to impactful projects, leveraging your expertise in machine learning, cloud infrastructure, and data engineering.<br><br>Responsibilities:<br>• Build and deploy machine learning models and solutions for production environments, ensuring they meet scalability and performance standards.<br>• Design and implement comprehensive ML pipelines, including data ingestion, feature engineering, model training, evaluation, and serving.<br>• Write clean, efficient code in Python and leverage its ML ecosystem, such as TensorFlow, PyTorch, and scikit-learn.<br>• Work with large datasets to extract meaningful insights and develop complex queries using modern data processing tools.<br>• Utilize containerization technologies like Docker and cloud platforms such as AWS to ensure robust and scalable deployment.<br>• Apply MLOps best practices, including CI/CD pipelines, automated testing, and performance monitoring, to maintain reliable machine learning systems.<br>• Conduct research and apply deep machine learning and AI techniques, including statistical modeling and large language models.<br>• Solve complex analytical problems with pragmatic engineering approaches while maintaining scientific rigor.<br>• Collaborate with cross-functional teams to align machine learning solutions with business goals and mission-driven objectives.<br>• Monitor and address issues like data drift and model performance to ensure continuous improvement and reliability.
<p><strong>DevOps Engineer</strong></p><p>We are seeking a motivated <strong>DevOps Engineer</strong> to enhance automation, streamline deployments, and support modern cloud-native infrastructure. This role is ideal for someone who enjoys improving system reliability, optimizing pipelines, and enabling faster development workflows.</p><p><strong>Responsibilities</strong></p><ul><li>Build, maintain, and optimize CI/CD pipelines using tools like Azure DevOps, GitHub Actions, or Jenkins</li><li>Support containerized environments using Docker and Kubernetes</li><li>Manage infrastructure automation using Terraform, Helm, Ansible, or Bicep</li><li>Monitor application performance, system uptime, and deployment health</li><li>Troubleshoot build failures, pipeline issues, infrastructure drift, and deployment errors</li><li>Manage configuration management across multiple environments</li><li>Collaborate with developers and cloud engineers during releases and application migrations</li><li>Implement logging, monitoring, and alerting solutions</li><li>Maintain documentation for deployments, pipelines, and CI/CD procedures</li></ul><p><br></p>
<p>We are looking for a Systems Administrator to help drive technology modernization and operational excellence. We are seeking IT professionals with a strong background in systems administration or infrastructure operations, eager to support mission-critical initiatives and grow into senior infrastructure roles.</p><p><strong>Qualifications:</strong></p><ul><li>2–5 years of experience in systems administration or infrastructure operations; helpdesk/NOC backgrounds with relevant exposure also considered. (Based on general knowledge)</li><li>Proven expertise in Linux system administration. (Based on general knowledge)</li><li>Familiarity with enterprise infrastructure, including storage, virtualization, and networking. (Based on general knowledge)</li><li>Hands-on experience with monitoring systems such as Zabbix, Grafana, or Prometheus. (Based on general knowledge)</li><li>Basic scripting skills (e.g., Bash, Python) and a strong interest in further developing automation capabilities. (Based on general knowledge)</li><li>Excellent written communication for documentation and process development. (Based on general knowledge)</li><li>Ability to respond quickly and decisively during support rotation and system issues. (Based on general knowledge)</li><li>Comfortable leveraging AI tools for troubleshooting, documentation, and automation with a disciplined approach to validating outputs. (Based on general knowledge)</li><li>Growth mindset: eagerness to learn, develop, and advance into senior infrastructure roles over time. (Based on general knowledge)</li></ul><p><br></p>
We are looking for an experienced DevOps Engineer to join our team in Raleigh, North Carolina. This role involves collaborating with cross-functional teams to optimize cloud infrastructure, automate processes, and support development pipelines. The ideal candidate will bring a strong background in both development and infrastructure, as well as leadership skills to guide projects and mentor team members.<br><br>Responsibilities:<br>• Design and implement automation solutions to streamline development and deployment processes.<br>• Manage containerization technologies, ensuring efficient application deployment and scalability.<br>• Collaborate with engineering teams to establish best practices for cloud infrastructure and DevOps strategies.<br>• Develop and maintain Infrastructure as Code (IaC) solutions using tools like Terraform and Ansible.<br>• Monitor system performance and create dashboards to provide insights into infrastructure health.<br>• Partner with senior leadership and technical teams to align DevOps initiatives with organizational goals.<br>• Lead project teams, providing mentorship and guidance to ensure successful execution.<br>• Contribute to backlog management, addressing tasks related to monitoring and operational improvements.<br>• Assist in setting up cloud environments and DevOps tools to support testing and development workflows.<br>• Work with Azure DevOps and other platforms to enhance operational efficiency and collaboration.
<p>Our company is seeking a talented DevOps Engineer to join our team on a contract basis. In this role, you will collaborate closely with development, QA, and IT teams to streamline and enhance deployment pipelines, automation, and cloud infrastructure. The ideal candidate is adept at solving complex problems, communicates well, and is proactive in identifying and implementing continuous improvement opportunities. Key Responsibilities: Design, build, and maintain CI/CD pipelines to support scalable application development and deployment. Automate software build and deployment processes to improve reliability and efficiency. Manage and optimize cloud infrastructure (such as AWS, Azure, or Google Cloud Platform). Monitor system performance and troubleshoot issues related to infrastructure, deployment, and automation. Collaborate with developers and other team members to identify requirements and deliver technical solutions aligned with project goals. Maintain infrastructure as code using tools such as Terraform, Ansible, or similar. Implement and uphold security best practices in DevOps workflows and cloud architectures. Document procedures and share knowledge across the team to support a collaborative work environment.</p>
We are looking for a skilled DevOps Engineer to join our team in Indianapolis, Indiana, within the chemicals manufacturing industry. In this Contract to permanent role, you will play a vital role in designing, implementing, and optimizing automation and control systems for innovative research and development projects. The ideal candidate will possess hands-on expertise in system integration, process control, and electrical design, ensuring the seamless operation of advanced technologies.<br><br>Responsibilities:<br>• Design and implement custom control systems for process optimization and system integration in R&D projects.<br>• Develop and modify machine control software to enhance performance and improve operational efficiency.<br>• Create detailed process flow diagrams, schematics, and technical documentation for project execution.<br>• Evaluate equipment specifications and performance requirements to design appropriate solutions.<br>• Collaborate with external vendors to identify, assess, and integrate their technologies into existing processes.<br>• Conduct process mapping and optimization to improve current control systems.<br>• Assist in the installation, configuration, and validation of equipment to ensure successful integration.<br>• Utilize IIoT platforms and Rockwell systems to develop robust process control solutions.<br>• Manage and oversee electrical design and fabrication for custom automation solutions.<br>• Ensure all systems comply with safety standards and operational requirements.
<p>Join our dynamic technology team as a Site Reliability Engineer (SRE) or Platform Engineer, where you’ll play a central role in building, automating, and maintaining our modern infrastructure across both on-premise and cloud environments.</p><p><strong>Qualifications:</strong></p><ul><li>Bachelor’s degree in Computer Science, Engineering, or a related technical field. (Based on general knowledge)</li><li>3–5+ years of experience in SRE, Platform Engineering, or Systems Administration within fast-paced environments. (Based on general knowledge)</li><li>Strong Python scripting skills. (Based on general knowledge)</li><li>Deep hands-on experience with Kubernetes (deployment, management, troubleshooting); OpenShift experience is a plus. (Based on general knowledge)</li><li>Proficiency with Docker/Podman and internal image management. (Based on general knowledge)</li><li>Solid experience with Ansible and Terraform; Puppet knowledge is helpful. (Based on general knowledge)</li><li>Familiarity with CI/CD workflows; experience with ArgoCD (preferred) or Flux for GitOps. (Based on general knowledge)</li><li>Proficiency with Grafana and Prometheus; exposure to Grafana Cloud/Alloy is desirable. (Based on general knowledge)</li><li>Experience with incident management and on-call tools such as Rootly, Opsgenie, or PagerDuty. (Based on general knowledge)</li><li>Security-first mindset with exposure to DevSecOps practices, including SonarQube, SAST, and CVE scanning. (Based on general knowledge)</li><li>Proven experience with both on-premise and cloud infrastructure:</li><li><strong>On-Premise:</strong> Primary experience with Kubernetes clusters; familiarity with Proxmox is desirable.</li><li><strong>Cloud:</strong> AWS and GCP experience (with a growing footprint), managed via Terraform. (Based on general knowledge)</li></ul><p>If you’re passionate about automation, reliability, and working at the forefront of scalable infrastructure, we invite you to apply.</p>