<p><strong>Data Engineer</strong></p><p>On-site | Austin, TX | Contract-to-Hire</p><p><br></p><p><strong>Responsibilities:</strong></p><ul><li>Design, build, and maintain scalable data pipelines and ETL/ELT processes</li><li>Develop and optimize data architectures for data lakes, warehouses, and analytics platforms</li><li>Ingest, transform, and integrate data from multiple sources (databases, APIs, streaming systems)</li><li>Ensure data quality, reliability, and performance across data systems</li><li>Collaborate with data scientists, analysts, and business stakeholders to support reporting and analytics needs</li><li>Optimize database performance, queries, and data storage strategies</li><li>Implement data governance, security, and compliance best practices</li><li>Automate data workflows and monitoring processes</li><li>Troubleshoot and resolve data pipeline failures and performance issues</li><li>Document data models, workflows, and technical processes</li></ul>
We are looking for a skilled Data Engineer to join our team in Houston, Texas. In this Contract to permanent position, you will play a key role in designing, developing, and optimizing data solutions while collaborating with cross-functional teams to deliver impactful results. This role offers an excellent opportunity to contribute to innovative projects and mentor other developers.<br><br>Responsibilities:<br>• Design and implement scalable data solutions using tools such as Apache Spark, Hadoop, and Kafka.<br>• Build and maintain efficient ETL processes to ensure seamless data transformation and integration.<br>• Collaborate with product owners, business analysts, and stakeholders to gather requirements and translate them into technical solutions.<br>• Optimize and troubleshoot complex data workflows to enhance performance and reliability.<br>• Lead technical discussions and provide architectural guidance for best practices and development standards.<br>• Mentor entry level developers and conduct code reviews to ensure high-quality deliverables.<br>• Integrate data solutions with existing systems and third-party tools using APIs and cloud platforms.<br>• Stay updated with the latest data engineering technologies and proactively recommend improvements.<br>• Work within Agile/Scrum teams to deliver solutions aligned with user stories and project goals.<br>• Ensure compliance with security and quality standards through thorough documentation and testing.
<p>Robert Half Technology is seeking a <strong>mid-to-senior level Data Engineer</strong> to support the modernization of an existing data environment for a client in Bellevue, Washington. This role will focus on <strong>rearchitecting data pipelines into Databricks</strong>, improving performance, and establishing scalable data architecture and governance. This is a hands-on role in a <strong>fast-paced, less structured environment</strong>, ideal for someone who takes ownership and can operate with autonomy.</p><p> </p><p><strong>Duration:</strong> Long-term contract with potential for extension or conversion</p><p><strong>Location:</strong> Bellevue, Washington (3-days onsite working hybrid)</p><p><strong>Schedule:</strong> Monday-Friday (9AM-5PM PST)</p><p> </p><p><strong>Key Responsibilities</strong></p><ul><li>Rebuild and optimize existing <strong>Python-based ETL pipelines</strong> within Databricks </li><li>Design and implement scalable <strong>data ingestion and transformation processes</strong> </li><li>Architect and maintain <strong>data marts and data warehouse structures</strong> </li><li>Implement <strong>Medallion Architecture (Bronze, Silver, Gold layers)</strong> </li><li>Improve performance of data processing workflows (reduce runtimes, optimize queries) </li><li>Support migration and consolidation of data into Databricks </li><li>Document <strong>data pipelines, tables, and architecture</strong> for governance and maintainability </li><li>Define best practices for <strong>data storage, organization, and access</strong> </li><li>Ensure alignment with existing compliance and data standards </li></ul><p><br></p>
We are looking for a detail-oriented and innovative Data Engineer to join our team on a long-term contract basis. In this role, you will focus on designing, building, and maintaining data pipelines while ensuring data quality and integrity across various platforms. This position is based in Kaysville, Utah, and offers an exciting opportunity to contribute to the seamless flow of data within a dynamic environment.<br><br>Responsibilities:<br>• Develop, optimize, and maintain data pipelines to facilitate efficient data ingestion, transformation, and integration from diverse sources.<br>• Utilize Snowflake and other cloud platforms to manage and enhance data storage solutions for high performance and scalability.<br>• Create and maintain clean, efficient code to interface with APIs for seamless data extraction and integration.<br>• Implement rigorous data validation and quality assurance processes to ensure data accuracy and reliability.<br>• Leverage AI tools to enhance coding efficiency, automate repetitive tasks, and troubleshoot complex issues.<br>• Collaborate with cross-functional teams to align data workflows with organizational goals and objectives.<br>• Optimize SQL queries and workflows to improve performance and reduce processing time.<br>• Work with tools like dbt and Python to build and orchestrate scalable data workflows.<br>• Stay updated on emerging technologies and best practices in data engineering to continuously improve processes.
<p><strong>Mid-Level Data Engineer (On-Site | Los Angeles, CA)</strong></p><p><em>Build systems that actually drive business decisions.</em></p><p><br></p><p>This is not a “maintain the pipeline and go home” kind of role.</p><p><br></p><p>We’re looking for a sharp, early-career Data Engineer who wants to operate close to the business, own meaningful projects end-to-end, and build systems that directly impact how decisions get made across an entire organization. You’ll join a small, high-performing team where your work won’t get buried—it will be seen, used, and relied on daily.</p><p><br></p><p>If you’re someone who enjoys solving messy problems, building from scratch, and working in a fast-paced, high-expectation environment, this is the kind of role where you’ll grow quickly.</p><p><br></p><p>What You’ll Do</p><ul><li>Design and build automated data systems (e.g., billing workflows, internal tools)</li><li>Create and maintain BI dashboards and reports using Python, Excel, and visualization tools</li><li>Write and optimize SQL queries and ETL pipelines for clean, reliable data flow</li><li>Analyze large datasets to uncover actionable insights and trends</li><li>Partner with stakeholders across the business to translate needs into technical solutions</li><li>Help improve data accessibility and usability across departments</li><li>Ensure data integrity and accuracy through audits and troubleshooting</li><li>Contribute to a growing data function with high visibility and ownership</li></ul><p>Why This Role Stands Out</p><ul><li>High ownership: You’ll build systems from the ground up, not just maintain them</li><li>Small team, big impact: Work directly with senior leadership and decision-makers</li><li>Growth opportunity: The team is expanding—this role can evolve quickly</li><li>Flexibility within intensity: While this is a high-performance environment, there’s trust and flexibility when needed</li></ul>
<p>Robert Half is seeking a Data Engineer to design, build, and maintain enterprise data infrastructure and analytics platforms. This role will serve as the technical owner of data architecture, ensuring data quality, governance, and accessibility across the organization.</p><p>This is a highly visible role supporting leadership and business teams by enabling reliable, data-driven decision-making through scalable data solutions and modern analytics tools.</p><p><br></p><p><strong>Job Responsibilities</strong></p><ul><li>Design and implement enterprise data architecture, including data models and integration patterns to establish a single source of truth </li><li>Build and manage analytics platforms to support reporting and business intelligence initiatives </li><li>Develop and maintain high-impact dashboards using Power BI or similar tools for leadership and operational teams </li><li>Design and build automated ETL/ELT pipelines across multiple systems and data sources </li><li>Define and enforce data governance standards, including metric definitions, data quality rules, and access controls </li><li>Monitor and optimize data pipeline performance, including troubleshooting failures and implementing automated error handling </li><li>Investigate and resolve data quality issues (e.g., duplicates, sync failures) and implement proactive monitoring solutions </li><li>Enable self-service analytics by creating user-friendly data models and supporting end users with training and documentation </li><li>Ensure compliance with data security and regulatory requirements, including proper data handling and access controls </li><li>Partner with IT leadership to recommend tools, technologies, and best practices to enhance data capabilities </li></ul>
We are looking for a skilled Data Engineer to join our team in Carmel, Indiana. In this long-term contract role, you will design, build, and optimize data pipelines and systems to support business needs. The ideal candidate will bring expertise in data engineering tools and frameworks, along with a passion for solving complex challenges.<br><br>Responsibilities:<br>• Develop and maintain robust data pipelines using modern frameworks and tools.<br>• Implement ETL processes to ensure accurate and efficient data transformation.<br>• Optimize data storage and retrieval systems for performance and scalability.<br>• Collaborate with cross-functional teams to understand data requirements and deliver solutions.<br>• Utilize Apache Spark and Hadoop for large-scale data processing.<br>• Work with Databricks to streamline data workflows and enhance analytics.<br>• Apply machine learning techniques using tools like scikit-learn and Pandas.<br>• Integrate Kafka for real-time data streaming and processing.<br>• Analyze and troubleshoot data-related issues to ensure system reliability.<br>• Document processes and workflows to support future development and maintenance.
<p>We are looking for an experienced Data Engineer to join our team in Cleveland, Ohio. In this role, you will design, implement, and optimize data solutions that support business intelligence and analytics needs. If you have a passion for working with cutting-edge technologies and thrive in a fast-paced environment, this opportunity is for you.</p><p><br></p><p>Responsibilities:</p><p>• Develop and refine data models to ensure optimal performance and scalability.</p><p>• Design and implement data warehouse solutions for managing structured and unstructured data.</p><p>• Create and maintain data integration processes to support analytics and data-driven applications.</p><p>• Establish robust data quality and validation protocols to guarantee accuracy and consistency.</p><p>• Collaborate with business intelligence teams and stakeholders to gather requirements and deliver tailored solutions.</p><p>• Monitor and address issues within data pipelines, including performance bottlenecks and system errors.</p><p>• Research and adopt emerging technologies and best practices to enhance data engineering capabilities.</p>
We are looking for a skilled Data Engineer to join our team on a long-term contract basis in Houston, Texas. In this role, you will design, build, and manage data pipelines and systems to support business operations and decision-making processes. This position offers an exciting opportunity to work with cutting-edge technologies within the energy and natural resources sector.<br><br>Responsibilities:<br>• Develop and maintain scalable data pipelines to efficiently process large volumes of data.<br>• Collaborate with cross-functional teams to gather requirements and design data solutions that meet business needs.<br>• Implement and optimize ETL processes to ensure the accuracy and reliability of data flows.<br>• Utilize technologies such as Apache Spark, Hadoop, and Kafka to manage and process data streams.<br>• Monitor and troubleshoot data systems to ensure optimal performance and reliability.<br>• Perform data integration from multiple sources to create unified datasets for analysis.<br>• Ensure data security and compliance with organizational and industry standards.<br>• Continuously evaluate and adopt new tools and technologies to enhance data engineering practices.<br>• Provide technical guidance and mentorship to entry-level team members as needed.
<p>We are supporting our client in hiring a Product Data Engineer who will take full ownership of their product information environment. This role centers on managing their PIM solution (Salsify), improving data structures, and building automated, API‑driven integrations that ensure product data is clean, scalable, and synchronized across platforms.</p><p>This position will be deeply involved in a major product‑data overhaul, including cleanup, restructuring, and long‑term system improvements. The ideal candidate is someone who enjoys solving data problems, building automated workflows, and improving the reliability of product information across systems.</p><p><br></p><p> Key Responsibilities</p><p>Product Data Platform Ownership</p><ul><li>Act as the primary administrator for the PIM platform</li><li>Define and maintain product attributes, hierarchies, and data relationships</li><li>Create validation rules, formulas, and workflows to enforce data standards</li><li>Manage permissions, governance, and platform configuration</li><li>Troubleshoot issues related to imports, exports, and publishing</li></ul><p>Integrations & Automation</p><ul><li>Manage integrations between the PIM and internal/external systems (eCommerce, retail, etc.)</li><li>Build and support API‑based data flows with a focus on reliability and scale</li><li>Develop automation using scripting (Python preferred)</li><li>Support event‑driven or automated pipelines to reduce manual work</li><li>Monitor integration performance and proactively resolve failures</li></ul><p>Product Data Improvements</p><ul><li>Contribute to a large‑scale product data cleanup and restructuring effort</li><li>Identify gaps in current data models and workflows</li><li>Partner with cross‑functional teams to define scalable data standards</li><li>Improve system design to support long‑term growth</li></ul><p>Channel Syndication</p><ul><li>Manage product data distribution to digital and retail channels</li><li>Ensure data meets channel‑specific requirements</li><li>Troubleshoot publishing issues and improve success rates</li><li>Support product launches and updates across channels</li></ul><p>Data Governance & Quality</p><ul><li>Establish naming conventions, validation rules, and governance standards</li><li>Define and track data quality KPIs (accuracy, completeness, timeliness)</li><li>Utilize or support data governance tools</li><li>Work with business teams to improve data accountability</li></ul><p>Reporting & Metrics</p><ul><li>Build dashboards and reports on data quality and system performance</li><li>Provide insights to leadership to support decision‑making</li><li>Track syndication outcomes and operational metrics</li></ul><p>Operational Support</p><ul><li>Handle day‑to‑day platform usage, enhancements, and issue resolution</li><li>Prioritize incoming requests and tickets</li><li>Ensure stability and reliability of product data operations</li></ul><p><br></p>
We are looking for a skilled Data Engineer to join our team in Wyoming, Michigan. This Contract to permanent role offers an exciting opportunity to design, manage, and optimize data architecture and engineering solutions across a dynamic healthcare organization. The ideal candidate will play a key role in ensuring efficient data governance and infrastructure performance while collaborating with cross-functional teams.<br><br>Responsibilities:<br>• Develop and maintain robust data architectures and frameworks, including relational and graph databases, to meet business objectives.<br>• Create and manage data pipelines to extract, transform, and load data from various sources into data warehouses.<br>• Ensure data governance policies are implemented and monitored, including retention and backup protocols.<br>• Collaborate with teams across departments to translate business requirements into technical specifications.<br>• Monitor and optimize the performance of data assets, identifying opportunities for improvement.<br>• Design scalable and secure data solutions using cloud-based platforms like AWS and Microsoft Azure.<br>• Implement advanced tools and technologies, such as AI, to enhance data analytics and processing capabilities.<br>• Mentor and support team members by sharing technical expertise and providing guidance.<br>• Establish key performance indicators (KPIs) to measure database performance and drive continuous improvement.<br>• Stay up to date with emerging trends and advancements in data engineering and architecture.
<ul><li>Design, develop, and optimize data pipelines using Azure Data Services (Azure Data Factory, Azure Data Lake Storage, Azure Synapse).</li><li>Build and maintain scalable ETL/ELT workflows using Databricks (Spark, PySpark, Delta Lake).</li><li>Implement and manage data orchestration and dependency management using Dagster or similar tools.</li><li>Partner with analytics, data science, and product teams to ensure reliable, high-quality data availability.</li><li>Optimize data models and storage strategies for performance, scalability, and cost efficiency.</li><li>Ensure data quality, observability, and reliability through monitoring, logging, and automated validation.</li><li>Support CI/CD pipelines and infrastructure-as-code practices for data platforms.</li><li>Enforce data security, governance, and compliance best practices within Azure.</li></ul>
<p>A Manufacturing and distribution company is looking for a Data Engineer with 3 + yeasr of experience to join a dynamic team in Oklahoma City, Oklahoma. In this role, you will play a crucial part in designing and maintaining data infrastructure to support analytics and decision-making processes. You will be a key contributor in developing, optimizing, and maintaining the data infrastructure that supports analytics and business intelligence initiatives, and data driven decision making using Snowflake, Matillion, and other tools. Position will be in-office to work closely with the team. No 3rd parties please.</p><p><br></p><p>Responsibilities:</p><p><br></p><p>• Design, develop, and maintain scalable data pipelines to support data integration and real-time processing.</p><p>• Implement and manage data warehouse solutions, with a strong focus on Snowflake architecture and optimization.</p><p>• Write efficient and effective scripts and tools using Python to automate workflows and enhance data processing capabilities.</p><p>• Work with SQL Server to design, query, and optimize relational databases in support of analytics and reporting needs.</p><p>• Monitor and troubleshoot data pipelines, resolving any performance or reliability issues.</p><p>• Ensure data quality, governance, and integrity by implementing and enforcing best practice</p>
We are looking for a skilled Data Engineer to join our team in Houston, Texas. This long-term contract position offers an exciting opportunity to work in the manufacturing industry, leveraging your expertise in data processing and engineering. You will play a pivotal role in designing, implementing, and optimizing data solutions to support critical business operations.<br><br>Responsibilities:<br>• Develop and maintain scalable data pipelines using tools such as Apache Spark and Python.<br>• Design efficient ETL processes to extract, transform, and load data from various sources.<br>• Collaborate with cross-functional teams to understand data requirements and deliver actionable insights.<br>• Implement and manage big data solutions using Apache Hadoop and Apache Kafka.<br>• Monitor and optimize the performance of data systems to ensure reliability and scalability.<br>• Ensure data quality and integrity through rigorous testing and validation processes.<br>• Troubleshoot and resolve issues related to data pipelines and infrastructure.<br>• Maintain documentation for data workflows and processes to ensure clarity and consistency.<br>• Stay updated on emerging technologies and best practices in data engineering to continuously improve systems.
We are looking for a skilled Data Engineer to join our team in Houston, Texas. This contract position offers an exciting opportunity to leverage your expertise in data processing and analytics within the dynamic energy and natural resources industry. You will play a pivotal role in designing, implementing, and optimizing data solutions to support critical business operations.<br><br>Responsibilities:<br>• Develop and maintain scalable data pipelines using Apache Spark, Python, and ETL processes.<br>• Design and implement data storage solutions utilizing Apache Hadoop for efficient data management.<br>• Build real-time data streaming architectures with Apache Kafka to support operational needs.<br>• Optimize data workflows to ensure high performance and reliability across systems.<br>• Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.<br>• Perform data quality checks and validation to ensure accuracy and consistency of datasets.<br>• Troubleshoot and resolve technical issues related to data processing and integration.<br>• Document processes and workflows to ensure knowledge sharing and operational transparency.<br>• Monitor and improve system performance, ensuring the infrastructure meets business demands.
<p><strong>Data Engineer – CRM Integration (Hybrid in San Fernando Valley)</strong></p><p><strong>Location:</strong> San Fernando Valley (Hybrid – 3x per week onsite)</p><p><strong>Compensation:</strong> $140K–$170K annual base salary</p><p><strong>Job Type:</strong> Full Time, Permanent</p><p><strong>Overview:</strong></p><p>Join our growing technology team as a Data Engineer with a focus on CRM data integration. This permanent role will play a key part in supporting analytics and business intelligence across our organization. The position offers a collaborative hybrid environment and highly competitive compensation.</p><p><strong>Responsibilities:</strong></p><ul><li>Design, develop, and optimize data pipelines and workflows integrating multiple CRM systems (Salesforce, Dynamics, HubSpot, Netsuite, or similar).</li><li>Build and maintain scalable data architectures for analytics and reporting.</li><li>Manage and advance CRM data integrations, including real-time and batch processing solutions.</li><li>Deploy ML models, automate workflows, and support model serving using Azure Databricks (ML Flow experience preferred).</li><li>Utilize Azure Synapse Analytics & Pipelines for high-volume data management.</li><li>Write advanced Python and Spark SQL code for ETL, transformation, and analytics.</li><li>Collaborate with BI and analytics teams to deliver actionable insights using PowerBI.</li><li>Support streaming solutions with technologies like Kafka, Event Hubs, and Spark Streaming.</li></ul><p><br></p>
We are looking for a highly skilled Senior Google Cloud Engineer to join our team on a long-term contract basis in Salt Lake City, Utah. In this role, you will be responsible for designing, implementing, and maintaining secure, reliable, and scalable cloud infrastructure to support critical business operations. Your contributions will help ensure that teams across the organization can operate efficiently and deliver impactful solutions. If you are passionate about leveraging your expertise in Google Cloud to drive innovation, we want to hear from you.<br><br>Responsibilities:<br>• Design and manage production-grade workloads on Google Cloud, ensuring security, reliability, and cost-effectiveness.<br>• Develop and enforce infrastructure standards for identity, networking, data protection, and secrets management.<br>• Build and maintain automated CI/CD pipelines to streamline infrastructure provisioning, testing, and deployment.<br>• Implement Infrastructure as Code (IaC) solutions using tools like Terraform to enhance scalability and repeatability.<br>• Troubleshoot incidents, conduct root-cause analysis, and refine system monitoring and alerting mechanisms.<br>• Enhance and maintain private connectivity, firewall policies, and least-privilege access for secure cloud environments.<br>• Collaborate with cross-functional teams to review and optimize designs, threat models, and operational readiness.<br>• Mentor team members on cloud best practices, operational excellence, and sustainable on-call strategies.<br>• Improve visibility into cloud costs, system performance, and logging to support organizational goals.<br>• Participate in post-incident reviews to drive continuous improvement of backup, recovery, and capacity management strategies.
<p>We are seeking a Cloud Engineer to support the design, deployment, and maintenance of cloud infrastructure across our environments. This role focuses on building reliable, secure, and scalable cloud solutions while assisting development and IT teams with day‑to‑day cloud operations.</p><p><strong>Responsibilities</strong></p><ul><li>Deploy, configure, and manage cloud resources in AWS/Azure/GCP.</li><li>Maintain CI/CD pipelines and automate infrastructure tasks using tools such as Terraform, CloudFormation, or similar.</li><li>Monitor system performance, reliability, cost, and security posture across cloud workloads.</li><li>Support containerized applications (Docker/Kubernetes) and troubleshoot environment issues.</li><li>Implement cloud security best practices, including IAM, logging, patching, and network controls.</li><li>Assist with incident response, cloud migrations, and environment optimization.</li><li>Collaborate with engineering and IT teams to support application deployments and cloud operations.</li></ul><p><br></p>
<p>We are proactively building a network of experienced Cloud Engineers for upcoming consulting and contract opportunities with enterprise and high-growth technology organizations. These roles involve designing, implementing, and optimizing cloud infrastructure that supports scalable applications, modern DevOps practices, and secure enterprise environments.</p><p><br></p><p>The Cloud Engineer will be responsible for building and maintaining secure, scalable cloud environments across public cloud platforms. This role works closely with DevOps teams, developers, and infrastructure engineers to implement automation, improve system reliability, and ensure cloud resources are optimized for performance and cost efficiency.</p><p><strong>Key Responsibilities</strong></p><ul><li>Design, implement, and maintain cloud infrastructure across modern public cloud platforms.</li><li>Automate infrastructure provisioning and deployment using infrastructure-as-code tools.</li><li>Monitor system performance, reliability, and security across cloud environments.</li><li>Collaborate with development and DevOps teams to support CI/CD pipelines and application deployments.</li><li>Implement best practices around cloud security, networking, and identity management.</li><li>Troubleshoot infrastructure issues and optimize environments for scalability and performance.</li></ul>
<p><strong>Cloud Engineer</strong></p><p>We are seeking a talented <strong>Cloud Engineer</strong> to join our infrastructure team. This role is ideal for someone who enjoys building cloud-based solutions, optimizing deployments, and supporting scalable, secure environments. The ideal candidate will have strong problem-solving abilities, excellent communication skills, and a solid foundation in cloud architecture with room to grow into more advanced engineering responsibilities.</p><p><strong>Responsibilities</strong></p><ul><li>Deploy, configure, and manage cloud resources across Azure, AWS, and/or GCP</li><li>Implement cloud security controls including IAM/RBAC permissions, encryption, policies, and MFA</li><li>Build and maintain Infrastructure-as-Code templates using Terraform, ARM/Bicep, or CloudFormation</li><li>Support CI/CD pipelines and automated deployments for application and infrastructure releases</li><li>Monitor cloud performance, availability, cost usage, and alerts using native tools</li><li>Troubleshoot cloud networking issues including firewalls, VNETs/VPCs, routing, gateways, and load balancers</li><li>Support containerized workloads using Docker and Kubernetes</li><li>Collaborate with developers, DevOps teams, and systems administrators on cloud projects</li><li>Document cloud architectures, procedures, and operational guidelines</li><li>Assist with cloud migrations, modernization initiatives, and optimization efforts</li></ul><p><br></p>
<p>We are looking for a Cloud Engineer to join our team in Fort Myers/Naples area in Florida. This is a contract to permanent position, fully onsite Monday to Friday.</p><p><br></p><p>Responsibilities:</p><ul><li>Review and design both high-level and detailed architecture.</li><li>Understand HERC business needs and identify necessary improvements in the AWS architecture to align with the AWS Well-Architected Framework and best practices.</li><li>Design, implement, and manage scalable, secure, and reliable cloud infrastructure on AWS.</li><li>Protect information, systems, and assets while delivering business value by advising best practices for the creation of a secure cloud environment.</li><li>Ensure that AWS architecture aligns with business needs and monitor performance for effective use of AWS resources.</li><li>Evaluate AWS products and services for impact on architecture and recommend adoption as necessary.</li><li>Advise application teams on deployment options and oversee provisioning of AWS architecture components.</li><li>Ensure development teams follow best practices and architectural standards.</li><li>Monitor and optimize cloud resources for performance, cost, and security</li></ul>
<p>We are seeking a Cloud Engineer to support the design, deployment, and maintenance of cloud infrastructure across our environments. This role focuses on building reliable, secure, and scalable cloud solutions while assisting development and IT teams with day‑to‑day cloud operations.</p><p><strong>Responsibilities</strong></p><ul><li>Deploy, configure, and manage cloud resources in AWS/Azure/GCP.</li><li>Maintain CI/CD pipelines and automate infrastructure tasks using tools such as Terraform, CloudFormation, or similar.</li><li>Monitor system performance, reliability, cost, and security posture across cloud workloads.</li><li>Support containerized applications (Docker/Kubernetes) and troubleshoot environment issues.</li><li>Implement cloud security best practices, including IAM, logging, patching, and network controls.</li><li>Assist with incident response, cloud migrations, and environment optimization.</li><li>Collaborate with engineering and IT teams to support application deployments and cloud operations.</li></ul><p><br></p>
<p>We are seeking a Cloud Engineer to support the design, deployment, and maintenance of cloud infrastructure across our environments. This role focuses on building reliable, secure, and scalable cloud solutions while assisting development and IT teams with day‑to‑day cloud operations.</p><p><strong>Responsibilities</strong></p><ul><li>Deploy, configure, and manage cloud resources in AWS/Azure/GCP.</li><li>Maintain CI/CD pipelines and automate infrastructure tasks using tools such as Terraform, CloudFormation, or similar.</li><li>Monitor system performance, reliability, cost, and security posture across cloud workloads.</li><li>Support containerized applications (Docker/Kubernetes) and troubleshoot environment issues.</li><li>Implement cloud security best practices, including IAM, logging, patching, and network controls.</li><li>Assist with incident response, cloud migrations, and environment optimization.</li><li>Collaborate with engineering and IT teams to support application deployments and cloud operations.</li></ul><p><br></p>
<p>We are seeking a Cloud Engineer to support the design, deployment, and maintenance of cloud infrastructure across our environments. This role focuses on building reliable, secure, and scalable cloud solutions while assisting development and IT teams with day‑to‑day cloud operations.</p><p><strong>Responsibilities</strong></p><ul><li>Deploy, configure, and manage cloud resources in AWS/Azure/GCP.</li><li>Maintain CI/CD pipelines and automate infrastructure tasks using tools such as Terraform, CloudFormation, or similar.</li><li>Monitor system performance, reliability, cost, and security posture across cloud workloads.</li><li>Support containerized applications (Docker/Kubernetes) and troubleshoot environment issues.</li><li>Implement cloud security best practices, including IAM, logging, patching, and network controls.</li><li>Assist with incident response, cloud migrations, and environment optimization.</li><li>Collaborate with engineering and IT teams to support application deployments and cloud operations.</li></ul><p><br></p>
We are looking for an experienced Cloud Engineer to join our team in Carmel, Indiana. In this long-term contract role, you will play a pivotal part in designing, implementing, and maintaining cloud-based solutions that drive business efficiency and scalability. If you have a strong background in cloud technologies and a passion for optimizing systems, we encourage you to apply.<br><br>Responsibilities:<br>• Develop and manage cloud infrastructure using AWS technologies, ensuring high availability and scalability.<br>• Implement and maintain automated deployment processes using Ansible.<br>• Configure and optimize Amazon EC2 instances and Auto Scaling groups for performance and cost efficiency.<br>• Monitor and troubleshoot cloud environments to ensure seamless operation.<br>• Collaborate with cross-functional teams to design cloud architectures tailored to business needs.<br>• Conduct regular system performance evaluations and recommend improvements.<br>• Ensure compliance with security protocols and best practices in cloud operations.<br>• Document processes and provide training to team members as needed.<br>• Assist in disaster recovery planning and execution for cloud-based systems.