<ul><li>Design, develop, and optimize data pipelines using Azure Data Services (Azure Data Factory, Azure Data Lake Storage, Azure Synapse).</li><li>Build and maintain scalable ETL/ELT workflows using Databricks (Spark, PySpark, Delta Lake).</li><li>Implement and manage data orchestration and dependency management using Dagster or similar tools.</li><li>Partner with analytics, data science, and product teams to ensure reliable, high-quality data availability.</li><li>Optimize data models and storage strategies for performance, scalability, and cost efficiency.</li><li>Ensure data quality, observability, and reliability through monitoring, logging, and automated validation.</li><li>Support CI/CD pipelines and infrastructure-as-code practices for data platforms.</li><li>Enforce data security, governance, and compliance best practices within Azure.</li></ul>
We are looking for a skilled Data Engineer to join our team in Wyoming, Michigan. This Contract to permanent role offers an exciting opportunity to design, manage, and optimize data architecture and engineering solutions across a dynamic healthcare organization. The ideal candidate will play a key role in ensuring efficient data governance and infrastructure performance while collaborating with cross-functional teams.<br><br>Responsibilities:<br>• Develop and maintain robust data architectures and frameworks, including relational and graph databases, to meet business objectives.<br>• Create and manage data pipelines to extract, transform, and load data from various sources into data warehouses.<br>• Ensure data governance policies are implemented and monitored, including retention and backup protocols.<br>• Collaborate with teams across departments to translate business requirements into technical specifications.<br>• Monitor and optimize the performance of data assets, identifying opportunities for improvement.<br>• Design scalable and secure data solutions using cloud-based platforms like AWS and Microsoft Azure.<br>• Implement advanced tools and technologies, such as AI, to enhance data analytics and processing capabilities.<br>• Mentor and support team members by sharing technical expertise and providing guidance.<br>• Establish key performance indicators (KPIs) to measure database performance and drive continuous improvement.<br>• Stay up to date with emerging trends and advancements in data engineering and architecture.
<p>We are looking for an experienced Data Engineer to design and support data exchange solutions that connect external business partners with internal systems. This role will mainly work remotely with different office locations. We are looking for a candidate who lives in NC, within 2 hours of Greensboro, NC. This role focuses on building reliable integration processes, transforming structured files and API-based data, and ensuring critical information is available for reporting and operational use. The ideal candidate brings strong technical depth in data movement and troubleshooting, along with a practical understanding of manufacturing and supply chain workflows.</p><p><br></p><p>Responsibilities:</p><p>• Build and maintain business-to-business data interfaces that onboard new partner organizations and align incoming data with internal database structures.</p><p>• Develop automated workflows that ingest, transform, validate, and deliver data using file-based exchanges, APIs, and structured transaction formats such as EDI and X12.</p><p>• Configure and manage end-to-end integration processes across system interfaces, including flat-file handling, file sharing, and reporting-related data movement.</p><p>• Lead data transformation efforts through the full lifecycle by designing solutions, testing functionality, deploying processes, and stabilizing production performance.</p><p>• Investigate integration failures or data quality issues, identify root causes, and implement corrective actions to restore reliable processing.</p><p>• Partner with business intelligence and reporting teams to provide access to accurate, usable data sources that support analysis and operational decision-making.</p><p>• Apply manufacturing and supply chain process knowledge to structure data flows that support purchasing, components, orders, and assembly-related transactions.</p><p>• Use available tools and platforms to execute integration projects independently, including extracting data from enterprise applications and translating it into usable formats.</p><p>• Create scalable data pipelines that enable customer and order transactions to move through systems with minimal manual intervention.</p>
<p>A Manufacturing and distribution company is looking for a Data Engineer with 3 + yeasr of experience to join a dynamic team in Oklahoma City, Oklahoma. In this role, you will play a crucial part in designing and maintaining data infrastructure to support analytics and decision-making processes. You will be a key contributor in developing, optimizing, and maintaining the data infrastructure that supports analytics and business intelligence initiatives, and data driven decision making using Snowflake, Matillion, and other tools. Position will be in-office to work closely with the team. No 3rd parties please.</p><p><br></p><p>Responsibilities:</p><p><br></p><p>• Design, develop, and maintain scalable data pipelines to support data integration and real-time processing.</p><p>• Implement and manage data warehouse solutions, with a strong focus on Snowflake architecture and optimization.</p><p>• Write efficient and effective scripts and tools using Python to automate workflows and enhance data processing capabilities.</p><p>• Work with SQL Server to design, query, and optimize relational databases in support of analytics and reporting needs.</p><p>• Monitor and troubleshoot data pipelines, resolving any performance or reliability issues.</p><p>• Ensure data quality, governance, and integrity by implementing and enforcing best practice</p>
<p>We are supporting our client in hiring a Product Data Engineer who will take full ownership of their product information environment. This role centers on managing their PIM solution (Salsify), improving data structures, and building automated, API‑driven integrations that ensure product data is clean, scalable, and synchronized across platforms.</p><p>This position will be deeply involved in a major product‑data overhaul, including cleanup, restructuring, and long‑term system improvements. The ideal candidate is someone who enjoys solving data problems, building automated workflows, and improving the reliability of product information across systems.</p><p><br></p><p> Key Responsibilities</p><p>Product Data Platform Ownership</p><ul><li>Act as the primary administrator for the PIM platform</li><li>Define and maintain product attributes, hierarchies, and data relationships</li><li>Create validation rules, formulas, and workflows to enforce data standards</li><li>Manage permissions, governance, and platform configuration</li><li>Troubleshoot issues related to imports, exports, and publishing</li></ul><p>Integrations & Automation</p><ul><li>Manage integrations between the PIM and internal/external systems (eCommerce, retail, etc.)</li><li>Build and support API‑based data flows with a focus on reliability and scale</li><li>Develop automation using scripting (Python preferred)</li><li>Support event‑driven or automated pipelines to reduce manual work</li><li>Monitor integration performance and proactively resolve failures</li></ul><p>Product Data Improvements</p><ul><li>Contribute to a large‑scale product data cleanup and restructuring effort</li><li>Identify gaps in current data models and workflows</li><li>Partner with cross‑functional teams to define scalable data standards</li><li>Improve system design to support long‑term growth</li></ul><p>Channel Syndication</p><ul><li>Manage product data distribution to digital and retail channels</li><li>Ensure data meets channel‑specific requirements</li><li>Troubleshoot publishing issues and improve success rates</li><li>Support product launches and updates across channels</li></ul><p>Data Governance & Quality</p><ul><li>Establish naming conventions, validation rules, and governance standards</li><li>Define and track data quality KPIs (accuracy, completeness, timeliness)</li><li>Utilize or support data governance tools</li><li>Work with business teams to improve data accountability</li></ul><p>Reporting & Metrics</p><ul><li>Build dashboards and reports on data quality and system performance</li><li>Provide insights to leadership to support decision‑making</li><li>Track syndication outcomes and operational metrics</li></ul><p>Operational Support</p><ul><li>Handle day‑to‑day platform usage, enhancements, and issue resolution</li><li>Prioritize incoming requests and tickets</li><li>Ensure stability and reliability of product data operations</li></ul><p><br></p>
<p>Position Overview</p><p>We are seeking a delivery‑focused Data Automation Engineer to design and implement innovative automation solutions across a Microsoft Azure‑based data analytics platform. This role partners closely with engineering teams and stakeholders to translate business requirements into scalable data engineering and AI‑enabled solutions.</p><p>The ideal candidate is hands‑on with Azure Data Factory, Synapse Pipelines, Apache Spark, Python, and SQL, and brings experience building reliable ETL pipelines across SQL and NoSQL environments. This role emphasizes performance optimization, automation, and proactive data quality within Agile DevOps delivery models.</p><p><br></p><p>Key Responsibilities</p><p>Data Engineering & Automation</p><ul><li>Develop high‑performance data pipelines using Azure Data Factory, Synapse Pipelines, Spark Notebooks, Python, and SQL.</li><li>Design ETL workflows supporting advanced analytics, reporting, and AI/ML use cases.</li><li>Implement data migration, integrity, quality, metadata, and security controls across pipelines.</li><li>Monitor, troubleshoot, and optimize pipelines for availability, scalability, and performance.</li></ul><p>Performance Testing & Optimization</p><ul><li>Execute ETL performance testing and validate load performance against benchmarks.</li><li>Analyze pipeline runtime, throughput, latency, and resource utilization.</li><li>Support tuning activities (e.g., query optimization, partitioning, indexing).</li><li>Validate data completeness and consistency after high‑volume processing.</li></ul><p>Platform Collaboration & DevOps Support</p><ul><li>Collaborate with DevOps and infrastructure teams to optimize compute, memory, and scaling.</li><li>Maintain versioning and configuration control across environments.</li><li>Support production, testing, development, and integration environments.</li><li>Actively participate in Agile delivery processes including Program Increment planning.</li></ul>
<p>We are seeking a talented and motivated Python Data Engineer to join our global team. In this role, you will be instrumental in expanding and optimizing our data assets to enhance analytical capabilities across the organization. You will collaborate closely with traders, analysts, researchers, and data scientists to gather requirements and deliver scalable data solutions that support critical business functions.</p><p><br></p><p>Responsibilities</p><ul><li>Develop modular and reusable Python components to connect external data sources with internal systems and databases.</li><li>Work directly with business stakeholders to translate analytical requirements into technical implementations.</li><li>Ensure the integrity and maintainability of the central Python codebase by adhering to existing design standards and best practices.</li><li>Maintain and improve the in-house Python ETL toolkit, contributing to the standardization and consolidation of data engineering workflows.</li><li>Partner with global team members to ensure efficient coordination and delivery.</li><li>Actively participate in internal Python development community and support ongoing business development initiatives with technical expertise.</li></ul>
<p><strong>DevOps Engineer</strong></p><p>We are seeking a motivated <strong>DevOps Engineer</strong> to enhance automation, streamline deployments, and support modern cloud-native infrastructure. This role is ideal for someone who enjoys improving system reliability, optimizing pipelines, and enabling faster development workflows.</p><p><strong>Responsibilities</strong></p><ul><li>Build, maintain, and optimize CI/CD pipelines using tools like Azure DevOps, GitHub Actions, or Jenkins</li><li>Support containerized environments using Docker and Kubernetes</li><li>Manage infrastructure automation using Terraform, Helm, Ansible, or Bicep</li><li>Monitor application performance, system uptime, and deployment health</li><li>Troubleshoot build failures, pipeline issues, infrastructure drift, and deployment errors</li><li>Manage configuration management across multiple environments</li><li>Collaborate with developers and cloud engineers during releases and application migrations</li><li>Implement logging, monitoring, and alerting solutions</li><li>Maintain documentation for deployments, pipelines, and CI/CD procedures</li></ul><p><br></p>
<p>DevOps Engineer</p><p>We’re looking for a DevOps Engineer who enjoys automating all the things and making the software development lifecycle run smoother, faster, and with fewer “why is this broken?” moments. You’ll support and improve CI/CD pipelines, development environments, and SDLC tooling across IT and Engineering.</p><p><br></p><p>What You’ll Do</p><ul><li>Build, improve, and maintain CI/CD pipelines and DevOps tooling</li><li>Collaborate with IT and Engineering to streamline SDLC processes (less friction, more shipping)</li><li>Administer development environments, automation, and build tools</li><li>Create clear documentation and training materials so others don’t have to guess</li></ul><p>What You Bring</p><ul><li>Hands-on experience designing and supporting CI/CD systems</li><li>Familiarity with tools like Git, containers, infrastructure-as-code, build, and test frameworks</li><li>Strong communication skills (you can explain complex things without heavy sighing)</li><li>Ability to work independently, learn quickly, and keep things running smoothly</li></ul><p>Education & Experience</p><ul><li>Bachelor’s degree in Computer Science, Software Engineering, or similar</li><li>4+ years in DevOps, software engineering, or related roles</li><li>2+ years rolling out CI/CD pipelines in real-world environments</li></ul><p><br></p>
<p>Position: DevOps Engineer (Mobile-First)</p><p>Location: Remote - Full Time | Direct Hire</p><p>Salary: $140,000 - 160,000 base annual salary + bonus + excellent benefits</p><p>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. ***</p><p>DevOps Engineer</p><p>We’re building something entirely new—and we’re doing it the right way.</p><p>This is a rare opportunity to join a brand-new digital transformation team inside a well‑established, financially strong enterprise. After decades of success with a traditional operating model, the organization is launching a remote‑first, mobile‑first digital platform designed to unify multiple services into a seamless consumer experience.</p><p>The product is 0→1, greenfield, and pre‑launch, with executive sponsorship and long‑term funding already secured. The pace, ownership, and innovation feel like a startup—without startup risk.</p><p>Why This Role Is Exciting</p><p>• Startup‑like environment inside a proven, stable enterprise</p><p>• New digital department with new leadership, tooling, and architecture</p><p>• True greenfield DevOps work—no legacy rebuilds</p><p>• Direct impact on a high‑visibility, consumer‑facing platform</p><p>What You’ll Work On</p><p>• Design, build, and support cloud infrastructure for a mobile‑first platform</p><p>• Build and maintain CI/CD pipelines for modern microservices</p><p>• Support containerized workloads using Docker and Kubernetes</p><p>• Implement Infrastructure as Code (IaC) for consistent environments</p><p>• Improve deployment reliability, scalability, and developer experience</p><p>Technical Environment</p><p>• Cloud: AWS (primary for the digital platform), cloud‑flexible architecture</p><p>• Containers: Docker, Kubernetes</p><p>• CI/CD: GitHub Actions, Jenkins</p><p>• IaC: Terraform, CloudFormation</p><p>• Observability: Prometheus, Grafana, CloudWatch, Datadog / ELK</p><p>• Security: IAM, RBAC, secrets management</p><p>• Scripting: Python, Bash</p><p>• Application Context: Flutter (web,mobile), NestJS, Postgres, API</p><p>What We’re Looking For</p><p>• 3–6 years of experience in DevOps, Cloud Engineering, or SRE</p><p>• Hands‑on AWS infrastructure experience</p><p>• Working knowledge of Kubernetes and containerized systems</p><p>• Experience building or supporting CI/CD pipelines</p><p>• Familiarity with Infrastructure as Code</p><p>• Strong collaboration skills and an ownership mindset</p><p>• Comfortable operating in ambiguity and building from scratch</p><p>What Success Looks Like</p><p>• Faster, safer deployments enabled by your automation</p><p>• Scalable, observable, and secure environments</p><p>• DevOps seen as a partner—not a bottleneck</p><p><br></p><p>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. Also, you may contact me by office: 515-303-4654. Or one click apply on our Robert Half website. No third party inquiries please. Our client cannot provide sponsorship and cannot hire C2C. ***</p>
<p>Our company is seeking a skilled and collaborative DevOps Engineer to join our technology team in St. Louis, MO. This role offers the opportunity to drive automation, streamline workflows, and optimize infrastructure for high-performance applications.</p><p><strong> </strong></p><p><strong>Key Responsibilities:</strong></p><p>Design, deploy, and manage CI/CD pipelines to support agile development practices</p><p>Automate infrastructure provisioning, monitoring, and scaling in cloud or hybrid environments</p><p>Collaborate with development and IT teams to implement DevOps best practices</p><p>Troubleshoot and resolve issues in development, test, and production environments</p><p>Stay current with emerging DevOps technologies, tools, and methodologies</p><p><br></p>
<p><strong>Location:</strong> Hybrid — <em>2 days per month on-site in New Hampshire</em></p><p><strong>Employment Type:</strong> Full-Time</p><p><strong>About the Role</strong></p><p>We’re seeking a talented <strong>Software Engineer</strong> with deep experience in <strong>Oracle APEX</strong> and <strong>PL/SQL. </strong>You should also have a strong background integrating third-party applications like <strong>Salesforce</strong>. This role is ideal for someone who enjoys collaborating with cross-functional teams, designing scalable solutions, and enhancing business systems through thoughtful engineering and integrations.</p><p><br></p><p>As part of our team, you’ll play a key role in building and maintaining applications that drive critical business workflows. You’ll leverage your Oracle APEX expertise to architect solutions and your integration experience to ensure smooth data flows between platforms.</p><p>This is a <strong>hybrid position</strong>, requiring <strong>two days per month on-site in New Hampshire</strong> for team collaboration, planning, or project workshops.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Design, develop, and maintain applications using <strong>Oracle Application Express (APEX)</strong>.</li><li>Build, optimize, and troubleshoot <strong>integrations with third-party systems</strong>, including Salesforce and other enterprise platforms.</li><li>Develop APIs, data pipelines, and middleware solutions to support seamless cross-system communication.</li><li>Collaborate with business stakeholders to gather requirements and translate them into technical specifications.</li><li>Ensure application performance, security, and reliability through best practices.</li><li>Participate in code reviews, testing, deployment, and documentation of software solutions.</li><li>Support ongoing enhancements, bug fixes, and system improvements.</li></ul><p><strong>Required Qualifications</strong></p><ul><li><strong>Hands-on experience with Oracle APEX</strong> development.</li><li>Proven experience designing and implementing <strong>Salesforce integrations</strong> (REST/SOAP APIs, middleware tools, or direct platform integration).</li><li>Strong proficiency with <strong>SQL, PL/SQL</strong>, and Oracle database structures.</li><li>Experience working with APIs, integration frameworks, and data transformation workflows.</li><li>Solid understanding of software development best practices, including version control, testing, and documentation.</li><li>Excellent analytical, troubleshooting, and communication skills.</li><li>Ability to work in a hybrid environment and be on-site in New Hampshire <strong>twice per month</strong>.</li></ul><p><strong>Preferred Qualifications</strong></p><ul><li>Experience with additional integration platforms (e.g., MuleSoft, Boomi, Workato).</li><li>Background working in enterprise environments or supporting mission-critical systems.</li><li>Familiarity with Agile methodologies.</li><li>Knowledge of secure coding practices and data governance.</li></ul>
<p>We are seeking a Software Engineer to support the development of digital manufacturing solutions in a plant environment. This role involves building and implementing applications that enhance automation, data visibility, and overall operational efficiency.</p><p>Key responsibilities include developing software using technologies such as Python, C#, and .NET; supporting real-time monitoring and data integration across systems; and partnering with production and leadership teams to identify and implement process improvements. This individual will also help modernize legacy systems and support the adoption of Industry 4.0 initiatives.</p><p>The ideal candidate brings strong technical skills, a collaborative mindset, and the ability to drive projects from concept through implementation while working cross-functionally with operations and IT teams.</p>
<p><strong>Software Engineer</strong></p><p>On-site | Austin, TX | Contract</p><p><br></p><p><strong>Responsibilities:</strong></p><ul><li>Design, develop, test, and maintain software applications and systems</li><li>Write clean, scalable, and efficient code following best practices and coding standards</li><li>Collaborate with product managers, designers, and other engineers to define requirements and solutions</li><li>Participate in code reviews to ensure quality, performance, and maintainability</li><li>Troubleshoot, debug, and resolve software defects and production issues</li><li>Build and integrate APIs, services, and backend components</li><li>Optimize applications for performance, scalability, and reliability</li><li>Contribute to system architecture, technical design, and documentation</li><li>Follow SDLC processes using Agile, Scrum, or similar methodologies</li><li>Stay current with emerging technologies and continuously improve development practices</li></ul>
We are looking for a skilled Software Engineer to join our dynamic team in Lafayette, Louisiana. In this role, you will contribute to the design, development, and deployment of innovative software solutions, focusing either on ServiceNow module development or full-stack engineering. This position offers the flexibility of working onsite or remotely if based in Louisiana.<br><br>Responsibilities:<br>• Develop and customize applications and modules within the ServiceNow platform.<br>• Design and implement backend and full-stack features using Java, Python, and PostgreSQL.<br>• Ensure software scalability, reliability, and performance through clean coding practices and automated testing.<br>• Enhance development workflows by incorporating automation tools and DevOps methodologies.<br>• Collaborate with cross-functional teams, including product management and QA, to achieve roadmap goals.<br>• Foster coding, testing, and architectural best practices to maintain high engineering standards.<br>• Address performance issues and improve service reliability to meet customer expectations.<br>• Streamline development processes by introducing tools for onboarding and automation.<br>• Actively contribute to the integration of scalable architecture into enterprise-level solutions.<br>• Monitor and optimize software to reduce testing instability and system escalations.
<p>As a Senior Software Developer, you will play a critical role in designing, developing, and maintaining robust software solutions. You will collaborate with cross-functional teams, mentor junior developers, and contribute to the technical direction of projects. Your expertise will ensure the delivery of scalable, secure, and maintainable systems that align with business objectives.</p><p> </p><p><strong>Key Responsibilities:</strong></p><ul><li>Lead the design and development of software applications, ensuring adherence to best practices and coding standards.</li><li>Collaborate with product managers, architects, and stakeholders to gather requirements and define technical specifications.</li><li>Develop scalable and maintainable code using modern programming languages and frameworks.</li><li>Conduct code reviews, provide constructive feedback, and mentor junior team members.</li><li>Troubleshoot and resolve complex software issues, ensuring minimal downtime and impact.</li><li>Optimize application performance, scalability, and security through continuous improvement.</li><li>Contribute to the design and implementation of APIs, microservices, and distributed systems.</li><li>Stay updated on emerging technologies and recommend innovative solutions to enhance development practices.</li><li>Participate in Agile ceremonies, including sprint planning, daily stand-ups, and retrospectives.</li><li>Ensure proper documentation of code, processes, and technical decisions.</li></ul><p><br></p>
<p>We are looking for a talented Software Engineer to join our team in Jacksonville, Florida. This Contract to permanent position involves designing and developing web applications using a combination of Microsoft technologies and modern front-end frameworks. The ideal candidate thrives in collaborative environments, possesses strong problem-solving skills, and is passionate about delivering scalable and user-friendly software solutions.</p><p><br></p><p>Responsibilities:</p><p>• Design, develop, and maintain robust applications using C# and .NET technologies.</p><p>• Create intuitive and responsive front-end interfaces using Angular, JavaScript, and React.js.</p><p>• Develop and optimize SQL queries while managing relational databases, such as SQL Server.</p><p>• Build and consume RESTful APIs to support application integrations.</p><p>• Collaborate with cross-functional teams, including Product, QA, UX, and DevOps, to ensure high-quality deliverables.</p><p>• Participate in the full software development lifecycle, from requirements gathering to deployment and support.</p><p>• Conduct code reviews and contribute to the establishment of engineering best practices.</p><p>• Troubleshoot and resolve application issues across various environments.</p><p>• Maintain clear and comprehensive documentation for code, systems, and development processes.</p><p>• Stay informed about emerging technologies and industry trends to enhance development practices.</p>
We are looking for a skilled Software Engineer to join our team in the Plano/ Richardson, Texas. In this role, you will focus on crafting server-side components while contributing to the development and optimization of AI-driven tools and workflows. This position offers an exciting opportunity to work in an agile environment, drive innovation, and enhance engineering efficiency using cutting-edge technologies. Must be must be eligible to work in the U.S. or Permanent Resident. <br> Responsibilities: • Design and implement high-quality server-side components that align with business needs and architectural standards. • Develop and refine AI-driven coding tools, workflows, and infrastructure to improve engineering consistency and efficiency. • Create and optimize coding agents, prompts, and workflows to streamline development processes. • Integrate tools and plugins into a cohesive development pipeline that enhances productivity. • Enable specification-driven development by leveraging AI for idea generation, implementation, and validation. • Build and maintain unit, integration, and automation tests to ensure software reliability and performance. • Troubleshoot and enhance existing applications while collaborating with stakeholders to identify improvement opportunities. • Develop technical designs and models for assigned components, ensuring alignment with project goals. • Participate in code reviews and provide constructive feedback to improve team output. • Promote agile practices and collaborate with internal and external stakeholders to drive project success.
We are looking for a skilled Software Engineer to join our team in Jacksonville, Florida, on a Contract to permanent position. In this role, you will develop and enhance software applications using C++ while automating workflows through batch scripting. This position offers an excellent opportunity to contribute to high-performance systems and solve complex technical challenges.<br><br>Responsibilities:<br>• Design and implement software features and applications using the C++ programming language.<br>• Develop clean, efficient, and maintainable code that meets technical requirements and standards.<br>• Collaborate with cross-functional teams to gather and define technical specifications for projects.<br>• Perform unit testing and support integration testing to ensure software quality and reliability.<br>• Identify and resolve software defects through debugging and troubleshooting techniques.<br>• Automate repetitive tasks and workflows by creating and maintaining batch scripts.<br>• Optimize and maintain existing codebases to improve performance and functionality.<br>• Stay informed about the latest advancements and best practices in C++ development.
<p>We are looking for a Software Engineer to join our in Albuquerque, New Mexico.</p><p>This role is predominately remote with some onsite as needed. Must be local to New Mexico or willing to relocate.</p><p><br></p><p>The Software Engineer will help create dependable software that supports farmers across the globe. This position offers the chance to work in a lean, highly autonomous environment where engineers shape product capabilities from concept through release. You will contribute to customer-facing SaaS applications, strengthen product quality through testing and observability, and evaluate technical approaches that improve efficiency, scalability, and performance.</p><p><br></p><p>Responsibilities:</p><p>• Develop and deliver meaningful enhancements for customer-facing SaaS products, taking features from design through production release.</p><p>• Collaborate with support and customer-facing teams to investigate user challenges and translate feedback into practical engineering improvements.</p><p>• Examine logs, monitoring data, and session insights to identify defects, diagnose root causes, and improve application reliability.</p><p>• Build and maintain automated test coverage across unit, integration, and end-to-end levels to support stable, high-quality releases.</p><p>• Research and apply new tools, frameworks, and architectural patterns that reduce operating costs and improve system performance.</p><p>• Contribute to cloud-hosted services and applications using modern backend and frontend technologies in an on-site engineering environment.</p><p>• Strengthen development workflows by improving continuous integration and deployment practices for faster, more reliable delivery.</p><p>• Support systems that interact with containerized environments, cloud infrastructure, and asynchronous messaging patterns where needed.</p><p>Other duties as needed</p>
<p>The AI/ML Engineer will design, build, and deploy production-grade machine learning and AI systems that power core products and features. This role bridges cutting-edge research with reliable, scalable engineering, turning prototypes into high-performance services that run 24/7 in production.</p><p> </p><p>Key Responsibilities:</p><ul><li>Design and implement end-to-end ML pipelines: data ingestion, feature engineering, training, evaluation, deployment, and monitoring</li><li>Develop, optimize, and productionize models using PyTorch/TensorFlow/JAX (including LLMs, vision, multimodal, and custom architectures)</li><li>Optimize inference for latency, memory, and cost (quantization, pruning, distillation, TensorRT, ONNX, vLLM)</li><li>Integrate models into backend systems via REST/gRPC APIs, event-driven architectures, or real-time serving</li><li>Own MLOps practices: experiment tracking (MLflow, W&B), model registry, CI/CD for ML, canary deployments, drift detection, and observability</li><li>Collaborate with data scientists to harden research prototypes into clean, tested, production-ready code</li><li>Build and maintain retrieval-augmented generation (RAG), agentic workflows, and prompt-engineered systems when appropriate (LangChain, LlamaIndex)</li><li>Continuously monitor, retrain, and improve live models to maintain performance and reliability</li></ul><p><br></p>
<p>Our company is seeking an innovative and driven AI/ML Engineer to join our technology team in St. Louis, Missouri. If you enjoy developing machine learning models and leveraging AI to solve real business challenges, we invite you to apply.</p><p><strong> </strong></p><p>Key Responsibilities:</p><p>· Design, build, and deploy AI and ML solutions for various business applications</p><p>· Collaborate with data scientists, analysts, and business stakeholders to define project requirements and deliver impactful results</p><p>· Optimize and tune algorithms for accuracy, scalability, and performance</p><p>· Stay current with advancements in machine learning, deep learning, and related technologies</p><p>· Communicate findings and recommendations transparently to technical and non-technical teams</p>
<p>We are looking for an experienced Platforms Engineer to support and enhance critical infrastructure in Syracuse, New York. This position focuses on maintaining stable, secure Linux-based environments while also supporting key platforms used by development teams. The role suits someone who can balance day-to-day operational excellence with long-term improvements in system performance, access control, and compliance readiness.</p>
<p><strong>Data Engineer (Python / AWS)</strong></p><p><strong>Location:</strong> Remote (Northeast / Greater Boston area preferred)</p><p><strong>Type:</strong> Full-Time</p><p><strong>Level:</strong> Mid-to-Senior Individual Contributor</p><p><strong>About the Role</strong></p><p>We are looking for a strong individual contributor who excels in the Python data ecosystem and enjoys building reliable, scalable data pipelines. This role sits within a data engineering group responsible for integrating large volumes of data from external partners and transforming it into usable datasets for internal teams. You’ll work with modern cloud tools while also helping our team gradually transition away from a legacy platform.</p><p>This position is ideal for someone who wants to stay hands-on, focus on technical execution, and remain in an IC role for the next several years. We’re not looking for someone who is aiming to move immediately into architecture or leadership.</p><p>This team is fully distributed, and although candidates in the Boston area can go into the office, the rest of the group is remote. Anyone local may occasionally sit with other teams when on site.</p><p><br></p><p><strong>What You’ll Do</strong></p><ul><li>Build and maintain ETL pipelines that ingest, clean, and aggregate data received from external vendors and large enterprise partners.</li><li>Develop Python‑based data processing workflows deployed on AWS cloud services.</li><li>Work with tools such as AWS Glue, Airflow, dbt, and PySpark to support data transformations and pipeline orchestration.</li><li>Help modernize existing workflows and assist in the gradual migration away from a legacy data system.</li><li>Collaborate with internal stakeholders to understand data needs, define requirements, and ensure reliable integration of partner data feeds.</li><li>Troubleshoot pipeline issues, optimize performance, and improve overall system stability.</li><li>Contribute to best practices around code quality, testing, documentation, and data governance.</li></ul><p><br></p>
<p><br></p><p>Software Platform Engineer will design, build, and maintain a core Data & Machine Learning platform.</p><p><br></p><p>Platform Development: Design and implement new features for our AWS and Databricks-based platform, staying current with industry trends and advancements in AI. Core Component Implementation: Test and integrate central platform components that support our technology stack and serve tenants across the organization. Collaboration: Partner with other engineering teams to identify and deliver platform enhancements that solve specific business problems. Maintain Excellence: Uphold strict security protocols, compliance controls, and architectural principles in all aspects of your work.</p><p><br></p><p><br></p>