We are looking for an experienced Artificial Intelligence (AI) Engineer to join our team in Atlanta, Georgia. This is a long-term contract position where you will play a pivotal role in advancing AI initiatives across clinical and business operations. The ideal candidate will have a strong technical background, excellent communication skills, and the ability to collaborate across multiple departments to drive innovative solutions in healthcare.<br><br>Responsibilities:<br>• Partner with various departments to identify, design, and implement AI solutions that address clinical, financial, and operational needs.<br>• Evaluate and integrate third-party AI tools and platforms, with a focus on healthcare applications such as NexTech, call center automation, AI-powered scribing, and clinical trial identification.<br>• Develop and support AI applications to enhance patient identification for trials, automate documentation, and improve workflows.<br>• Build and maintain AI-driven dashboards and analytics using tools like Power BI to provide actionable insights for clinical and business teams.<br>• Ensure AI integrations meet scalability, security, and compliance requirements, adhering to healthcare data privacy standards.<br>• Serve as a strategic advisor by proactively identifying opportunities for organizational improvement through AI.<br>• Collaborate with stakeholders across IT and non-IT teams to foster innovation and streamline operations.<br>• Stay updated on industry trends, regulatory standards, and emerging AI technologies relevant to healthcare.<br>• Provide technical leadership and guidance on AI-related projects, ensuring alignment with organizational goals.
<p><strong>Overview</strong></p><p>Our client is seeking an Atlanta-based AI Engineer to accelerate their artificial intelligence strategy and project implementation across clinical and business operations. This cross-functional role will serve as a trusted partner for both clinical and financial teams, driving AI adoption with tools such as NexTech (AI scribe and clinical trials), call center AI, AI-based revenue cycle management bots, and advanced analytics through PowerBI and data integrations. Our ideal candidate is a self-starter—an “athlete” or "one-man-band"—who can communicate effectively to both IT and non-IT professionals working across different teams, catalyzing innovation, and providing strategic guidance for optimizing workflows and extracting value from healthcare data.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Collaborate with 12 different departments to identify, scope, and deliver AI use cases in clinical, financial, and operational domains.</li><li>Evaluate, recommend, and integrate third-party AI vendors and solutions, especially in healthcare applications such as NexTech, call center automation, AI-powered scribing, clinical trial identification, and revenue cycle management bots.</li><li>Build, customize, and deploy AI tools (including Gemini, ChatGPT, Claude, and Copilot) to create efficiencies, automate workflows, and generate actionable insights from company data (databases, emails, PowerPoint presentations).</li><li>Support clinical teams by automating patient identification for trials, improving documentation, and leveraging AI scribes in NexTech.</li><li>Assist management in leveraging AI-driven dashboards and analytics (PowerBI) to monitor key business and clinical metrics.</li><li>Ensure scalable, secure, and compliant integration of AI applications.</li><li>Maintain up-to-date knowledge of healthcare data requirements, privacy/regulatory standards, and industry best practices.</li><li>Serve as an internal consultant, staying knowledgeable about company operations and proactively looking for opportunities to drive organizational change via AI.</li></ul>
<p><strong>Service & Automation Engineer</strong></p><p>This position supports customers with technical service needs related to advanced manufacturing equipment and automated production systems. The role focuses on troubleshooting, upgrade support, and automation software commissioning to ensure equipment performance, uptime, and customer satisfaction.</p><p>You will travel to customer sites and work independently to diagnose and resolve technical issues. Candidates should be self-driven, organized, and comfortable handling service requests both on-site and remotely. Travel may be required on short notice, with service visits typically lasting 1–2 weeks and occasionally longer. When not traveling, you will provide remote support and contribute to continuous improvement of service operations.</p><p><br></p><p>Key Responsibilities</p><p>Customer Support & Service</p><ul><li>Serve as a primary technical contact for customer inquiries and service requests.</li><li>Provide remote troubleshooting support using phone, email, and secure remote access tools.</li><li>Diagnose issues and guide customers through corrective actions.</li><li>Escalate complex problems to specialized engineering teams when needed.</li><li>Perform on-site service work such as troubleshooting, commissioning, maintenance, and repairs when remote resolution is not possible.</li></ul><p>PLC Programming & Troubleshooting</p><ul><li>Modify, test, and debug PLC programs across modern and legacy control platforms (examples include Siemens and Allen-Bradley systems).</li><li>Troubleshoot ladder logic, function blocks, and structured text issues.</li><li>Optimize control logic for performance, safety, and efficiency.</li><li>Perform online/offline edits during commissioning or service activities.</li></ul><p>Drive Configuration & Diagnostics</p><ul><li>Configure and troubleshoot VFDs, servo drives, and motion controllers.</li><li>Set motor parameters, feedback devices, and motion profiles.</li><li>Diagnose drive faults such as overcurrent, encoder errors, or communication issues.</li><li>Integrate drives with PLC systems via industrial networks (e.g., EtherNet/IP, Profinet).</li></ul><p>HMI / SCADA Support</p><ul><li>Create or modify operator interface screens including alarms, trends, recipes, and controls.</li><li>Connect PLC tags to HMI objects and verify communications.</li><li>Adjust user interfaces based on operator feedback.</li><li>Troubleshoot display, scripting, or communication issues.</li></ul><p>Quality & Continuous Improvement</p><ul><li>Confirm resolution of service issues and ensure equipment reliability.</li><li>Provide feedback to engineering teams on recurring problems or improvement opportunities.</li><li>Document service activities and customer interactions in internal systems.</li></ul><p>Collaboration</p><ul><li>Support spare-parts identification and service quotation activities.</li><li>Share technical knowledge with internal teams and participate in project discussions.</li></ul>
<p><strong>Software Engineer (Databricks/Data Platform)</strong></p><p><strong>Hybrid 3-4 days onsite in Alpharetta, GA</strong></p><p><strong>Duration through 10/30/26</strong></p><p><br></p><p>We are looking for an experienced Software Engineer III to join our team in Alpharetta, GA. In this role, you will play a critical part in supporting and developing a Databricks-based data platform, focusing on creating scalable and efficient solutions during the development phase. This is a long-term contract position, requiring in-office work three to four days per week.</p><p><br></p><p>Responsibilities:</p><ul><li>Develop and support Databricks notebooks, jobs, and workflows</li><li>Write, optimize, and maintain PySpark and Python code for data processing</li><li>Help design scalable, reliable, and efficient data pipelines</li><li>Apply Spark best practices (partitioning, caching, joins, file sizing)</li><li>Work with Delta Lake tables and data models</li><li>Perform data validation and quality checks during development</li><li>Support cluster configuration and sizing for development workloads</li><li>Identify performance bottlenecks early and recommend improvements</li><li>Collaborate with Data Engineers to ensure solutions are production-ready</li><li>Document development standards, patterns, and best practices</li></ul>
<p>We are seeking a Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. This role will support data-driven decision-making by ensuring reliable data flow, transformation, and accessibility across the organization.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, build, and maintain ETL/ELT data pipelines</li><li>Develop and optimize data models and data architectures</li><li>Integrate data from multiple sources (APIs, databases, third-party systems)</li><li>Ensure data quality, integrity, and reliability</li><li>Collaborate with data analysts, data scientists, and business stakeholders</li><li>Monitor and troubleshoot data pipeline performance issues</li><li>Implement best practices for data governance and security</li></ul><p><br></p>
<p><strong>Position Summary:</strong></p><ul><li>We are looking for a Data Operations Engineer to support and oversee the automated data‑pipeline environment built on AWS. This position bridges data engineering and customer operations, ensuring that incoming datasets are processed accurately, consistently, and securely within established ingestion and transformation frameworks.</li><li>Key responsibilities include monitoring automated workflows, troubleshooting processing failures, validating data quality, and helping onboard new customers by aligning their data formats to a standardized internal model.</li><li>The role requires strong proficiency in SQL and Python, practical experience with AWS services, and the ability to communicate effectively with external customers when data issues arise.</li></ul><p><strong>Responsibilities:</strong></p><p><strong>Data Pipeline Monitoring & Operations:</strong></p><ul><li>Monitor automated batch and streaming data pipelines in AWS</li><li>Identify, troubleshoot, and resolve data processing failures</li><li>Investigate file‑level errors, schema mismatches, and transformation issues</li><li>Perform root‑cause analysis and document resolutions</li><li>Ensure data integrity, completeness, and timeliness across environments</li><li>Escalate architectural or systemic issues to the Data Engineering team</li></ul><p><strong>Customer Data Onboarding & Implementation:</strong></p><ul><li>Collaborate directly with customers to understand their file formats and data structures</li><li>Create and maintain mapping templates to align customer data to a normalized data model</li><li>Validate sample files and run tests on ingestion workflows</li><li>Configure ingestion parameters within predefined frameworks</li><li>Support customer go‑live processes and initial data processing cycles</li></ul><p><strong>Data Quality & Continuous Improvement:</strong></p><ul><li>Write SQL queries to validate data accuracy and research anomalies</li><li>Develop lightweight Python scripts for validation, transformation checks, or automation tasks</li><li>Improve monitoring processes, internal documentation, and operational playbooks</li><li>Work with engineering teams to strengthen platform reliability and observability</li></ul><p><strong>Customer & Cross‑Functional Collaboration:</strong></p><ul><li>Communicate clearly with customers regarding file issues or data discrepancies</li><li>Partner with internal teams including Data Engineering, Product, and Support</li><li>Provide feedback to enhance scalability, resilience, and overall platform performance</li></ul>
We are looking for a skilled and innovative Sr. Engineer specializing in integrations (iPaaS) to join our team on a contract basis. This role is based in Atlanta, Georgia, and offers the opportunity to lead technical projects that drive seamless connectivity between systems and platforms. You will play a pivotal role in designing, implementing, and optimizing integration frameworks while collaborating with cross-functional teams to align technical solutions with business goals.<br><br>Responsibilities:<br>• Design and implement scalable integration frameworks that ensure efficient data pipelines and connectivity between systems.<br>• Provide technical leadership and guidance to a team of engineers, fostering accountability and a collaborative work environment.<br>• Develop and enhance integrations with third-party systems, focusing on data extraction, transformation, and loading processes.<br>• Define and uphold best practices for integration development, documentation, and team collaboration.<br>• Work closely with stakeholders, including product managers and engineering teams, to ensure alignment with business requirements.<br>• Manage the end-to-end delivery of integration projects, ensuring deadlines are met and quality standards are maintained.<br>• Lead initiatives to explore and integrate innovative technologies to optimize data handling and system connectivity.<br>• Utilize AI-driven tools and techniques to enhance coding productivity and integration efficiency.<br>• Collaborate with enterprise systems such as Salesforce, NetSuite, and other relevant platforms to build seamless integrations.
<p><strong>Key Responsibilities </strong> </p><ul><li>Architect and Implement Integrations Framework: Develop a scalable and resilient integrations framework that prioritizes ETL techniques and data pipeline efficiency. </li><li>Technical Leadership & Mentorship: Lead and mentor a team of 3 engineers, promoting a culture of extreme ownership, accountability, and clear, effective communication. </li><li>Develop Data Integrations: Design and develop robust integrations with third-party systems, emphasizing data extraction, transformation, and loading combined with API-driven approaches. </li><li>Establish Best Practices: Define and enforce best practices for integration design, development, documentation, and open team communication. </li><li>Collaborate with Stakeholders: Work closely with product managers, engineering teams, and other stakeholders, ensuring alignment with business objectives through transparent and proactive communication. </li><li>Oversee Project Delivery: Manage end-to-end delivery of integration projects, ensuring timely completion and accountability at every stage. </li><li>Drive Innovation: Lead initiatives to innovate our integration strategies and technologies, continuously improving our data handling and ETL processes.</li></ul>