We are looking for a highly skilled Data Analyst to join our team in Belgrade, Maine. This is a long-term contract position offering the opportunity to contribute to advanced data solutions and cloud-based projects. The role involves working with large-scale data migrations and developing modern data pipelines using cutting-edge technologies.<br><br>Responsibilities:<br>• Design and implement efficient data pipelines to ingest, transform, and deliver data across enterprise systems.<br>• Develop and manage Azure-based Data Lakes and associated cloud data services.<br>• Lead large-scale data migration projects from on-premise systems to Azure cloud environments.<br>• Create scalable data models and curated datasets to support business intelligence and reporting.<br>• Ensure data integrity, quality, governance, and security throughout all workflows.<br>• Collaborate with technical and business teams to translate requirements into impactful data solutions.<br>• Optimize data architectures to improve performance and scalability.<br>• Support business intelligence platforms like Power BI by delivering curated datasets and analytics layers.<br>• Monitor and troubleshoot data workflows to ensure seamless operations.
<p><strong>Software Engineer (Databricks/Data Platform)</strong></p><p><strong>Hybrid 3-4 days onsite in Alpharetta, GA</strong></p><p><strong>Duration through 10/30/26</strong></p><p><br></p><p>We are looking for an experienced Software Engineer III to join our team in Alpharetta, GA. In this role, you will play a critical part in supporting and developing a Databricks-based data platform, focusing on creating scalable and efficient solutions during the development phase. This is a long-term contract position, requiring in-office work three to four days per week.</p><p><br></p><p>Responsibilities:</p><ul><li>Develop and support Databricks notebooks, jobs, and workflows</li><li>Write, optimize, and maintain PySpark and Python code for data processing</li><li>Help design scalable, reliable, and efficient data pipelines</li><li>Apply Spark best practices (partitioning, caching, joins, file sizing)</li><li>Work with Delta Lake tables and data models</li><li>Perform data validation and quality checks during development</li><li>Support cluster configuration and sizing for development workloads</li><li>Identify performance bottlenecks early and recommend improvements</li><li>Collaborate with Data Engineers to ensure solutions are production-ready</li><li>Document development standards, patterns, and best practices</li></ul>
<p>We are looking for a Data Engineer to strengthen and expand an established Microsoft Fabric data environment. This Long-term Contract position is ideal for someone who can turn business data into reliable, well-structured assets that support reporting and decision-making. The role requires a hands-on engineer who can shape data architecture, build scalable pipelines, and communicate clearly with both technical teams and business stakeholders.</p><p><br></p><p>Responsibilities:</p><p>• Expand and improve an existing Microsoft Fabric platform to support dependable, scalable analytics solutions.</p><p>• Create and maintain a layered data architecture across Bronze, Silver, and Gold tiers, with emphasis on delivering trusted and business-ready curated datasets.</p><p>• Build ingestion and transformation processes for Salesforce data along with information from additional enterprise sources.</p><p>• Develop data models that improve accuracy, usability, and reporting value by evaluating structure, relationships, and downstream needs.</p><p>• Support the shift away from older warehouse and spreadsheet-driven reporting practices by introducing more modern data engineering approaches.</p><p>• Work autonomously to manage priorities while providing regular updates on progress, technical decisions, and potential risks.</p><p>• Collaborate with business partners to understand reporting goals and translate them into practical data solutions.</p><p>• Contribute to data processing and integration workflows using technologies such as Python, Spark, ETL frameworks, and related platform tools.</p>
<p>We are looking for an experienced Amazon Connect Developer (Telephony) to join a contract t operm opportunity supporting a life insurance organization in West Des Moines, Iowa. This role is ideal for a senior team member who can contribute immediately across cloud telephony enhancements, targeted new development, and steady production support. The position requires strong recent experience with both Amazon Connect and Salesforce Service Cloud Voice, along with the ability to communicate clearly with business users and technical teams.</p><p><br></p><p>Responsibilities:</p><p>• Lead ongoing improvements to the telephony platform, including refining call flows, adjusting routing logic, improving queue performance, and updating voice prompts.</p><p>• Build and enhance integrations connecting Amazon Connect, Salesforce Service Cloud Voice, AWS Lambda, and related telephony components.</p><p>• Develop new solutions that support evolving contact center needs while maintaining alignment with existing cloud architecture.</p><p>• Investigate production issues across telephony and CRM platforms, identify root causes, and implement durable fixes to reduce repeat incidents.</p><p>• Participate in a rotating support schedule with other developers, providing user-focused assistance for agent experience issues, connectivity concerns, and device-related challenges.</p><p>• Design, update, and troubleshoot flows within Salesforce Service Cloud Voice and Omni-Channel to support efficient call handling.</p><p>• Create Lambda-based logic for telephony use cases such as external data lookups, custom decisioning, and system interactions using Node.js or Python.</p><p>• Partner with internal stakeholders to explain technical issues, gather requirements, and translate operational needs into effective platform updates.</p>
<p><strong>Data Engineer (Hybrid, Los Angeles)</strong></p><p><strong>Location:</strong> Los Angeles, California</p><p><strong>Compensation:</strong> $140,000 - $175,000 per year</p><p><strong>Work Environment:</strong> Hybrid, with onsite requirements</p><p>Are you passionate about crafting highly-scalable and performant data systems? Do you have expertise in Azure Databricks, Spark SQL, and real-time data pipelines? We are searching for a talented and motivated <strong>Data Engineer</strong> to join our team in Los Angeles. You'll work in a hybrid environment that combines onsite collaboration with the flexibility of remote work.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and implement data pipelines and ETL workflows using cutting-edge Azure technologies (e.g., Databricks, Synapse Analytics, Synapse Pipelines).</li><li>Manage and optimize big data processes, ensuring scalability, efficiency, and data accuracy.</li><li>Build and work with real-time data pipelines leveraging technologies such as Kafka, Event Hubs, and Spark Streaming.</li><li>Apply advanced skills in Python and Spark SQL to build data solutions for analytics and machine learning.</li><li>Collaborate with business analysts and stakeholders to implement impactful dashboards using Power BI.</li><li>Architect and support the seamless integration of diverse data sources into a central platform for analytics, reporting, and model serving via ML Flow.</li></ul><p><br></p>
<p>We are currently seeking a Data Engineer for a contract opportunity supporting a growing data and analytics organization. This role is focused on building and maintaining modern cloud-based data infrastructure, including scalable ELT pipelines, Snowflake data solutions, and automated data workflows.</p><p>This is a hands-on engineering role where you will design, develop, and support end-to-end data systems that enable reliable reporting, analytics, and business decision-making.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, build, and maintain scalable ELT/ETL data pipelines and workflows</li><li>Develop and optimize Snowflake-based data warehouse solutions</li><li>Build and maintain data models and transformation logic to support analytics and reporting</li><li>Write efficient and high-quality Python and SQL code to support data engineering processes</li><li>Develop reusable data engineering frameworks and backend data services</li><li>Implement and maintain CI/CD pipelines using GitHub and related tooling</li><li>Build automated testing frameworks to ensure data quality and reliability</li><li>Create reporting and visualization solutions using tools such as Power BI</li><li>Monitor production data systems and resolve performance or reliability issues</li><li>Support continuous improvement of data architecture, processes, and standards</li></ul>
We are looking for an experienced Lead Data Engineer to oversee the design, implementation, and management of advanced data infrastructure in Houston, Texas. This role requires expertise in architecting scalable solutions, optimizing data pipelines, and ensuring data quality to support analytics, machine learning, and real-time processing. The ideal candidate will have a deep understanding of Lakehouse architecture and Medallion design principles to deliver robust and governed data solutions.<br><br>Responsibilities:<br>• Develop and implement scalable data pipelines to ingest, process, and store large datasets using tools such as Apache Spark, Hadoop, and Kafka.<br>• Utilize cloud platforms like AWS or Azure to manage data storage and processing, leveraging services such as S3, Lambda, and Azure Data Lake.<br>• Design and operationalize data architecture following Medallion patterns to ensure data usability and quality across Bronze, Silver, and Gold layers.<br>• Build and optimize data models and storage solutions, including Databricks Lakehouses, to support analytical and operational needs.<br>• Automate data workflows using tools like Apache Airflow and Fivetran to streamline integration and improve efficiency.<br>• Lead initiatives to establish best practices in data management, facilitating knowledge sharing and collaboration across technical and business teams.<br>• Collaborate with data scientists to provide infrastructure and tools for complex analytical models, using programming languages like Python or R.<br>• Implement and enforce data governance policies, including encryption, masking, and access controls, within cloud environments.<br>• Monitor and troubleshoot data pipelines for performance issues, applying tuning techniques to enhance throughput and reliability.<br>• Stay updated with emerging technologies in data engineering and advocate for improvements to the organization's data systems.
<p>Robert Half Technology is seeking a <strong>mid-to-senior level Data Engineer</strong> to support the modernization of an existing data environment for a client in Bellevue, Washington. This role will focus on <strong>rearchitecting data pipelines into Databricks</strong>, improving performance, and establishing scalable data architecture and governance. This is a hands-on role in a <strong>fast-paced, less structured environment</strong>, ideal for someone who takes ownership and can operate with autonomy.</p><p> </p><p><strong>Duration:</strong> Long-term contract with potential for extension or conversion</p><p><strong>Location:</strong> Bellevue, Washington (3-days onsite working hybrid)</p><p><strong>Schedule:</strong> Monday-Friday (9AM-5PM PST)</p><p> </p><p><strong>Key Responsibilities</strong></p><ul><li>Rebuild and optimize existing <strong>Python-based ETL pipelines</strong> within Databricks </li><li>Design and implement scalable <strong>data ingestion and transformation processes</strong> </li><li>Architect and maintain <strong>data marts and data warehouse structures</strong> </li><li>Implement <strong>Medallion Architecture (Bronze, Silver, Gold layers)</strong> </li><li>Improve performance of data processing workflows (reduce runtimes, optimize queries) </li><li>Support migration and consolidation of data into Databricks </li><li>Document <strong>data pipelines, tables, and architecture</strong> for governance and maintainability </li><li>Define best practices for <strong>data storage, organization, and access</strong> </li><li>Ensure alignment with existing compliance and data standards </li></ul><p><br></p>
We are looking for a skilled Technical Business Analyst with expertise in data analysis to join our team in White Plains, New York. In this role, you will bridge the gap between business needs and technical solutions, ensuring data processes align with organizational goals. If you have a strong analytical background and experience working with data platforms and visualization tools, we encourage you to apply.<br><br>Responsibilities:<br>• Analyze and interpret complex datasets to identify trends and deliver actionable insights.<br>• Collaborate with stakeholders to gather and translate business requirements into technical solutions.<br>• Design and implement data pipelines using tools such as Python or Java to automate workflows.<br>• Optimize database performance and manage data warehousing using platforms like Microsoft SQL Server.<br>• Utilize cloud services, such as AWS or Azure, to support scalable data solutions.<br>• Develop and maintain ETL processes for efficient data extraction, transformation, and loading.<br>• Create dynamic dashboards and visualizations using tools like Tableau or Power BI.<br>• Ensure data integrity and accuracy through rigorous testing and validation.<br>• Provide technical support and training to non-technical team members.<br>• Document processes and workflows to ensure clarity and continuity.
<p><strong>Data Engineer – CRM Integration (Hybrid in San Fernando Valley)</strong></p><p><strong>Location:</strong> San Fernando Valley (Hybrid – 3x per week onsite)</p><p><strong>Compensation:</strong> $140K–$170K annual base salary</p><p><strong>Job Type:</strong> Full Time, Permanent</p><p><strong>Overview:</strong></p><p>Join our growing technology team as a Data Engineer with a focus on CRM data integration. This permanent role will play a key part in supporting analytics and business intelligence across our organization. The position offers a collaborative hybrid environment and highly competitive compensation.</p><p><strong>Responsibilities:</strong></p><ul><li>Design, develop, and optimize data pipelines and workflows integrating multiple CRM systems (Salesforce, Dynamics, HubSpot, Netsuite, or similar).</li><li>Build and maintain scalable data architectures for analytics and reporting.</li><li>Manage and advance CRM data integrations, including real-time and batch processing solutions.</li><li>Deploy ML models, automate workflows, and support model serving using Azure Databricks (ML Flow experience preferred).</li><li>Utilize Azure Synapse Analytics & Pipelines for high-volume data management.</li><li>Write advanced Python and Spark SQL code for ETL, transformation, and analytics.</li><li>Collaborate with BI and analytics teams to deliver actionable insights using PowerBI.</li><li>Support streaming solutions with technologies like Kafka, Event Hubs, and Spark Streaming.</li></ul><p><br></p>
<p><br></p><p>Software Platform Engineer will design, build, and maintain a core Data & Machine Learning platform.</p><p><br></p><p>Platform Development: Design and implement new features for our AWS and Databricks-based platform, staying current with industry trends and advancements in AI. Core Component Implementation: Test and integrate central platform components that support our technology stack and serve tenants across the organization. Collaboration: Partner with other engineering teams to identify and deliver platform enhancements that solve specific business problems. Maintain Excellence: Uphold strict security protocols, compliance controls, and architectural principles in all aspects of your work.</p><p><br></p><p><br></p>
We are looking for an experienced AWS Platform Engineer SR to join a Contract position supporting a growing data science platform in Dublin, Ohio. This role focuses on building, maintaining, and improving cloud infrastructure that enables analytics, AI/ML, and data-driven product teams to work efficiently at scale. The ideal candidate will bring strong experience in platform engineering, automation, and secure environment management across AWS-based ecosystems.<br><br>Responsibilities:<br>• Maintain and enhance cloud infrastructure that supports data science, analytics, and machine learning workloads across the platform.<br>• Build and release new environments through automated delivery pipelines, enabling scalable and repeatable deployments for technical teams.<br>• Administer large, multi-environment AWS landscapes and prepare the platform to support expanding business and engineering needs.<br>• Establish and oversee image lifecycle practices to improve consistency, governance, and operational stability across hosted environments.<br>• Configure and manage AWS accounts dedicated to the data science ecosystem while applying appropriate access controls and platform standards.<br>• Use tools such as Azure DevOps and Terraform to automate provisioning, deployment, and ongoing infrastructure management.<br>• Develop scripts and lightweight applications in Python to streamline platform tasks, integration needs, and operational support.<br>• Support database and data access technologies including Athena, Oracle, MySQL, and PostgreSQL within cloud-based solutions.<br>• Partner with network, database, infrastructure, and architecture teams to resolve issues, strengthen security controls, and support upgrades, patching, root cause analysis, and on-call needs.
We are looking for an experienced Penetration Tester to join our cybersecurity team and enhance the security of our Windows-based systems. The ideal candidate will play a key role in identifying vulnerabilities, conducting simulated attacks, and implementing strategies to safeguard critical infrastructure, including servers, endpoints, and Active Directory environments. This position offers an opportunity to apply advanced penetration testing techniques and collaborate with IT teams to strengthen security measures.<br><br>Responsibilities:<br>• Perform penetration tests and security evaluations of Windows environments, including Active Directory, servers, endpoints, and domain controllers.<br>• Execute red team scenarios to simulate real-world attack tactics, techniques, and procedures.<br>• Identify and exploit vulnerabilities within Windows systems, applications, and networks, documenting findings comprehensively.<br>• Develop and utilize custom scripts and tools using programming languages such as PowerShell, Python, C++, or C#.<br>• Work closely with IT teams to address security gaps and implement mitigation strategies.<br>• Prepare detailed reports outlining vulnerabilities and actionable recommendations to fortify Windows infrastructure.<br>• Stay updated on emerging threats, attack vectors, and techniques targeting Windows-based systems.<br>• Evaluate Group Policy Objects and other system configurations to ensure adherence to security standards.<br>• Support the improvement of cybersecurity practices by sharing insights and conducting knowledge transfer sessions.
<p><strong>Clearance Requirement:</strong> Public Trust clearance (Must currently hold or have the ability to obtain and maintain)</p><p><strong>Red Hat Enterprise Linux (RHEL) Systems Administrator</strong></p><p><strong>Location:</strong> Washington, DC (On-site, 5 days per week)</p><p><strong>Employment Type:</strong> 6 Month Contract, Potential for Extension or Conversion</p><p><strong>Pay: </strong>Available on W2</p><p><strong>Job Summary</strong></p><p>The RHEL Systems Administrator is responsible for administering, supporting, and maintaining Red Hat Enterprise Linux (RHEL) server environments. This role focuses on enterprise-scale Linux operations, automation, containerized platforms, cloud integration, monitoring, and system reliability. The ideal candidate has strong scripting skills, experience supporting containerized workloads, and the ability to troubleshoot, stabilize, and optimize systems in a secure, fast-paced environment.</p><p><strong>Key Responsibilities</strong></p><ul><li>Manage the full lifecycle of RHEL servers, including deployment, configuration, patching, reboot coordination, and ongoing maintenance in virtualized environments (VMware).</li><li>Administer enterprise RHEL environments using Red Hat Satellite, including repository management, content views, lifecycle environments, and patch orchestration.</li><li>Serve as the primary owner for implementing system and software updates across the RHEL platform.</li><li>Develop, test, and maintain automation using <strong>Bash and Python scripting</strong> to streamline administrative tasks and reduce manual intervention.</li><li>Support and maintain <strong>containerized environments</strong>, including <strong>Docker and Kubernetes</strong>, in coordination with application and platform teams.</li><li>Perform advanced system monitoring, performance tuning, and root cause analysis to ensure system stability and optimal performance.</li><li>Design, implement, and maintain <strong>backup and recovery strategies</strong> to ensure data integrity and system resiliency.</li><li>Perform daily operational support, including troubleshooting platform and application issues and responding to escalations.</li><li>Collaborate closely with IT security teams to ensure systems are fully patched, rebooted as necessary, and compliant with security standards.</li><li>Provide regular status updates and reporting on system health, patching, backups, and operational activities.</li><li>Create and maintain system documentation, operational procedures, and knowledge base articles.</li><li>Work cross-functionally with infrastructure, cloud, security, and application teams to support deployments and resolve issues efficiently.</li></ul>
<p>We are seeking experienced CyberArk L2/L3 Administrator to support an enterprise Privileged Access Management (PAM) environment focused on server-based access control. This role will work alongside an existing CyberArk SME to manage privileged account onboarding, password rotation, incident response and audit support. This is a hands-on operational role requiring strong CyberArk knowledge, solid troubleshooting ability, and comfort working in a fast-moving environment with on-call expectations.</p><p><br></p><p>This role is 4 days/week onsite in Marysville, OH</p><p><br></p><p><strong>CyberArk Administration</strong></p><p>• Support CyberArk EPV, PVWA, CPM, and PSM modules in a large enterprise environment</p><p> • Perform privileged account onboarding into CyberArk safes (currently a manual process)</p><p> • Manage password rotation, reconciliation, and platform configuration</p><p> • Monitor and troubleshoot access failures, rotation errors, and session issues</p><p> • Assist with time‑boxed privileged access and Break Glass workflows</p><p><strong>Operational Support</strong></p><p>• Work from ServiceNow request queues (access, activities, break/fix) with a 2‑day SLA</p><p> • Respond to severity‑based incidents; Sev1 requires 2‑hour response</p><p> • Participate in weekend on‑call rotations (Sat → Sun)</p><p> • Assist users with CyberArk workflows and provide training as needed</p><p><strong>Audit, Compliance & Security</strong></p><p>• Provide evidence for audits, including access approvals and session recording review</p><p> • Investigate suspicious activity using CyberArk logs and integrated SIEM alerts (QRadar)</p><p> • Support Disaster Recovery activities (e.g., adjusting password rotation parameters)</p><p><strong>Process Improvement & Automation</strong></p><p>• Recommend improvements to reduce manual onboarding</p><p> • Use scripting (PowerShell/Python) to streamline repeatable tasks</p><p> • Contribute to future automation between ServiceNow and CyberArk</p>
We are looking for a Data Governance Manager to lead enterprise data governance efforts in Greenville, South Carolina. This role will shape policies, accountability models, and quality standards that strengthen how data is managed, protected, and used across the organization. The ideal candidate brings strong leadership skills, hands-on experience with governance tooling and Python, and the ability to partner with technical and business teams to advance a data-driven culture.<br><br>Responsibilities:<br>• Direct the development and execution of companywide data governance practices, ensuring policies and controls support business objectives.<br>• Lead and mentor data-focused team members while coordinating governance-related initiatives, priorities, and deliverables.<br>• Partner with leaders across business, technology, legal, and compliance functions to define governance needs and implement practical solutions.<br>• Create and maintain governance standards for data quality, stewardship, ownership, and lifecycle management from intake through archival or disposal.<br>• Oversee controls for data classification, access permissions, sharing protocols, and reference data to safeguard sensitive information.<br>• Establish processes for metadata, lineage, and asset documentation within Atlan to improve transparency and usability of enterprise data.<br>• Drive data quality improvement efforts through profiling, validation, and remediation strategies that increase consistency and trust in reporting and operations.<br>• Promote organization-wide understanding of data governance by delivering training, guidance, and clear communication on governance value and responsibilities.<br>• Ensure adherence to corporate policies and applicable privacy expectations through consistent oversight and enforcement of governance practices.
<p>ServiceNow CMDB Specialist</p><p>Location: Remote</p><p>Ability to obtain a Public Trust Clearance</p><p><br></p><p>Position Overview</p><p><br></p><p>We are seeking an experienced ServiceNow CMDB Specialist to support a critical, time‑bound initiative focused on improving the accuracy, reliability, and governance of configuration data across the ServiceNow platform. This role is responsible for hands‑on CMDB management, ServiceNow Discovery, and alignment with the Common Services Data Model (CSDM).</p><p>The ideal candidate brings deep functional knowledge of ServiceNow CMDB and Discovery, strong infrastructure fundamentals, and the ability to collaborate effectively with infrastructure, operations, and platform stakeholders. This role plays a key part in ensuring high‑quality configuration data is available to support IT operations, reporting, and decision‑making.</p><p><br></p><p>Key Responsibilities</p><p>CMDB Management & Governance</p><ul><li>Implement, configure, and maintain the ServiceNow CMDB to ensure data accuracy and integrity.</li><li>Develop and enforce CMDB policies, standards, business rules, and procedures.</li><li>Perform regular audits, reconciliation, and health checks to ensure compliance with organizational standards.</li><li>Configure and maintain the CMDB Health Dashboard.</li></ul><p>ServiceNow Discovery</p><ul><li>Configure discovery schedules, probes, and sensors.</li><li>Manage MID servers and credentials securely.</li><li>Evaluate and reconcile data sources to ensure optimal discovery coverage.</li><li>Understand and develop discovery patterns as needed.</li></ul><p>CMDB & CSDM Alignment</p><ul><li>Manage CI classes and relationships within the CMDB.</li><li>Ensure alignment between CMDB structure and the Common Services Data Model (CSDM).</li><li>Recommend and implement automated processes to retire stale CIs and relationships.</li></ul><p>Infrastructure & Technical Enablement</p><ul><li>Apply network fundamentals (IP addressing, protocols, ports, firewalls) to support discovery and CMDB accuracy.</li><li>Support Windows and Linux environments, including DNS, DHCP, and Active Directory integrations.</li><li>Manage service accounts and ensure secure system access.</li></ul><p>Automation, Reporting & Collaboration</p><ul><li>Use basic scripting (PowerShell, JavaScript, or Python) to automate CMDB and discovery processes.</li><li>Develop dashboards and reports to provide insights into CMDB data quality and coverage.</li><li>Partner with IT teams, business units, and vendors to support CMDB adoption and best practices.</li><li>Provide training and guidance to stakeholders on CMDB processes.</li></ul>
<p>We are supporting our client in hiring a Product Data Engineer who will take full ownership of their product information environment. This role centers on managing their PIM solution (Salsify), improving data structures, and building automated, API‑driven integrations that ensure product data is clean, scalable, and synchronized across platforms.</p><p>This position will be deeply involved in a major product‑data overhaul, including cleanup, restructuring, and long‑term system improvements. The ideal candidate is someone who enjoys solving data problems, building automated workflows, and improving the reliability of product information across systems.</p><p><br></p><p> Key Responsibilities</p><p>Product Data Platform Ownership</p><ul><li>Act as the primary administrator for the PIM platform</li><li>Define and maintain product attributes, hierarchies, and data relationships</li><li>Create validation rules, formulas, and workflows to enforce data standards</li><li>Manage permissions, governance, and platform configuration</li><li>Troubleshoot issues related to imports, exports, and publishing</li></ul><p>Integrations & Automation</p><ul><li>Manage integrations between the PIM and internal/external systems (eCommerce, retail, etc.)</li><li>Build and support API‑based data flows with a focus on reliability and scale</li><li>Develop automation using scripting (Python preferred)</li><li>Support event‑driven or automated pipelines to reduce manual work</li><li>Monitor integration performance and proactively resolve failures</li></ul><p>Product Data Improvements</p><ul><li>Contribute to a large‑scale product data cleanup and restructuring effort</li><li>Identify gaps in current data models and workflows</li><li>Partner with cross‑functional teams to define scalable data standards</li><li>Improve system design to support long‑term growth</li></ul><p>Channel Syndication</p><ul><li>Manage product data distribution to digital and retail channels</li><li>Ensure data meets channel‑specific requirements</li><li>Troubleshoot publishing issues and improve success rates</li><li>Support product launches and updates across channels</li></ul><p>Data Governance & Quality</p><ul><li>Establish naming conventions, validation rules, and governance standards</li><li>Define and track data quality KPIs (accuracy, completeness, timeliness)</li><li>Utilize or support data governance tools</li><li>Work with business teams to improve data accountability</li></ul><p>Reporting & Metrics</p><ul><li>Build dashboards and reports on data quality and system performance</li><li>Provide insights to leadership to support decision‑making</li><li>Track syndication outcomes and operational metrics</li></ul><p>Operational Support</p><ul><li>Handle day‑to‑day platform usage, enhancements, and issue resolution</li><li>Prioritize incoming requests and tickets</li><li>Ensure stability and reliability of product data operations</li></ul><p><br></p>
<p><em>The salary range for this position is $180,000-$200,000 plus bonus, and it comes with benefits, including medical, vision, dental, life, and disability insurance. To apply to this hybrid role please send your resume to [email protected]</em></p><p><br></p><p>You know what’s awesome? PTO. You know what else is awesome? A high-paying job that respects your work-life balance so you can enjoy your PTO. This role has perks that are unmatched by its competitors. Plus, this position doubles as a fast-track career advancement opportunity as they prefer to promote from within. </p><p><br></p><p><strong>Job Description:</strong></p><p><strong>Project and Asset Accounting</strong></p><ul><li>Oversee project accounting (CapEx and OpEx), ensuring accurate tracking, cost allocations, and capitalization of project expenditures in compliance with U.S. GAAP.</li><li>Manage fixed asset accounting, including asset capitalization, depreciation schedules, impairment assessments, and disposals.</li><li>Manage intangible asset accounting, including capitalization, amortization schedules, impairment assessments, and divestitures / disposals.</li><li>Lead lease accounting under ASC 842, ensuring accurate lease classification, right-of-use asset accounting, and financial disclosures.</li><li>Collaborate with project management, finance, and operations teams to improve capital expenditure tracking and financial controls.</li><li>Mineral Reserve reporting (S-K 1300) finance support analytics and commentary</li><li>Properties & Facilities reporting finance support analytics and commentary</li><li>Sustainability & Environmental reporting finance support analytics and commentary</li><li>Support account receivable aging analysis and allowance for doubtful accounts analysis through the leveraging / addition of credit assessment, past due history, collections status, etc.</li></ul><p><br></p><p><strong> Financial Compliance and Internal Controls</strong></p><ul><li>Maintain compliance with U.S. GAAP, Sarbanes-Oxley (SOX), and corporate accounting policies.</li><li>Develop and implement internal controls to ensure financial accuracy and mitigate risk in project and asset accounting.</li><li>Support internal and external audits, ensuring proper documentation and adherence to regulatory requirements.</li><li>Lead process improvement initiatives to enhance financial reporting accuracy, efficiency, and consistency.</li></ul><p><br></p><p><strong> Data Analytics and Financial Insights</strong></p><ul><li>Utilize data tools such as Power Query, Power BI, Alteryx, and Python to develop financial models, automate reporting, and generate actionable insights.</li><li>Improve data governance and system integration to enhance financial reporting accuracy, accessibility, and automation.</li><li>Provide data analytics and reporting support across finance, shared services, and accounting to drive strategic decision-making and operational efficiency.</li></ul><p><br></p>
<p><strong>Platform Engineer – Data Science Platform</strong></p><p><strong>13 Week Contract to Hire </strong></p><p><strong>Onsite Hybrid: </strong>Columbus, OH or Dallas, TX or Minneapolis, MN</p><p><strong>Pay: </strong>Available on W2</p><p><strong>Job Summary</strong></p><p>We are seeking an experienced <strong>Platform Engineer</strong> to join a growing Platform Engineering team responsible for supporting and evolving a modern <strong>Data Science platform</strong>. This role focuses on building, managing, and securing cloud-based infrastructure that enables Data Science and AI/ML teams to operate efficiently at scale. The ideal candidate brings strong AWS expertise, hands-on infrastructure automation experience, and the ability to collaborate across technical and business teams.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Support and maintain ongoing <strong>Data Science infrastructure operations</strong></li><li>Design, build, and deploy <strong>AWS environments</strong> using automated <strong>CI/CD pipelines</strong></li><li>Manage and scale large, secure cloud environments to support current and future Data Science initiatives</li><li>Implement, own, and improve the <strong>image management lifecycle process</strong></li><li>Assist with the setup and ongoing management of <strong>AWS accounts</strong> dedicated to the Data Science platform</li><li>Develop and maintain infrastructure pipelines using <strong>CI/CD tools</strong> (e.g., Azure DevOps)</li><li>Build and manage environments using <strong>Infrastructure as Code (IaC)</strong> tools such as <strong>Terraform</strong></li><li>Develop scripts and applications using programming languages such as <strong>Python</strong></li><li>Manage and support database technologies including <strong>Athena, Oracle, MySQL, and PostgreSQL</strong></li><li>Leverage AWS services to enable <strong>Data Lake, Data Science, and AI/ML workloads</strong></li><li>Respond to requests from development and business users, removing technical roadblocks</li><li>Manage secured infrastructure environments, applying security controls and guardrails</li><li>Identify, remediate, and track infrastructure vulnerabilities within defined SLAs</li><li>Maintain audit logs and support compliance-related needs</li><li>Perform system upgrades, patching, and provide <strong>on-call support</strong> as required</li><li>Conduct root cause analysis and knowledge transfer sessions with internal teams</li><li>Collaborate closely with <strong>Network, Database, Infrastructure, and Architecture teams</strong> to align on platform strategy and delivery</li></ul>
<p><strong>AWS Infrastructure Engineer </strong></p><p><strong>13 Week Contract to Hire </strong></p><p><strong>Onsite Hybrid: </strong>Columbus, OH or Dallas, TX or Minneapolis, MN </p><p><strong>Pay: </strong>Available on W2</p><p><strong>Job Summary</strong></p><p>We are seeking an experienced <strong>Platform Engineer</strong> to join a growing Platform Engineering team responsible for supporting and evolving a modern <strong>Data Science platform</strong>. This role focuses on building, managing, and securing cloud-based infrastructure that enables Data Science and AI/ML teams to operate efficiently at scale. The ideal candidate brings strong AWS expertise, hands-on infrastructure automation experience, and the ability to collaborate across technical and business teams.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Support and maintain ongoing <strong>Data Science infrastructure operations</strong></li><li>Design, build, and deploy <strong>AWS environments</strong> using automated <strong>CI/CD pipelines</strong></li><li>Manage and scale large, secure cloud environments to support current and future Data Science initiatives</li><li>Implement, own, and improve the <strong>image management lifecycle process</strong></li><li>Assist with the setup and ongoing management of <strong>AWS accounts</strong> dedicated to the Data Science platform</li><li>Develop and maintain infrastructure pipelines using <strong>CI/CD tools</strong> (e.g., Azure DevOps)</li><li>Build and manage environments using <strong>Infrastructure as Code (IaC)</strong> tools such as <strong>Terraform</strong></li><li>Develop scripts and applications using programming languages such as <strong>Python</strong></li><li>Manage and support database technologies including <strong>Athena, Oracle, MySQL, and PostgreSQL</strong></li><li>Leverage AWS services to enable <strong>Data Lake, Data Science, and AI/ML workloads</strong></li><li>Respond to requests from development and business users, removing technical roadblocks</li><li>Manage secured infrastructure environments, applying security controls and guardrails</li><li>Identify, remediate, and track infrastructure vulnerabilities within defined SLAs</li><li>Maintain audit logs and support compliance-related needs</li><li>Perform system upgrades, patching, and provide <strong>on-call support</strong> as required</li><li>Conduct root cause analysis and knowledge transfer sessions with internal teams</li><li>Collaborate closely with <strong>Network, Database, Infrastructure, and Architecture teams</strong> to align on platform strategy and delivery</li></ul><p><br></p>
<p>In this role you will lead a team of dedicated platform engineers as we strategically migrate our data assets to the cloud. As a Manager on the Data Platform Team you will report to the Director of Data Platforms and work closely with the Chief Data Office organization to influence the business and technology strategies for the company.</p><p><br></p><p>If you consider data as a strategic asset evangelize the value of good data and insights have deep understanding of data governance are an experienced thought leader in cloud data migrations and love building teams and mentoring talent this role is for you.</p><p><br></p><p>You will be responsible for leading a Data platform team for the bank including innovating on the platform to enable next-generation capabilities. You will participate in the definition of the strategic roadmap for data technology and be responsible for executing cloud data initiatives. You will partner with other Technology organizations and CDAO leads to architect and enable an effective data ecosystem.</p><p><br></p><p>Key Responsibilities</p><ul><li>Manage a team of colleagues and contractors providing resource allocation coaching and development.</li><li>Understand and translate the technical design from the Data Architect team into implemented physical data models that meet data governance enterprise architecture and business requirements for data warehousing. Manage data within the data warehouse to ensure efficiency of platform.</li><li>Work with operational data and data acquisition teams to manage incoming sources and the down-stream systems to understand and support their needs for reporting and analytics.</li><li>Collaborate with key partners to prototype and recommend accurate data solutions in embracing new technologies.</li><li>Remain abreast of technology developments and proactively align the future state goals with strategic business goals.</li><li>Develop and maintain a technology roadmap for the aligned data platforms including execution of strategic tactical and continuous improvement CI initiatives to further enhance the platform.</li><li>Maintain stable operations ensuring operational metrics are proactively managed and reported on.</li></ul>
<p>Robert Half is seeking a results-driven Data Analyst for a dynamic organization in St. Louis, Missouri. If you have a passion for uncovering actionable insights, proficiency with advanced analytics tools, and enjoy collaborating to support business strategy, we want to hear from you. This opportunity offers the chance to join a team shaping the future of data-driven decision making.</p><p><strong> </strong></p><p><strong>Key Responsibilities:</strong></p><p>· Collect, analyze, and interpret complex data sets utilizing advanced statistical methods and data visualization tools.</p><p>· Generate business insights and present clear, actionable recommendations to stakeholders.</p><p>· Develop and maintain dashboards and reports using tools such as Power BI, SQL, Excel, and others.</p><p>· Collaborate cross-functionally with IT, Finance, Operations, and other business units to solve critical business challenges.</p><p>· Identify trends, patterns, and opportunities for process improvements and new initiatives.</p>
<p>Our company is seeking an innovative and driven AI/ML Engineer to join our technology team in St. Louis, Missouri. If you enjoy developing machine learning models and leveraging AI to solve real business challenges, we invite you to apply.</p><p><strong> </strong></p><p>Key Responsibilities:</p><p>· Design, build, and deploy AI and ML solutions for various business applications</p><p>· Collaborate with data scientists, analysts, and business stakeholders to define project requirements and deliver impactful results</p><p>· Optimize and tune algorithms for accuracy, scalability, and performance</p><p>· Stay current with advancements in machine learning, deep learning, and related technologies</p><p>· Communicate findings and recommendations transparently to technical and non-technical teams</p>
<p>Our company is seeking a proactive and skilled Cybersecurity Engineer to join our IT team in St. Louis, Missouri. This role offers the opportunity to develop and enhance security measures, protect critical infrastructure, and drive innovative solutions in a fast-moving environment.</p><p><strong> </strong></p><p><strong>Key Responsibilities:</strong></p><p>· Design, implement, and manage security systems to safeguard network and data resources</p><p>· Conduct vulnerability assessments and penetration tests to identify areas for improvement</p><p>· Respond promptly to security incidents and develop mitigation strategies</p><p>· Collaborate with IT and other departments to ensure compliance with industry standards and regulations</p><p>· Stay current with emerging cybersecurity threats, tools, and best practices</p><p><br></p>