<p>Our client is seeking a Data Architect for multi-year assignment.</p><p><br></p><p>Job Overview:</p><p>The Data Architect will lead the design and implementation of enterprise data systems, ensuring alignment with business needs, data governance, and security standards. This role involves working closely with IT teams, business analysts, and data consumers to deliver scalable and secure data solutions.</p><p>This role will be onsite in Marysville, OH four days per week.</p><p><br></p><p>Daily Responsibilities:</p><ul><li>Translate high-level business requirements into data models, metadata, test data, and data quality standards</li><li>Manage senior business stakeholders to ensure project alignment with strategic roadmaps</li><li>Lead peer reviews and quality assurance of architectural artifacts</li><li>Define and manage standards, guidelines, and processes for data quality</li><li>Collaborate with IT and analytics teams to develop data solutions</li><li>Evaluate and recommend emerging technologies for data management and analytics</li><li>Establish governance frameworks for internal teams and vendor partners</li></ul><p>Project Focus:</p><ul><li>Apply data protection rules across storage, compute, and consumption layers</li><li>Design data protection solutions at database, table, column, and API levels</li><li>Architect data systems including databases, warehouses, and lakes</li><li>Select and implement database management systems with optimized schemas and security</li><li>Enhance data pipeline performance and ensure data governance</li></ul>
<p>We are looking for an experienced Senior Data Engineer to join our team. This role involves designing and implementing scalable data solutions, optimizing data workflows, and driving innovation in data architecture. The ideal candidate will possess strong leadership qualities and a passion for problem-solving in a fast-paced, cutting-edge environment.</p><p><br></p><p>Responsibilities:</p><p>• Develop high-performance data systems, including databases, APIs, and data integration pipelines, to support scalable solutions.</p><p>• Design and implement metadata-driven architectures and automate deployment processes using infrastructure-as-code principles.</p><p>• Promote best practices in software engineering, such as code reviews, testing, and continuous integration/delivery (CI/CD).</p><p>• Establish and maintain a robust data governance framework to ensure compliance and data integrity.</p><p>• Monitor processes and implement improvements, including query optimization, code refactoring, and efficiency enhancements.</p><p>• Leverage cloud platforms, particularly Azure and Databricks, to improve system architecture and scalability.</p><p>• Conduct data quality checks and build procedures to address and resolve data issues effectively.</p><p>• Create and maintain documentation for data architecture, standards, and best practices.</p><p>• Provide technical leadership to the team, guiding design discussions and fostering innovation in data infrastructure.</p><p>• Identify and implement opportunities for process optimization and automation to improve operational efficiency.</p>
We are looking for an experienced Data/Information Architect to join our team in Janesville, Wisconsin, on a contract-to-permanent basis. In this role, you will be responsible for designing and implementing data architecture solutions that ensure data quality, integrity, and accessibility. This position requires a strong understanding of data mapping, report building, and integration tools to support the organization's data-driven initiatives.<br><br>Responsibilities:<br>• Develop and maintain complex reports and queries to support data analysis and business intelligence needs.<br>• Perform detailed data mapping to identify sources, logic, and relationships within the organization’s data environment.<br>• Utilize Azure Data Factory and other integration tools to streamline data integration processes.<br>• Ensure data quality and integrity through effective data cleansing and validation techniques.<br>• Collaborate with cross-functional teams to design scalable data architectures using Microsoft technologies.<br>• Work with D365 and other cloud ERP systems to enhance data management capabilities.<br>• Leverage Power BI for advanced reporting and visualization.<br>• Optimize the organization’s use of Snowflake and other data platforms to improve performance.<br>• Implement ETL processes to support data migration and transformation efforts.<br>• Provide technical expertise in managing and improving the organization's Microsoft-based data environment.
<p>We are looking for an experienced Salesforce Pardot Architect to join our team on a contract basis. In this role, you will play a pivotal part in supporting marketing automation projects, focusing on platform migrations and integrations. This position offers a unique opportunity to collaborate with a skilled team while contributing to the successful implementation of Salesforce Marketing Cloud Account Engagement solutions.</p><p><br></p><p>Responsibilities:</p><p>• Lead the migration of data and workflows from HubSpot and Marketo to Salesforce Pardot (Marketing Cloud Account Engagement).</p><p>• Collaborate with teams to ensure proper data governance and establish best practices for data conversion processes.</p><p>• Design and implement workflows, architecture standards, and data migration strategies that align with project requirements.</p><p>• Evaluate and map 197 fields involved in the migration process to ensure seamless integration.</p><p>• Provide technical expertise in server virtualization, enterprise storage, and DevOps practices.</p><p>• Develop and maintain architecture standards to support scalability and reliability.</p><p>• Work closely with internal teams to troubleshoot and resolve migration-related challenges.</p><p>• Ensure compliance with Group Policy Objects (GPO) and other organizational policies.</p><p>• Offer guidance and mentorship to team members throughout the project lifecycle.</p><p>• Document processes and solutions to ensure clarity and knowledge transfer.</p>
<p><strong>SOLUTION ARCHITECT – Digital Transformation - 100% DIRECT HIRE </strong></p><p><strong>Drive Innovation in Commercial Platforms with a Modern Azure, React, APi's, .NET AI Tech Stack! </strong></p><p><strong>LOCATION: DES MOINES IOWA - HYBRID in office. NOTE: NOT 100% REMOTE. </strong></p><p><strong>HYBRID: ONLY in the office a few times during each month! SUPER FLEXIBLE! </strong></p><p>Ready to shape how global enterprises engage customers & partners digitally? Join a worldwide leader undergoing rapid digital transformation—seeks a Solution Architect to design and deliver next-gen digital platforms, web apps, and elevate CX. Bring your strategic vision to the intersection of technology innovation and business growth.</p><p>Solution Architect – Digital Transformation -</p><p>Foor Immediate & Confidential Consideration: <strong> 📩 Carrie Danger, SVP Permanent Placement, on LinkedIn . 📞 Office: 515-259-6087 | Cell: 515-991-0863 . 📧 Email found on My LinkedIn profile</strong></p><ul><li><strong>Compensation: Competitive base up to $155K</strong></li><li><strong>Bonus potential up to 20%</strong></li></ul><p><strong>Your Focused Mission:</strong></p><ul><li>Design scalable, secure cloud solutions with Azure, Databricks, React & Node.js</li><li>Lead delivery of cloud-native, API-first architectures for seamless CX</li><li>Translate complex technology concepts for business stakeholders</li><li>Transform business needs into reusable solutions and architecture documentation</li><li>Guide teams on best practices for data environments, APIs, and microservices</li><li>Collaborate cross-functionally to establish Communities of Practice and co-create architecture roadmaps with leadership</li><li>Ensure governance, compliance, and ongoing improvement of standards</li><li>Mentor and coach architecture & development teams</li></ul><p><strong>WHAT YOU NEED: </strong></p><ul><li>5+ years designing enterprise software architecture (cloud, APIs, web)</li><li>3+ years with Azure cloud platforms, including hybrid environments</li><li>Track record with API-first design, microservices, event-driven architecture, SOA</li><li>Hands-on Azure Databricks, Data Factory, Delta Lake, Synapse Analytics, Big Data</li><li>Skilled in React, Node.js, Python, Scala, SQL, or similar</li><li>AI development experience a plus</li><li>Skilled ability to connect tech with business</li></ul><p><strong>WHY JOIN? </strong></p><ul><li>Strategic, high-impact role influencing global digital strategy</li><li>Collaborative & innovative teams</li><li>Hybrid flexibility & tech l growth into digital, data, and cloud domains. <strong>For immediate and confidential consideration on this Solution Architect – Digital Transformation connect directly with Carrie Danger, SVP Permanent Placement Team (Iowa Region): 515-259-6087 (office), 515-991-0863 (cell), or via email (on LinkedIn). One-click apply also available on the Robert Half site. Your information will never be shared without your direct consent.</strong></li></ul><p><br></p>
We are looking for an experienced Data/Information Architect to join our team in Houston, Texas. In this Contract to permanent employment position, you will play a pivotal role in designing, organizing, and optimizing data systems from the ground up to support business objectives. This role requires hands-on expertise in modern data technologies and methodologies to ensure the delivery of high-quality, scalable solutions.<br><br>Responsibilities:<br>• Develop and implement strategies for cleaning, organizing, and optimizing data across the organization.<br>• Design and build data platforms and architectures to meet business requirements and support analytics.<br>• Analyze existing data systems and recommend improvements for efficiency and scalability.<br>• Utilize AI tools and techniques to enhance data processes and commodity training.<br>• Collaborate with stakeholders to assess company needs and define data-related approaches.<br>• Work with cloud technologies and AWS to establish robust data solutions.<br>• Apply advanced knowledge of Spark to manage and process large-scale data efficiently.<br>• Partner with internal teams to ensure alignment with Agile Scrum methodologies.<br>• Create comprehensive documentation, including Business Requirement Documents, to guide data initiatives.<br>• Ensure data governance and security through the use of MDM technologies and best practices.
Qualifications<br>QUALIFICATIONS<br><br>These specifications are general guidelines based on the minimum experience normally considered essential to the satisfactory performance of this position. The requirements listed below are representative of the knowledge, skill and/or ability required to perform the position in a satisfactory manner. Individual abilities may result in some deviation from these guidelines.<br><br>• A self-starting team player who possesses a bachelor’s degree in Information Technology, Engineering Technology, Computer Information Technology, or related field<br>• Minimum of 10 years of technical experience directing data analytics teams (Enterprise Data Architecture, Data Analytics, Data Governance, ALM Teams, Release Management, Support Functions) within a hybrid cloud environment<br>• Minimum of 5 years in banking experience<br>• Background in large scale migrations, hybrid cloud, and application modernization projects<br>• Experience with regulated or complex environments<br>• Familiarity with ITIL, Agile, and modern delivery frameworks<br>• Proven experience delivering enterprise Azure solutions in an IT and Business-facing capacity<br>• Strong understanding of core Azure services and architecture patterns across Compute, networking, identity, storage, containers, and automation<br>• Knowledgeable with Infrastructure as code (ARM, Bicep, Terraform), DevOps pipelines, and CI/CD practices<br>• Strong understanding of Gen AI principles (Microsoft CoPilot), large language models (LLM) and their applications in analytics<br>• Strong knowledge of security, governance, identity (Entra ID), and compliance considerations for Azure<br>• Microsoft Cloud certifications<br>• F
<p>Description of Position</p><p> This role leads Oracle ERP architecture, cloud migrations, and system optimization initiatives. The architect partners with functional and technical teams to ensure performance, scalability, and alignment with business goals.</p>
Position: SENIOR DATA INTEGRATION ENGINEER - Architect the backbone of a Mobile-First Digital Experience<br>Location: REMOTE<br>Salary: UPTO $175K + BONUS + EXCEPTIONAL BENEFITS<br><br>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. ***<br><br>A nationally recognized company with a long-standing legacy of success is launching a bold new digital initiative—and this is your opportunity to help build it from the ground up.<br>With full executive support and the resources of a Fortune 500 parent, this newly formed department is creating a mobile-first product from scratch. It’s a greenfield, 0-to-1 launch with the pace and creativity of a startup, but with the stability and funding already secured. The first MVP is nearing launch, and we’re assembling a team of 20 innovators to bring it to life.<br>We’re looking for a Senior Data Integration Engineer to lead the design and development of a scalable, secure, and high-performing data ecosystem. You’ll play a critical role in building a custom Customer Data Platform (CDP), integrating internal and external systems, and enabling real-time personalization, analytics, and enterprise-wide insights.<br>What You’ll Do<br> • Build and maintain scalable data pipelines using Python and cloud-native tools.<br> • Design and implement a robust CDP to unify customer data across platforms.<br> • Develop APIs and data workflows for real-time and batch integrations.<br> • Optimize data environments for performance, security, and reliability.<br> • Collaborate with cross-functional teams to define data requirements and integration strategies.<br> • Implement data governance and privacy practices to ensure compliance.<br> • Leverage CI/CD pipelines and automation to streamline deployment and maintenance.<br> • Document architecture, data flows, and integration patterns for scalability.<br>What You Bring<br> • 10+ years of experience with RESTful APIs, SQL, Python, or PowerShell.<br> • 5+ years working with cloud platforms (Azure, AWS, GCP) and cloud-native tools (ADF, Glue, dbt, Airflow).<br> • Proven experience building or integrating with CDPs (Segment, Salesforce CDP, Adobe Experience Platform, or custom solutions).<br> • Experience with data integration tools (Apache NiFi, Talend, Informatica).<br> • Familiarity with event-driven architectures, message queues (Kafka), and microservices.<br> • Strong understanding of data privacy, security, and governance.<br> • Bonus: Experience with Snowflake, BigQuery, Databricks, and Infrastructure as Code tools (Terraform, Pulumi, CloudFormation).<br><br><br>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. Also, you may contact me by office: 515-303-4654 or mobile: 515-771-8142. Or one click apply on our Robert Half website. No third party inquiries please. Our client cannot provide sponsorship and cannot hire C2C. ***
<p>Robert Half is looking for a DBA to join our client's team in Murietta. </p><p><br></p><p>SQL Server & ERP Administration</p><p>• Administer, monitor, and maintain multiple Microsoft SQL Server instances, with the Deltek Vantagepoint ERP database as the primary system of record.</p><p>• Write, optimize, and tune complex T-SQL queries, stored procedures, and views for both ad hoc analysis and application logic.</p><p>• Create and schedule SQL Server Agent jobs to automate data ingestion, maintenance routines, and reporting pipelines.</p><p>• Implement, document, and test backup and disaster recovery strategies to ensure business continuity.</p><p>• Manage database indexes, performance tuning, and query optimization to support application efficiency and stability.</p><p>• Plan and execute Deltek Vantagepoint upgrades, schema updates, and integrations, working closely with the ERP administrator and IT team.</p><p>• Design and publish SSRS reports and datasets to meet business and finance reporting requirements.</p><p><br></p><p><br></p><p>Cross-System Data Architecture</p><p>• Support integration between SQL Server and the Azure Databricks Lakehouse, ensuring consistent and secure data flow.</p><p>• Collaborate with data engineers and analysts to maintain data integrations to other critical systems.</p><p>• Develop and maintain SQL transformations and stored procedures that serve as upstream logic for analytics and reporting layers.</p><p>• Contribute to the management of Unity Catalog, metadata inventory, and data lineage documentation.</p><p>• Partner with Power BI administrators to optimize Fabric dataset refreshes, gateways, and source connections.</p><p><br></p><p>Data Governance, Security, & Monitoring</p><p>• Enforce data security and access controls aligned with IT and Data Governance policies.</p><p>• Participate in data issue management and quality improvement processes, ensuring system reliability and integrity.</p><p>• Monitor system performance using both native and custom monitoring tools; proactively identify and resolve issues.</p><p>• Maintain clear, comprehensive documentation for database configurations, schemas, and operational procedures.</p><p><br></p><p><br></p>
<p>The Business Architect will operate at a systems and enterprise level, owning end-to-end business and application workflows across multiple teams. This role focuses on defining how business processes, data, and systems interact, ensuring alignment between product strategy and technical delivery. The Business Architect will partner closely with product, engineering, and leadership to identify dependencies, integration points, and impacts across platforms while disseminating architectural guidance to multiple delivery teams.</p>
We are looking for a skilled Data Engineer to join our team in Los Angeles, California. This role focuses on designing and implementing advanced data solutions to support innovative advertising technologies. The ideal candidate will have hands-on experience with large datasets, cloud platforms, and machine learning, and will play a critical role in shaping our data infrastructure.<br><br>Responsibilities:<br>• Develop and maintain robust data pipelines to ensure seamless data extraction, transformation, and loading processes.<br>• Design scalable architectures that support machine learning models and advanced analytics.<br>• Collaborate with cross-functional teams to deliver business intelligence tools, reporting solutions, and analytical dashboards.<br>• Implement real-time data streaming solutions using platforms like Apache Kafka and Apache Spark.<br>• Optimize database performance and ensure efficient data storage and retrieval.<br>• Build and manage resilient data science programs and personas to support AI initiatives.<br>• Lead and mentor a team of data scientists, machine learning engineers, and data architects.<br>• Design and implement strategies for maintaining large datasets, ensuring data integrity and accessibility.<br>• Create detailed technical documentation for workflows, processes, and system architecture.<br>• Stay up-to-date with emerging technologies to continuously improve data engineering practices.
<p>We are seeking a highly skilled Senior Data Engineer with AI / ML desires to design, build, and scale next-generation data and machine learning infrastructure. This role is ideal for a hands-on technical expert who thrives in building complex systems from the ground up, has deep experience in Google Cloud Platform (GCP), and is excited about stepping into management and technical leadership. You will work across engineering, data science, and executive leadership teams to architect cloud-native solutions, optimize real-time data pipelines, and help shape our long-term AI/ML engineering strategy.</p><p><br></p><p><strong>Key Responsibilities</strong></p><p><strong>Cloud & Platform Engineering</strong></p><ul><li>Architect, build, and maintain high-performance data and ML infrastructure on GCP using best-in-class cloud-native tools and services.</li><li>Lead the design of scalable cloud architectures, with a strong focus on resilience, automation, and cost-effective operation.</li><li>Build applications and services from scratch, ensuring they are modular, maintainable, and scalable.</li></ul><p><strong>Real-Time & Distributed Systems</strong></p><ul><li>Design and optimize real-time data processing pipelines capable of handling high-volume, low-latency traffic.</li><li>Implement and fine-tune load balancing strategies to support fault tolerance and performance across distributed systems.</li><li>Lead system design for high availability, horizontal scaling, and microservices communication patterns.</li></ul><p><strong>AI/ML Engineering</strong></p><ul><li>Partner with ML engineers and data scientists to deploy, monitor, and scale machine learning workflows.</li><li>Create and maintain ML-focused CI/CD pipelines, model deployment frameworks, and automated testing harnesses.</li></ul><p><strong>Open-Source & Code Quality</strong></p><ul><li>Contribute to and maintain open-source projects, including active GitHub repositories.</li><li>Champion best practices across code reviews, version control, and documentation.</li><li>Establish, document, and enforce advanced testing methodologies, including integration, regression, performance, and automated testing frameworks.</li></ul><p><strong>Leadership & Collaboration</strong></p><ul><li>Serve as a technical leader and mentor within the engineering team.</li><li>Collaborate effectively with senior leadership and executive stakeholders, translating complex engineering concepts into strategic insights.</li><li>Provide guidance and direction to junior engineers, with an eye toward growing into a people leadership role.</li></ul>
<p><strong>Data Engineer (Hybrid, Los Angeles)</strong></p><p><strong>Location:</strong> Los Angeles, California</p><p><strong>Compensation:</strong> $140,000 - $175,000 per year</p><p><strong>Work Environment:</strong> Hybrid, with onsite requirements</p><p>Are you passionate about crafting highly-scalable and performant data systems? Do you have expertise in Azure Databricks, Spark SQL, and real-time data pipelines? We are searching for a talented and motivated <strong>Data Engineer</strong> to join our team in Los Angeles. You'll work in a hybrid environment that combines onsite collaboration with the flexibility of remote work.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and implement data pipelines and ETL workflows using cutting-edge Azure technologies (e.g., Databricks, Synapse Analytics, Synapse Pipelines).</li><li>Manage and optimize big data processes, ensuring scalability, efficiency, and data accuracy.</li><li>Build and work with real-time data pipelines leveraging technologies such as Kafka, Event Hubs, and Spark Streaming.</li><li>Apply advanced skills in Python and Spark SQL to build data solutions for analytics and machine learning.</li><li>Collaborate with business analysts and stakeholders to implement impactful dashboards using Power BI.</li><li>Architect and support the seamless integration of diverse data sources into a central platform for analytics, reporting, and model serving via ML Flow.</li></ul><p><br></p>
We are looking for an experienced individual to lead our Data Management and Client Reporting initiatives. This role requires a strategic thinker with a deep understanding of investment operations and expertise in Addepar. The successful candidate will oversee the integrity and evolution of reporting systems, manage a team member, and align data strategies with organizational objectives.<br><br>Responsibilities:<br>• Develop and implement firm-wide strategies for the Addepar platform to optimize data architecture, reporting capabilities, and system performance.<br>• Establish and maintain data governance standards to ensure accuracy, consistency, and compliance across client portfolios.<br>• Design and deliver tailored client reporting solutions in collaboration with advisory teams, ensuring alignment with regulatory and firm standards.<br>• Manage and mentor the Client Reporting Analyst, fostering growth and a culture of continuous improvement.<br>• Oversee the onboarding process for new clients and entities, ensuring seamless integration into Addepar and related systems.<br>• Collaborate with cross-functional teams, including Investment Operations, Technology, Compliance, and Advisors, to support firm-wide objectives.<br>• Identify opportunities to improve reporting processes, enhance efficiency, and drive innovation.<br>• Serve as the primary escalation point for all Addepar-related initiatives and provide expertise to resolve complex issues.
<p><strong>Systems Analyst / Architect – Intermediate</strong></p><p>26‑Week Contract | Onsite | Quincy, MA</p><p><br></p><p><strong>Overview</strong></p><p>We are seeking an Intermediate Systems Analyst/Architect for a 26‑week onsite engagement in Quincy, MA. The ideal candidate will support workflow optimization, software deployment governance, data analysis, and issue resolution across multiple teams. This role involves significant interaction with end users and requires strong communication, analytical ability, and technical fluency.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Become proficient in various proprietary data sources and integrate their use to track initiatives and the context surrounding them.</li><li>Guide end users through workflow processes to reduce friction in software development, deployment, and governance activities.</li><li>Generate, customize, and deliver reports or logs related to initiative metadata and historical activity for stakeholder review.</li><li>Analyze the root cause of system defects, workflow gaps, or technical issues.</li><li>Recommend mitigation strategies and improvements to streamline workflows and resolve recurring pain points.</li><li>Draft and communicate proposals to team members and development groups for enhancements, requirements, or fixes to improve system performance and address defects.</li></ul><p><br></p>
Position: SENIOR CDP DATA ANALYST - Help Build a Smarter Connected Digital Experience<br>Location: REMOTE<br>Salary: UP TO $140K + EXCEPTIONAL BENEFITS<br><br>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. ***<br><br>A nationally recognized company with a long history of success is launching a bold new digital initiative—and this is your opportunity to help shape it from the ground up.<br>This newly formed department is building a mobile-first product from scratch. It’s a greenfield, 0-to-1 launch with the pace and creativity of a startup, backed by the resources and stability of a Fortune 500 parent. The first MVP is nearing launch, and we’re assembling a team of 20 innovators to bring it to life.<br>As a Senior CDP Data Analyst, you’ll be a key player in designing and evolving a custom-built Customer Data Platform. Your work will unify customer insights across systems and empower smarter, faster decision-making across the organization.<br>What You’ll Be Doing<br> • Collaborate with data engineers, architects, and business stakeholders to define data requirements and use cases.<br> • Design data models and integration logic to support a unified customer view.<br> • Analyze customer behavior across platforms to uncover insights and segmentation opportunities.<br> • Build dashboards and visualizations that drive strategic decisions.<br> • Ensure data quality, consistency, and governance across the Customer Data Platform.<br> • Translate business needs into technical specifications and support iterative development.<br> • Advocate for data best practices and help standardize customer metrics across teams.<br>What You Bring<br> • 5+ years of experience in data analysis, with a focus on customer data and cross-platform integration.<br> • Advanced skills in SQL and Python, R, or similar languages.<br> • Experience with data visualization tools like Power BI or Tableau.<br> • Familiarity with cloud data platforms (Azure, AWS, GCP) and modern data warehousing.<br> • Strong communication skills and ability to work across technical and non-technical teams.<br> • Bonus: Experience with customer journey analytics, segmentation modeling, personalization strategies, and data privacy frameworks (GDPR, CCPA).<br>Why Join Now?<br> • Be part of a ground-floor team shaping a transformative digital product.<br> • Work in a fast-paced, agile environment with full executive support.<br> • Influence how data drives decisions across a nationally recognized organization.<br> • Enjoy the freedom to innovate—without legacy constraints.<br><br>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. Also, you may contact me by office: 515-303-4654 or mobile: 515-771-8142. Or one click apply on our Robert Half website. No third party inquiries please. Our client cannot provide sponsorship and cannot hire C2C. ***
Position: SENIOR DATA INTEGRATION ENGINEER - Architect the backbone of a Mobile-First Digital Experience<br>Location: REMOTE<br>Salary: UPTO $175K + BONUS + EXCEPTIONAL BENEFITS<br><br>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. ***<br><br>A nationally recognized company with a long-standing legacy of success is launching a bold new digital initiative—and this is your opportunity to help build it from the ground up.<br>With full executive support and the resources of a Fortune 500 parent, this newly formed department is creating a mobile-first product from scratch. It’s a greenfield, 0-to-1 launch with the pace and creativity of a startup, but with the stability and funding already secured. The first MVP is nearing launch, and we’re assembling a team of 20 innovators to bring it to life.<br>We’re looking for a Senior Data Integration Engineer to lead the design and development of a scalable, secure, and high-performing data ecosystem. You’ll play a critical role in building a custom Customer Data Platform (CDP), integrating internal and external systems, and enabling real-time personalization, analytics, and enterprise-wide insights.<br>What You’ll Do<br> • Build and maintain scalable data pipelines using Python and cloud-native tools.<br> • Design and implement a robust CDP to unify customer data across platforms.<br> • Develop APIs and data workflows for real-time and batch integrations.<br> • Optimize data environments for performance, security, and reliability.<br> • Collaborate with cross-functional teams to define data requirements and integration strategies.<br> • Implement data governance and privacy practices to ensure compliance.<br> • Leverage CI/CD pipelines and automation to streamline deployment and maintenance.<br> • Document architecture, data flows, and integration patterns for scalability.<br>What You Bring<br> • 10+ years of experience with RESTful APIs, SQL, Python, or PowerShell.<br> • 5+ years working with cloud platforms (Azure, AWS, GCP) and cloud-native tools (ADF, Glue, dbt, Airflow).<br> • Proven experience building or integrating with CDPs (Segment, Salesforce CDP, Adobe Experience Platform, or custom solutions).<br> • Experience with data integration tools (Apache NiFi, Talend, Informatica).<br> • Familiarity with event-driven architectures, message queues (Kafka), and microservices.<br> • Strong understanding of data privacy, security, and governance.<br> • Bonus: Experience with Snowflake, BigQuery, Databricks, and Infrastructure as Code tools (Terraform, Pulumi, CloudFormation).<br><br><br>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. Also, you may contact me by office: 515-303-4654 or mobile: 515-771-8142. Or one click apply on our Robert Half website. No third party inquiries please. Our client cannot provide sponsorship and cannot hire C2C. ***
<p>This role is responsible for designing and implementing enterprise integration solutions across cloud and on-prem systems. The Integration Architect owns integration strategy, API design, data flow architecture, and middleware configuration to support scalable, reliable system communication.</p><p>The position partners closely with application, data, and infrastructure teams to translate business and technical requirements into secure, high-availability integration patterns.</p>
We are looking for a talented Sr. Software Engineer to join our team in New York, New York. In this role, you will focus on backend development while working with cutting-edge technologies to design and implement scalable solutions. Ideal candidates will have experience in system architecture and cloud-based platforms, with a passion for building robust and innovative systems.<br><br>Responsibilities:<br>• Develop and maintain backend systems using Node.js and Golang to ensure optimal performance and scalability.<br>• Collaborate with frontend teams utilizing React.js, TypeScript, and Next.js to create seamless user experiences.<br>• Design and implement cloud-based solutions using Docker, Kubernetes, and Google Cloud Platform.<br>• Architect and optimize backend systems to support real-time processing and bidding functionalities.<br>• Work with Kafka to manage and process large data streams efficiently.<br>• Utilize AI and machine learning to enhance ad optimization and performance.<br>• Contribute to the design and development of systems within AdTech platforms.<br>• Perform system design tasks to ensure reliability and scalability of applications.<br>• Troubleshoot and resolve technical issues to maintain system stability and functionality.
<p><strong>Robert Half</strong> is actively partnering with an Austin-based client to identify a <strong>AI/ML Engineer (contract).</strong> In this role, you’ll architect and deliver robust, scalable platforms that enable machine learning and data science at enterprise scale. <strong>This role is hybrid in Austin, Tx. </strong></p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li><strong>Architect & Build:</strong> Design and implement distributed systems and data processing frameworks using modern cloud technologies (GCP, Azure). Ensure solutions are secure, scalable, and optimized for performance.</li><li><strong>Lead Projects:</strong> Own technical initiatives end-to-end—from concept through delivery. Influence product roadmaps and make critical decisions on architecture and trade-offs.</li><li><strong>Collaborate Across Teams:</strong> Participate in planning, design reviews, and code reviews. Anticipate cross-team dependencies and resolve conflicts proactively.</li><li><strong>Mentor & Grow Talent:</strong> Interview candidates, onboard new hires, and provide technical guidance to engineers and interns. Foster a culture of learning and excellence.</li></ul>
<p>Our client is seeking an experienced <strong>Salesforce</strong> <strong>Technical Lead</strong> with strong expertise in <strong>Salesforce development</strong> and <strong>release engineering</strong>. This individual will serve as a key liaison between client's local team and Deloitte’s offshore development team, ensuring seamless coordination of Salesforce releases and efficient data management. The ideal candidate combines hands-on technical skills with leadership capabilities to drive successful implementations and maintain confidence in Salesforce systems.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><p><strong>Technical Leadership & Coordination</strong></p><ul><li>Act as the primary point of contact between local stakeholders and offshore development teams.</li><li>Provide guidance and oversight on Salesforce development activities without direct people management responsibilities.</li><li>Ensure alignment of deliverables, timelines, and quality standards across teams.</li></ul><p><strong>Release Management & Engineering</strong></p><ul><li>Own and manage Salesforce release processes, including planning, scheduling, and execution.</li><li>Implement best practices for release engineering to minimize downtime and ensure smooth deployments.</li><li>Collaborate with offshore teams to avoid diverting development resources for release tasks.</li></ul><p><strong>Salesforce Development</strong></p><ul><li>Perform hands-on Salesforce development as needed, including customizations, integrations, and troubleshooting.</li><li>Maintain high confidence and competence in Salesforce technical capabilities.</li></ul><p><strong>Data Management</strong></p><ul><li>Extract, transform, and load (ETL) data within Salesforce environments.</li><li>Utilize tools such as Workbench or equivalent for data operations, ensuring accuracy and integrity.</li></ul><p><strong>DevOps & Process Improvement</strong></p><ul><li>Understand DevOps principles and practices relevant to Salesforce environments.</li><li>Gather requirements and lead implementation of DevOps-related initiatives when necessary.</li></ul><p><br></p>
We are seeking a highly skilled Lead-to-Order Architect to support a major organizational transformation toward a workstream-oriented model. This role focuses on the Lead-to-Order (LTO) value stream, covering processes from lead generation through quoting and estimating (not Quote-to-Cash). The position is part of IT and will play a critical role in designing and implementing integrated solutions across multiple plants.<br>Key Responsibilities:<br><br>Develop and lead the forward-looking strategy and roadmap for the Lead-to-Order value stream, ensuring alignment with business goals and scalability.<br>Lead technical implementation of LTO platforms (CRM, Estimating, Quoting, Pricing).<br>Oversee architecture, design, development, implementation, and support of business-critical applications.<br>Partner with department leaders to understand business needs and align application solutions accordingly.<br>Prepare functional and technical specifications, configure software packages, and ensure successful integrations.<br>Collaborate with IT teams for alignment on security, data, project management, infrastructure, and operations.<br>Drive project milestones and serve as the go-to technical SME for the APEX program.<br>Mentor team members and guide them through change management initiatives.<br><br>Day-to-Day Activities:<br><br>Support implementation of Paperless Parts (estimating tool) across five plants.<br>Work with UiPath (automation) and SnapLogic (iPaaS) for integrations.<br>Partner with MDM resources to ensure proper master data management.<br>Provide technical oversight and guidance to project managers and business analysts.<br>Help define next steps in the process and ensure alignment with organizational goals.<br><br>Required Skills & Experience:<br><br>Strong understanding of API design, integration platforms, automation tools, and algorithms.<br>Familiarity with master data management and data governance concepts.<br>Proficiency in creating solution diagrams and architectural documentation.<br>Excellent analytical, conceptual thinking, and strategic planning skills.<br>Strong communication and facilitation skills in a collaborative environment.<br>10+ years of experience in IT and enterprise applications.<br>5+ years in system design, architecture development, system integration, or technical leadership roles.<br>Manufacturing industry experience highly preferred.<br>Experience with Lead-to-Order, Quote-to-Cash, or Order Management disciplines.<br>Exposure to ERP systems (SAP RTR/MMPP/P2P modules or Oracle Fusion Manufacturing) is ideal.<br><br>Education:<br>Bachelor’s degree in Computer Science, Information Systems, Business Administration, or related field.<br>Worksite: Remote (with potential for hybrid collaboration as needed)<br>Contract Length: 6+ months
<p>We are seeking a skilled Database Engineer to support and enhance data infrastructure and donor management systems. This role focuses on database development, optimization, and maintenance within a Microsoft-based environment with heavy MySQL usage. The Database Engineer will contribute to ongoing data automation and system integrations that support organizational operations.</p><p><strong>Key Responsibilities</strong></p><ul><li>Develop, optimize, and maintain MySQL and Microsoft SQL Server databases</li><li>Design, build, and manage ETL and SSIS processes for data movement across environments</li><li>Perform ongoing data maintenance, including data cleanup, schema changes, and structural updates</li><li>Query and extract data from donor management or CRM systems to support communications and reporting</li><li>Create, tune, and maintain stored procedures to improve performance and reliability</li><li>Automate recurring data tasks and workflows</li><li>Support API-driven integrations between internal systems</li><li>Collaborate with Data Engineering, Operations, HRIS, Service Desk, and Web teams</li><li>Troubleshoot and resolve data-related issues across platforms</li></ul><p><strong>Technical Environment</strong></p><ul><li><strong>Tech Stack:</strong> Microsoft and Open Source (Linux, .NET, MySQL)</li><li><strong>Primary Tools:</strong> MySQL, Microsoft SQL Server, SSIS, ETL pipelines, APIs</li><li><strong>Web Stack Collaboration:</strong> React and Strapi-based applications</li></ul>
<p><strong>Key Responsibilities</strong></p><ul><li>Design, implement, and own ETL/ELT pipelines—both batch and real‑time—using best-in-class tools and frameworks.</li><li>Architect and maintain scalable data platforms: data lakes, warehouses, and modern lakehouses.</li><li>Drive adoption of event-driven systems, ensuring schema evolution, idempotency, and end-to-end pipeline reliability.</li><li>Leverage AI-powered tooling and automation to optimize operations and reduce manual overhead.</li><li>Implement robust DataOps practices, including CI/CD, version control (Git), testing, and observability.</li><li>Collaborate with platform and AI teams to build AI‑ready, trustworthy data systems with lineage, governance, and cost-awareness.</li><li>Monitor and optimize performance and cost across cloud services</li></ul><p><br></p><p><br></p>