<p>Our client is seeking a Data Architect for multi-year assignment.</p><p><br></p><p>Job Overview:</p><p>The Data Architect will lead the design and implementation of enterprise data systems, ensuring alignment with business needs, data governance, and security standards. This role involves working closely with IT teams, business analysts, and data consumers to deliver scalable and secure data solutions.</p><p>This role will be onsite in Marysville, OH four days per week.</p><p><br></p><p>Daily Responsibilities:</p><ul><li>Translate high-level business requirements into data models, metadata, test data, and data quality standards</li><li>Manage senior business stakeholders to ensure project alignment with strategic roadmaps</li><li>Lead peer reviews and quality assurance of architectural artifacts</li><li>Define and manage standards, guidelines, and processes for data quality</li><li>Collaborate with IT and analytics teams to develop data solutions</li><li>Evaluate and recommend emerging technologies for data management and analytics</li><li>Establish governance frameworks for internal teams and vendor partners</li></ul><p>Project Focus:</p><ul><li>Apply data protection rules across storage, compute, and consumption layers</li><li>Design data protection solutions at database, table, column, and API levels</li><li>Architect data systems including databases, warehouses, and lakes</li><li>Select and implement database management systems with optimized schemas and security</li><li>Enhance data pipeline performance and ensure data governance</li></ul>
<p>We are looking for an experienced Senior Data Engineer to join our team. This role involves designing and implementing scalable data solutions, optimizing data workflows, and driving innovation in data architecture. The ideal candidate will possess strong leadership qualities and a passion for problem-solving in a fast-paced, cutting-edge environment.</p><p><br></p><p>Responsibilities:</p><p>• Develop high-performance data systems, including databases, APIs, and data integration pipelines, to support scalable solutions.</p><p>• Design and implement metadata-driven architectures and automate deployment processes using infrastructure-as-code principles.</p><p>• Promote best practices in software engineering, such as code reviews, testing, and continuous integration/delivery (CI/CD).</p><p>• Establish and maintain a robust data governance framework to ensure compliance and data integrity.</p><p>• Monitor processes and implement improvements, including query optimization, code refactoring, and efficiency enhancements.</p><p>• Leverage cloud platforms, particularly Azure and Databricks, to improve system architecture and scalability.</p><p>• Conduct data quality checks and build procedures to address and resolve data issues effectively.</p><p>• Create and maintain documentation for data architecture, standards, and best practices.</p><p>• Provide technical leadership to the team, guiding design discussions and fostering innovation in data infrastructure.</p><p>• Identify and implement opportunities for process optimization and automation to improve operational efficiency.</p>
<p>We are looking for an experienced Salesforce Pardot Architect to join our team on a contract basis. In this role, you will play a pivotal part in supporting marketing automation projects, focusing on platform migrations and integrations. This position offers a unique opportunity to collaborate with a skilled team while contributing to the successful implementation of Salesforce Marketing Cloud Account Engagement solutions.</p><p><br></p><p>Responsibilities:</p><p>• Lead the migration of data and workflows from HubSpot and Marketo to Salesforce Pardot (Marketing Cloud Account Engagement).</p><p>• Collaborate with teams to ensure proper data governance and establish best practices for data conversion processes.</p><p>• Design and implement workflows, architecture standards, and data migration strategies that align with project requirements.</p><p>• Evaluate and map 197 fields involved in the migration process to ensure seamless integration.</p><p>• Provide technical expertise in server virtualization, enterprise storage, and DevOps practices.</p><p>• Develop and maintain architecture standards to support scalability and reliability.</p><p>• Work closely with internal teams to troubleshoot and resolve migration-related challenges.</p><p>• Ensure compliance with Group Policy Objects (GPO) and other organizational policies.</p><p>• Offer guidance and mentorship to team members throughout the project lifecycle.</p><p>• Document processes and solutions to ensure clarity and knowledge transfer.</p>
<p><strong>SOLUTION ARCHITECT – Digital Transformation - 100% DIRECT HIRE </strong></p><p><strong>Drive Innovation in Commercial Platforms with a Modern Azure, React, APi's, .NET AI Tech Stack! </strong></p><p><strong>LOCATION: DES MOINES IOWA - HYBRID in office. NOTE: NOT 100% REMOTE. </strong></p><p><strong>HYBRID: ONLY in the office a few times during each month! SUPER FLEXIBLE! </strong></p><p>Ready to shape how global enterprises engage customers & partners digitally? Join a worldwide leader undergoing rapid digital transformation—seeks a Solution Architect to design and deliver next-gen digital platforms, web apps, and elevate CX. Bring your strategic vision to the intersection of technology innovation and business growth.</p><p>Solution Architect – Digital Transformation -</p><p>Foor Immediate & Confidential Consideration: <strong> 📩 Carrie Danger, SVP Permanent Placement, on LinkedIn . 📞 Office: 515-259-6087 | Cell: 515-991-0863 . 📧 Email found on My LinkedIn profile</strong></p><ul><li><strong>Compensation: Competitive base up to $155K</strong></li><li><strong>Bonus potential up to 20%</strong></li></ul><p><strong>Your Focused Mission:</strong></p><ul><li>Design scalable, secure cloud solutions with Azure, Databricks, React & Node.js</li><li>Lead delivery of cloud-native, API-first architectures for seamless CX</li><li>Translate complex technology concepts for business stakeholders</li><li>Transform business needs into reusable solutions and architecture documentation</li><li>Guide teams on best practices for data environments, APIs, and microservices</li><li>Collaborate cross-functionally to establish Communities of Practice and co-create architecture roadmaps with leadership</li><li>Ensure governance, compliance, and ongoing improvement of standards</li><li>Mentor and coach architecture & development teams</li></ul><p><strong>WHAT YOU NEED: </strong></p><ul><li>5+ years designing enterprise software architecture (cloud, APIs, web)</li><li>3+ years with Azure cloud platforms, including hybrid environments</li><li>Track record with API-first design, microservices, event-driven architecture, SOA</li><li>Hands-on Azure Databricks, Data Factory, Delta Lake, Synapse Analytics, Big Data</li><li>Skilled in React, Node.js, Python, Scala, SQL, or similar</li><li>AI development experience a plus</li><li>Skilled ability to connect tech with business</li></ul><p><strong>WHY JOIN? </strong></p><ul><li>Strategic, high-impact role influencing global digital strategy</li><li>Collaborative & innovative teams</li><li>Hybrid flexibility & tech l growth into digital, data, and cloud domains. <strong>For immediate and confidential consideration on this Solution Architect – Digital Transformation connect directly with Carrie Danger, SVP Permanent Placement Team (Iowa Region): 515-259-6087 (office), 515-991-0863 (cell), or via email (on LinkedIn). One-click apply also available on the Robert Half site. Your information will never be shared without your direct consent.</strong></li></ul><p><br></p>
We are looking for an experienced Data/Information Architect to join our team in Houston, Texas. In this Contract to permanent employment position, you will play a pivotal role in designing, organizing, and optimizing data systems from the ground up to support business objectives. This role requires hands-on expertise in modern data technologies and methodologies to ensure the delivery of high-quality, scalable solutions.<br><br>Responsibilities:<br>• Develop and implement strategies for cleaning, organizing, and optimizing data across the organization.<br>• Design and build data platforms and architectures to meet business requirements and support analytics.<br>• Analyze existing data systems and recommend improvements for efficiency and scalability.<br>• Utilize AI tools and techniques to enhance data processes and commodity training.<br>• Collaborate with stakeholders to assess company needs and define data-related approaches.<br>• Work with cloud technologies and AWS to establish robust data solutions.<br>• Apply advanced knowledge of Spark to manage and process large-scale data efficiently.<br>• Partner with internal teams to ensure alignment with Agile Scrum methodologies.<br>• Create comprehensive documentation, including Business Requirement Documents, to guide data initiatives.<br>• Ensure data governance and security through the use of MDM technologies and best practices.
Qualifications<br>QUALIFICATIONS<br><br>These specifications are general guidelines based on the minimum experience normally considered essential to the satisfactory performance of this position. The requirements listed below are representative of the knowledge, skill and/or ability required to perform the position in a satisfactory manner. Individual abilities may result in some deviation from these guidelines.<br><br>• A self-starting team player who possesses a bachelor’s degree in Information Technology, Engineering Technology, Computer Information Technology, or related field<br>• Minimum of 10 years of technical experience directing data analytics teams (Enterprise Data Architecture, Data Analytics, Data Governance, ALM Teams, Release Management, Support Functions) within a hybrid cloud environment<br>• Minimum of 5 years in banking experience<br>• Background in large scale migrations, hybrid cloud, and application modernization projects<br>• Experience with regulated or complex environments<br>• Familiarity with ITIL, Agile, and modern delivery frameworks<br>• Proven experience delivering enterprise Azure solutions in an IT and Business-facing capacity<br>• Strong understanding of core Azure services and architecture patterns across Compute, networking, identity, storage, containers, and automation<br>• Knowledgeable with Infrastructure as code (ARM, Bicep, Terraform), DevOps pipelines, and CI/CD practices<br>• Strong understanding of Gen AI principles (Microsoft CoPilot), large language models (LLM) and their applications in analytics<br>• Strong knowledge of security, governance, identity (Entra ID), and compliance considerations for Azure<br>• Microsoft Cloud certifications<br>• F
<p>Robert Half is looking for a DBA to join our client's team in Murietta. </p><p><br></p><p>SQL Server & ERP Administration</p><p>• Administer, monitor, and maintain multiple Microsoft SQL Server instances, with the Deltek Vantagepoint ERP database as the primary system of record.</p><p>• Write, optimize, and tune complex T-SQL queries, stored procedures, and views for both ad hoc analysis and application logic.</p><p>• Create and schedule SQL Server Agent jobs to automate data ingestion, maintenance routines, and reporting pipelines.</p><p>• Implement, document, and test backup and disaster recovery strategies to ensure business continuity.</p><p>• Manage database indexes, performance tuning, and query optimization to support application efficiency and stability.</p><p>• Plan and execute Deltek Vantagepoint upgrades, schema updates, and integrations, working closely with the ERP administrator and IT team.</p><p>• Design and publish SSRS reports and datasets to meet business and finance reporting requirements.</p><p><br></p><p><br></p><p>Cross-System Data Architecture</p><p>• Support integration between SQL Server and the Azure Databricks Lakehouse, ensuring consistent and secure data flow.</p><p>• Collaborate with data engineers and analysts to maintain data integrations to other critical systems.</p><p>• Develop and maintain SQL transformations and stored procedures that serve as upstream logic for analytics and reporting layers.</p><p>• Contribute to the management of Unity Catalog, metadata inventory, and data lineage documentation.</p><p>• Partner with Power BI administrators to optimize Fabric dataset refreshes, gateways, and source connections.</p><p><br></p><p>Data Governance, Security, & Monitoring</p><p>• Enforce data security and access controls aligned with IT and Data Governance policies.</p><p>• Participate in data issue management and quality improvement processes, ensuring system reliability and integrity.</p><p>• Monitor system performance using both native and custom monitoring tools; proactively identify and resolve issues.</p><p>• Maintain clear, comprehensive documentation for database configurations, schemas, and operational procedures.</p><p><br></p><p><br></p>
<p>The Business Architect will operate at a systems and enterprise level, owning end-to-end business and application workflows across multiple teams. This role focuses on defining how business processes, data, and systems interact, ensuring alignment between product strategy and technical delivery. The Business Architect will partner closely with product, engineering, and leadership to identify dependencies, integration points, and impacts across platforms while disseminating architectural guidance to multiple delivery teams.</p>
We are looking for a skilled Data Engineer to join our team in Los Angeles, California. This role focuses on designing and implementing advanced data solutions to support innovative advertising technologies. The ideal candidate will have hands-on experience with large datasets, cloud platforms, and machine learning, and will play a critical role in shaping our data infrastructure.<br><br>Responsibilities:<br>• Develop and maintain robust data pipelines to ensure seamless data extraction, transformation, and loading processes.<br>• Design scalable architectures that support machine learning models and advanced analytics.<br>• Collaborate with cross-functional teams to deliver business intelligence tools, reporting solutions, and analytical dashboards.<br>• Implement real-time data streaming solutions using platforms like Apache Kafka and Apache Spark.<br>• Optimize database performance and ensure efficient data storage and retrieval.<br>• Build and manage resilient data science programs and personas to support AI initiatives.<br>• Lead and mentor a team of data scientists, machine learning engineers, and data architects.<br>• Design and implement strategies for maintaining large datasets, ensuring data integrity and accessibility.<br>• Create detailed technical documentation for workflows, processes, and system architecture.<br>• Stay up-to-date with emerging technologies to continuously improve data engineering practices.
<p>We are seeking a highly skilled Senior Data Engineer with AI / ML desires to design, build, and scale next-generation data and machine learning infrastructure. This role is ideal for a hands-on technical expert who thrives in building complex systems from the ground up, has deep experience in Google Cloud Platform (GCP), and is excited about stepping into management and technical leadership. You will work across engineering, data science, and executive leadership teams to architect cloud-native solutions, optimize real-time data pipelines, and help shape our long-term AI/ML engineering strategy.</p><p><br></p><p><strong>Key Responsibilities</strong></p><p><strong>Cloud & Platform Engineering</strong></p><ul><li>Architect, build, and maintain high-performance data and ML infrastructure on GCP using best-in-class cloud-native tools and services.</li><li>Lead the design of scalable cloud architectures, with a strong focus on resilience, automation, and cost-effective operation.</li><li>Build applications and services from scratch, ensuring they are modular, maintainable, and scalable.</li></ul><p><strong>Real-Time & Distributed Systems</strong></p><ul><li>Design and optimize real-time data processing pipelines capable of handling high-volume, low-latency traffic.</li><li>Implement and fine-tune load balancing strategies to support fault tolerance and performance across distributed systems.</li><li>Lead system design for high availability, horizontal scaling, and microservices communication patterns.</li></ul><p><strong>AI/ML Engineering</strong></p><ul><li>Partner with ML engineers and data scientists to deploy, monitor, and scale machine learning workflows.</li><li>Create and maintain ML-focused CI/CD pipelines, model deployment frameworks, and automated testing harnesses.</li></ul><p><strong>Open-Source & Code Quality</strong></p><ul><li>Contribute to and maintain open-source projects, including active GitHub repositories.</li><li>Champion best practices across code reviews, version control, and documentation.</li><li>Establish, document, and enforce advanced testing methodologies, including integration, regression, performance, and automated testing frameworks.</li></ul><p><strong>Leadership & Collaboration</strong></p><ul><li>Serve as a technical leader and mentor within the engineering team.</li><li>Collaborate effectively with senior leadership and executive stakeholders, translating complex engineering concepts into strategic insights.</li><li>Provide guidance and direction to junior engineers, with an eye toward growing into a people leadership role.</li></ul>
<p><strong>Our client, a leading oil and gas company, is embarking on a key cloud modernization initiative and seeks a Data Engineer experienced with Databricks in AWS environments. This is an opportunity to shape the data landscape for an established energy leader</strong></p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Lead the implementation and integration of Databricks within AWS-based infrastructure</li><li>Design, build, and optimize scalable data pipelines for large, multi-source datasets</li><li>Collaborate with data architects, business analysts, and project managers to deliver analytics solutions aligned with business goals</li><li>Develop and maintain ETL processes, ensuring data quality and reliability across systems</li><li>Automate data workflows and enable advanced analytics for operations, production, and business intelligence</li><li>Support data governance best practices, security, and compliance within a highly regulated sector</li><li>Document solutions and provide guidance on infrastructure and platform best practices</li></ul><p><br></p>
We are looking for an experienced individual to lead our Data Management and Client Reporting initiatives. This role requires a strategic thinker with a deep understanding of investment operations and expertise in Addepar. The successful candidate will oversee the integrity and evolution of reporting systems, manage a team member, and align data strategies with organizational objectives.<br><br>Responsibilities:<br>• Develop and implement firm-wide strategies for the Addepar platform to optimize data architecture, reporting capabilities, and system performance.<br>• Establish and maintain data governance standards to ensure accuracy, consistency, and compliance across client portfolios.<br>• Design and deliver tailored client reporting solutions in collaboration with advisory teams, ensuring alignment with regulatory and firm standards.<br>• Manage and mentor the Client Reporting Analyst, fostering growth and a culture of continuous improvement.<br>• Oversee the onboarding process for new clients and entities, ensuring seamless integration into Addepar and related systems.<br>• Collaborate with cross-functional teams, including Investment Operations, Technology, Compliance, and Advisors, to support firm-wide objectives.<br>• Identify opportunities to improve reporting processes, enhance efficiency, and drive innovation.<br>• Serve as the primary escalation point for all Addepar-related initiatives and provide expertise to resolve complex issues.
<p><strong>Role Summary</strong></p><p>As a Technical Project Manager focused on data and AWS cloud, you will lead the planning, execution, and delivery of engineering efforts involving data infrastructure, data platforms, analytics, and cloud services. You will partner with data engineering, analytics, DevOps, product, security, and business stakeholders to deliver on key strategic initiatives. You are comfortable navigating ambiguity, managing dependencies across teams, and ensuring alignment between technical direction and business priorities.</p><p><strong>Key Responsibilities</strong></p><ul><li>Lead end-to-end technical projects pertaining to AWS cloud, data platforms, data pipelines, ETL/ELT, analytics, and reporting.</li><li>Define project scope, objectives, success criteria, deliverables, and timelines in collaboration with stakeholders.</li><li>Create and maintain detailed project plans, roadmaps, dependency maps, risk & mitigation plans, status reports, and communication plans.</li><li>Track and monitor project progress, managing changes to scope, schedule, and resources.</li><li>Facilitate agile ceremonies (e.g., sprint planning, standups, retrospectives) or hybrid methodologies as appropriate.</li><li>Serve as the bridge between technical teams (data engineering, DevOps, platform, security) and business stakeholders (product, analytics, operations).</li><li>Identify technical and organizational risks, escalate when needed, propose mitigation or contingency plans.</li><li>Drive architectural and design discussions, ensure technical feasibility, tradeoff assessments, and alignment with cloud best practices.</li><li>Oversee vendor, third-party, or external partner integrations and workstreams.</li><li>Ensure compliance, security, governance, and operational readiness (e.g., data privacy, logging, monitoring, SLA) are baked into deliverables.</li><li>Conduct post-implementation reviews, lessons learned, and process improvements.</li><li>Present regularly to senior leadership on project status, challenges, KPIs, and outcomes.</li></ul>
<p><strong>Systems Analyst / Architect – Intermediate</strong></p><p>26‑Week Contract | Onsite | Quincy, MA</p><p><br></p><p><strong>Overview</strong></p><p>We are seeking an Intermediate Systems Analyst/Architect for a 26‑week onsite engagement in Quincy, MA. The ideal candidate will support workflow optimization, software deployment governance, data analysis, and issue resolution across multiple teams. This role involves significant interaction with end users and requires strong communication, analytical ability, and technical fluency.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Become proficient in various proprietary data sources and integrate their use to track initiatives and the context surrounding them.</li><li>Guide end users through workflow processes to reduce friction in software development, deployment, and governance activities.</li><li>Generate, customize, and deliver reports or logs related to initiative metadata and historical activity for stakeholder review.</li><li>Analyze the root cause of system defects, workflow gaps, or technical issues.</li><li>Recommend mitigation strategies and improvements to streamline workflows and resolve recurring pain points.</li><li>Draft and communicate proposals to team members and development groups for enhancements, requirements, or fixes to improve system performance and address defects.</li></ul><p><br></p>
<p>IoT Edge AI Data Engineer </p><p>Job Posting: IoT Edge Data Engineer – Hi-TECH AI SaaS Company - Direct Hire FTE</p><p>LOCATION: Pittsburgh PA HYBRID IN PITTSBURGH PA </p><p>WORK AUTHORIZATION: F1 OR OPT CONSIDERED for the right candidate</p><p>***** APPLY TODAY IF YOU HAVE EDGE DEVICE IOT DATA ENGINEERING, INTERVIEWS NOW....OFFER IN JANUARY! </p><p>Are you passionate about powering real-time AI and analytics at the intersection of devices, data, and the cloud? Our fast-growing, innovative SaaS fast GTM fast growing, industry disruptor is seeking an experienced IoT Edge AI Data Engineer to architect and build the next generation of edge-to-cloud data pipelines supporting sensor-driven platforms. You’ll be working with cutting-edge edge hardware, including NVIDIA Jetson, and developing solutions that transform raw sensor data into powerful, ML-ready streams and data pipelines.</p><p>YOU MUST HAVE: Data cleansing, data ingestion, and data engineering dealing with disparate complex data sources and getting data OFF OF Edge devices!!! THIS IS NOT A data science / data engineering typical role, you must have the EDGE IoT device experience. </p><p>**** To be considered on this pioneering team to Carrie Danger, Robert Half Technology / SVP Permanent Placement and apply to this position.***</p><p>KEYS:</p><p>Design & implement high volume data ingestion pipelines for IoT systems, from device through on-prem infrastructure to cloud platforms.</p><p>Develop edge computing solutions leveraging platforms like NVIDIA Jetson for near-real-time data processing, orchestration, & distribution.</p><p>Build robust frameworks to clean, validate, and prepare sensor, telemetry, and machine data for downstream ML engineering teams.</p><p>Architect and software engineer, infrastructure for reliable edge-to-cloud connectivity and data workflows.</p><p>Looking for experience: using technologies such as MQTT, Kinesis, and related telemetry protocols.</p><p>You MUST HAVE on prem hardware device experience: Optimize on-prem and distributed data flows for scalable, resilient performance.</p><p>YOU NEED: </p><p>3–4+ years of data engineering experience—MUST designing and deploying device-to-cloud pipelines for IoT/edge environments.</p><p>Hands-on experience with edge hardware (e.g., NVIDIA Jetson) and low-level data engineering.</p><p>Proficient in cloud data services, streaming, and orchestration (AWS, Azure, GCP, Kinesis, MQTT, Kafka, etc.).</p><p>Data cleansing, data ingestion, and data engineering dealing with getting data OFF OF Edge devices!!!</p><p>MUST HAVE SKILLS: MQTT, Telemetry, Sensor Data, Data Pipeline, Edge Computing, On-Prem, Device to Cloud, Data Orchestration</p><p>Strong software engineering skills: Python, C++.</p><p>Deep knowledge of sensor/telemetry data and proven capability in building scalable data infrastructure for ML/AI use cases.</p><p>Experience with on-premises distributed systems & cloud integration. </p><p>Prior work with edge AI or industrial IoT, developing robust real-time data pipelines.</p><p>hardware/software integration</p><p>Direct Hire FTE direct hire position up to $135K, plus potential other incentives. For immediate & confidential consideration on this IoT Edge AI Data Engineer = Hybrid in the Pittsburgh PA or Des Moines Iowa Market, please call Carrie Danger, SVP at 515-259 6087, or apply via LinkedIn. </p>
<p>This role is responsible for designing and implementing enterprise integration solutions across cloud and on-prem systems. The Integration Architect owns integration strategy, API design, data flow architecture, and middleware configuration to support scalable, reliable system communication.</p><p>The position partners closely with application, data, and infrastructure teams to translate business and technical requirements into secure, high-availability integration patterns.</p>
<p><strong>Robert Half</strong> is actively partnering with an Austin-based client to identify a <strong>AI/ML Engineer (contract).</strong> In this role, you’ll architect and deliver robust, scalable platforms that enable machine learning and data science at enterprise scale. <strong>This role is hybrid in Austin, Tx. </strong></p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li><strong>Architect & Build:</strong> Design and implement distributed systems and data processing frameworks using modern cloud technologies (GCP, Azure). Ensure solutions are secure, scalable, and optimized for performance.</li><li><strong>Lead Projects:</strong> Own technical initiatives end-to-end—from concept through delivery. Influence product roadmaps and make critical decisions on architecture and trade-offs.</li><li><strong>Collaborate Across Teams:</strong> Participate in planning, design reviews, and code reviews. Anticipate cross-team dependencies and resolve conflicts proactively.</li><li><strong>Mentor & Grow Talent:</strong> Interview candidates, onboard new hires, and provide technical guidance to engineers and interns. Foster a culture of learning and excellence.</li></ul>
<p><strong>Key Responsibilities</strong></p><ul><li>Architect, implement, and optimize LAN/WAN, data center, and cloud/hybrid networking (AWS/Azure/GCP).</li><li>Lead configuration and lifecycle management for Cisco (Catalyst/Nexus), SD‑WAN (Cisco Viptela/Meraki/Fortinet), and wireless (Cisco/Meraki/Aruba).</li><li>Design and enforce network security: firewalls (Cisco ASA/FTD, Palo Alto, Fortinet), VPNs (IPsec/SSL), segmentation (VLAN/VRF), NAC (ISE/ClearPass).</li><li>Own network observability: implement and tune monitoring/alerting (SolarWinds, ThousandEyes, NetFlow, SNMP), packet analysis (Wireshark), and log pipelines (Syslog/SIEM).</li><li>Drive automation & IaC for network operations using Python, Ansible, Git, and templates/Golden Configs; integrate with CI/CD where applicable.</li><li>Ensure governance/compliance (e.g., NIST, ISO 27001, PCI, HIPAA) with documentation, standards, and change control (ITIL).</li><li>Troubleshoot complex, multi-domain incidents; perform root cause analysis and create permanent fixes.</li><li>Mentor junior engineers; contribute to runbooks, design docs, and architecture diagrams.</li><li>Partner with InfoSec, Cloud, and Apps teams on zero‑trust, SASE, ZTA, and secure connectivity patterns.</li></ul><p><br></p>
<p>Our client is seeking an experienced <strong>Salesforce</strong> <strong>Technical Lead</strong> with strong expertise in <strong>Salesforce development</strong> and <strong>release engineering</strong>. This individual will serve as a key liaison between client's local team and Deloitte’s offshore development team, ensuring seamless coordination of Salesforce releases and efficient data management. The ideal candidate combines hands-on technical skills with leadership capabilities to drive successful implementations and maintain confidence in Salesforce systems.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><p><strong>Technical Leadership & Coordination</strong></p><ul><li>Act as the primary point of contact between local stakeholders and offshore development teams.</li><li>Provide guidance and oversight on Salesforce development activities without direct people management responsibilities.</li><li>Ensure alignment of deliverables, timelines, and quality standards across teams.</li></ul><p><strong>Release Management & Engineering</strong></p><ul><li>Own and manage Salesforce release processes, including planning, scheduling, and execution.</li><li>Implement best practices for release engineering to minimize downtime and ensure smooth deployments.</li><li>Collaborate with offshore teams to avoid diverting development resources for release tasks.</li></ul><p><strong>Salesforce Development</strong></p><ul><li>Perform hands-on Salesforce development as needed, including customizations, integrations, and troubleshooting.</li><li>Maintain high confidence and competence in Salesforce technical capabilities.</li></ul><p><strong>Data Management</strong></p><ul><li>Extract, transform, and load (ETL) data within Salesforce environments.</li><li>Utilize tools such as Workbench or equivalent for data operations, ensuring accuracy and integrity.</li></ul><p><strong>DevOps & Process Improvement</strong></p><ul><li>Understand DevOps principles and practices relevant to Salesforce environments.</li><li>Gather requirements and lead implementation of DevOps-related initiatives when necessary.</li></ul><p><br></p>
We are seeking a highly skilled Lead-to-Order Architect to support a major organizational transformation toward a workstream-oriented model. This role focuses on the Lead-to-Order (LTO) value stream, covering processes from lead generation through quoting and estimating (not Quote-to-Cash). The position is part of IT and will play a critical role in designing and implementing integrated solutions across multiple plants.<br>Key Responsibilities:<br><br>Develop and lead the forward-looking strategy and roadmap for the Lead-to-Order value stream, ensuring alignment with business goals and scalability.<br>Lead technical implementation of LTO platforms (CRM, Estimating, Quoting, Pricing).<br>Oversee architecture, design, development, implementation, and support of business-critical applications.<br>Partner with department leaders to understand business needs and align application solutions accordingly.<br>Prepare functional and technical specifications, configure software packages, and ensure successful integrations.<br>Collaborate with IT teams for alignment on security, data, project management, infrastructure, and operations.<br>Drive project milestones and serve as the go-to technical SME for the APEX program.<br>Mentor team members and guide them through change management initiatives.<br><br>Day-to-Day Activities:<br><br>Support implementation of Paperless Parts (estimating tool) across five plants.<br>Work with UiPath (automation) and SnapLogic (iPaaS) for integrations.<br>Partner with MDM resources to ensure proper master data management.<br>Provide technical oversight and guidance to project managers and business analysts.<br>Help define next steps in the process and ensure alignment with organizational goals.<br><br>Required Skills & Experience:<br><br>Strong understanding of API design, integration platforms, automation tools, and algorithms.<br>Familiarity with master data management and data governance concepts.<br>Proficiency in creating solution diagrams and architectural documentation.<br>Excellent analytical, conceptual thinking, and strategic planning skills.<br>Strong communication and facilitation skills in a collaborative environment.<br>10+ years of experience in IT and enterprise applications.<br>5+ years in system design, architecture development, system integration, or technical leadership roles.<br>Manufacturing industry experience highly preferred.<br>Experience with Lead-to-Order, Quote-to-Cash, or Order Management disciplines.<br>Exposure to ERP systems (SAP RTR/MMPP/P2P modules or Oracle Fusion Manufacturing) is ideal.<br><br>Education:<br>Bachelor’s degree in Computer Science, Information Systems, Business Administration, or related field.<br>Worksite: Remote (with potential for hybrid collaboration as needed)<br>Contract Length: 6+ months
<p><strong>Key Responsibilities</strong></p><ul><li>Design, implement, and own ETL/ELT pipelines—both batch and real‑time—using best-in-class tools and frameworks.</li><li>Architect and maintain scalable data platforms: data lakes, warehouses, and modern lakehouses.</li><li>Drive adoption of event-driven systems, ensuring schema evolution, idempotency, and end-to-end pipeline reliability.</li><li>Leverage AI-powered tooling and automation to optimize operations and reduce manual overhead.</li><li>Implement robust DataOps practices, including CI/CD, version control (Git), testing, and observability.</li><li>Collaborate with platform and AI teams to build AI‑ready, trustworthy data systems with lineage, governance, and cost-awareness.</li><li>Monitor and optimize performance and cost across cloud services</li></ul><p><br></p><p><br></p>
<p>This role provides hands-on technical leadership for the Salesforce platform, owning architecture, custom development, and integration strategy across Sales Cloud, Service Cloud, Experience Cloud, and custom applications. The individual will translate complex business requirements into scalable, secure solutions while enforcing architectural standards and development best practices.</p><p>The position blends deep development work with strategic platform ownership, including data modeling, API design, CI/CD implementation, and performance optimization. This role partners closely with developers, administrators, and business stakeholders to ensure the platform supports long-term growth and enterprise scalability.</p>
Role Description: We're hiring a Data Engineering Manager to lead our analytics engineering <br>team and own the strategy, architecture, and delivery of our data platform. You'll manage two <br>analysts, partner with software engineering on ETL/integration work, and make the technical <br>decisions that shape how the organization consumes data. Your priorities are speed, security, <br>and stability—in that order when they conflict. This role requires initiative and ownership. <br>You'll operate with significant autonomy—identifying problems, developing solutions, and <br>driving them to completion without waiting for direction. We need someone who stays close to <br>the work and leads by example.<br>Roles & Responsibilities:<br>• Manage and develop a team of two data analysts focused on reporting, dashboards, and <br>data modeling<br>• Own schema design, data modeling decisions, and warehouse architecture<br>• Optimize query performance, storage costs, and pipeline reliability<br>• Partner with engineering teams on data integration, ETL processes, and source system <br>changes<br>• Establish and enforce standards for data quality, access controls, and documentation<br>• Evaluate and implement tooling decisions across the data stack<br>• Translate business requirements into scalable, maintainable data solutions<br>• Integrate multiple systems with Salesforce Sales Cloud, including internal proprietary <br>and third-party applications and APIs<br>Qualifications:<br>• Bachelor’s Degree in Computer Science and Engineering or equivalent<br>• 5+ years of experience in data engineering, analytics engineering, or a related technical <br>role<br>• 2+ years of experience managing or technically leading a team<br>• Deep expertise with cloud data warehouse: Azure Data Warehouse or Snowflake<br>• Advanced SQL skills and solid understanding of data modeling patterns (star schema, <br>normalization tradeoffs, slowly changing dimensions)<br>• Hands-on experience with at least one BI platform (PowerBI, Tableau, or Looker)<br>• Experience with Salesforce data structures and reporting (SOQL, custom objects, record <br>relationships)<br>• Familiarity with ETL/ELT orchestration tools (Airflow, Azure Data Factory, Airbyte etc..)<br>• Background in data governance, PII handling, or compliance requirements<br>• Ability to understand and document business logic, not just technical implementation<br>• Clear communication skills—you'll interface with both engineers and business <br>stakeholder
<p>Quick moving contract to perm opening with hybrid in-office plus remote type schedule. Expectations would be in our client's Tampa office 2 to 3 days per week in mentoring Jr Data Engineer </p><p><br></p><p>If applying, please make sure you have at least 5 years experience in Power BI, ETL Development, Snowflake and Azure. If you have Sigma reporting that will be a huge plus as our client is going that direction with their reporting initiatives. Healthcare background also a strong preference to understand our client technical work flows. </p><p><br></p><p>We are seeking an experienced Senior Data Engineer with 5+ years of hands-on experience to join our dynamic Data Engineering team. In this role, you will design, build, and optimize scalable data pipelines and analytics solutions in a fast-paced healthcare environment. You will play a pivotal role in enabling real-time insights for healthcare stakeholders, ensuring data integrity, compliance with HIPAA and other regulations, and seamless integration across multi-cloud ecosystems.</p><p><br></p><p>Key Responsibilities</p><p><br></p><p>Architect and implement end-to-end ETL/ELT pipelines using Azure Data Factory, Snowflake, and other tools to ingest, transform, and load healthcare data (e.g., EHR, claims, patient demographics) from diverse sources.</p><p>Design and maintain scalable data warehouses in Snowflake, optimizing for performance, cost, and healthcare-specific querying needs.</p><p>Develop interactive dashboards and reports in Power BI to visualize key healthcare metrics, such as patient outcomes, readmission rates, and resource utilization.</p><p>Collaborate with cross-functional teams (data scientists, analysts, clinicians) to translate business requirements into robust data solutions compliant with HIPAA, GDPR, and HITRUST standards.</p><p>Lead data modeling efforts, including dimensional modeling for healthcare datasets, ensuring data quality, governance, and lineage.</p><p>Integrate Azure services (e.g., Synapse Analytics, Databricks, Blob Storage) to build secure, high-availability data platforms.</p><p>Mentor junior engineers, conduct code reviews, and drive best practices in CI/CD pipelines for data engineering workflows.</p><p>Troubleshoot and optimize data pipelines for performance in high-volume healthcare environments (e.g., processing millions of claims daily).</p><p>Stay ahead of industry trends in healthcare data analytics and contribute to strategic initiatives like AI/ML integration for predictive care models.</p><p><br></p><p><br></p><p><br></p>
<p>We are looking for an experienced ERP Director to lead strategic and technical initiatives across enterprise applications, data platforms, and custom development. This role requires a visionary leader who can bridge technology and business needs to deliver secure, scalable, and insight-driven solutions. </p><p><br></p><p>Responsibilities:</p><p>• Develop and implement a comprehensive strategy for enterprise applications, custom software, and data platforms.</p><p>• Establish and enforce standards for application architecture, data pipelines, APIs, and system integrations.</p><p>• Collaborate with executive leadership to advise on application modernization and data-driven initiatives.</p><p>• Manage the lifecycle, performance, and enhancements of enterprise systems supporting key business functions such as finance, HR, and operations.</p><p>• Drive system upgrades, implementations, and optimizations for ERP platforms.</p><p>• Ensure seamless integration between ERP systems, custom applications, and data analytics platforms.</p><p>• Promote governance, scalability, and standardization across the application portfolio.</p><p>• Oversee the alignment of data and analytics strategies with business objectives.</p><p>• Lead teams in deploying enterprise applications and managing data platforms.</p><p>• Continuously evaluate emerging technologies to improve system performance and functionality.</p>