<p>We are looking for a skilled <strong>Data Engineer</strong> to design and build robust data solutions that align with business objectives. In this role, you will collaborate with cross-functional teams to develop and maintain scalable data architectures, pipelines, and models. Your expertise will ensure the quality, security, and compliance of data systems while contributing to the organization’s data-driven decision-making processes. Call 319-362-8606, or email your resume directly to Shania Lewis - Technology Recruiting Manager at Robert Half (email information is on LinkedIn). Let's talk!!</p><p><br></p><p><strong>Responsibilities:</strong></p><ul><li>Design and implement scalable data architectures, pipelines, and models.</li><li>Translate business requirements into practical data solutions.</li><li>Ensure data quality, security, and regulatory compliance.</li><li>Maintain and improve existing data infrastructure.</li><li>Optimize system performance for efficiency and reliability.</li><li>Research and recommend emerging data technologies.</li><li>Mentor team members and foster collaboration.</li><li>Enable effective analytics through robust data solutions.</li></ul>
We are looking for a skilled Engineer to develop and enhance software solutions that address complex challenges in the real estate and property industry. This long-term contract position involves designing, coding, testing, and maintaining scalable and secure software systems. Based in Minneapolis, Minnesota, this role offers an opportunity to contribute to impactful engineering projects while collaborating with cross-functional teams.<br><br>Responsibilities:<br>• Design and implement software solutions that align with customer needs and organizational goals.<br>• Develop, test, debug, and document code to ensure reliability and performance.<br>• Collaborate with team members to solve technical challenges and remove roadblocks.<br>• Apply knowledge of frameworks and systems design to create stable and scalable software.<br>• Participate in product planning and provide input on technical strategies and solutions.<br>• Troubleshoot and analyze complex issues to identify and resolve defects.<br>• Mentor developers who are early in their careers and provide technical guidance to the team.<br>• Explore and adopt new technologies to enhance product performance and lifecycle.<br>• Contribute to DevOps processes, including support rotations and subsystem knowledge-building.<br>• Assist in recruiting efforts by participating in interviews and evaluating potential team members.
<p><strong>Job Title:</strong> Cloud Data Engineer</p><p><strong>Location:</strong> Remote (occasional travel to the Washington D.C. metro area may be required)</p><p><strong>Clearance Required:</strong> Public Trust</p><p><strong>Position Overview</strong></p><p>We are seeking a customer-focused <strong>Cloud Data Engineer</strong> to join a dynamic team of subject matter experts and developers. This role involves designing and implementing full lifecycle data pipeline services for Azure-based data lake, SQL, and NoSQL data stores. The ideal candidate will be mission-driven, delivery-oriented, and skilled in translating business requirements into scalable data engineering solutions.</p><p><strong>Key Responsibilities</strong></p><ul><li>Maintain and operate legacy ETL processes using Microsoft SSIS, PowerShell, SQL procedures, SSAS, and .NET.</li><li>Develop and manage full lifecycle Azure cloud-native data pipelines.</li><li>Collaborate with stakeholders to understand data requirements and deliver effective solutions.</li><li>Design and implement data models and pipelines for various data architectures including relational, dimensional, lakehouse (medallion), warehouse, and mart.</li><li>Utilize Azure services such as Data Factory, Synapse Pipelines, Apache Spark Notebooks, Python, and SQL.</li><li>Migrate existing SSIS ETL scripts to Azure Data Factory and Synapse Pipelines.</li><li>Prepare data for advanced analytics, visualization, reporting, and AI/ML applications.</li><li>Ensure data integrity, quality, metadata management, and security across pipelines.</li><li>Monitor and troubleshoot data issues to maintain performance and availability.</li><li>Implement governance, CI/CD, and monitoring for automated platform operations.</li><li>Participate in Agile DevOps processes and continuous learning initiatives.</li><li>Maintain strict versioning and configuration control.</li></ul>
<p>Opportunity to work for a dynamic company with a global footprint and recognition in the travel & leisure industry. As a Sr. Data Engineer specializing in Databricks and Adobe architecture, you will collaborate across business domains (CRM, Marketing, Digital Marketing, Finance, etc.) and IT teams to architect, implement, and optimize data solutions. Your primary focus will be on designing and maintaining scalable data pipelines and integrations leveraging Databricks, Adobe Experience Platform (AEP), and related Azure cloud technologies.</p><p><br></p><p><strong><em>Location </em></strong>= Hybrid Role that includes 3 days onsite/week in the San Fernando Valley</p><p><strong><em>Type </em></strong>= Full Time Permanent Employment</p><p><br></p>
<p>We are seeking an innovative <strong>AI Engineer</strong> to design, develop, and deploy artificial intelligence solutions that drive business transformation. This role involves working with machine learning models, natural language processing, and automation technologies to solve complex problems and enhance operational efficiency. The ideal candidate is passionate about AI, has strong programming skills, and thrives in a collaborative environment.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Design and implement AI and machine learning models for business applications.</li><li>Develop algorithms for predictive analytics, NLP, and computer vision.</li><li>Collaborate with data scientists and engineers to integrate AI solutions into existing systems.</li><li>Optimize models for performance, scalability, and accuracy.</li><li>Conduct research on emerging AI technologies and recommend innovative solutions.</li><li>Ensure compliance with data privacy and security standards.</li><li>Document processes, models, and workflows for future reference.</li></ul><p><br></p><p><br></p><p><br></p>
<p>Our client is seeking a <strong>Lead Engineer / Senior Lead Engineer</strong> to join their technology and systems team. This role sits at the intersection of <strong>technology, operations, and investment strategy</strong>, and will directly influence how the company manages and grows its portfolio.</p><p><br></p><p>Unlike traditional engineering roles, this position is <strong>deeply embedded with business teams</strong>—you’ll work side-by-side with Asset Management, Property Operations, and Construction leaders to design and deploy tools that directly improve operational performance and financial outcomes.</p><p>This is a <strong>hands-on builder role</strong> for someone who thrives on turning complex business needs into real, usable software solutions. You’ll write code, automate workflows, integrate data systems, and build analytics dashboards—all while collaborating closely with executives and operational leaders.</p><p>The company operates a large, vertically integrated portfolio across the Western U.S. and partners with major institutional investors. Its culture emphasizes <strong>teamwork, integrity, innovation, and excellence</strong>—valuing individuals who take ownership and drive measurable impact.</p><p><br></p><p><strong>Primary Responsibilities</strong></p><ul><li><strong>Partner with business teams:</strong> Collaborate directly with property management, asset management, and finance teams to identify pain points, inefficiencies, and opportunities for automation.</li><li><strong>Rapid prototyping & deployment:</strong> Design and deliver data tools, dashboards, and automations that drive immediate results.</li><li><strong>Systems integration:</strong> Connect and streamline workflows across platforms such as Yardi, Zendesk, Palantir Foundry, Excel/Sheets, and internal databases.</li><li><strong>Operational analytics:</strong> Develop analytical models to support forecasting, expense tracking, and portfolio performance analysis.</li><li><strong>AI & automation:</strong> Implement AI-based tools to optimize leasing, maintenance, and communication workflows.</li><li><strong>Measure business impact:</strong> Quantify improvements in efficiency, cost reduction, and performance resulting from technical solutions.</li><li><strong>Cross-functional collaboration:</strong> Translate between business and technical teams to ensure that solutions are practical, scalable, and aligned with business goals.</li></ul><p><br></p>
<p>We are seeking a highly skilled Database Engineer to support enterprise data modernization and migration efforts. This role focuses on evaluating, transforming, and managing data systems with a specialization in database migrations, particularly between on-premises and cloud platforms. Ideal candidates will be comfortable working with various data sources, building scalable integration solutions, and ensuring data integrity and performance throughout the process.</p><p>This position also includes a significant focus on designing and executing data migration strategies, documenting architecture, and ensuring compliance with data governance and security protocols.</p><p><br></p><p><strong>Core Responsibilities</strong></p><ul><li>Analyze existing database architectures to scope and plan efficient data migration strategies.</li><li>Develop comprehensive migration plans including risk assessments, resource allocation, and delivery timelines.</li><li>Execute data extraction, transformation, and loading (ETL) processes between systems.</li><li>Apply schema modifications and performance tuning during migrations.</li><li>Maintain data integrity, consistency, and security throughout all phases of migration.</li><li>Conduct thorough testing and validation to ensure migrated systems meet technical and business performance standards.</li><li>Identify and resolve issues during and after migration to ensure continued operational success.</li><li>Collaborate with business, IT, and vendor teams to align migration objectives with broader business goals.</li><li>Maintain clear and complete documentation for migration processes, configuration changes, and architecture.</li><li>Ensure all activities meet industry standards, internal policies, and data protection regulations.</li><li>Provide post-migration support and troubleshoot any arising database issues.</li><li>Analyze performance metrics post-migration and implement enhancements as needed.<strong></strong></li></ul>
<p>The Machine Learning Engineer will be responsible for creating and implementing scalable machine learning models to support our business objectives. You will collaborate with data scientists, engineers, and product teams to build intelligent systems that process large-scale data and provide actionable insights.</p><p> </p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and optimize machine learning models and algorithms for real-world applications.</li><li>Build and maintain scalable data pipelines to preprocess, clean, and analyze large datasets.</li><li>Deploy, monitor, and maintain machine learning models in production environments.</li><li>Collaborate with cross-functional teams to understand business requirements and translate them into ML solutions.</li><li>Conduct experiments and performance evaluations to improve model accuracy, efficiency, and scalability.</li><li>Leverage cloud platforms (AWS, Azure, GCP) and tools for model deployment and MLOps.</li><li>Document processes, methodologies, and results for reproducibility and knowledge sharing.</li><li>Stay up-to-date with the latest advancements in AI/ML technologies and incorporate them into projects as needed.</li></ul><p><br></p>
<p>Quick moving contract to perm opening with hybrid in-office plus remote type schedule. Expectations would be in our client's Tampa office 2 to 3 days per week in mentoring Jr Data Engineer </p><p><br></p><p>If applying, please make sure you have at least 5 years experience in Power BI, ETL Development, Snowflake and Azure. If you have Sigma reporting that will be a huge plus as our client is going that direction with their reporting initiatives. Healthcare background also a strong preference to understand our client technical work flows. </p><p><br></p><p>We are seeking an experienced Senior Data Engineer with 5+ years of hands-on experience to join our dynamic Data Engineering team. In this role, you will design, build, and optimize scalable data pipelines and analytics solutions in a fast-paced healthcare environment. You will play a pivotal role in enabling real-time insights for healthcare stakeholders, ensuring data integrity, compliance with HIPAA and other regulations, and seamless integration across multi-cloud ecosystems.</p><p><br></p><p>Key Responsibilities</p><p><br></p><p>Architect and implement end-to-end ETL/ELT pipelines using Azure Data Factory, Snowflake, and other tools to ingest, transform, and load healthcare data (e.g., EHR, claims, patient demographics) from diverse sources.</p><p>Design and maintain scalable data warehouses in Snowflake, optimizing for performance, cost, and healthcare-specific querying needs.</p><p>Develop interactive dashboards and reports in Power BI to visualize key healthcare metrics, such as patient outcomes, readmission rates, and resource utilization.</p><p>Collaborate with cross-functional teams (data scientists, analysts, clinicians) to translate business requirements into robust data solutions compliant with HIPAA, GDPR, and HITRUST standards.</p><p>Lead data modeling efforts, including dimensional modeling for healthcare datasets, ensuring data quality, governance, and lineage.</p><p>Integrate Azure services (e.g., Synapse Analytics, Databricks, Blob Storage) to build secure, high-availability data platforms.</p><p>Mentor junior engineers, conduct code reviews, and drive best practices in CI/CD pipelines for data engineering workflows.</p><p>Troubleshoot and optimize data pipelines for performance in high-volume healthcare environments (e.g., processing millions of claims daily).</p><p>Stay ahead of industry trends in healthcare data analytics and contribute to strategic initiatives like AI/ML integration for predictive care models.</p><p><br></p><p><br></p><p><br></p>
<p>We are looking for a skilled Software Engineer to join our team in Middletown, Ohio. This role offers a unique opportunity to contribute to the development of high-quality software solutions that drive efficiency and innovation within our organization. The successful candidate will focus on creating reliable data pipelines, customizing software systems, and improving business decision-making processes through advanced technology.</p><p><br></p><p>Responsibilities:</p><p>• Develop and implement robust data pipelines and automation processes, ensuring accurate data flow across the enterprise.</p><p>• Customize software systems, including targeted screen updates, workflow changes, and system enhancements using C# and .NET technologies.</p><p>• Build and extend functionalities for web services, endpoints, and Generic Inquiries to enhance system performance and usability.</p><p>• Monitor system performance and establish dashboards, alerts, and runbooks to ensure smooth operations and timely issue resolution.</p><p>• Execute cutovers and provide hypercare support, including performance tuning and rapid resolution of defects.</p><p>• Collaborate with team members to identify high-impact use cases for AI and technology improvements.</p><p>• Create proofs of concept and scale successful pilots into maintainable features that enhance business efficiency.</p><p>• Design and maintain data models, ensuring clean reconciliations and effective handling of historical changes in reference data.</p><p>• Utilize modern orchestration and transformation tools to optimize data migration and system integrations.</p><p>• Stay updated on emerging technologies and incorporate AI-assisted coding tools to improve development processes.</p>
We are looking for an experienced Data/ETL Engineer to join our team in Malvern, Pennsylvania. This role involves designing and implementing robust data pipelines to support business needs, while also contributing to AI-driven strategies through data preparation and insights. The ideal candidate will have strong expertise in data engineering, ETL processes, and machine learning models.<br><br>Responsibilities:<br>• Develop scalable ETL processes to extract, transform, and load data from diverse sources into a centralized data warehouse.<br>• Troubleshoot and resolve system and data-related errors to maintain data quality and integrity.<br>• Collaborate with stakeholders to translate business requirements into efficient data solutions.<br>• Clean, transform, and prepare large datasets for machine learning applications, addressing issues such as missing values and inconsistencies.<br>• Apply feature engineering techniques, including data scaling, normalization, and encoding, to optimize dataset usability.<br>• Design and deploy machine learning models, integrating AI solutions into existing systems.<br>• Perform model evaluation, hyperparameter tuning, and cross-validation to ensure optimal performance.<br>• Utilize Python and relevant libraries such as Pandas, NumPy, and TensorFlow to implement data processing workflows.<br>• Work within Agile teams to review user stories, estimate tasks, and contribute to sprint deliverables.<br>• Leverage cloud-based resources, including serverless tools like AWS Lambda and S3, to enhance data engineering processes.
We are looking for a skilled Generative AI Analyst/Prompt Engineer to join our team on a contract basis in New York, New York. In this role, you will leverage your expertise in data science and artificial intelligence to develop and optimize innovative solutions. This position is an excellent opportunity to work in the financial services industry while applying cutting-edge AI technologies to real-world challenges.<br><br>Responsibilities:<br>• Design, implement, and optimize AI-driven models and solutions tailored to business needs.<br>• Develop and maintain scalable data pipelines and workflows using tools like Apache Kafka, Spark, and Hadoop.<br>• Write and debug Python code, utilizing Jupyter Notebooks for effective data analysis and visualization.<br>• Collaborate with cross-functional teams to integrate AI models into existing systems and workflows.<br>• Research and experiment with state-of-the-art generative AI techniques to improve system performance.<br>• Build and manage APIs to facilitate seamless communication between applications and AI models.<br>• Utilize cloud-based technologies, including AWS, to deploy, scale, and monitor AI solutions.<br>• Visualize complex data insights through effective data visualization techniques and tools.<br>• Implement algorithms that solve specific business problems in a robust and efficient manner.
<p>We are looking for a skilled Data Warehouse Engineer to join our team in Malvern, Pennsylvania. This Contract-to-Permanent position offers the opportunity to work with cutting-edge data technologies and contribute to the optimization of data processes. The ideal candidate will have a strong background in Azure and Snowflake, along with experience in data integration and production support. This role is 4-days onsite a WEEK, with no negotiations. Please apply directly if you're interested.</p><p><br></p><p>Responsibilities:</p><p>• Develop, configure, and optimize Snowflake-based data solutions to meet business needs.</p><p>• Utilize Azure Data Factory to design and implement efficient ETL processes.</p><p>• Provide production support by monitoring and managing data workflows and tasks.</p><p>• Extract and analyze existing code from Talend to facilitate system migrations.</p><p>• Stand up and configure data repository processes to ensure seamless performance.</p><p>• Collaborate on the migration from Talend to Azure Data Factory, providing expertise on best practices.</p><p>• Leverage Python scripting to enhance data processing and automation capabilities.</p><p>• Apply critical thinking to solve complex data challenges and support transformation initiatives.</p><p>• Maintain and improve Azure Fabric-based solutions for data warehousing.</p><p>• Work within the context of financial services, ensuring compliance with industry standards.</p>
Position: SENIOR DATA INTEGRATION ENGINEER - Architect the backbone of a Mobile-First Digital Experience<br>Location: REMOTE<br>Salary: UPTO $175K + BONUS + EXCEPTIONAL BENEFITS<br><br>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. ***<br><br>A nationally recognized company with a long-standing legacy of success is launching a bold new digital initiative—and this is your opportunity to help build it from the ground up.<br>With full executive support and the resources of a Fortune 500 parent, this newly formed department is creating a mobile-first product from scratch. It’s a greenfield, 0-to-1 launch with the pace and creativity of a startup, but with the stability and funding already secured. The first MVP is nearing launch, and we’re assembling a team of 20 innovators to bring it to life.<br>We’re looking for a Senior Data Integration Engineer to lead the design and development of a scalable, secure, and high-performing data ecosystem. You’ll play a critical role in building a custom Customer Data Platform (CDP), integrating internal and external systems, and enabling real-time personalization, analytics, and enterprise-wide insights.<br>What You’ll Do<br> • Build and maintain scalable data pipelines using Python and cloud-native tools.<br> • Design and implement a robust CDP to unify customer data across platforms.<br> • Develop APIs and data workflows for real-time and batch integrations.<br> • Optimize data environments for performance, security, and reliability.<br> • Collaborate with cross-functional teams to define data requirements and integration strategies.<br> • Implement data governance and privacy practices to ensure compliance.<br> • Leverage CI/CD pipelines and automation to streamline deployment and maintenance.<br> • Document architecture, data flows, and integration patterns for scalability.<br>What You Bring<br> • 10+ years of experience with RESTful APIs, SQL, Python, or PowerShell.<br> • 5+ years working with cloud platforms (Azure, AWS, GCP) and cloud-native tools (ADF, Glue, dbt, Airflow).<br> • Proven experience building or integrating with CDPs (Segment, Salesforce CDP, Adobe Experience Platform, or custom solutions).<br> • Experience with data integration tools (Apache NiFi, Talend, Informatica).<br> • Familiarity with event-driven architectures, message queues (Kafka), and microservices.<br> • Strong understanding of data privacy, security, and governance.<br> • Bonus: Experience with Snowflake, BigQuery, Databricks, and Infrastructure as Code tools (Terraform, Pulumi, CloudFormation).<br><br><br>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. Also, you may contact me by office: 515-303-4654 or mobile: 515-771-8142. Or one click apply on our Robert Half website. No third party inquiries please. Our client cannot provide sponsorship and cannot hire C2C. ***
Position: SENIOR DATA INTEGRATION ENGINEER - ENTERPRISE DIGITAL TEAM<br> Location: REMOTE<br> Salary: UP TO $175,000 BASE + BONUS + BENEFITS<br> <br> *** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. ***<br> About the Role:<br> Join a newly launched Enterprise Digital team building cloud-native, mobile-first solutions from the ground up. You’ll design and implement a Customer Data Platform (CDP) and API frameworks to unify data across 45 operating companies.<br> Key Responsibilities:<br> • Develop scalable data pipelines and APIs using Python, SQL, and modern data tools.<br> • Build and optimize integrations across internal and third-party systems.<br> • Implement data governance, security, and compliance practices.<br> • Collaborate with product, engineering, and business teams on integration strategies.<br> Requirements:<br> • 10+ years with RESTful APIs and SQL/Python.<br> • 5+ years with cloud platforms (AWS, Azure, GCP) and tools (ADF, Glue, dbt, Airflow).<br> • Experience with CDPs (Segment, Salesforce CDP, Adobe Experience Platform).<br> • Familiarity with ETL/ELT, event-driven architectures, Kafka, microservices.<br> Nice to Have:<br> • IaC tools (Terraform, CloudFormation), data warehousing (Snowflake, BigQuery).<br> Why Join:<br> • Ground-floor opportunity in a fast-growing digital team.<br> • Influence architecture and technology decisions.<br> <br> *** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. Also, you may contact me by office: 515-303-4654 or mobile: 515-771-8142. Or one click apply on our Robert Half website. No third party inquiries please. Our client cannot provide sponsorship and cannot hire C2C. ***
Position: SENIOR DATA INTEGRATION ENGINEER - Architect the backbone of a Mobile-First Digital Experience<br>Location: REMOTE<br>Salary: UPTO $175K + BONUS + EXCEPTIONAL BENEFITS<br><br>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. ***<br><br>A nationally recognized company with a long-standing legacy of success is launching a bold new digital initiative—and this is your opportunity to help build it from the ground up.<br>With full executive support and the resources of a Fortune 500 parent, this newly formed department is creating a mobile-first product from scratch. It’s a greenfield, 0-to-1 launch with the pace and creativity of a startup, but with the stability and funding already secured. The first MVP is nearing launch, and we’re assembling a team of 20 innovators to bring it to life.<br>We’re looking for a Senior Data Integration Engineer to lead the design and development of a scalable, secure, and high-performing data ecosystem. You’ll play a critical role in building a custom Customer Data Platform (CDP), integrating internal and external systems, and enabling real-time personalization, analytics, and enterprise-wide insights.<br>What You’ll Do<br> • Build and maintain scalable data pipelines using Python and cloud-native tools.<br> • Design and implement a robust CDP to unify customer data across platforms.<br> • Develop APIs and data workflows for real-time and batch integrations.<br> • Optimize data environments for performance, security, and reliability.<br> • Collaborate with cross-functional teams to define data requirements and integration strategies.<br> • Implement data governance and privacy practices to ensure compliance.<br> • Leverage CI/CD pipelines and automation to streamline deployment and maintenance.<br> • Document architecture, data flows, and integration patterns for scalability.<br>What You Bring<br> • 10+ years of experience with RESTful APIs, SQL, Python, or PowerShell.<br> • 5+ years working with cloud platforms (Azure, AWS, GCP) and cloud-native tools (ADF, Glue, dbt, Airflow).<br> • Proven experience building or integrating with CDPs (Segment, Salesforce CDP, Adobe Experience Platform, or custom solutions).<br> • Experience with data integration tools (Apache NiFi, Talend, Informatica).<br> • Familiarity with event-driven architectures, message queues (Kafka), and microservices.<br> • Strong understanding of data privacy, security, and governance.<br> • Bonus: Experience with Snowflake, BigQuery, Databricks, and Infrastructure as Code tools (Terraform, Pulumi, CloudFormation).<br><br><br>*** For immediate and confidential consideration, please send a message to MEREDITH CARLE on LinkedIn or send an email to me with your resume. My email can be found on my LinkedIn page. Also, you may contact me by office: 515-303-4654 or mobile: 515-771-8142. Or one click apply on our Robert Half website. No third party inquiries please. Our client cannot provide sponsorship and cannot hire C2C. ***
<p>We are looking for an experienced <strong>Azure Data Architect</strong> to join our global technology team. This role plays a key part in designing and implementing scalable cloud-based data solutions that support enterprise integration, analytics, and transformation initiatives. The successful candidate will be skilled in working across multiple data sources, designing robust data pipelines, and supporting cloud data architecture with a focus on automation, performance, and governance.</p><p>This position focuses heavily on integrating and curating data from various systems into an Azure-based data environment. You'll collaborate with cross-functional teams to develop standards and frameworks for scalable ingestion, cleaning, transformation, and modeling across large datasets.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Design and implement optimized data pipelines and architecture for efficient ingestion, transformation, and delivery across a wide range of data sources.</li><li>Analyze data quality and perform detailed investigations across structured and unstructured datasets.</li><li>Partner with data engineers and solution architects to apply best practices in data modeling, test design, validation, and integration.</li><li>Build and scale data platforms that meet both functional and non-functional requirements.</li><li>Develop curated and industrialized datasets to support analytical use cases and business intelligence reporting.</li><li>Document data architecture diagrams, lineage, metadata, and standards across enterprise systems.</li><li>Identify and design cloud-based solutions for migrating and hosting legacy on-prem datasets in a scalable, secure environment.</li><li>Work closely with business stakeholders to align data architecture to organizational objectives and operational needs.</li></ul><p><br></p>
We are looking for a skilled Senior C++ Software Engineer to join our client's team. This is a contract to permanent position and can be remote. This role focuses on developing innovative solutions for data recovery and incident response in critical situations, such as ransomware and malware attacks. The ideal candidate will possess strong technical expertise and a collaborative mindset, working closely with cross-functional teams to deliver impactful results. <br> Responsibilities: • Design and implement advanced data recovery solutions to address ransomware and malware incidents. • Analyze corrupt files to identify potential recovery methods, including reconstruction or data reversal. • Conduct reverse engineering of malware and ransomware to support threat detection and mitigation. • Collaborate with technical teams, including the Lead Developer and Customer Success Team, to prioritize and execute response strategies. • Participate in regular team meetings to ensure alignment and progress on active projects. • Solve complex technical challenges related to data recovery and security threats with innovative approaches. • Contribute to the development of tools and strategies that enhance threat detection capabilities. • Maintain clear communication with distributed teams to ensure seamless collaboration.
We are looking for a highly skilled Staff Software Engineer to join our Growth team in San Francisco, California. In this role, you will play a key part in enhancing our product's impact by developing and optimizing systems that drive user acquisition, engagement, retention, and monetization. You will collaborate closely with cross-functional teams, including engineering, product, design, and data, to deliver innovative solutions that significantly influence the company's growth.<br><br>Responsibilities:<br>Design and implement features across the full technology stack to enhance user acquisition, engagement, and retention.<br>Develop and optimize server-side event forwarding and maintain a clean, consistent event schema across platforms.<br>Ensure robust identity and attribution systems, enabling seamless tracking and data governance across web and app platforms.<br>Collaborate with designers to create and refine high-performance landing pages and micro-tools that improve conversion rates.<br>Work with cross-functional teams to identify impactful opportunities and conduct A/B testing to validate and implement growth strategies.<br>Enhance data pipelines to ensure accurate and reliable measurement for growth experiments and attribution.<br>Maintain and improve backend systems, including APIs and authentication processes.<br>Leverage technical expertise to automate processes and streamline data quality assurance tasks.
<p><strong>Title: Senior Backend Engineer (Remote – West Coast Required)</strong></p><p> <strong>Salary:</strong> Up to $200K + 15% Bonus</p><p> </p><p><strong>Overview</strong></p><p>We’re seeking a <strong>Senior Backend Engineer</strong> to join a fast-growing ecommerce technology team. This role is ideal for a hands-on developer who thrives in data-intensive, high-traffic environments and enjoys owning mission-critical backend systems. You’ll play a key role in building and optimizing inventory management and ecommerce integrations that ensure real-time accuracy across platforms.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Contribute as an individual developer on core backend systems for a large-scale ecommerce platform</li><li>Design and build microservices and RESTful APIs using <strong>Python (Django)</strong> and AWS infrastructure</li><li>Develop and deploy <strong>serverless applications (Lambda)</strong> and other AWS components for scalability and reliability</li><li>Collaborate with DevOps to implement CI/CD pipelines, automate testing, and improve local development environments</li><li>Partner with product and engineering leaders to deliver roadmap initiatives and technical improvements</li><li>Maintain system performance and data integrity for inventory synchronization across services</li><li>Participate in PR reviews, testing, and on-call rotations to ensure production stability</li></ul><p><br></p><p><strong>Requirements</strong></p><ul><li><strong>5–7+ years</strong> of backend development experience with <strong>Python</strong></li><li>Proficiency with <strong>Django</strong> (Flask experience acceptable)</li><li>Strong understanding of <strong>AWS infrastructure</strong> and DevOps practices</li><li>Experience in <strong>high-traffic ecommerce environments</strong> or similar large-scale systems</li><li>Familiarity with <strong>ETL, AWS Glue, Athena, S3, and PostgreSQL</strong></li><li>Experience building and securing RESTful APIs with <strong>OAuth/JWT</strong></li><li>Strong debugging skills and ability to work independently with minimal guidance</li><li>Excellent attention to detail and commitment to writing clean, maintainable code</li></ul><p>For immediate consideration, direct message Reid Gormly on LinkedIN Today!</p>
<p>We are looking for a highly skilled Senior Full-Stack Engineer to join our team in Plano / Frisco, Texas. This role involves designing and building innovative web and mobile applications, focusing on delivering scalable and efficient solutions. The ideal candidate will have extensive experience in full-stack development and a deep understanding of modern technologies, including e-commerce platforms and cloud-native systems.</p><p><br></p><p>Responsibilities:</p><p>• Develop and maintain robust full-stack applications for web and mobile platforms with JacaScript, React or React Native, Node, Python and PHP (OOP).</p><p>• Design scalable APIs, microservices, and event-driven systems to ensure optimal performance.</p><p>• Collaborate with Agile/Scrum teams to create high-quality, user-centric software solutions.</p><p>• Optimize and integrate data layers, tag management systems, and analytics platforms.</p><p>• Implement and enhance e-commerce solutions using Salesforce Commerce Cloud or similar frameworks.</p><p>• Set up and maintain CI/CD pipelines to streamline deployment processes.</p><p>• Leverage Cloudflare Workers to improve edge computing and system performance.</p><p>• Conduct thorough code reviews and provide mentorship to entry-level developers.</p><p>• Establish and enforce technical best practices across the team.</p><p>• Stay updated on emerging technologies and apply them to improve development processes.</p>
<p><strong>Job Title:</strong> Java/J2EE Developer </p><p><strong>Employment Type:</strong> 13 Week Contract to Hire</p><p><strong>Position Type:</strong> Individual Contributor</p><p><strong>Location: </strong>Columbus, OH (Hybrid 4x a week)</p><p><strong>Overview</strong></p><p>We are seeking a highly skilled <strong>IS Technical Specialist III</strong> to provide advanced technical and consultative support on complex systems and applications. This role involves designing, developing, and optimizing enterprise systems based on user specifications, troubleshooting hardware and software issues, and ensuring best practices in system development and support. The ideal candidate will have strong leadership capabilities, technical expertise, and the ability to work independently while collaborating with cross-functional teams.</p><p><strong>Key Responsibilities</strong></p><ul><li>Analyze, design, and develop systems based on user requirements and specifications.</li><li>Provide technical assistance and resolve complex hardware and software issues.</li><li>Lead projects and serve as a technical resource for team members and stakeholders.</li><li>Maintain in-depth knowledge of industry trends and emerging technologies to recommend innovative solutions.</li><li>Assist in identifying training needs and mentor less experienced staff as required.</li><li>Ensure adherence to best practices in system architecture, security, and performance optimization.</li></ul>
We are looking for an experienced Senior AI Engineer to join our team in San Francisco, California. In this role, you will leverage cutting-edge technologies to design and implement advanced AI solutions, ensuring optimal performance and scalability. This position offers the opportunity to work on innovative projects that push the boundaries of artificial intelligence.<br><br>Responsibilities:<br>• Develop and optimize AI models and systems to meet business objectives.<br>• Collaborate with cross-functional teams to integrate AI technologies into existing workflows.<br>• Design and implement algorithms using Python and other programming languages.<br>• Utilize Agentic AI and LangGraph to create intelligent solutions tailored to client needs.<br>• Conduct thorough testing and validation to ensure the accuracy and reliability of AI models.<br>• Analyze complex datasets to uncover insights and improve system performance.<br>• Stay updated on industry trends and advancements, applying them to enhance project outcomes.<br>• Provide technical guidance and mentorship to less experienced team members.<br>• Document processes and solutions to maintain transparency and facilitate knowledge sharing.
<p>Job Summary:</p><p><br></p><p>We are seeking a skilled and motivated System Engineer to join our team in New York City. As a System Engineer, you will be responsible for maintaining and optimizing our IT infrastructure, ensuring the smooth operation of our systems, and providing technical support to end-users. The ideal candidate should have 5+ years of experience in system administration, with a strong focus on Azure, Windows, Active Directory, VMware, Barracuda Backups Appliance, SAN Nimble storage, and MFT/SFTP technologies.</p><p><br></p><p><br></p><p><br></p><p>Responsibilities:</p><p><br></p><p>Manage and maintain the company's IT infrastructure, including servers, storage systems, network devices, and related components.</p><p>Monitor and troubleshoot system performance, ensuring high availability and reliability of all systems.</p><p>Configure and administer Azure cloud services, including virtual machines, storage, networking, and security.</p><p>Oversee the Windows server environment, including installation, configuration, and maintenance of servers and services.</p><p>Manage Active Directory, including user accounts, group policies, security permissions, and domain services.</p><p>Perform virtualization tasks using VMware, including server provisioning, virtual machine management, and troubleshooting.</p><p>Administer and monitor Barracuda Backups Appliance for data backup and recovery operations.</p><p>Maintain and support SAN Nimble storage systems, ensuring optimal performance and availability.</p><p>Collaborate with cross-functional teams to implement and maintain secure file transfer protocols (MFT) and secure file transfer protocol (SFTP) solutions.</p><p>Perform system upgrades, patches, and security updates in accordance with industry best practices.</p><p>Provide technical support to end-users, resolving issues related to hardware, software, and network connectivity.</p><p>Create and maintain documentation related to system configurations, procedures, and troubleshooting guides.</p><p><br></p><p><br></p><p><br></p>
<p>Robert Half is seeking a <strong>Backend Engineer</strong> to support a <strong>Financial Services/Insurance</strong> organization based in <strong>Remote (EST or PST preferred)</strong>. This role involves supporting <strong>underwriting and domain integration projects, building new APIs, and modernizing backend systems</strong>. The position is <strong>Remote</strong>, offered as a <strong>6+ month contract</strong> opportunity with <strong>potential to extend/convert</strong>. Apply today!</p><p><br></p><p><strong>Job Details:</strong></p><ul><li><strong>Schedule:</strong> Standard 40 hours/week (EST or PST acceptable)</li><li><strong>Duration:</strong> 6+ month contract (potential extension)</li><li><strong>Location:</strong> Remote (EST or PST preferred)</li></ul><p><br></p><p><strong>Job Responsibilities:</strong></p><ul><li>Write secure, performant backend code in Java (primary), C#, Node.js, or Python</li><li>Build and maintain new APIs (including 3rd-party integrations and authentication) as well as existing services</li><li>Participate in system design and architecture reviews to ensure scalability and alignment with enterprise standards</li><li>Develop and deploy cloud-native, event-driven, serverless applications in AWS</li><li>Support CI/CD pipeline automation and deployments (Azure DevOps, infrastructure-as-code modifications)</li><li>Manage and optimize SQL/Postgres data storage and ETL workflows; limited NoSQL/DynamoDB exposure</li><li>Provide root cause analysis and production support for backend issues</li><li>Mentor peers and promote best practices in development, DevOps, and testing</li></ul><p><br></p>