<p>Robert Half is hiring a highly skilled and innovative Intelligent Automation Engineer to design, develop, and deploy advanced automation solutions using Microsoft Power Automate, Python, and AI technologies. This role is ideal for a hands-on technologist passionate about streamlining business processes, integrating systems, and applying cutting-edge AI to drive intelligent decision-making. This role is a hybrid position based in Philadelphia. For consideration, please apply directly. </p><p><br></p><p>Key Responsibilities</p><ul><li>Design and implement end-to-end automation workflows using Microsoft Power Automate (Cloud & Desktop).</li><li>Develop Python scripts and APIs to support automation, system integration, and data pipeline management.</li><li>Integrate Power Automate with Azure services (Logic Apps, Functions, AI Services, App Insights) and enterprise platforms such as SharePoint, Dynamics 365, and Microsoft Teams.</li><li>Apply Generative AI, LLMs, and Conversational AI to enhance automation with intelligent, context-aware interactions.</li><li>Leverage Agentic AI frameworks (LangChain, AutoGen, CrewAI, OpenAI Function Calling) to build dynamic, adaptive automation solutions.</li></ul>
<p>We’re hiring a C++ Engineer to build high-performance, real-time software systems where speed, reliability, and precision matter. You’ll work on complex systems including simulations, processing engines, and performance-critical applications.</p><p>What You’ll Do</p><ul><li>Develop and optimize C++ applications for performance and scalability</li><li>Work on multi-threaded and real-time systems</li><li>Debug complex production issues and optimize memory and CPU usage</li><li>Collaborate with cross-functional teams to design robust architectures</li><li>Write clean, maintainable, and testable code</li></ul><p><br></p>
We are looking for a skilled Software Engineer to join our team in Bethlehem, Pennsylvania. This role involves designing and optimizing data systems, managing tools for data orchestration, and ensuring secure and efficient operations. The ideal candidate will thrive in a collaborative environment while delivering impactful solutions for business intelligence and operations.<br><br>Responsibilities:<br>• Build and manage data orchestration tools, including creating variables, setting notifications, and configuring retries.<br>• Optimize Snowflake performance by adjusting warehouse sizing, clustering, and profiling queries.<br>• Schedule and oversee near real-time data loads using Snowflake Tasks and Streams.<br>• Implement rigorous data quality checks such as verifying freshness, row counts, and referential integrity.<br>• Monitor and control costs through usage dashboards and guardrails.<br>• Ensure secure operations by maintaining roles, managing secrets, and auditing logs.<br>• Develop and monitor Power BI datasets to support Finance and Operations teams.<br>• Collaborate with stakeholders to gather requirements and deliver tailored solutions.<br>• Enhance and maintain front-end data applications using tools like Streamlit and Python.<br>• Create detailed documentation, including runbooks, root cause analyses, and change tickets for releases.
<p>We are looking for a skilled Data Engineer to design and enhance scalable data solutions that meet diverse business objectives. This role involves collaborating with cross-functional teams to identify data requirements, improve existing pipelines, and ensure efficient data processing. The ideal candidate will bring expertise in server-side development, database management, and software deployment, working in a dynamic and fast-paced environment.</p><p><br></p><p>Responsibilities</p><ul><li>Enhance and optimize existing data storage platforms, including relational and NoSQL databases, to improve data accessibility, performance, and persistence</li><li>Apply advanced database techniques such as tuning, indexing, views, and stored procedures to support efficient and reliable data management</li><li>Develop server-side Python services utilizing concurrency patterns such as asynchronous programming and multi-threading, and leveraging libraries such as NumPy and Pandas</li><li>Design, build, and maintain APIs using modern frameworks, with experience across communication protocols including gRPC and socket-based implementations</li><li>Create, manage, and maintain CI/CD pipelines using DevOps and artifact management tools to enable efficient and reliable software delivery</li><li>Design and deploy applications in enterprise Linux environments, ensuring stability, performance, and scalability</li><li>Partner with cross-functional teams to gather requirements and deliver technical solutions aligned with business objectives</li><li>Follow software development lifecycle best practices to ensure high-quality, maintainable, and secure solutions</li><li>Work effectively in iterative, fast-paced development environments while consistently delivering high-quality outcomes on schedule</li></ul><p><br></p>
<p>We are seeking a skilled <strong>Front‑End Engineer</strong> to help modernize a suite of legacy enterprise applications. This role focuses on <strong>rebuilding and enhancing user interfaces</strong> by porting existing MVC‑based applications to a modern, component‑driven front‑end architecture using <strong>React and Next.js</strong>.</p><p>While the current environment includes <strong>.NET MVC applications</strong>, prior .NET experience is <em>not</em> required. The priority is strong front‑end engineering expertise and a passion for improving usability, performance, and maintainability of legacy systems.</p><p>What You’ll Be Working On</p><ul><li>Rebuilding legacy UI layers and migrating them to modern <strong>React / Next.js</strong> applications</li><li>Modernizing user interfaces to align with current UX, accessibility, and performance standards</li><li>Collaborating with backend, product, and design teams to define UI requirements and integration points</li><li>Translating existing MVC views and workflows into reusable, scalable front‑end components</li><li>Improving application responsiveness, maintainability, and overall user experience</li><li>Helping establish front‑end best practices and standards as part of the modernization effort</li></ul><p><br></p>
We are looking for an experienced Level 2 Helpdesk Engineer to join our dynamic team in New York, New York. In this role, you will provide advanced technical support, troubleshoot complex issues, and ensure smooth operations across various IT environments. This position is ideal for professionals who excel in problem-solving and are passionate about delivering exceptional service in a fast-paced, hands-on environment.<br><br>Responsibilities:<br>• Provide Level 2 technical support to diagnose and resolve complex IT issues across infrastructure, cloud services, and endpoint management.<br>• Troubleshoot and manage Microsoft 365 applications, including Exchange Online, Teams, SharePoint, and OneDrive.<br>• Administer Windows Server and Windows OS environments, ensuring optimal performance and reliability.<br>• Support VMware vSphere and vCenter environments, including setup and maintenance.<br>• Design and implement Azure and hybrid cloud solutions to meet client needs.<br>• Utilize PowerShell to automate tasks and streamline IT operations.<br>• Deploy and manage endpoint solutions using Microsoft Intune and Windows Autopilot.<br>• Collaborate with Level 1 engineers to provide guidance and mentorship.<br>• Maintain detailed documentation of processes, solutions, and technical standards.<br>• Participate in root cause analysis and implement long-term fixes to recurring issues.
We are looking for a skilled End User Support Engineer to manage and maintain macOS devices within our enterprise environment in Jacksonville, Florida. This position focuses on utilizing Jamf Pro for endpoint management and ensuring all systems meet organizational standards for security, compliance, and performance. The ideal candidate will have hands-on experience with macOS administration and a strong ability to provide advanced technical support and automation solutions.<br><br>Responsibilities:<br>• Administer and configure Jamf Pro for managing the lifecycle of macOS devices, including enrollment, updates, and compliance enforcement.<br>• Develop and implement policies, scripts, and profiles to streamline Mac management and enhance security.<br>• Collaborate with the Information Security team to ensure macOS devices meet compliance standards for encryption, updates, and application controls.<br>• Provide tier-3 technical support for macOS-related issues, including hardware, operating system, and applications.<br>• Integrate Jamf Pro with enterprise systems such as Intune and identity management tools to enhance functionality.<br>• Create and maintain detailed documentation, standard operating procedures, and knowledge articles for Mac support processes.<br>• Monitor device health, compliance status, and update progress through Jamf Pro and related tools.<br>• Plan and execute macOS upgrade cycles and manage application deployment strategies.<br>• Coordinate with vendors for Apple hardware procurement and lifecycle management.
We are looking for a highly skilled Software Engineer specializing in Angular and UI/UX design to join our team in West Des Moines, Iowa. In this long-term contract position, you will play a crucial role in enhancing user experience and developing intuitive front-end solutions. This opportunity is ideal for professionals passionate about crafting innovative and efficient interfaces.<br><br>Responsibilities:<br>• Design and implement dynamic front-end solutions using Angular to support user-centric applications.<br>• Collaborate with stakeholders to develop and refine user interface designs that enhance usability and engagement.<br>• Ensure seamless integration of UI/UX elements with back-end systems for optimal performance.<br>• Utilize agile methodologies to deliver high-quality software solutions within set timelines.<br>• Conduct thorough testing and debugging to ensure the reliability and functionality of applications.<br>• Participate in code reviews and provide constructive feedback to improve development processes.<br>• Stay updated on emerging trends and technologies in front-end development and UI/UX design.<br>• Support and maintain existing applications, ensuring they meet current standards and user needs.<br>• Work closely with cross-functional teams to align software development with project requirements.<br>• Contribute to the overall improvement of the accounting system's interface and functionality.
The Opportunity: Be part of a dynamic team that designs, develops, and optimizes data solutions supporting enterprise-level products across diverse industries. This role provides a clear track to higher-level positions, including Lead Data Engineer and Data Architect, for those who demonstrate vision, initiative, and impact. Key Responsibilities: Design, develop, and optimize relational database objects and data models using Microsoft SQL Server and Snowflake. Build and maintain scalable ETL/ELT pipelines for batch and streaming data using SSIS and cloud-native solutions. Integrate and utilize Redis for caching, session management, and real-time analytics. Develop and maintain data visualizations and reporting solutions using Sigma Computing, SSRS, and other BI tools. Collaborate across engineering, analytics, and product teams to deliver impactful data solutions. Ensure data security, governance, and compliance across all platforms. Participate in Agile Scrum ceremonies and contribute to continuous improvement within the data engineering process. Support database deployments using DevOps practices, including version control (Git) and CI/CD pipelines (Azure DevOps, Flyway, Octopus, SonarQube). Troubleshoot and resolve performance, reliability, and scalability issues across the data platform. Mentor entry level team members and participate in design/code reviews.
<p>We are looking for a skilled and innovative Data Engineer to join our team in Grove City, Ohio. In this role, you will be responsible for designing and implementing advanced data pipelines, ensuring the seamless integration and accessibility of data across various systems. As a key player in our analytics and data infrastructure efforts, you will contribute to building a robust and scalable data ecosystem to support AI and machine learning initiatives.</p><p><br></p><p>Responsibilities:</p><p>• Design and develop scalable data pipelines to ingest, process, and transform data from multiple sources.</p><p>• Optimize data models to support analytics, forecasting, and AI/ML applications.</p><p>• Collaborate with internal teams and external partners to enhance data engineering capabilities.</p><p>• Implement and enforce data governance, security, and quality standards across hybrid cloud environments.</p><p>• Work closely with analytics and data science teams to ensure seamless data accessibility and integration.</p><p>• Develop and maintain data products and services to enable actionable insights.</p><p>• Troubleshoot and improve the performance of data workflows and storage systems.</p><p>• Align data systems across departments to create a unified and reliable data infrastructure.</p><p>• Support innovation by leveraging big data tools and frameworks such as Databricks and Spark.</p>
<p><strong>Robert Half</strong> is actively partnering with an Austin-based client to identify a <strong>Security Engineer (contract).</strong> In this role, you will play a critical part in building, implementing, and maintaining the core security controls that safeguard our cloud platform, internal systems, and end users. You will help strengthen the security posture of a large-scale SaaS environment by developing secure, resilient, and scalable security solutions. As a Security Engineer II, you’ll partner closely with engineering, IT, security operations, and governance teams to apply strong security practices across a modern cloud ecosystem. Your background in cloud security, automation, and foundational security concepts will enable you to design and operate automated controls throughout the environment. This position offers the opportunity to broaden your expertise and directly contribute to protecting millions of users. <strong>This role is onsite in Austin, Tx. </strong></p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Configure, maintain, and enhance identity and access guardrails across cloud platforms (AWS, GCP, Azure) and enterprise identity systems (e.g., Okta).</li><li>Develop and support automated processes for asset visibility, inventory management, and SBOM (Software Bill of Materials) generation.</li><li>Assist in implementing data protection technologies, including encryption, secrets management, and key lifecycle operations.</li><li>Help define secure configurations for containerized workloads (Kubernetes, EKS) and infrastructure-as-code workflows (Terraform).</li><li>Work with engineering and product teams to validate and document system resilience strategies.</li><li>Support compliance and audit activities by collecting evidence and explaining security controls when needed.</li><li>Monitor, investigate, and respond to alerts from security tools and platforms; assist in driving remediation efforts.</li><li>Participate in the evaluation and testing of emerging security solutions and technologies.</li><li>Join the on-call rotation to provide after-hours support as required.</li></ul>
<p>We are a technology-driven organization building advanced, hardware-integrated software platforms used by customers in real-world environments. Our products combine data capture, analytics, and intuitive user interfaces to deliver reliable, high-impact solutions at scale.</p><p> </p><p>We’re hiring a <strong>Software Quality Assurance Engineer</strong> to join our team. This dual-focused role is responsible for fully automating the software testing suite while improving the operational efficiency of applications deployed in the field. The ideal candidate bridges quality assurance and technical support, with a strong emphasis on automation, diagnostics, and continuous improvement.</p><p> </p><p><strong><u>Key Responsibilities</u></strong></p><p> </p><p><strong>I. Software Quality & Automation Framework Development</strong></p><ul><li>Design, build, and maintain robust, scalable test automation frameworks.</li><li>Develop and execute automated UI and API tests using tools such as Selenium WebDriver and Postman.</li><li>Participate in software release cycles, performing validation and providing technical feedback to engineering teams.</li></ul><p><strong>II. Operational Efficiency & Tooling</strong></p><ul><li>Develop and maintain scripts (PowerShell, Bash) to automate system diagnostics, support tasks, and operational workflows.</li><li>Build lightweight internal tools to enhance diagnostic capabilities and streamline support processes (Python experience is a strong plus).</li><li>Document testing, automation, and support processes using tools such as Confluence, and manage workflows in Jira.</li></ul><p><strong>III. Technical Support & Cloud Operations</strong></p><ul><li>Resolve technical issues by providing timely, effective support via phone, email, and remote sessions, with a strong focus on Windows-based environments.</li><li>Use SQL and database expertise (including MySQL) to perform complex data queries and analysis while ensuring data integrity during troubleshooting.</li><li>Monitor and troubleshoot applications deployed in AWS environments, demonstrating foundational knowledge of EC2, S3, and CloudWatch.</li></ul><p> </p>
<p>Position Overview</p><p>We are seeking a Data Engineer Engineer to support and enhance a Databricks‑based data platform during its development phase. This role is focused on building reliable, scalable data solutions early in the lifecycle—not production firefighting.</p><p>The ideal candidate brings hands‑on experience with Databricks, PySpark, Python, and a working understanding of Azure cloud services. You will partner closely with Data Engineering teams to ensure pipelines, notebooks, and workflows are designed for long‑term scalability and production readiness.</p><p><br></p><p>Key Responsibilities</p><ul><li>Develop and enhance Databricks notebooks, jobs, and workflows</li><li>Write and optimize PySpark and Python code for distributed data processing</li><li>Assist in designing scalable and reliable data pipelines</li><li>Apply Spark performance best practices: partitioning, caching, joins, file sizing</li><li>Work with Delta Lake tables, schemas, and data models</li><li>Perform data validation and quality checks during development cycles</li><li>Support cluster configuration, sizing, and tuning for development workloads</li><li>Identify performance bottlenecks early and recommend improvements</li><li>Partner with Data Engineers to prepare solutions for future production rollout</li><li>Document development standards, patterns, and best practices</li></ul>
Key Responsibilities:<br><br>· Design, develop, and maintain scalable backend systems to support data warehousing and data lake initiatives.<br><br>· Build and optimize ETL/ELT processes to extract, transform, and load data from various sources into centralized data repositories.<br><br>· Develop and implement integration solutions for seamless data exchange between systems, applications, and platforms.<br><br>· Collaborate with data architects, analysts, and other stakeholders to define and implement data models, schemas, and storage solutions.<br><br>· Ensure data quality, consistency, and security by implementing best practices and monitoring frameworks.<br><br>· Monitor and troubleshoot data pipelines and systems to ensure high availability and performance.<br><br>· Stay up-to-date with emerging technologies and trends in data engineering and integration to recommend improvements and innovations.<br><br>· Document technical designs, processes, and standards for the team and stakeholders.<br><br><br><br>Qualifications:<br><br>· Bachelor’s degree in Computer Science, Engineering, or a related field; equivalent experience considered.<br><br>· Proven experience as a Data Engineer with 5 or more years of experience; or in a similar backend development role.<br><br>· Strong proficiency in programming languages such as Python, Java, or Scala.<br><br>· Hands-on experience with ETL/ELT tools and frameworks (e.g., Apache Airflow, Talend, Informatica, etc.).<br><br>· Extensive knowledge of relational and non-relational databases (e.g., SQL, NoSQL, PostgreSQL, MongoDB).<br><br>· Expertise in building and managing enterprise data warehouses (e.g., Snowflake, Amazon Redshift, Google BigQuery) and data lakes (e.g., AWS S3, Azure Data Lake).<br><br>· Familiarity with cloud platforms (AWS, Azure, Google Cloud) and their data services.<br><br>· Experience with API integrations and data exchange protocols (e.g., REST, SOAP, JSON, XML).<br><br>· Solid understanding of data governance, security, and compliance standards.<br><br>· Strong analytical and problem-solving skills with attention to detail.<br><br>· Excellent communication and collaboration abilities.<br><br><br><br>Preferred Qualifications:<br><br>· Certifications in cloud platforms (AWS Certified Data Analytics, Azure Data Engineer, etc.)<br><br>· Experience with big data technologies (e.g., Apache Hadoop, Spark, Kafka).<br><br>· Knowledge of data visualization tools (e.g., Tableau, Power BI) for supporting downstream analytics.<br><br>· Familiarity with DevOps practices and tools (e.g., Docker, Kubernetes, Jenkins).
<p>Join our dynamic technology team as a Site Reliability Engineer (SRE) or Platform Engineer, where you’ll play a central role in building, automating, and maintaining our modern infrastructure across both on-premise and cloud environments.</p><p><strong>Qualifications:</strong></p><ul><li>Bachelor’s degree in Computer Science, Engineering, or a related technical field. (Based on general knowledge)</li><li>3–5+ years of experience in SRE, Platform Engineering, or Systems Administration within fast-paced environments. (Based on general knowledge)</li><li>Strong Python scripting skills. (Based on general knowledge)</li><li>Deep hands-on experience with Kubernetes (deployment, management, troubleshooting); OpenShift experience is a plus. (Based on general knowledge)</li><li>Proficiency with Docker/Podman and internal image management. (Based on general knowledge)</li><li>Solid experience with Ansible and Terraform; Puppet knowledge is helpful. (Based on general knowledge)</li><li>Familiarity with CI/CD workflows; experience with ArgoCD (preferred) or Flux for GitOps. (Based on general knowledge)</li><li>Proficiency with Grafana and Prometheus; exposure to Grafana Cloud/Alloy is desirable. (Based on general knowledge)</li><li>Experience with incident management and on-call tools such as Rootly, Opsgenie, or PagerDuty. (Based on general knowledge)</li><li>Security-first mindset with exposure to DevSecOps practices, including SonarQube, SAST, and CVE scanning. (Based on general knowledge)</li><li>Proven experience with both on-premise and cloud infrastructure:</li><li><strong>On-Premise:</strong> Primary experience with Kubernetes clusters; familiarity with Proxmox is desirable.</li><li><strong>Cloud:</strong> AWS and GCP experience (with a growing footprint), managed via Terraform. (Based on general knowledge)</li></ul><p>If you’re passionate about automation, reliability, and working at the forefront of scalable infrastructure, we invite you to apply.</p>
<p><strong>Data Engineer (Hybrid, Los Angeles)</strong></p><p><strong>Location:</strong> Los Angeles, California</p><p><strong>Compensation:</strong> $140,000 - $175,000 per year</p><p><strong>Work Environment:</strong> Hybrid, with onsite requirements</p><p>Are you passionate about crafting highly-scalable and performant data systems? Do you have expertise in Azure Databricks, Spark SQL, and real-time data pipelines? We are searching for a talented and motivated <strong>Data Engineer</strong> to join our team in Los Angeles. You'll work in a hybrid environment that combines onsite collaboration with the flexibility of remote work.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and implement data pipelines and ETL workflows using cutting-edge Azure technologies (e.g., Databricks, Synapse Analytics, Synapse Pipelines).</li><li>Manage and optimize big data processes, ensuring scalability, efficiency, and data accuracy.</li><li>Build and work with real-time data pipelines leveraging technologies such as Kafka, Event Hubs, and Spark Streaming.</li><li>Apply advanced skills in Python and Spark SQL to build data solutions for analytics and machine learning.</li><li>Collaborate with business analysts and stakeholders to implement impactful dashboards using Power BI.</li><li>Architect and support the seamless integration of diverse data sources into a central platform for analytics, reporting, and model serving via ML Flow.</li></ul><p><br></p>
<p>We are looking for a skilled Data Warehouse Engineer to join our team in Malvern, Pennsylvania. This Contract-to-Permanent position offers the opportunity to work with cutting-edge data technologies and contribute to the optimization of data processes. The ideal candidate will have a strong background in Azure and Snowflake, along with experience in data integration and production support. This role is 4-days onsite a WEEK, with no negotiations. Please apply directly if you're interested.</p><p><br></p><p>Responsibilities:</p><p>• Develop, configure, and optimize Snowflake-based data solutions to meet business needs.</p><p>• Utilize Azure Data Factory to design and implement efficient ETL processes.</p><p>• Provide production support by monitoring and managing data workflows and tasks.</p><p>• Extract and analyze existing code from Talend to facilitate system migrations.</p><p>• Stand up and configure data repository processes to ensure seamless performance.</p><p>• Collaborate on the migration from Talend to Azure Data Factory, providing expertise on best practices.</p><p>• Leverage Python scripting to enhance data processing and automation capabilities.</p><p>• Apply critical thinking to solve complex data challenges and support transformation initiatives.</p><p>• Maintain and improve Azure Fabric-based solutions for data warehousing.</p><p>• Work within the context of financial services, ensuring compliance with industry standards.</p>
<p>Robert Half Technology is hiring a <strong>Senior Automation Engineer</strong> to support an expanding enterprise SaaS platform. This is a hands-on, highly technical role where you’ll <strong>advance automation strategy and architecture</strong>, strengthen test coverage across the stack, and partner closely with Engineering, UI/UX, DevOps, and Product to embed quality throughout the SDLC. 🚀</p><p>If you thrive in agile teams, enjoy building scalable automation frameworks, and can deliver quickly while raising engineering standards, this role is for you. ✅</p><p><strong> </strong></p><p>Responsibilities</p><ul><li>Own and evolve automation capabilities across business-critical testing processes (UI, API, integration, back-end, performance).</li><li>Design, develop, and execute automated tests across all layers of the stack:</li><li><strong>UI automation:</strong> Selenium</li><li><strong>API testing:</strong> Rest Assured (or equivalent)</li><li><strong>Data validation & test data creation:</strong> SQL</li><li>Serve as a full-stack Senior Automation Engineer in agile teams using <strong>Selenium + Maven in IntelliJ</strong>, applying strong <strong>Java/OOP</strong> design patterns and coding standards.</li><li>Build and maintain automation for functional, integration, API, and back-end tests (including file upload/download, email parsing, multi-browser, and security validations).</li><li>Develop advanced SQL queries to validate data integrity and generate automated test data.</li><li>Support containerized test execution: build/maintain frameworks within <strong>Docker</strong> and partner with DevOps on scalable environments.</li><li>Implement and execute performance/load testing frameworks; analyze results and communicate quality insights.</li><li>Contribute to unit test strategies; collaborate with developers to increase application-level test coverage.</li><li>Establish/enforce coding standards and conduct code reviews for QA automation initiatives.</li><li>Mentor automation engineers and promote best practices across the QA organization.</li><li>Troubleshoot and resolve blockers from automated regression suites; improve suite stability and reliability.</li><li>Participate in defect management, bug triage, backlog grooming, and sprint ceremonies; provide actionable QA feedback during design/build phases.</li><li>Support QA leadership with metrics, reporting, and trend analysis to measure automation effectiveness and product quality. 📈</li></ul><p><br></p>
<p>A <strong>Robert Half client</strong> is seeking a <strong>Senior Full Stack Engineer</strong> to join their product development team. This team designs and builds innovative web and mobile software experiences that support a fast-paced, global entertainment environment serving agents, executives, artists, and business partners.</p><p>This role is ideal for a hands-on engineer who enjoys building scalable applications, mentoring others, and contributing to product direction in a collaborative, agile environment.</p><p><br></p><p>As a <strong>Senior Full Stack Engineer</strong>, you will play a leadership role in designing, developing, and delivering high-quality web and mobile applications along with supporting APIs and services.</p><p><br></p><p>Responsibilities include:</p><ul><li>Designing and developing scalable web and mobile applications</li><li>Building and maintaining robust APIs and backend services</li><li>Providing architectural input and making key technical decisions</li><li>Leading code reviews and enforcing best practices and coding standards</li><li>Mentoring junior developers and fostering a culture of continuous improvement</li><li>Collaborating with product managers, designers, and engineers in an agile environment</li><li>Engaging with users and stakeholders to gather feedback and iteratively improve products</li><li>Supporting ongoing delivery and maintenance of production applications</li></ul><p><br></p>
<p><strong>Location:</strong> Hybrid — <em>2 days per month on-site in New Hampshire</em></p><p><strong>Employment Type:</strong> Full-Time</p><p><strong>About the Role</strong></p><p>We’re seeking a talented <strong>Software Engineer</strong> with deep experience in <strong>Oracle APEX</strong> and <strong>PL/SQL. </strong>You should also have a strong background integrating third-party applications like <strong>Salesforce</strong>. This role is ideal for someone who enjoys collaborating with cross-functional teams, designing scalable solutions, and enhancing business systems through thoughtful engineering and integrations.</p><p><br></p><p>As part of our team, you’ll play a key role in building and maintaining applications that drive critical business workflows. You’ll leverage your Oracle APEX expertise to architect solutions and your integration experience to ensure smooth data flows between platforms.</p><p>This is a <strong>hybrid position</strong>, requiring <strong>two days per month on-site in New Hampshire</strong> for team collaboration, planning, or project workshops.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Design, develop, and maintain applications using <strong>Oracle Application Express (APEX)</strong>.</li><li>Build, optimize, and troubleshoot <strong>integrations with third-party systems</strong>, including Salesforce and other enterprise platforms.</li><li>Develop APIs, data pipelines, and middleware solutions to support seamless cross-system communication.</li><li>Collaborate with business stakeholders to gather requirements and translate them into technical specifications.</li><li>Ensure application performance, security, and reliability through best practices.</li><li>Participate in code reviews, testing, deployment, and documentation of software solutions.</li><li>Support ongoing enhancements, bug fixes, and system improvements.</li></ul><p><strong>Required Qualifications</strong></p><ul><li><strong>Hands-on experience with Oracle APEX</strong> development.</li><li>Proven experience designing and implementing <strong>Salesforce integrations</strong> (REST/SOAP APIs, middleware tools, or direct platform integration).</li><li>Strong proficiency with <strong>SQL, PL/SQL</strong>, and Oracle database structures.</li><li>Experience working with APIs, integration frameworks, and data transformation workflows.</li><li>Solid understanding of software development best practices, including version control, testing, and documentation.</li><li>Excellent analytical, troubleshooting, and communication skills.</li><li>Ability to work in a hybrid environment and be on-site in New Hampshire <strong>twice per month</strong>.</li></ul><p><strong>Preferred Qualifications</strong></p><ul><li>Experience with additional integration platforms (e.g., MuleSoft, Boomi, Workato).</li><li>Background working in enterprise environments or supporting mission-critical systems.</li><li>Familiarity with Agile methodologies.</li><li>Knowledge of secure coding practices and data governance.</li></ul>
<p>We are looking for a skilled Software Engineer IV to join our team in Boca Raton, Florida. This long-term contract position offers an exciting opportunity to work on cutting-edge cloud-native architecture and enterprise services platforms. You will play a vital role in designing, developing, and optimizing secure, scalable middleware solutions while contributing to the overall evolution of our systems.</p><p><br></p><p><strong>6 month contract with option for extension. </strong></p><p><br></p><p>Responsibilities:</p><p>• Develop and implement high-performance C++ services using modern standards to ensure scalability and efficiency.</p><p>• Design and manage cloud-native microservices within Kubernetes, focusing on optimization and resource allocation in Azure.</p><p>• Collaborate with architects and security teams to ensure platform compliance and implement "Security by Design" principles.</p><p>• Build and maintain RESTful APIs and middleware interfaces to facilitate seamless data exchange across applications.</p><p>• Monitor, troubleshoot, and improve production systems, ensuring high availability and performance.</p><p>• Lead technical discussions and provide guidance to entry level developers to foster a culture of innovation and excellence.</p><p>• Execute full lifecycle development, including requirements gathering, automated testing, and deployment through CI/CD pipelines.</p><p>• Partner with DevOps and Site Reliability Engineering teams to enhance system reliability and scalability.</p><p>• Stay up-to-date with industry trends and integrate emerging technologies into existing systems.</p><p>• Optimize database performance and integrate relational and NoSQL databases into the platform.</p>
<p><strong>POSITION OVERVIEW</strong></p><p>The Desktop Engineer focuses on providing high-level desktop support and systems engineering services. The individual in this position will manage and enhance internal tech infrastructure related to endpoints, automation, and software deployment. The responsibilities include scripting, image lifecycle support, patch management, application packaging, and cross-platform device maintenance. The position also plays a role in training and technical communication, ensuring security standards are upheld within the environment.</p><p><br></p><p><strong>KEY RESPONSIBILITIES</strong></p><p>Some of the core tasks and expectations include:</p><ul><li>Prepare and deploy endpoint devices using modern management tools such as Ivanti and Intune.</li><li>Maintain imaging standards and adjust based on departmental use cases and hardware evolution.</li><li>Build and support automation scripts (e.g., PowerShell, AutoIT) for device setup and software installs.</li><li>Provide support across both Windows and macOS ecosystems, including patching and compliance monitoring.</li><li>Curate and publish applications to a self-service portal after validation/testing.</li><li>Collaborate with IT operations and project teams to schedule, test, and roll out updates and patches.</li><li>Leverage ticketing and reporting systems (e.g., ServiceNow, Power BI) for issue resolution and trend tracking.</li><li>Support both virtual and physical desktops, including incident management and hardware repair.</li><li>Contribute to IT initiatives like system refreshes, process enhancements, and cross-team coordination.</li><li>Escalate technical barriers as needed while maintaining strong end-user communication.</li><li>Participate in formalized processes such as incident, change, and problem management workflows.</li></ul><p><br></p>
We are looking for a skilled Sr. Software Engineer to join our team in Peekskill, New York. In this role, you will leverage your expertise in data analysis and business intelligence tools to drive actionable insights and support organizational goals. The ideal candidate will be detail-oriented, adept at multitasking, and capable of working independently while delivering high-quality results.<br><br>Responsibilities:<br>• Analyze and interpret complex datasets from multiple sources, including claims data and electronic health records, to identify trends and actionable insights.<br>• Develop and document technical specifications, training materials, and provide ongoing system support to ensure sustainability.<br>• Apply actuarial concepts to evaluate risk contracts, focusing on areas such as cost of care, disease management, and quality measures.<br>• Transform datasets through quantitative and qualitative analysis to uncover patterns and predict outcomes that optimize health and financial performance.<br>• Communicate findings effectively across teams and stakeholders, using both verbal and visual presentations.<br>• Design and implement data-driven strategies to identify opportunities, mitigate risks, and meet client-specific needs.<br>• Collaborate with IT teams to address data quality issues and enhance the reliability of datasets.<br>• Assess contract performance against financial metrics and patient satisfaction indicators.<br>• Maintain organized workflows and ensure timely delivery of projects while paying attention to detail.
We are looking for a skilled Data Engineer to join our team on a long-term contract basis in Columbus, Ohio. This role focuses on optimizing database performance, implementing efficient data solutions, and ensuring the reliability of data-driven processes. The ideal candidate will have a strong background in SQL Server and Azure SQL technologies, with a commitment to delivering high-quality, scalable solutions.<br><br>Responsibilities:<br>• Analyze and enhance stored procedures to improve database performance and efficiency.<br>• Develop and implement indexing strategies to optimize query execution.<br>• Collaborate with teams to refine reporting views and synchronize processes effectively.<br>• Conduct thorough code reviews to ensure adherence to quality standards and scalability.<br>• Utilize diagnostic tools to identify and resolve performance bottlenecks in SQL Server environments.<br>• Design and maintain reliable database solutions that meet organizational needs.<br>• Provide technical guidance and collaborate on troubleshooting efforts.<br>• Document processes and solutions to ensure clarity and future usability.<br>• Stay updated with advancements in SQL and Azure technologies to apply best practices.
Robert Half is hiring! Apply today! <br> Develops and maintains a deep understanding of all solutions, applications, infrastructure, and overall company mission and objectives. Maintains all public, private and hybrid cloud servers and associated environments, email (Microsoft 365/Exchange), and any associated platforms. Maintains, troubleshoots, and supports Active Directory, DHCP, DNS, Microsoft SQL Databases, Microsoft 365, Exchange Online, EntraID, Teams, and backup software. Manages the data center and all associated equipment, including hardware, software, and related applications. Participates in and performs on-premise to cloud-based and cloud to cloud migrations. Researches, recommends, and implements innovative and automated approaches for system administration tasks. Implements, tests, and troubleshoots Active Directory Group Policies (GPOs) and other policies utilized by workstation and server endpoints. Maintains and performs systems backups, data recovery, and any associated configurations, which include replications and archivals. Maintains SAN (Storage Area Network) environments in both Operations Center (HQ) and Disaster Recovery sites, ensuring proper data replication continuity. Recommends, schedules, and performs hardware and software upgrades in a timely manner, including system security patches. Installs all equipment with appropriate security features/tools following hardening standards, and resolves security issues/vulnerabilities related to the server, storage, and associated environments. Maintains user environments, file shares, and their associated permissions. Provides end-user-support for all hardware, software, and peripheral devices. Works within the IT Help Desk and project management platforms by assuming, updating, and resolving escalated service requests and projects. Mentors and provides guidance to other team members on requests/projects.