<p>Join our dynamic technology team as a Site Reliability Engineer (SRE) or Platform Engineer, where you’ll play a central role in building, automating, and maintaining our modern infrastructure across both on-premise and cloud environments.</p><p><strong>Qualifications:</strong></p><ul><li>Bachelor’s degree in Computer Science, Engineering, or a related technical field. (Based on general knowledge)</li><li>3–5+ years of experience in SRE, Platform Engineering, or Systems Administration within fast-paced environments. (Based on general knowledge)</li><li>Strong Python scripting skills. (Based on general knowledge)</li><li>Deep hands-on experience with Kubernetes (deployment, management, troubleshooting); OpenShift experience is a plus. (Based on general knowledge)</li><li>Proficiency with Docker/Podman and internal image management. (Based on general knowledge)</li><li>Solid experience with Ansible and Terraform; Puppet knowledge is helpful. (Based on general knowledge)</li><li>Familiarity with CI/CD workflows; experience with ArgoCD (preferred) or Flux for GitOps. (Based on general knowledge)</li><li>Proficiency with Grafana and Prometheus; exposure to Grafana Cloud/Alloy is desirable. (Based on general knowledge)</li><li>Experience with incident management and on-call tools such as Rootly, Opsgenie, or PagerDuty. (Based on general knowledge)</li><li>Security-first mindset with exposure to DevSecOps practices, including SonarQube, SAST, and CVE scanning. (Based on general knowledge)</li><li>Proven experience with both on-premise and cloud infrastructure:</li><li><strong>On-Premise:</strong> Primary experience with Kubernetes clusters; familiarity with Proxmox is desirable.</li><li><strong>Cloud:</strong> AWS and GCP experience (with a growing footprint), managed via Terraform. (Based on general knowledge)</li></ul><p>If you’re passionate about automation, reliability, and working at the forefront of scalable infrastructure, we invite you to apply.</p>
<p>Position Overview</p><p>We are seeking a Data Engineer Engineer to support and enhance a Databricks‑based data platform during its development phase. This role is focused on building reliable, scalable data solutions early in the lifecycle—not production firefighting.</p><p>The ideal candidate brings hands‑on experience with Databricks, PySpark, Python, and a working understanding of Azure cloud services. You will partner closely with Data Engineering teams to ensure pipelines, notebooks, and workflows are designed for long‑term scalability and production readiness.</p><p><br></p><p>Key Responsibilities</p><ul><li>Develop and enhance Databricks notebooks, jobs, and workflows</li><li>Write and optimize PySpark and Python code for distributed data processing</li><li>Assist in designing scalable and reliable data pipelines</li><li>Apply Spark performance best practices: partitioning, caching, joins, file sizing</li><li>Work with Delta Lake tables, schemas, and data models</li><li>Perform data validation and quality checks during development cycles</li><li>Support cluster configuration, sizing, and tuning for development workloads</li><li>Identify performance bottlenecks early and recommend improvements</li><li>Partner with Data Engineers to prepare solutions for future production rollout</li><li>Document development standards, patterns, and best practices</li></ul>
<p>.NET Software Engineer (Full Stack)</p><p>Location: Pasadena, CA (Fully On-Site)</p><p> Salary: Up to $135K </p><p> </p><p>We’re looking for an experienced .NET Engineer to help build and maintain modern, cloud-based applications and APIs. You’ll work closely with product and engineering partners to deliver scalable, high-quality solutions using current technologies.</p><p><br></p><p>Responsibilities</p><ul><li>Build and maintain .NET Core applications and RESTful APIs</li><li>Develop user-facing features with React</li><li>Work with Azure services including Azure SQL and Service Bus (or similar)</li><li>Deploy and support production systems in Azure Cloud</li><li>Collaborate in Agile development cycles</li><li>Troubleshoot issues and contribute to clean, well-documented code</li></ul><p>For immediate consideration, direct message Reid Gormly on LinkedIN and apply now!</p>
We are looking for a skilled Data Engineer to join our team in San Antonio, Texas. This role offers an opportunity to design, develop, and optimize data solutions that support business operations and strategic decision-making. The ideal candidate will possess a strong technical background, excellent problem-solving skills, and the ability to collaborate effectively across departments.<br><br>Responsibilities:<br>• Develop, maintain, and optimize data pipelines using Azure Synapse Analytics, Microsoft Fabric, and Azure Data Factory.<br>• Implement advanced data modeling techniques and design scalable BI solutions that align with business objectives.<br>• Create and maintain dashboards and reports using Power BI, ensuring data accuracy and usability.<br>• Integrate data from various sources, including APIs and Dataverse, into Azure Data Lake Storage Gen2.<br>• Utilize tools like Delta Lake and Parquet to manage and structure data within a lakehouse architecture.<br>• Define and implement BI governance frameworks to ensure consistent data standards and practices.<br>• Collaborate with cross-functional teams such as Operations, Sales, Engineering, and Accounting to gather requirements and deliver actionable insights.<br>• Troubleshoot, document, and resolve data issues independently while driving continuous improvement initiatives.<br>• Lead or contribute to Agile/Scrum-based projects to deliver high-quality data solutions within deadlines.<br>• Stay updated on emerging technologies and trends to enhance data engineering practices.
<p>We are seeking a highly skilled Software Engineer with strong experience in .NET Core, database design (SQL & NoSQL), Azure cloud architecture, and RESTful API development. This role is ideal for a developer who thrives in modern cloud-native environments and enjoys building scalable, high‑performance applications.</p><p><br></p><p>Key Responsibilities</p><p><br></p><p>Design, develop, and maintain .NET Core applications following best practices in clean code, performance, and security.</p><p>Build and optimize RESTful APIs for internal and external integrations.</p><p>Design, implement, and tune SQL databases (e.g., SQL Server, Azure SQL) including stored procedures, indexing strategies, and query optimization.</p><p>Work with NoSQL technologies (e.g., Cosmos DB, MongoDB, Redis) where appropriate for distributed, high‑throughput workloads.</p><p>Architect and deploy cloud solutions using Microsoft Azure, leveraging services such as:</p><p><br></p><p>App Service / Functions</p><p>Azure SQL / Cosmos DB</p><p>API Management</p><p>Azure Storage</p><p>Azure Kubernetes Service (AKS) (optional)</p><p><br></p><p><br></p><p>Apply cloud‑native design principles to ensure scalability, resilience, and security.</p><p>Collaborate with cross-functional teams (product, QA, DevOps) to deliver high‑quality software.</p><p>Contribute to CI/CD pipelines and automated testing frameworks.</p><p>Participate in code reviews, architectural discussions, and mentoring team members.</p>
<p>Job Summary:</p><p><br></p><p>We are seeking a skilled and motivated System Engineer to join our team in New York City. As a System Engineer, you will be responsible for maintaining and optimizing our IT infrastructure, ensuring the smooth operation of our systems, and providing technical support to end-users. The ideal candidate should have 5+ years of experience in system administration, with a strong focus on Azure, Windows, Active Directory, VMware, Barracuda Backups Appliance, SAN Nimble storage, and MFT/SFTP technologies.</p><p><br></p><p><br></p><p><br></p><p>Responsibilities:</p><p><br></p><p>Manage and maintain the company's IT infrastructure, including servers, storage systems, network devices, and related components.</p><p>Monitor and troubleshoot system performance, ensuring high availability and reliability of all systems.</p><p>Configure and administer Azure cloud services, including virtual machines, storage, networking, and security.</p><p>Oversee the Windows server environment, including installation, configuration, and maintenance of servers and services.</p><p>Manage Active Directory, including user accounts, group policies, security permissions, and domain services.</p><p>Perform virtualization tasks using VMware, including server provisioning, virtual machine management, and troubleshooting.</p><p>Administer and monitor Barracuda Backups Appliance for data backup and recovery operations.</p><p>Maintain and support SAN Nimble storage systems, ensuring optimal performance and availability.</p><p>Collaborate with cross-functional teams to implement and maintain secure file transfer protocols (MFT) and secure file transfer protocol (SFTP) solutions.</p><p>Perform system upgrades, patches, and security updates in accordance with industry best practices.</p><p>Provide technical support to end-users, resolving issues related to hardware, software, and network connectivity.</p><p>Create and maintain documentation related to system configurations, procedures, and troubleshooting guides.</p><p><br></p><p><br></p><p><br></p>
We are looking for an experienced Senior Azure Cloud Infrastructure Engineer to join our team in Minneapolis, Minnesota. In this role, you will lead efforts in designing, implementing, and optimizing cloud-based solutions while contributing to the growth of our platform engineering initiatives. This is an exciting opportunity to utilize your expertise in Azure and cloud technologies to support innovative infrastructure strategies.<br><br>Responsibilities:<br>• Design and implement cloud infrastructure solutions using Microsoft Azure, ensuring scalability and security.<br>• Develop Infrastructure as Code using tools such as Terraform and Bicep to automate deployments and improve efficiency.<br>• Collaborate with cross-functional teams to integrate cloud services with networking, security, and application systems.<br>• Create and maintain scripts in Python, PowerShell, or Bash to streamline operational processes.<br>• Monitor cloud environments and establish governance frameworks to maintain compliance and performance.<br>• Configure CI/CD pipelines to support continuous integration and delivery.<br>• Apply knowledge of full-stack engineering to connect cloud systems with data services and application design.<br>• Analyze and improve cloud networking and security models to enhance infrastructure reliability.<br>• Provide technical leadership and mentorship to team members, fostering skill development and collaboration.<br>• Stay updated on emerging cloud technologies and recommend innovative solutions to meet business needs.
We’re working with a Metro Atlanta–based client who is seeking a hands-on Security Engineer to shape and strengthen their security posture across cloud, network, and endpoint environments. This strategic role calls for technical expertise, independent ownership, and excellent communication skills. You’ll lead initiatives to modernize security tools and architecture, mentor entry level team members, and collaborate closely with IT teams to ensure a robust, forward-looking security strategy. <br> Key Responsibilities: Administer, monitor, and optimize security tools, including CrowdStrike, Secureworks, Mimecast, and Entra ID. Design and maintain secure cloud (Azure) and on-prem environments. Lead incident response, risk assessments, and vulnerability remediation. Develop and enforce security policies, standards, and documentation. Collaborate with IT teams to ensure secure architecture and operational practices. Provide guidance, mentorship, and training to team members.
<p>We are looking for an experienced Senior Data Engineer to join our team in Denver, Colorado. In this role, you will design and implement data solutions that drive business insights and operational efficiency. You will collaborate with cross-functional teams to manage data pipelines, optimize workflows, and ensure the integrity and security of data systems.</p><p><br></p><p>Responsibilities:</p><p>• Develop and maintain robust data pipelines to process and transform large datasets effectively.</p><p>• Advise on tools / technologies to implement. </p><p>• Collaborate with stakeholders to understand data requirements and translate them into technical solutions.</p><p>• Design and implement ETL processes to facilitate seamless data integration.</p><p>• Optimize data workflows and ensure system performance meets organizational needs.</p><p>• Work with Apache Spark, Hadoop, and Kafka to build scalable data systems.</p><p>• Create and maintain SQL queries for data extraction and analysis.</p><p>• Ensure data security and integrity by adhering to best practices.</p><p>• Troubleshoot and resolve issues in data systems to minimize downtime.</p><p>• Provide technical guidance and mentorship to less experienced team members.</p><p>• Stay updated on emerging technologies to enhance data engineering practices.</p>
<p>Robert Half is seeking a Systems Engineer with knowledge of Modern Microsoft toolsets and best practices with knowledge of Windows Server/ Active Directory,, 0365, Cloud (AWS), DevOps</p><p><br></p><p>The System Engineer supports the design, implementation, and administration of enterprise infrastructure across both on-premises and cloud environments. Responsible for a mix of Microsoft-based systems, virtual infrastructure, AWS-hosted services, backups, monitoring, and automation—ensuring stable, secure, and efficient operations.</p><p>As we modernize our technology platforms, this role helps bridge traditional systems administration with cloud-native operations, contributing to infrastructure solutions that enhance reliability, compliance, and service delivery.</p><p>Essential Functions</p><p>•Support and maintain Windows Server environments (2016-2022), including Active Directory, DNS, DHCP, and Group Policy.</p><p>•Manage Microsoft 365 services such as Exchange Online, SharePoint, Teams, and Intune endpoint management.</p><p>•Maintain Microsoft Entra ID (Azure AD) and Entra Connect to support secure hybrid identity and authentication management.</p><p>•Provision and manage AWS services including EC2, RDS, S3, IAM, and VPCs, following best practices for cost, security, and resilience.</p><p>•Automate infrastructure deployment and management using Terraform, AWS CloudFormation, and PowerShell scripting.</p><p>•Support hybrid workloads and infrastructure-as-code practices aligned with DevOps principles.</p><p>•Administer VMware and/or Hyper-V environments, ensuring effective capacity planning, resource optimization, and snapshot management.</p><p>•Perform lifecycle support for physical servers, SAN/NAS storage, UPS systems, and datacenter equipment.</p><p>•Coordinate server patching, hardware upgrades, and system migrations with minimal business disruption.</p><p>•Use Datadog to monitor system and application performance, configure dashboards and alerts, and conduct performance tuning.</p><p>•Ensure infrastructure complies with internal security policies and external regulatory standards (FFIEC, NCUA, GLBA).</p><p>•Support vulnerability remediation efforts, access control management, and audit response processes.</p><p>•Administer backup and recovery solutions using Rubrik and AWS native services such as AWS Backup and EBS/RDS Snapshots.</p><p>•Test, document, and support business continuity plans for both on-prem and cloud-hosted workloads.</p><p>•Serve as Tier-3 support for infrastructure and cloud-related issues escalated by the help desk or project teams.</p><p>•Collaborate with engineers, application owners, and vendors to implement secure, scalable infrastructure solutions.</p><p>•Contribute to system documentation, SOPs, and architecture diagrams to support operational consistency and knowledge sharing.</p><p>•Understands system interdependencies and how infrastructure impacts broader IT services and business outcomes; proactively identifies and addresses potential risks or gaps.</p><p>•Other duties as assigned.</p><p><br></p><p><br></p>
<p>We are seeking a highly skilled Data Engineer to design, build, and manage our data infrastructure. The ideal candidate is an expert in writing complex SQL queries, designing efficient database schemas, and developing ETL/ELT pipelines. This role ensures data accuracy, accessibility, and performance optimization to support business intelligence, analytics, and reporting initiatives.</p><p><br></p><p><strong><em><u>Key Responsibilities</u></em></strong></p><p><br></p><p><strong>Database Design & Management</strong></p><ul><li>Design, develop, and maintain relational databases, including SQL Server, PostgreSQL, and Oracle, as well as cloud-based data warehouses.</li></ul><p><strong>Strategic SQL & Data Engineering</strong></p><ul><li>Develop advanced, optimized SQL queries, stored procedures, and functions to process and analyze large, complex datasets and deliver actionable business insights.</li></ul><p><strong>Data Pipeline Automation & Orchestration</strong></p><ul><li>Build, automate, and orchestrate ETL/ELT workflows using SQL, Python, and cloud-native tools to integrate and transform data from diverse, distributed sources.</li></ul><p><strong>Performance Optimization</strong></p><ul><li>Tune SQL queries and optimize database schemas through indexing, partitioning, and normalization to improve data retrieval and processing performance.</li></ul><p><strong>Data Integrity & Security</strong></p><ul><li>Ensure data quality, consistency, and integrity across systems.</li><li>Implement data masking, encryption, and role-based access control (RBAC).</li></ul><p><strong>Documentation</strong></p><ul><li>Maintain comprehensive technical documentation, including database schemas, data dictionaries, and ETL workflows.</li></ul>
<p><strong>Robert Half</strong> is actively partnering with an Austin-based client to identify a <strong>Security Engineer (contract).</strong> In this role, you will play a critical part in building, implementing, and maintaining the core security controls that safeguard our cloud platform, internal systems, and end users. You will help strengthen the security posture of a large-scale SaaS environment by developing secure, resilient, and scalable security solutions. As a Security Engineer II, you’ll partner closely with engineering, IT, security operations, and governance teams to apply strong security practices across a modern cloud ecosystem. Your background in cloud security, automation, and foundational security concepts will enable you to design and operate automated controls throughout the environment. This position offers the opportunity to broaden your expertise and directly contribute to protecting millions of users. <strong>This role is onsite in Austin, Tx. </strong></p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Configure, maintain, and enhance identity and access guardrails across cloud platforms (AWS, GCP, Azure) and enterprise identity systems (e.g., Okta).</li><li>Develop and support automated processes for asset visibility, inventory management, and SBOM (Software Bill of Materials) generation.</li><li>Assist in implementing data protection technologies, including encryption, secrets management, and key lifecycle operations.</li><li>Help define secure configurations for containerized workloads (Kubernetes, EKS) and infrastructure-as-code workflows (Terraform).</li><li>Work with engineering and product teams to validate and document system resilience strategies.</li><li>Support compliance and audit activities by collecting evidence and explaining security controls when needed.</li><li>Monitor, investigate, and respond to alerts from security tools and platforms; assist in driving remediation efforts.</li><li>Participate in the evaluation and testing of emerging security solutions and technologies.</li><li>Join the on-call rotation to provide after-hours support as required.</li></ul>
We are looking for an experienced Cloud Security Engineer to join our team in Philadelphia, Pennsylvania. In this role, you will play a pivotal part in safeguarding cloud environments and ensuring compliance with industry security standards. This is a contract-to-permanent position within the healthcare sector, offering the opportunity to make a meaningful impact while developing your expertise.<br><br>Responsibilities:<br>• Design and implement cloud security solutions across multiple platforms, including Microsoft Azure and AWS.<br>• Monitor, analyze, and respond to security incidents using tools such as Splunk, Azure Sentinel, and Arcsight SIEM.<br>• Collaborate with cross-functional teams to integrate security measures within cloud applications.<br>• Evaluate and maintain compliance with security standards and regulatory requirements.<br>• Develop and optimize DevSecOps practices to enhance system security.<br>• Manage identity and access protocols using Microsoft Entra ID.<br>• Perform regular security assessments and provide recommendations for improvements.<br>• Troubleshoot and resolve security-related issues in UNIX and Microsoft environments.<br>• Support the deployment and configuration of Lawson systems within secure frameworks.<br>• Stay informed about emerging security threats and implement proactive solutions.
<p><strong>Data Engineer (Hybrid, Los Angeles)</strong></p><p><strong>Location:</strong> Los Angeles, California</p><p><strong>Compensation:</strong> $140,000 - $175,000 per year</p><p><strong>Work Environment:</strong> Hybrid, with onsite requirements</p><p>Are you passionate about crafting highly-scalable and performant data systems? Do you have expertise in Azure Databricks, Spark SQL, and real-time data pipelines? We are searching for a talented and motivated <strong>Data Engineer</strong> to join our team in Los Angeles. You'll work in a hybrid environment that combines onsite collaboration with the flexibility of remote work.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and implement data pipelines and ETL workflows using cutting-edge Azure technologies (e.g., Databricks, Synapse Analytics, Synapse Pipelines).</li><li>Manage and optimize big data processes, ensuring scalability, efficiency, and data accuracy.</li><li>Build and work with real-time data pipelines leveraging technologies such as Kafka, Event Hubs, and Spark Streaming.</li><li>Apply advanced skills in Python and Spark SQL to build data solutions for analytics and machine learning.</li><li>Collaborate with business analysts and stakeholders to implement impactful dashboards using Power BI.</li><li>Architect and support the seamless integration of diverse data sources into a central platform for analytics, reporting, and model serving via ML Flow.</li></ul><p><br></p>
We are looking for an experienced and driven Sr. Software Engineer to join our team in New York, New York. As part of a dynamic financial services organization, you will play a critical role in developing and optimizing software solutions that support quantitative investment strategies. This position offers the opportunity to collaborate with a skilled global team and contribute to cutting-edge advancements in the world of automated trading.<br><br>Responsibilities:<br>• Design, develop, and maintain software systems that support quantitative trading and investment strategies.<br>• Collaborate with cross-functional teams to implement research-tested solutions into production environments.<br>• Optimize and enhance backend systems to ensure high performance and reliability.<br>• Develop APIs and integrate third-party tools to support data analysis and trading operations.<br>• Ensure the quality and accuracy of data by implementing robust validation and improvement processes.<br>• Participate in Agile Scrum workflows to deliver projects efficiently and effectively.<br>• Create comprehensive documentation for software systems and processes.<br>• Conduct performance testing and debugging to ensure system stability.<br>• Stay updated on emerging technologies and incorporate relevant innovations into development practices.<br>• Provide mentorship and guidance to less experienced team members when needed.
<p>We are looking for a skilled Software Engineer to join our dynamic team in Bristol, Connecticut. In this role, you will contribute to the development of high-performance systems and APIs that support sports data and content distribution across a variety of platforms. This is a Long-term Contract position where you will collaborate with talented professionals to deliver innovative solutions that enhance user experiences and drive impactful results.</p><p><br></p><p>Job Description:</p><p> • Experience with Java and open-source technologies like Spring, Tomcat, MySQL, Kafka, Elastic Search, etc.</p><p> • Exposure to cloud-based technologies such as AWS EC2, ECS, SQS, S3, Lambda, CloudFormation, etc.</p><p> • Exposure to full lifecycle of application development, including best practices of unit testing, code reviews, documentation, etc.</p><p> • Motivated self-starter with the ability to learn and adapt to new technologies</p><p> Please make sure your candidates meet the following qualifications and can hit the ground running:</p><p> •Experience with Java and open-source technologies like Spring, Tomcat, MySQL, Kafka, Elastic Search, etc.</p><p> •Exposure to cloud-based technologies such as AWS EC2, ECS, SQS, S3, Lambda, CloudFormation, etc.</p><p> •Exposure to full lifecycle of application development, including best practices of unit testing, code reviews, documentation, etc.</p><p> •Motivated self-starter with the ability to learn and adapt to new technologies and familiarity with AI coding tools (Claude, Amazon Q, Copilot) </p>
We are looking for a Senior Data Architect to join our team on a long-term contract basis in Marysville, Ohio. In this role, you will design and maintain advanced data solutions to support analytics and business intelligence initiatives within the automotive industry. This position offers an exciting opportunity to work with cutting-edge technologies and contribute to the optimization of data systems and processes.<br><br>Responsibilities:<br>• Design and implement scalable data pipelines using cloud-based services such as Glue, S3, and Redshift.<br>• Build and orchestrate workflows with tools like Step Functions, EventBridge, and Lambda.<br>• Develop and integrate CI/CD pipelines using GitHub and other DevOps tools for automated deployments.<br>• Create conceptual, logical, and physical data models tailored for both operational and analytical systems.<br>• Optimize database queries, normalize datasets, and apply performance tuning strategies.<br>• Utilize Python and PySpark for data transformation, automation, and engineering tasks.<br>• Monitor and assess pipeline performance using tools like CloudWatch and Glue job logs.<br>• Troubleshoot and resolve issues related to data quality, system performance, and compliance.<br>• Implement metadata management practices and audit trails to ensure adherence to governance standards.<br>• Collaborate with cross-functional teams and stakeholders to align data architecture with business goals.
We are looking for an experienced Senior Data Engineer with a strong background in Python and modern data engineering tools to join our team in West Des Moines, Iowa. This is a long-term contract position that requires expertise in designing, building, and optimizing data pipelines and working with cloud-based data warehouses. If you thrive in a collaborative environment and have a passion for transforming raw data into actionable insights, we encourage you to apply.<br><br>Responsibilities:<br>• Develop, debug, and optimize Python-based data pipelines using frameworks such as Flask, Django, or FastAPI.<br>• Design and implement data transformations in a data warehouse using tools like dbt, ensuring high-quality analytics-ready datasets.<br>• Utilize Amazon Redshift and Snowflake for managing large-scale data storage and performing advanced querying and optimization.<br>• Automate data integration processes using platforms like Fivetran and orchestration tools such as Prefect or Airflow.<br>• Build reusable and maintainable data models to improve performance and scalability for analytics and reporting.<br>• Conduct data analysis and visualization leveraging Python libraries such as NumPy, Pandas, TensorFlow, and PyTorch.<br>• Manage version control for data engineering projects using Git and GitHub.<br>• Ensure data quality through automated testing and validation processes.<br>• Document workflows, code, and data transformations following best practices for readability and maintainability.<br>• Optimize cloud-based data warehouse and lake platforms for performance and integration of new data sources.
We are looking for an experienced Cloud Developer to join our team in Houston, Texas. In this long-term contract position, you will play a pivotal role in designing, implementing, and optimizing cloud-based solutions tailored to the energy and natural resources industry. This role offers an exciting opportunity to work on cutting-edge AI technologies and contribute to the development of innovative solutions.<br><br>Responsibilities:<br>• Design and deploy scalable cloud solutions using AWS and Azure services.<br>• Develop and integrate AI-powered applications to enhance operational efficiency.<br>• Implement automation tools, such as Ansible, to streamline processes and improve system performance.<br>• Optimize cloud infrastructure through advanced techniques like auto-scaling and resource allocation.<br>• Collaborate with cross-functional teams to align AI strategies with business goals.<br>• Create and maintain pipelines for data processing and machine learning workflows using Azure ML and Azure Pipelines.<br>• Conduct process modeling and mapping to identify opportunities for AI integration.<br>• Deploy and manage AI models on cloud platforms, ensuring reliability and scalability.<br>• Monitor and troubleshoot cloud environments to ensure optimal performance.<br>• Stay updated with emerging cloud technologies and AI advancements to drive innovation.
Robert Half is hiring! Apply today! <br> Develops and maintains a deep understanding of all solutions, applications, infrastructure, and overall company mission and objectives. Maintains all public, private and hybrid cloud servers and associated environments, email (Microsoft 365/Exchange), and any associated platforms. Maintains, troubleshoots, and supports Active Directory, DHCP, DNS, Microsoft SQL Databases, Microsoft 365, Exchange Online, EntraID, Teams, and backup software. Manages the data center and all associated equipment, including hardware, software, and related applications. Participates in and performs on-premise to cloud-based and cloud to cloud migrations. Researches, recommends, and implements innovative and automated approaches for system administration tasks. Implements, tests, and troubleshoots Active Directory Group Policies (GPOs) and other policies utilized by workstation and server endpoints. Maintains and performs systems backups, data recovery, and any associated configurations, which include replications and archivals. Maintains SAN (Storage Area Network) environments in both Operations Center (HQ) and Disaster Recovery sites, ensuring proper data replication continuity. Recommends, schedules, and performs hardware and software upgrades in a timely manner, including system security patches. Installs all equipment with appropriate security features/tools following hardening standards, and resolves security issues/vulnerabilities related to the server, storage, and associated environments. Maintains user environments, file shares, and their associated permissions. Provides end-user-support for all hardware, software, and peripheral devices. Works within the IT Help Desk and project management platforms by assuming, updating, and resolving escalated service requests and projects. Mentors and provides guidance to other team members on requests/projects.
<p>The AI/ML Solutions Architect will lead the design, development, and deployment of advanced AI/ML solutions. This role combines deep technical expertise with strategic thinking to ensure AI/ML initiatives are successfully integrated into business operations. You will work closely with data scientists, engineers, and stakeholders to create architectures that maximize performance, scalability, and reliability.</p><p> </p><p><strong>Key Responsibilities:</strong></p><ul><li>Design end-to-end AI/ML architectures, including data pipelines, model training, deployment, and monitoring.</li><li>Collaborate with stakeholders to define AI/ML solution requirements aligned with business objectives.</li><li>Provide technical leadership and guidance to teams implementing AI/ML models and systems.</li><li>Develop scalable and secure solutions using cloud platforms (AWS, Azure, GCP) and MLOps best practices.</li><li>Ensure seamless integration of AI/ML models into existing IT systems and workflows.</li><li>Conduct feasibility studies, prototyping, and performance evaluations for new technologies and frameworks.</li><li>Stay updated on advancements in AI/ML and recommend innovative solutions to meet emerging needs.</li><li>Document technical designs, workflows, and implementation plans to ensure clarity and reproducibility.</li></ul><p><br></p>
Title: Software Engineer<br>The Software Engineer is a full-stack developer responsible for designing, developing, testing, and maintaining software applications that support the organization’s mission. This role combines technical expertise across multiple platforms and programming languages with a strong understanding of business objectives. The successful candidate will deliver high-quality, maintainable solutions within an agile team environment, independently executing moderate to complex technical work and contributing to architecture and implementation efforts.<br>The Software Engineer collaborates with teammates, clients, vendors, and internal partners to develop custom applications, enhance vendor systems, and support integrations connecting systems to broader organizational platforms. Responsibilities include system design, database development, API creation and consumption, testing, and support for on-premise and cloud-based solutions. The role also supports deployment automation, infrastructure maintenance, and DevOps practices. Strong analytical, problem-solving, and communication skills are essential for success.<br><br>Key Responsibilities<br><br>Design, develop, test, and maintain custom software solutions aligned with organizational standards.<br>Translate technical requirements into functional components.<br>Participate in architecture discussions and contribute to decisions on system structure, integration, and performance optimization.<br>Develop and consume APIs and web services for interoperability.<br>Maintain and enhance database-driven applications using Oracle PL/SQL and APEX.<br>Implement unit testing, automated testing frameworks, and follow version control and release management best practices.<br>Support infrastructure and applications in on-premises and cloud environments.<br>Contribute to deployment automation and DevOps workflows.<br>Create and maintain documentation for systems, integrations, and processes.<br>Collaborate with stakeholders to clarify requirements and deliver technical solutions.<br>Support vendor system implementations and integrations.<br>Participate in agile ceremonies, code reviews, and knowledge sharing.<br>Engage in continuous learning to stay current with tools, languages, and frameworks.<br>Demonstrate commitment to diversity, inclusion, and cultural awareness.<br><br><br>Required Qualifications<br><br>Bachelor’s degree in Computer Science or related field.<br>5–8 years of professional software engineering experience.<br>Expertise in:<br><br>Full-stack development.<br>Oracle PL/SQL, SQL, APEX for complex queries, data migrations, schema management, and performance tuning.<br>One or more programming languages (Python, JavaScript, Java, C#).<br>Front-end development (HTML, CSS, JavaScript).<br>RESTful APIs and system integration.<br>Version control (Git, SVN) and collaborative workflows.<br>Internet communication components (DNS, SMTP) and security protocols (SSL/TLS).<br><br><br>Familiarity with DevOps principles and CI/CD tools.<br>Strong problem-solving, communication, and teamwork skills.<br><br><br>Preferred Qualifications<br><br>Experience with vendor system integration and ERP platforms (Advancement, SIS, Finance, HR).<br>Knowledge of data governance and compliance standards (FERPA, HIPAA, GDPR).<br>Familiarity with cloud platforms (AWS, Azure, GCP) and containerization (Docker, Kubernetes).<br>Exposure to Agile methodologies and tools.<br>Understanding of institutional workflows.<br>Practical experience in database administration and application upgrade procedures.
<p>Robert Half Technology is is looking to hire a <strong>SharePoint Developer </strong>with <strong>React </strong>experience<strong> </strong>for a firm based in Seattle, Washington. The ideal candidate should have extensive hands-on experience in cloud development, with deep expertise in Microsoft 365, SharePoint Online, and Azure cloud services. The candidate must have in-depth knowledge of the SharePoint Framework (or expertise in developing custom applications using modern front-end technologies such as React, Angular, or Vue, along with strong full-stack development skills), API development, cloud-native best practices, and DevOps processes to deliver high-quality, scalable solutions.</p><p><br></p><p><strong>Duration:</strong> 6-month contract-to-hire</p><p><strong>Schedule:</strong> Monday-Friday (Core Business Hours) PST - Flexible</p><p><strong>Location: </strong>100% Remote</p><p><br></p><p><strong>Hands-On Development:</strong></p><ul><li>Develop and maintain scalable applications using C#, SPFx, HTML, CSS, React (or framework of choice), TypeScript, JavaScript, Power Platform, Azure Cloud Services, Microsoft Graph API, PnP JS, PowerShell, SharePoint REST and custom APIs.</li><li>Develop and optimize Azure Functions, Web APIs, Runbooks and cloud-native solutions.</li><li>Automation using Power Automate and create low-code/no-code solutions using Power Apps.</li><li>Apply best practices in code quality, testing, and deployment to ensure all solutions are robust, reliable, and secure.</li><li>Work independently and troubleshoot issues, resolving technical challenges and clearing dependencies in a timely manner.</li><li>Follow coding and compliance standards, contribute to and maintain technical documentation.</li><li>Leverage AI tools and technologies throughout the development process to automate repetitive tasks, enhance code quality, and boost overall productivity.</li></ul><p><strong>Agile Practices, DevOps & Continuous Innovation:</strong></p><ul><li>Continuously learn and adopt the latest features and updates in leading cloud platforms and technologies. Explore and identify opportunities to integrate AI and intelligent features into solutions.</li><li>Collaborate closely with team members within an Agile framework, actively participating in sprint planning, assist in defining acceptance criteria, identifying any technical dependencies, and effectively sizing user stories to ensure clear requirements, and alignment with sprint goals.</li><li>Utilize Azure DevOps and GitHub to plan, track, and document work, ensuring transparency and effective project coordination</li></ul>
The Opportunity: Be part of a dynamic team that designs, develops, and optimizes data solutions supporting enterprise-level products across diverse industries. This role provides a clear track to higher-level positions, including Lead Data Engineer and Data Architect, for those who demonstrate vision, initiative, and impact. Key Responsibilities: Design, develop, and optimize relational database objects and data models using Microsoft SQL Server and Snowflake. Build and maintain scalable ETL/ELT pipelines for batch and streaming data using SSIS and cloud-native solutions. Integrate and utilize Redis for caching, session management, and real-time analytics. Develop and maintain data visualizations and reporting solutions using Sigma Computing, SSRS, and other BI tools. Collaborate across engineering, analytics, and product teams to deliver impactful data solutions. Ensure data security, governance, and compliance across all platforms. Participate in Agile Scrum ceremonies and contribute to continuous improvement within the data engineering process. Support database deployments using DevOps practices, including version control (Git) and CI/CD pipelines (Azure DevOps, Flyway, Octopus, SonarQube). Troubleshoot and resolve performance, reliability, and scalability issues across the data platform. Mentor entry level team members and participate in design/code reviews.
<p>We are looking for a skilled Senior Software Engineer to join our dynamic team. In this role, you will be instrumental in developing and maintaining innovative digital solutions, including customer-facing applications and internal systems. Your expertise will drive the modernization of our software while leveraging cutting-edge technologies and cloud infrastructure to advance our business goals.</p><p><br></p><p>Responsibilities:</p><p>• Develop and maintain cloud-based applications using modern frameworks and programming languages, including .NET Core and web frontend technologies.</p><p>• Lead the creation of new software solutions, guiding projects from initial concept to final delivery.</p><p>• Modernize and improve existing internal systems to enhance functionality and performance.</p><p>• Apply and promote best practices in software development, such as agile methodologies, testing, and robust instrumentation.</p><p>• Utilize and refine CI/CD pipelines to enable rapid and efficient software deployment.</p><p>• Mentor and provide guidance to team members, fostering collaboration and growth.</p><p>• Troubleshoot and resolve technical issues swiftly to minimize disruptions.</p><p>• Communicate effectively with team members, management, and external stakeholders, ensuring clarity and a high standard of communication.</p><p>• Document workflows and project details using internal management systems.</p><p>• Demonstrate a commitment to continuous learning and improvement in software craftsmanship</p>