We are looking for a skilled Senior Front End Engineer to join our team in New York, New York. In this role, you will design and develop high-quality, scalable web applications and interfaces, ensuring seamless user experiences. The ideal candidate is passionate about technology, thrives in a collaborative environment, and has a proven track record of delivering commercial products.<br><br>Responsibilities:<br>• Build and maintain high-performance, user-centric web interfaces and APIs.<br>• Develop and implement solutions using React, Redux, Typescript, and JavaScript.<br>• Collaborate closely with team members to deliver multiple products efficiently.<br>• Leverage cloud platforms to create scalable and reliable real-time systems.<br>• Troubleshoot and debug issues to ensure system reliability and performance.<br>• Participate in the entire software development lifecycle, including testing and deployment.<br>• Stay informed of emerging technologies and industry trends to enhance system capabilities.<br>• Apply attention to detail to create intuitive and engaging user interfaces.<br>• Optimize web applications for speed, scalability, and responsiveness.
<p><strong>About the Role:</strong></p><p>Are you fluent in RPG, CL, and the mystical language of green screens? Do you dream in fixed-format code and occasionally wonder why the AS400 is still running everything important in the world? If so, we need you!</p><p><strong>Responsibilities:</strong></p><ul><li>Develop, maintain, and enhance AS400 applications (because yes, they still exist—and they’re mission-critical).</li><li>Collaborate with business teams to turn requirements into elegant, old-school magic.</li><li>Debug issues like a detective in a world where the clues are all in uppercase.</li></ul><p><strong>Requirements:</strong></p><ul><li>Strong experience with AS400/iSeries programming (RPG, CL, DB2).</li><li>Ability to work independently and occasionally explain to younger developers what a “green screen” is.</li><li>A sense of humor—because you’ll need it when someone asks, “Can’t we just move this to the cloud?”</li></ul><p><strong>Why Join Us?</strong></p><ul><li>Competitive salary and benefits.</li><li>Work with technology that’s older than some of your coworkers—but still more reliable.</li><li>Be the hero who keeps the legacy alive while everyone else is chasing microservices.</li></ul><p><strong>Apply Today!</strong></p><p>If you’re ready to keep the AS400 humming and prove that old-school is still cool, send us your resume. Bonus points if you can tell us your favorite RPG joke.</p>
We are seeking a Senior Data Engineer to join a growing data engineering team responsible for building and scaling an enterprise data platform. This role will focus on developing cloud-based data pipelines within Google Cloud Platform (GCP) while also supporting elements of a legacy on-premise data warehouse environment during an ongoing cloud migration.<br><br>The ideal candidate will have strong experience building scalable data pipelines, event-driven data architectures, and cloud-native data services. This is a great opportunity to contribute to a rapidly expanding data ecosystem and help drive the transition to modern cloud data platforms.<br><br>Key Responsibilities<br><br>Design, build, and maintain data pipelines within Google Cloud Platform (GCP)<br><br>Develop event-driven data streaming solutions using Pub/Sub<br><br>Build and maintain Python-based services using Cloud Run<br><br>Develop and optimize BigQuery datasets and queries<br><br>Integrate new data sources into the enterprise data platform<br><br>Maintain and support existing ETL processes within SQL Server<br><br>Work with SSIS and stored procedures in legacy data environments<br><br>Monitor, troubleshoot, and optimize data pipeline performance<br><br>Collaborate with engineering teams to support data-driven initiatives<br><br>Participate in on-call rotations for production systems<br><br>Required Qualifications<br><br>5+ years of experience in Data Engineering<br><br>Strong experience with Google Cloud Platform (GCP)<br><br>Experience building data pipelines and ETL processes<br><br>Experience with Pub/Sub or event-driven data streaming<br><br>Strong experience with BigQuery<br><br>Proficiency in Python<br><br>Experience with Cloud Run or similar serverless services<br><br>Strong SQL experience including SQL Server<br><br>Experience with SSIS or similar ETL tools
We are looking for an experienced DevOps Engineer to join our team in Orem, Utah. In this role, you will collaborate with cross-functional teams to build, deploy, and maintain scalable and reliable systems, ensuring seamless integration and automation of workflows. You will work with cutting-edge technologies to enhance the infrastructure and optimize the development lifecycle.<br><br>Responsibilities:<br>• Design, implement, and maintain infrastructure-as-code solutions using tools like Terraform or AWS CDK.<br>• Develop and optimize containerized deployments using Docker and Kubernetes.<br>• Collaborate with software developers to integrate DevOps practices into the software development lifecycle.<br>• Set up and manage CI/CD pipelines using Bitbucket Pipelines or similar tools.<br>• Monitor and troubleshoot system performance using AWS services such as CloudWatch.<br>• Build scalable backend systems using .NET Core and C#, ensuring efficient data handling with PostgreSQL and Entity Framework Core.<br>• Develop and maintain frontend systems with React, TypeScript, and state management libraries like Redux Toolkit.<br>• Implement authentication and authorization solutions using AWS Cognito.<br>• Enhance testing frameworks and tools, including Vitest, Playwright, and React Testing Library.<br>• Support microservices architecture and ensure seamless communication between components.
<p>The Senior Data Engineer plays a key role in architecting, developing, and operating reliable, production-ready data solutions that enable analytics, automation, and operational processes across our client’s organization.</p><p><br></p><p>Operating within a modern, cloud-based data ecosystem, this role is responsible for bringing together data from internal platforms and external partners, transforming it into trusted, high-quality assets, and delivering it consistently to downstream users and systems. The work spans the full data lifecycle—ingestion, orchestration, transformation, and delivery—and blends advanced SQL development with Python-based pipeline and workflow automation.</p><p><br></p><p>This role sits at the intersection of data and systems engineering and works closely with Business Intelligence, Business Technology, and operational teams to ensure data solutions are scalable, dependable, and aligned with real business outcomes.</p><p><br></p><p><br></p><p><br></p><p><br></p>
We are looking for an experienced SAP Developer to join our team in San Antonio, Texas. This role involves designing, implementing, and supporting SAP solutions tailored to meet business needs while ensuring high-quality development standards. The ideal candidate will possess strong technical expertise and a proactive approach to problem-solving within a collaborative environment.<br><br>Responsibilities:<br>• Develop and maintain SAP solutions, including SAP ERP and S/4HANA, to support business operations.<br>• Collaborate with functional teams to identify and propose technical solutions for new business requirements.<br>• Create, test, and implement integrations using SAP PI/PO and other SAP technologies.<br>• Write comprehensive technical specifications and provide training and support during go-live phases.<br>• Troubleshoot and resolve SAP-related issues, ensuring minimal disruption to business processes.<br>• Drive continuous improvement initiatives within the SAP environment, delivering system enhancements and projects.<br>• Assist in SAP template rollouts and adhere to established methodologies.<br>• Work in a global, multi-country SAP environment, contributing to international projects as required.<br>• Ensure compliance with configuration management practices and maintain documentation standards.<br>• Stay updated on emerging SAP technologies and contribute to their adoption within the organization.
We are looking for an experienced Software Engineer to develop and maintain applications within the Symitar platform, contributing to the enhancement of our core banking systems. This role requires a proactive approach to creating tailored solutions that align with business needs while ensuring system performance and security compliance.<br><br>Responsibilities:<br>• Create and manage Symitar PowerOn scripts, batch jobs, and integrations to meet operational objectives.<br>• Support system upgrades and troubleshoot issues to ensure seamless functionality.<br>• Work closely with stakeholders to gather requirements and deliver customized solutions.<br>• Design, generate, and maintain reports using Symitar Quest and associated tools.<br>• Monitor and optimize system performance while ensuring data integrity and security standards.<br>• Provide ongoing production support and participate in the on-call rotation as required.<br>• Collaborate on system enhancements and ensure compatibility with core banking processes.
<p>We are looking for a Systems Administrator to help drive technology modernization and operational excellence. We are seeking IT professionals with a strong background in systems administration or infrastructure operations, eager to support mission-critical initiatives and grow into senior infrastructure roles.</p><p><strong>Qualifications:</strong></p><ul><li>2–5 years of experience in systems administration or infrastructure operations; helpdesk/NOC backgrounds with relevant exposure also considered. (Based on general knowledge)</li><li>Proven expertise in Linux system administration. (Based on general knowledge)</li><li>Familiarity with enterprise infrastructure, including storage, virtualization, and networking. (Based on general knowledge)</li><li>Hands-on experience with monitoring systems such as Zabbix, Grafana, or Prometheus. (Based on general knowledge)</li><li>Basic scripting skills (e.g., Bash, Python) and a strong interest in further developing automation capabilities. (Based on general knowledge)</li><li>Excellent written communication for documentation and process development. (Based on general knowledge)</li><li>Ability to respond quickly and decisively during support rotation and system issues. (Based on general knowledge)</li><li>Comfortable leveraging AI tools for troubleshooting, documentation, and automation with a disciplined approach to validating outputs. (Based on general knowledge)</li><li>Growth mindset: eagerness to learn, develop, and advance into senior infrastructure roles over time. (Based on general knowledge)</li></ul><p><br></p>
We are looking for a skilled Software Engineer to join our team in Bethlehem, Pennsylvania. This role involves designing and optimizing data systems, managing tools for data orchestration, and ensuring secure and efficient operations. The ideal candidate will thrive in a collaborative environment while delivering impactful solutions for business intelligence and operations.<br><br>Responsibilities:<br>• Build and manage data orchestration tools, including creating variables, setting notifications, and configuring retries.<br>• Optimize Snowflake performance by adjusting warehouse sizing, clustering, and profiling queries.<br>• Schedule and oversee near real-time data loads using Snowflake Tasks and Streams.<br>• Implement rigorous data quality checks such as verifying freshness, row counts, and referential integrity.<br>• Monitor and control costs through usage dashboards and guardrails.<br>• Ensure secure operations by maintaining roles, managing secrets, and auditing logs.<br>• Develop and monitor Power BI datasets to support Finance and Operations teams.<br>• Collaborate with stakeholders to gather requirements and deliver tailored solutions.<br>• Enhance and maintain front-end data applications using tools like Streamlit and Python.<br>• Create detailed documentation, including runbooks, root cause analyses, and change tickets for releases.
<p>We’re seeking a dynamic Infrastructure Engineer to design, build, and maintain our critical IT infrastructure. You’ll ensure high availability, scalability, and security across servers, storage, and cloud systems. If you’re passionate about creating resilient systems and solving complex technical challenges, join us to power our technology backbone!</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Design and deploy infrastructure solutions, including servers, virtualization, and storage systems.</li><li>Manage and optimize on-premises and cloud environments (e.g., AWS, Azure, VMware).</li><li>Automate provisioning, monitoring, and scaling using tools like Terraform or Ansible.</li><li>Troubleshoot and resolve issues impacting system performance or availability.</li><li>Implement security best practices, including patch management and access controls.</li><li>Collaborate with DevOps and application teams to support CI/CD pipelines.</li><li>Document infrastructure designs and maintain operational runbooks.</li></ul><p><br></p>
<p>As a Backend Senior Software Engineer I, you are a trusted technical contributor who helps lead the design and delivery of complex features and systems within a domain. Deep expertise allows you to set direction for the team, mentor others through code reviews, pairing, and project leadership, and influence both team processes and technical outcomes. You promote engineering excellence and partner closely with cross-functional stakeholders to ensure our technical decisions align with product and business goals.</p><p><br></p><p>How We Work Together</p><p>Strategic Technical Leadership</p><p>• Technical decision-making is led with attention to both immediate customer impact and long-term business needs.</p><p>• Roadmaps are developed in collaboration with product, the team, and leadership to guide future direction.</p><p>• Architectural improvements are defined to reduce operational overhead and strengthen scalability and reliability.</p><p>Cross-Team Collaboration & Influence</p><p>• Cross-team improvements are identified and advocated for, optimizing workflows and enhancing tools with broad organizational impact.</p><p>• Collaboration is fostered through active feedback, shared learning, and a commitment to technical excellence.</p><p>• Communication is clear and engaging, introducing challenging ideas thoughtfully and translating complex concepts into accessible insights.</p><p>Problem Solving & Overcoming Obstacles</p><p>• Trade-offs between business priorities, technical constraints, and customer needs are carefully balanced to make data-driven decisions.</p><p>• Systems are simplified where possible, reducing technical debt and improving long-term maintainability and scalability.</p><p>Mentorship & Growth</p><p>• Mentorship is offered through coaching, feedback, and participation in technical hiring, ensuring culture and mission are well represented.</p><p>• Continuous feedback is sought and shared, supporting both personal development and the growth of the team.</p><p><br></p><p>Ownership & Accountability</p><p>• End-to-end system design and execution for high-impact features or systems is owned with an emphasis on quality and reliability.</p><p>• Monitoring, logging, and failure-domain strategies are built in to ensure resilience and performance.</p><p>• Project progress is tracked, risks are assessed, and plans are adapted while maintaining a long-term perspective.</p><p>Technology</p><p>• Backend engineer with a preference toward Java/Kotlin</p><p>• Deep expertise in designing, building, and maintaining large-scale systems, focusing on clean, scalable, and performant code.</p><p>• Proven SME in modern programming paradigms and languages, with rapid adaptability to new ones.</p><p>• Strong mastery of testing practices like unit, integration, end-to-end, and high code quality standards.</p><p>• Expertise in current dev tools: version control, CI/CD, IaC, containers, cloud architectures. Eager to adopt best-fit solutions.</p><p>• Passion for tracking industry trends, evaluating new tech, and driving sustainable adoptions aligned with business goals.</p><p>• Openness to incorporating AI into the dev workflow</p><p>• Bonus for Marketplace or Apps development</p><p>• Bonus for exposure to Kong or similar</p><p>• Bonus for Full Stack capabilities</p>
<p>Design, build, and optimize scalable data pipelines and data platforms within Azure environments. Collaborate with analytics and business teams to ensure reliable data ingestion, transformation, and accessibility for reporting and advanced analytics.</p>
<p>We’re hiring a <strong>C++ Software Engineer</strong> to build and maintain high‑quality, performant backend services and libraries that power core product features. You’ll work across the full SDLC—design, implementation, testing, performance tuning, and release—partnering with product and platform teams.</p><p><strong>What You’ll Do</strong></p><ul><li>Design and implement <strong>robust, modern C++ (C++17/20)</strong> services, libraries, and APIs</li><li>Optimize for <strong>performance, memory, and reliability</strong> (profiling, benchmarking, CPU/memory analysis)</li><li>Build <strong>unit/integration tests</strong> and contribute to CI/CD pipelines</li><li>Debug complex, concurrent systems (threads, async I/O, lock‑free patterns where applicable)</li><li>Collaborate on <strong>system design</strong> (docs, diagrams, trade‑offs) and code reviews</li><li>Integrate with <strong>networking, storage, and messaging</strong> components (REST/gRPC/TCP)</li><li>Write clear documentation and support production releases (on‑call rotation optional)</li></ul><p><br></p>
<p>POSITION SUMMARY</p><p>As a member of the Enterprise Architecture team, the Software Architect will be responsible for building cloud native and highly scalable enterprise level software products with Azure being the preferred cloud platform of use. This role will also be responsible for architecting and implementing micro-service architecture for enterprise level business applications that are aligned with overall business strategy.</p><p><br></p><p>Essential Functions, Duties, and Responsibilities</p><p>• Evaluates internal functions, system development strategies, and suggest recommended improvements.</p><p>• Develop system architecture models to align with the organization's strategies and goals.</p><p>• Organizes team trainings to improve employees' knowledge and skills for organizational, developmental growth.</p><p>• Develops methods for compliance architecture, such as but not limited to data storage, metadata management, and change control.</p><p>• Identify and implements build versus buy strategies, mentor personnel, and views of the overall system strategy.</p><p>• Steers the effort to promote existing custom applications to cloud platform.</p><p>• Implements solution design standards and develops reference implementations.</p><p>• Provides documentation, training, and support for the designed solution.</p><p>• Contributes to the timely, high-quality delivery of customer facing software projects.</p><p>• Ability to assist business partners to be self-sufficient in designing business process workflows.</p><p>• Understandings of various cloud offerings - Saas, Paas and Iaas.</p><p>• Maintain up-to-date industry knowledge of relational and NoSQL databases and new cutting edge technologies around Single Sign-On, Mobile development, DevOps, Software Architecture, Cloud Offerings etc.</p><p><br></p><p><br></p>
<p><strong>Overview</strong></p><p>We are looking for a <strong>Data Engineer </strong>to design, build, and maintain data solutions that enable reporting, analytics, and informed decision‑making.</p><p><strong>Responsibilities</strong></p><ul><li>Design and maintain data pipelines and data models</li><li>Extract, transform, and load (ETL) data from multiple sources</li><li>Develop dashboards, reports, and analytics for business users</li><li>Ensure data accuracy, integrity, and governance</li><li>Collaborate with stakeholders to understand reporting needs</li></ul><p><br></p>
<p>We are seeking an Implementation Engineer to join our team and support the deployment of enterprise software solutions for public sector organizations nationwide. This role involves working closely with project managers and consultants to configure systems, train users, and ensure the successful implementation of software products. The position offers flexibility to work remotely or from one of our office locations.</p><p><strong>Responsibilities:</strong></p><ul><li>Conduct interviews with clients to gather requirements and document needs for system configuration.</li><li>Configure software platforms to align with customer workflows, utilizing standard industry modules and features.</li><li>Perform data migration and conversion using tools such as ETL engines and scripting languages (e.g., Python) to ensure a seamless transition from legacy systems.</li><li>Integrate spatial data and connectivity features to enhance overall system usability.</li><li>Deliver end-user training sessions and develop comprehensive documentation for client reference.</li><li>Provide on-site and remote support during system go-live phases to ensure a smooth transition.</li><li>Travel periodically to client locations to assist with implementation and training.</li><li>Stay up to date on security and privacy policies and report any incidents according to company procedures.</li><li>Collaborate with internal teams to ensure clients receive ongoing support after initial implementation.</li></ul><p><br></p>
<p><strong>Data Modeling and Analysis</strong></p><ul><li>Design data models and optimize performance: Creating the structure of data relationships ensuring efficient data retrieval and calculations.</li><li>Create calculated columns and measures: Using DAX to calculate derived values and aggregate metrics.</li><li>Perform exploratory data analysis (EDA): Using BI tools to explore data, identify trends, and patterns.</li><li>Apply advanced data analysis techniques (e.g., statistical analysis, time series analysis, predictive modeling).</li><li>Integrate machine learning models into Power BI dashboards.</li><li>Experience building semantic models</li></ul><p><strong>Dashboard Development and Visualization</strong></p><ul><li>Designing dashboards: Creating visually appealing and interactive dashboards.</li><li>Creating visualizations: Using charts, graphs, and other visual elements to represent data.</li><li>Implementing interactivity: Adding filters, slicers, and drill-down capabilities.</li><li>Expertise in SQL and DAX and knowledge of Python, R.</li><li>Strong proficiency in Power BI.</li><li>Data modeling and visualization skills.</li><li>Strong problem-solving skills to address technical challenges and data quality issues.</li><li>Analytical skills with capacity to analyze complex data problems and draw meaningful insights.</li></ul>
<p><strong>Data Engineer (Hybrid, Los Angeles)</strong></p><p><strong>Location:</strong> Los Angeles, California</p><p><strong>Compensation:</strong> $140,000 - $175,000 per year</p><p><strong>Work Environment:</strong> Hybrid, with onsite requirements</p><p>Are you passionate about crafting highly-scalable and performant data systems? Do you have expertise in Azure Databricks, Spark SQL, and real-time data pipelines? We are searching for a talented and motivated <strong>Data Engineer</strong> to join our team in Los Angeles. You'll work in a hybrid environment that combines onsite collaboration with the flexibility of remote work.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and implement data pipelines and ETL workflows using cutting-edge Azure technologies (e.g., Databricks, Synapse Analytics, Synapse Pipelines).</li><li>Manage and optimize big data processes, ensuring scalability, efficiency, and data accuracy.</li><li>Build and work with real-time data pipelines leveraging technologies such as Kafka, Event Hubs, and Spark Streaming.</li><li>Apply advanced skills in Python and Spark SQL to build data solutions for analytics and machine learning.</li><li>Collaborate with business analysts and stakeholders to implement impactful dashboards using Power BI.</li><li>Architect and support the seamless integration of diverse data sources into a central platform for analytics, reporting, and model serving via ML Flow.</li></ul><p><br></p>
<p>Robert Half's marketing & creative client is seeking a Web Content Accessibility Specialist for a 6-month contract. This is a remote, 40-hour-per-week opportunity; candidates must be willing and able to work eastern (ET) hours. The Web Accessibility Specialist will support ongoing accessibility audits, testing, and compliance initiatives across web, mobile, and digital platforms, ensuring digital experiences meet the latest accessibility standards and regulations.</p><p><strong> </strong></p><p><strong>Key Responsibilities:</strong></p><ul><li>Conduct automated and manual accessibility testing using tools such as Axe, WAVE, Lighthouse, and screen readers (NVDA, JAWS, VoiceOver)</li><li>Evaluate websites, applications, and digital properties for compliance with WCAG 2.1 AA (or higher) </li><li>Identify, document, and prioritize accessibility issues with clear remediation guidance</li><li>Create and maintain documentation</li><li>Collaborate with development, design, content, and product teams to embed accessibility best practices</li><li>Support remediation, compliance reviews, and respond to accessibility-related inquiries</li><li>Provide accessibility training and guidance to internal teams</li></ul><p><br></p>
<p>We are seeking a Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. This role will support data-driven decision-making by ensuring reliable data flow, transformation, and accessibility across the organization.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, build, and maintain ETL/ELT data pipelines</li><li>Develop and optimize data models and data architectures</li><li>Integrate data from multiple sources (APIs, databases, third-party systems)</li><li>Ensure data quality, integrity, and reliability</li><li>Collaborate with data analysts, data scientists, and business stakeholders</li><li>Monitor and troubleshoot data pipeline performance issues</li><li>Implement best practices for data governance and security</li></ul><p><br></p>
We are looking for an experienced Application Support Engineer to oversee and execute technical projects, ensuring efficient operations and seamless application functionality. This role requires a proactive individual with strong organizational and problem-solving skills, capable of managing various tasks, including project planning and vendor coordination. The position is based in Glen Rock, Pennsylvania.<br><br>Responsibilities:<br>• Develop, manage, and execute small to mid-size capital projects from inception to completion.<br>• Create detailed user and functional specifications, project plans, and scopes of work to guide project execution.<br>• Prepare comprehensive requests for quotes and other necessary documentation to support procurement processes.<br>• Identify and evaluate potential manufacturers, suppliers, integrators, and contractors to ensure optimal project outcomes.<br>• Conduct thorough reviews of competitive bids to select the most suitable vendors for project needs.<br>• Collaborate with stakeholders to ensure project alignment with organizational goals and requirements.<br>• Ensure compliance with industry standards and regulations throughout the project lifecycle.<br>• Utilize AutoCAD and Autodesk tools to design and refine technical solutions.<br>• Monitor project timelines and budgets, addressing any issues promptly to ensure successful delivery.<br>• Provide technical support and troubleshooting for applications to maintain operational efficiency.
<p>We are seeking a skilled Azure Cloud Engineer to design, implement, and maintain cloud infrastructure and services on the Microsoft Azure platform. This role will focus on building scalable, secure, and highly available cloud solutions, while supporting deployment automation, monitoring, and optimization of cloud resources. The Azure Cloud Engineer will collaborate closely with development, operations, and security teams to ensure cloud infrastructure aligns with business objectives.</p><p> </p><p>Key Responsibilities</p><ul><li>Design, deploy, and manage Azure-based cloud infrastructure, including virtual machines, storage, networking, and platform services</li><li>Implement Infrastructure as Code (IaC) using ARM templates, Terraform, or Ansible to automate provisioning and configuration</li><li>Configure and maintain Azure DevOps pipelines for CI/CD, release automation, and deployment of applications</li><li>Monitor cloud infrastructure using Azure Monitor, Application Insights, and Log Analytics</li><li>Optimize performance, cost, and scalability of cloud resources</li><li>Ensure security, compliance, and governance across cloud environments</li><li>Deploy and manage containerized applications using Azure Kubernetes Service (AKS), Docker, or other container platforms</li><li>Troubleshoot and resolve cloud infrastructure issues, ensuring high availability and reliability</li><li>Collaborate with cross-functional teams to support cloud migration, hybrid cloud, and digital transformation initiatives</li><li>Stay current with Azure updates, cloud best practices, and emerging cloud technologies</li></ul><p><br></p>
<p><strong>For immediate response please message Valerie Nielsen on LinkedIn or email!</strong></p><p><br></p><p><strong>Job Title:</strong> Senior Data Engineer</p><p> <strong>Location:</strong> Hybrid – Westwood (Los Angeles, CA) near University of California, Los Angeles</p><p> <strong>Compensation:</strong> $175,000 – $185,000 base salary + 10% annual bonus</p><p> <strong>Employment Type:</strong> Full-Time</p><p><br></p><p>Overview</p><p>We are seeking a <strong>Senior Data Engineer</strong> to join a growing data team in <strong>Westwood, CA</strong>. This role will focus on designing and building scalable data pipelines, supporting analytics and reporting initiatives, and improving data infrastructure across the organization.</p><p>The ideal candidate is highly experienced with <strong>Snowflake, dbt, Python</strong>, and modern data pipeline architecture, and enjoys working closely with analytics and business teams to deliver reliable, high-quality data. Experience integrating data from CRM platforms such as <strong>Salesforce</strong> is a strong plus.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, develop, and maintain <strong>scalable data pipelines</strong> supporting analytics, reporting, and operational data needs</li><li>Build and optimize data models and transformations using <strong>dbt</strong> within a <strong>Snowflake</strong> data warehouse environment</li><li>Develop robust ETL/ELT workflows using <strong>Python</strong> and modern data engineering best practices</li><li>Collaborate with analytics teams to deliver clean, reliable datasets used in <strong>Power BI</strong> dashboards and reporting</li><li>Ensure data quality, reliability, and performance across the data platform</li><li>Optimize Snowflake warehouse performance and manage cost-efficient data storage and compute usage</li><li>Integrate data from internal and external systems, including CRM and SaaS platforms</li><li>Partner with stakeholders across engineering, product, and business teams to define data requirements and solutions</li><li>Maintain documentation and promote data engineering standards and best practices</li></ul><p><br></p><p><br></p>
<p>We are looking for a skilled SQL Developer to join our clients team in Anaheim, California. In this role, you will design and optimize database structures, develop efficient data pipelines, and create reliable reports and dashboards to support organizational goals. The ideal candidate will have a strong background in database management and programming, with the ability to ensure data integrity and high performance across all systems.</p><p><br></p><p>Responsibilities:</p><p>• Develop and implement advanced database structures, stored procedures, and queries to support organizational needs.</p><p>• Optimize data transfer processes and ensure minimal latency when moving data between production systems and the data warehouse.</p><p>• Design and generate automated and on-demand reports for both internal stakeholders and external clients.</p><p>• Perform database performance tuning, including managing indexes and maintenance plans to ensure optimal efficiency.</p><p>• Maintain and manage the data warehouse to support reliable and scalable data storage solutions.</p><p>• Collaborate with project leaders and stakeholders to ensure database solutions align with business objectives.</p><p>• Work closely with vendors and IT teams to ensure databases are backed up, replicated, and maintained effectively.</p><p>• Conduct error handling, transaction rollbacks, and nested procedure creation to enhance database reliability.</p><p>• Utilize ETL processes to transform and load data accurately and efficiently.</p><p>• Evaluate and integrate products to align with the organization's data strategy.</p>