<p>We are looking for an experienced Product Quality Engineer to join our team in Costa Mesa, California. In this long-term contract position, you will play a critical role in ensuring product reliability and quality throughout its lifecycle. The ideal candidate is detail-oriented, proactive, and skilled at identifying and resolving quality issues efficiently.</p><p><br></p><p>Responsibilities:</p><p>• Conduct thorough inspections of incoming materials, assembly processes, and end-of-line tests to ensure compliance with quality standards.</p><p>• Provide prompt and effective resolutions for nonconformities identified during quality checks.</p><p>• Take ownership of reducing product risks and enhancing reliability across all stages of the product lifecycle.</p><p>• Collaborate with suppliers, manufacturing teams, and field operations to maintain and improve product quality.</p><p>• Lead root cause analysis and corrective action initiatives to address and rectify quality issues.</p><p>• Utilize product health data to troubleshoot problems, support refurbishment testing, and optimize overall quality.</p><p>• Drive continuous improvement by developing data-driven strategies and sharing lessons learned across departments.</p><p>• Enhance the Quality Management System by streamlining documentation and refining processes.</p><p>• Actively participate in investigations, projects, and initiatives to support organizational goals.</p><p>• Work hands-on in manufacturing environments to address quality concerns and ensure smooth operations.</p>
<p>Robert Half is currently partnering with a well-established company in San Diego that is looking for a Senior Data Engineer, experienced in BigQuery, DBT (Data Build Tool), and GCP. This position is full time (permanent placement) that is 100% onsite in San Diego. We are looking for a Principal Data Engineer that is passionate about optimizing systems with advanced techniques in partitioning, indexing, and Google Sequences for efficient data processing. Must have experience in DBT!</p><p>Responsibilities:</p><ul><li>Design and implement scalable, high-performance data solutions on GCP.</li><li>Develop data pipelines, data warehouses, and data lakes using GCP services (BigQuery, and DBT, etc.).</li><li>Build and maintain ETL/ELT pipelines to ingest, transform, and load data from various sources.</li><li>Ensure data quality, integrity, and security throughout the data lifecycle.</li><li>Design, develop, and implement a new version of a big data tool tailored to client requirements.</li><li>Leverage advanced expertise in DBT (Data Build Tool) and Google BigQuery to model and transform data pipelines.</li><li>Optimize systems with advanced techniques in partitioning, indexing, and Google Sequences for efficient data processing.</li><li>Collaborate cross-functionally with product and technical teams to align project deliverables with client goals.</li><li>Monitor, debug, and refine the performance of the big data tool throughout the development lifecycle.</li></ul><p><strong>Minimum Qualifications:</strong></p><ul><li>5+ years of experience in a data engineering role in GCP .</li><li>Proven experience in designing, building, and deploying data solutions on GCP.</li><li>Strong expertise in SQL, data warehouse design, and data pipeline development.</li><li>Understanding of cloud architecture principles and best practices.</li><li>Proven experience with DBT, BigQuery, and other big data tools.</li><li>Advanced knowledge of partitioning, indexing, and Google Sequences strategies.</li><li>Strong problem-solving skills with the ability to manage and troubleshoot complex systems.</li><li>Excellent written and verbal communication skills, including the ability to explain technical concepts to non-technical stakeholders.</li><li>Experience with Looker or other data visualization tools.</li></ul>
<p>Robert Half is looking for a DBA to join our client's team in Murietta. </p><p><br></p><p>SQL Server & ERP Administration</p><p>• Administer, monitor, and maintain multiple Microsoft SQL Server instances, with the Deltek Vantagepoint ERP database as the primary system of record.</p><p>• Write, optimize, and tune complex T-SQL queries, stored procedures, and views for both ad hoc analysis and application logic.</p><p>• Create and schedule SQL Server Agent jobs to automate data ingestion, maintenance routines, and reporting pipelines.</p><p>• Implement, document, and test backup and disaster recovery strategies to ensure business continuity.</p><p>• Manage database indexes, performance tuning, and query optimization to support application efficiency and stability.</p><p>• Plan and execute Deltek Vantagepoint upgrades, schema updates, and integrations, working closely with the ERP administrator and IT team.</p><p>• Design and publish SSRS reports and datasets to meet business and finance reporting requirements.</p><p><br></p><p><br></p><p>Cross-System Data Architecture</p><p>• Support integration between SQL Server and the Azure Databricks Lakehouse, ensuring consistent and secure data flow.</p><p>• Collaborate with data engineers and analysts to maintain data integrations to other critical systems.</p><p>• Develop and maintain SQL transformations and stored procedures that serve as upstream logic for analytics and reporting layers.</p><p>• Contribute to the management of Unity Catalog, metadata inventory, and data lineage documentation.</p><p>• Partner with Power BI administrators to optimize Fabric dataset refreshes, gateways, and source connections.</p><p><br></p><p>Data Governance, Security, & Monitoring</p><p>• Enforce data security and access controls aligned with IT and Data Governance policies.</p><p>• Participate in data issue management and quality improvement processes, ensuring system reliability and integrity.</p><p>• Monitor system performance using both native and custom monitoring tools; proactively identify and resolve issues.</p><p>• Maintain clear, comprehensive documentation for database configurations, schemas, and operational procedures.</p><p><br></p><p><br></p>
<p>SUMMARY OF POSITION: </p><p>The Senior Data Analyst plays a pivotal role in transforming data into actionable insights that drive business decisions and operational excellence across the organization. As a senior member of the Data Team—part of the IT Division—the analyst will lead advanced data modeling, analytics, and visualization efforts, while mentoring peers and ensuring adherence to data governance and quality standards.</p><p><br></p><p>This position partners closely with stakeholders across Finance, Operations, HR, and Project Management to deliver trusted data solutions from our Azure Databricks Lakehouse and connected systems (Deltek Vantagepoint, Dynamics 365, Workday, and others). The ideal candidate is highly analytical, self-driven, and passionate about leveraging data to enable smarter business outcomes.</p><p><br></p><p>ESSENTIAL JOB FUNCTIONS:</p><p>• Lead advanced analytical projects using data sourced from the company’s Azure Databricks lakehouse, ensuring outputs align with strategic business objectives.</p><p>• Design and develop Power BI dashboards leveraging Microsoft Fabric capacity and robust data models for scalability, governance, and performance.</p><p>• Build and maintain data models and transformations in Databricks SQL using Delta tables, Unity Catalog, and Lakehouse architecture best practices.</p><p>• Collaborate with data engineers to enhance ingestion pipelines using Fivetran, Workato, REST APIs, and other connectors.</p><p>• Perform exploratory and diagnostic analyses using SQL, Python (pandas, numpy), and Power BI to uncover business trends, inefficiencies, and improvement opportunities.</p><p>• Ensure data quality and lineage through established data governance frameworks, including metadata documentation, business glossary maintenance, and data issue management.</p><p>• Develop stored procedures and SQL logic to support operational systems such as Deltek Vantagepoint and related financial integrations.</p><p>• Collaborate with business stewards and system owners to validate data accuracy and drive consistency across departments.</p><p>• Mentor junior analysts and help define Power BI development standards, DAX best practices, and model optimization techniques.</p><p>• Communicate insights effectively through visual storytelling, executive dashboards, and data narratives tailored for non-technical audiences.</p><p><br></p><p>EDUCATION & EXPERIENCE</p><p>• Bachelor’s degree in Computer Science, Data Science, Information Systems, Business Analytics, or a related field.</p><p>• Minimum 5–7 years of professional experience in data analytics, BI development, or related technical roles.</p><p>• Experience working with cloud-based data platforms (Azure, Databricks, or Snowflake).</p><p>• Microsoft Certification PL-300 (Power BI Data Analyst) or DP-900 / Databricks Certified Data Analyst preferred.</p>