We are looking for a skilled Database Analyst to join our team on a contract basis in Vista, California. In this role, you will play a key part in managing, analyzing, and reporting healthcare data while supporting technical and application needs across various locations. This position requires advanced expertise in database technologies and a strong commitment to delivering timely and detail-oriented solutions.<br><br>Responsibilities:<br>• Export and manage database content using advanced tools and processes, ensuring data accuracy and reliability.<br>• Provide technical and application support to multiple locations, addressing issues promptly and with attention to detail.<br>• Collaborate with teams to identify data requirements and create detailed report specifications.<br>• Prepare statistical and narrative reports, as well as visual data presentations, for internal and external stakeholders.<br>• Develop analytics and strategies based on healthcare data to drive informed decision-making.<br>• Analyze and interpret complex health plan data, identifying trends and translating them into actionable insights.<br>• Communicate user needs and requests effectively to management.<br>• Assist clinic and IT staff by resolving issues submitted via the IT Help Desk.<br>• Participate in development activities, including workshops and educational programs, to enhance skills.<br>• Ensure alignment with the organization's mission, vision, and values through your work.
<p>Robert Half is currently partnering with a well-established company in San Diego that is looking for a Senior Data Engineer, experienced in BigQuery, DBT (Data Build Tool), and GCP. This position is full time (permanent placement) that is 100% onsite in San Diego. We are looking for a Principal Data Engineer that is passionate about optimizing systems with advanced techniques in partitioning, indexing, and Google Sequences for efficient data processing. Must have experience in DBT!</p><p>Responsibilities:</p><ul><li>Design and implement scalable, high-performance data solutions on GCP.</li><li>Develop data pipelines, data warehouses, and data lakes using GCP services (BigQuery, and DBT, etc.).</li><li>Build and maintain ETL/ELT pipelines to ingest, transform, and load data from various sources.</li><li>Ensure data quality, integrity, and security throughout the data lifecycle.</li><li>Design, develop, and implement a new version of a big data tool tailored to client requirements.</li><li>Leverage advanced expertise in DBT (Data Build Tool) and Google BigQuery to model and transform data pipelines.</li><li>Optimize systems with advanced techniques in partitioning, indexing, and Google Sequences for efficient data processing.</li><li>Collaborate cross-functionally with product and technical teams to align project deliverables with client goals.</li><li>Monitor, debug, and refine the performance of the big data tool throughout the development lifecycle.</li></ul><p><strong>Minimum Qualifications:</strong></p><ul><li>5+ years of experience in a data engineering role in GCP .</li><li>Proven experience in designing, building, and deploying data solutions on GCP.</li><li>Strong expertise in SQL, data warehouse design, and data pipeline development.</li><li>Understanding of cloud architecture principles and best practices.</li><li>Proven experience with DBT, BigQuery, and other big data tools.</li><li>Advanced knowledge of partitioning, indexing, and Google Sequences strategies.</li><li>Strong problem-solving skills with the ability to manage and troubleshoot complex systems.</li><li>Excellent written and verbal communication skills, including the ability to explain technical concepts to non-technical stakeholders.</li><li>Experience with Looker or other data visualization tools.</li></ul>