We are looking for a skilled Business Systems Analyst to join a dynamic team in Cypress, California, specializing in delivering impactful reporting and analytics solutions. In this role, you will collaborate with stakeholders to define business needs, create functional designs, and oversee the development of Power BI solutions powered by Snowflake and other advanced database systems. This position requires a proactive approach to problem-solving and a strong ability to translate complex requirements into actionable insights.<br><br>Responsibilities:<br>• Conduct interviews with stakeholders to gather requirements, identify business challenges, and define clear objectives for future solutions.<br>• Prioritize and manage requirements to ensure focus on high-impact deliverables while assessing risks across applications and processes.<br>• Translate business requirements into detailed functional specifications, ensuring alignment with technical constraints and enterprise goals.<br>• Design user-friendly BI and reporting solutions, incorporating best practices for user interface and performance optimization.<br>• Develop persuasive business cases that articulate challenges, expected benefits, and solution value to stakeholders.<br>• Ensure that solutions adhere to technology standards and collaborate with architecture teams to maintain consistency across projects.<br>• Lead testing and validation efforts by creating use cases, documenting results, and developing reusable test scenarios.<br>• Build actionable Power BI dashboards and reports, leveraging Snowflake and relational database systems to provide valuable insights.<br>• Partner with cross-functional teams to ensure seamless integration of solutions into existing systems and workflows.<br>• Monitor and evaluate solution performance, providing recommendations for continuous improvement based on user feedback and data analysis.
<p>Our team is seeking an experienced Data Engineer to help architect, build, and optimize data infrastructure supporting enterprise decision making and advanced analytics. This position is ideal for hands-on professionals who excel in SQL, Python, and have proven expertise across enterprise cloud platforms, databases, and modern data engineering practices.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and maintain scalable data pipelines that ingest, process, and transform structured and unstructured data.</li><li>Write advanced SQL queries—including complex joins, aggregations, window functions, and CTEs—to support ETL, data analysis, and migration projects.</li><li>Develop, optimize, and productionize Python scripts for data movement and validation using vanilla Python; familiarity with PySpark is a plus.</li><li>Collaborate across teams to deliver robust APIs and services for data integration with frontend (React, Angular, Vue) and backend (Node.js, Express) frameworks.</li><li>Architect and maintain data models and schemas, ensuring alignment with business requirements and efficient data retrieval.</li><li>Manage production-level SQL (e.g., SQL Server) and NoSQL (e.g., MongoDB) databases; ensure data quality through continuous monitoring and rigorous testing.</li><li>Deploy and manage automated ETL pipelines and schema migrations using Azure Data Factory, Microsoft Fabric, or comparable technologies.</li><li>Continuously improve performance through query optimization and efficient use of data structures (lists, dictionaries, sets) in Python.</li><li>Implement agile methodologies, continuous delivery, and DevOps best practices for reliable and scalable data infrastructure.</li><li>Enforce rigorous data quality management and automated testing strategies to ensure accuracy and reliability of data assets.</li></ul><p><br></p>
<p><strong>Senior Front-End Engineer (Hybrid – West LA)</strong></p><p><strong>Compensation:</strong> Up to $150K + bonus & benefits</p><p>We’re seeking a <strong>Senior Front-End Engineer</strong> to drive modern UI development for cloud-based business applications. This hybrid role (3 days onsite in West LA) will work closely with product, design, and offshore teams to deliver scalable, high-quality, and user-centric solutions.</p><p><strong>What You’ll Do:</strong></p><ul><li>Lead front-end architecture and development with Angular and modern web technologies.</li><li>Collaborate across teams to design and implement cloud-based solutions.</li><li>Ensure application performance, scalability, and best practices in code quality.</li><li>Mentor developers and champion user experience through intuitive design.</li><li>Troubleshoot issues and drive improvements through CI/CD and DevOps practices.</li></ul><p>For immediate consideration, direct message Reid Gormly on LinkedIN</p><p><br></p><p><strong>Why Join:</strong></p><ul><li>Competitive salary up to $165K + bonus.</li><li>Hybrid schedule in West LA.</li><li>Strong career growth opportunities and professional development support.</li><li>Comprehensive benefits: medical, dental, vision, 401K with match, PTO, and more.</li></ul>
<p>Our team is seeking a highly experienced Software Engineer with deep expertise in front-end (ReactJS), back-end (NestJS), and mobile development (iOS - Swift, Android - Kotlin). The ideal candidate has a proven track record of building and scaling modern web and mobile applications and strong experience working within cloud environments and modern DevOps practices (AWS, Terraform, Kubernetes, PostgreSQL, Redis, RabbitMQ, S3, CloudFront).</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Architect, design, and implement scalable, secure, and maintainable solutions across web, mobile, and backend platforms.</li><li>Develop user-friendly web applications using ReactJS.</li><li>Build performant, robust native mobile apps in Swift (iOS) and Kotlin (Android).</li><li>Design and implement RESTful APIs and backend services using NestJS.</li><li>Lead DevOps initiatives leveraging AWS, Terraform, Kubernetes (EKS), S3, and CloudFront.</li><li>Manage data storage and retrieval with PostgreSQL, Redis, and integrate message queues via RabbitMQ.</li><li>Review code, mentor junior developers, and promote best practices in coding, testing, and automation.</li><li>Collaborate cross-functionally with design, product, and business stakeholders to deliver exceptional software solutions.</li><li>Participate in and drive continuous improvement for deployment, monitoring, and reliability.</li></ul>
<p><strong><em><u>Client </u></em></strong>= Nationally Recognized Leader in the Travel & Leisure Industry</p><p><strong><em><u>Job Title</u></em></strong> = Data Engineer (mid and senior level candidates being considered)</p><p><strong><em><u>Location</u> </em></strong>= San Fernando Valley</p><p><br></p>
<p>**** For Faster response on the position, please send a message to Jimmy Escobar on LinkedIn or send an email to Jimmy.Escobar@roberthalf(.com) with your resume. You can also call my office number at 424-270-9193****</p><p><br></p><p>My client, a Burbank entertainment based firm is looking for a Java Tech Lead Developer to join their application development team. The Java Tech Lead Developer position is hybrid 3 days a week on-site and 2 days remote. The Java tech Lead Developer should have at least 7 years of experience in Java and have some Tech Lead experience. The Tasks for the Java Tech Lead Developer position includes mentoring Java Developers, developing spring boot into Java based applications, and performing unit test cases. This is a great opportunity for a Java Tech Lead Developer to work for enterprise organization that offers amazing benefits.</p>
<p><strong>Senior Data Engineer</strong></p><p><strong>Location:</strong> Calabasas, CA (Fully Remote if outside 50 miles)</p><p> <strong>Compensation:</strong> $140K–$160K </p><p> <strong>Reports to:</strong> Director of Data Engineering</p><p>Our entertainment client is seeking a <strong>Senior Data Engineer</strong> to design, build, and optimize enterprise data pipelines and cloud infrastructure. This hands-on role focuses on implementing scalable data architectures, developing automation, and driving modern data engineering best practices across the company.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Design and maintain ELT/ETL pipelines in Snowflake, Databricks, and AWS.</li><li>Build and orchestrate workflows using Python, SQL, Airflow, and dbt.</li><li>Implement medallion/lakehouse architectures and event-driven pipelines.</li><li>Manage AWS services (Lambda, EC2, S3, Glue) and infrastructure-as-code (Terraform).</li><li>Optimize data performance, quality, and governance across systems.</li></ul><p>For immediate consideration, direct message Reid Gormly on Linkedin and Apply Now!</p>
<p>We are looking for a skilled QA Analyst with expertise in the finance realm to join our team in Southern California. In this long-term contract role, you will play a pivotal part in ensuring the reliability, compliance, and performance of critical account servicing systems. The position requires hands-on experience with mainframe testing environments and a strong ability to work within complex systems.</p><p><br></p><p>Responsibilities:</p><p>• Validate upgrades to the Customer Account Servicing System by executing structured test plans and performing manual script-based testing.</p><p>• Analyze system updates and classify them into enhancements, corrections, or regulatory changes while determining the appropriate testing methods.</p><p>• Conduct regression testing by planning and executing tests for impacted processes, ensuring compatibility between base code and custom modules.</p><p>• Develop and run detailed test scripts in Quality Center for both batch processing and online components, focusing on calculations, error handling, and system integrity.</p><p>• Execute full file production parallel tests by reconciling output files and documenting results.</p><p>• Collaborate with developers and operations teams to identify, record, and resolve defects, ensuring system stability.</p><p>• Coordinate business reviews for regulatory changes and enhancements, capturing approvals and maintaining traceability to test results.</p><p>• Manage and maintain audit-ready documentation, including test scripts, defect logs, and change control records.</p><p>• Prepare and curate test data, batch input files, and account information for mainframe testing environments.</p><p>• Drive root cause analysis for defects and escalate issues as needed while maintaining compliance with established practices.</p>
<p>**** For Faster response on the position, please send a message to Jimmy Escobar on LinkedIn or send an email to Jimmy.Escobar@roberthalf(.com) with your resume. You can also call my office number at 424-270-9193****</p><p><br></p><p>We are looking for an experienced Senior Data Engineer with expertise in Databricks and the Adobe Experience Platform (AEP) to join our team on a long-term contract basis. In this role, you will design, implement, and optimize data solutions that support diverse business domains, leveraging advanced technologies such as Databricks, Azure cloud tools, and AEP. Based in Woodland Hills, California, this position offers the opportunity to collaborate with cross-functional teams and play a pivotal role in transforming data strategies to drive business success.</p><p><br></p><p>Responsibilities:</p><p>• Develop and implement scalable data pipelines to support integrations with Adobe Experience Platform and Databricks.</p><p>• Design data architecture solutions that ensure quality, reliability, and consistency across all data flows.</p><p>• Collaborate with business stakeholders and IT teams to deliver technical solutions aligned with organizational objectives.</p><p>• Optimize data workflows for improved performance and reduced latency in Databricks and Adobe environments.</p><p>• Monitor and troubleshoot issues within data pipelines to ensure seamless operations.</p><p>• Create and maintain comprehensive documentation for data architectures, integration processes, and workflows.</p><p>• Partner with analysts and stakeholders to gather requirements and deliver effective data solutions.</p><p>• Provide training and guidance on best practices for Databricks and Adobe data processes.</p><p>• Ensure the efficient use of Azure Synapse Analytics, Azure Data Factory, and related technologies in data integration projects.</p><p>• Drive continuous improvements in data strategies to support business growth.</p>