<p><strong>-- Client =</strong> Interactive Entertainment</p><p><strong>-- Location =</strong> Remote</p><p><strong>-- Comp =</strong> $150k-$200k annual base + benefits</p><p><strong>-- Work Authorization = </strong>our is NOT able to sponsor or transfer visas at this time</p><p><strong>-- Focus =</strong> Design, automate, and maintain CI/CD pipelines and infrastructure across Linux, Windows, and macOS environments</p><p><strong>-- MUST HAVES =</strong> <strong><em>Last 5+ years w/ a focus on DevOps using AWS, EKS,</em></strong> <strong><em>GitLab</em></strong></p><p><strong>-- Bonus Points =</strong> Okta Integration, Backstage.io (or similar dev portals), Mobile or Gaming CI/CD Pipelines, DataDog, OpenTofu, Ansible, CloudFormation, Docker, Python</p><p><br></p>
<p><strong>***Please email Valerie Nielsen for immediate response*** </strong></p><p><br></p><p><strong>Job Title:</strong> Data Engineer</p><p> <strong>Location:</strong> West Los Angeles, CA (Onsite)</p><p> <strong>Salary:</strong> $150,000 Base + Bonus</p><p><strong>Overview</strong></p><p> We are seeking a <strong>Data Engineer</strong> to join our team onsite in <strong>West Los Angeles</strong>. This role is ideal for someone early in their career who has strong technical fundamentals, enjoys working with data, and has curiosity around modern AI tools. The ideal candidate has a strong analytical mindset and enjoys solving complex data problems while building scalable pipelines and data models.</p><p><strong>Responsibilities</strong></p><ul><li>Build, maintain, and optimize data pipelines and ETL processes</li><li>Write efficient and scalable <strong>SQL and Python</strong> code for data transformation and analysis</li><li>Work with cloud data platforms in <strong>AWS or Azure</strong></li><li>Support data modeling, data warehouse development, and reporting pipelines</li><li>Collaborate with analytics and product teams to deliver clean, reliable datasets</li><li>Explore and leverage <strong>AI tools (e.g., Claude or similar)</strong> to improve workflows and productivity</li><li>Ensure data quality, performance, and scalability across systems</li></ul><p><br></p>
<p><strong>Data Engineer (Hybrid, Los Angeles)</strong></p><p><strong>Location:</strong> Los Angeles, California</p><p><strong>Compensation:</strong> $140,000 - $175,000 per year</p><p><strong>Work Environment:</strong> Hybrid, with onsite requirements</p><p>Are you passionate about crafting highly-scalable and performant data systems? Do you have expertise in Azure Databricks, Spark SQL, and real-time data pipelines? We are searching for a talented and motivated <strong>Data Engineer</strong> to join our team in Los Angeles. You'll work in a hybrid environment that combines onsite collaboration with the flexibility of remote work.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and implement data pipelines and ETL workflows using cutting-edge Azure technologies (e.g., Databricks, Synapse Analytics, Synapse Pipelines).</li><li>Manage and optimize big data processes, ensuring scalability, efficiency, and data accuracy.</li><li>Build and work with real-time data pipelines leveraging technologies such as Kafka, Event Hubs, and Spark Streaming.</li><li>Apply advanced skills in Python and Spark SQL to build data solutions for analytics and machine learning.</li><li>Collaborate with business analysts and stakeholders to implement impactful dashboards using Power BI.</li><li>Architect and support the seamless integration of diverse data sources into a central platform for analytics, reporting, and model serving via ML Flow.</li></ul><p><br></p>
<p><strong>***For immediate response please email Valerie Nielsen***</strong></p><p><br></p><p><strong>Job Title:</strong> Engineering Manager (.NET)</p><p> <strong>Location:</strong> Woodland Hills, CA (Hybrid – 2–3 days onsite)</p><p> <strong>Openings:</strong> 2</p><p> <strong>Compensation:</strong> $200,000 – $220,000 Base + 10% Bonus</p><p><strong>Overview</strong></p><p> We are seeking a <strong>hands-on Engineering Manager (.NET)</strong> to lead a small team of engineers while actively contributing to the codebase. This role is ideal for a strong technical leader who enjoys mentoring developers, driving engineering best practices, and remaining deeply involved in architecture and development.</p><p>The Engineering Manager will oversee a team of <strong>3 software engineers</strong> and play a key role in building and scaling modern applications using the <strong>Microsoft .NET ecosystem</strong>.</p><p><strong>Responsibilities</strong></p><ul><li>Lead, mentor, and manage a team of <strong>3 software engineers</strong></li><li>Remain <strong>hands-on with development</strong> using <strong>.NET and ASP.NET</strong></li><li>Design and build scalable backend services and APIs</li><li>Participate in system architecture, technical design, and code reviews</li><li>Collaborate with product, data, and infrastructure teams to deliver new features</li><li>Improve engineering processes, code quality, and development standards</li><li>Support hiring, performance management, and team development</li><li>Ensure applications are reliable, scalable, and secure</li></ul><p><br></p>
<p><strong>Data Engineer – CRM Integration (Hybrid in San Fernando Valley)</strong></p><p><strong>Location:</strong> San Fernando Valley (Hybrid – 3x per week onsite)</p><p><strong>Compensation:</strong> $140K–$170K annual base salary</p><p><strong>Job Type:</strong> Full Time, Permanent</p><p><strong>Overview:</strong></p><p>Join our growing technology team as a Data Engineer with a focus on CRM data integration. This permanent role will play a key part in supporting analytics and business intelligence across our organization. The position offers a collaborative hybrid environment and highly competitive compensation.</p><p><strong>Responsibilities:</strong></p><ul><li>Design, develop, and optimize data pipelines and workflows integrating multiple CRM systems (Salesforce, Dynamics, HubSpot, Netsuite, or similar).</li><li>Build and maintain scalable data architectures for analytics and reporting.</li><li>Manage and advance CRM data integrations, including real-time and batch processing solutions.</li><li>Deploy ML models, automate workflows, and support model serving using Azure Databricks (ML Flow experience preferred).</li><li>Utilize Azure Synapse Analytics & Pipelines for high-volume data management.</li><li>Write advanced Python and Spark SQL code for ETL, transformation, and analytics.</li><li>Collaborate with BI and analytics teams to deliver actionable insights using PowerBI.</li><li>Support streaming solutions with technologies like Kafka, Event Hubs, and Spark Streaming.</li></ul><p><br></p>
We are looking for an experienced Software Engineer to develop and maintain applications within the Symitar platform, contributing to the enhancement of our core banking systems. This role requires a proactive approach to creating tailored solutions that align with business needs while ensuring system performance and security compliance.<br><br>Responsibilities:<br>• Create and manage Symitar PowerOn scripts, batch jobs, and integrations to meet operational objectives.<br>• Support system upgrades and troubleshoot issues to ensure seamless functionality.<br>• Work closely with stakeholders to gather requirements and deliver customized solutions.<br>• Design, generate, and maintain reports using Symitar Quest and associated tools.<br>• Monitor and optimize system performance while ensuring data integrity and security standards.<br>• Provide ongoing production support and participate in the on-call rotation as required.<br>• Collaborate on system enhancements and ensure compatibility with core banking processes.
We are looking for a talented Data Engineer to join our team in Glendale, California. In this long-term contract role, you will be instrumental in designing, developing, and maintaining scalable data pipelines and platforms that support critical business operations. Through collaboration with cross-functional teams, you will contribute to innovative data solutions that enhance decision-making processes and drive operational excellence.<br><br>Responsibilities:<br>• Develop, maintain, and optimize data pipelines to support the Core Data platform.<br>• Create tools and services to enhance data discovery, governance, and privacy.<br>• Collaborate with product managers, architects, and software engineers to ensure the success of data platforms.<br>• Apply technologies such as Airflow, Spark, Databricks, Delta Lake, and Kubernetes to build advanced data solutions.<br>• Establish and document best practices for pipeline configurations, naming conventions, and operational standards.<br>• Monitor and ensure the accuracy, reliability, and efficiency of datasets to meet service level agreements (SLAs).<br>• Participate in agile and scrum ceremonies to improve collaboration and team processes.<br>• Foster relationships with stakeholders to understand their needs and prioritize platform enhancements.<br>• Maintain detailed documentation to support data governance and quality initiatives.
We are looking for an experienced Platform / DevOps Engineer to join our team in Los Angeles, California. This role focuses on enhancing developer workflows, maintaining platform operations, and ensuring system observability to support production environments. As part of a long-term contract, you will play a key role in optimizing cloud resources, managing access controls, and troubleshooting issues across various tools and environments.<br><br>Responsibilities:<br>• Manage user access across Atlassian tools, Azure DevOps, GitHub, and other platforms, ensuring secure and compliant permissions.<br>• Process and oversee access requests using ServiceNow and internal workflows to maintain least-privilege access.<br>• Design, maintain, and troubleshoot CI/CD pipelines within Azure DevOps and GitHub Actions.<br>• Provide support for containerized applications using Docker and Kubernetes, including environment configuration.<br>• Collaborate with Systems Engineering teams to manage cloud resources and optimize configurations in Azure.<br>• Analyze system logs and metrics using Elastic tools to identify and resolve backend service issues.<br>• Investigate and troubleshoot issues related to server environments, databases, and backend services.<br>• Partner with engineering teams to identify root causes of system failures and implement preventive measures.<br>• Participate in incident response efforts and contribute to post-incident reviews and improvements.