We are looking for a skilled Software Engineer to join our team on a long-term contract basis in Rancho Cucamonga, California. In this role, you will collaborate with various teams, including engineering, inventory control, and planning, to streamline processes and create detailed documentation packages. Your expertise in software development and ability to work closely with machinists and assemblers will be essential in ensuring accurate and efficient workflows.<br><br>Responsibilities:<br>• Transform 3D models generated by engineers into comprehensive drawing packages, including individual drawings for each element in the assembly.<br>• Collaborate with inventory control and planning teams to maintain and organize online documentation, including job build materials, time elements, and work instructions.<br>• Coordinate and conduct meetings with engineers, machinists, and assemblers to discuss designs and develop clear, actionable instructions.<br>• Develop and implement software solutions using programming languages such as C#, .NET, and ASP.NET.<br>• Apply JavaScript and React.js to enhance functionality and usability of applications.<br>• Ensure accurate documentation and workflow processes for manufacturing operations.<br>• Troubleshoot and resolve technical issues related to software and documentation systems.<br>• Maintain effective communication across teams to ensure alignment on project goals and deliverables.<br>• Provide technical expertise and guidance to support ongoing improvement initiatives.
We are looking for an experienced Platform / DevOps Engineer to join our team in Los Angeles, California. This role focuses on enhancing developer workflows, maintaining platform operations, and ensuring system observability to support production environments. As part of a long-term contract, you will play a key role in optimizing cloud resources, managing access controls, and troubleshooting issues across various tools and environments.<br><br>Responsibilities:<br>• Manage user access across Atlassian tools, Azure DevOps, GitHub, and other platforms, ensuring secure and compliant permissions.<br>• Process and oversee access requests using ServiceNow and internal workflows to maintain least-privilege access.<br>• Design, maintain, and troubleshoot CI/CD pipelines within Azure DevOps and GitHub Actions.<br>• Provide support for containerized applications using Docker and Kubernetes, including environment configuration.<br>• Collaborate with Systems Engineering teams to manage cloud resources and optimize configurations in Azure.<br>• Analyze system logs and metrics using Elastic tools to identify and resolve backend service issues.<br>• Investigate and troubleshoot issues related to server environments, databases, and backend services.<br>• Partner with engineering teams to identify root causes of system failures and implement preventive measures.<br>• Participate in incident response efforts and contribute to post-incident reviews and improvements.
<p><strong>-- Client =</strong> Interactive Entertainment</p><p><strong>-- Location =</strong> Remote</p><p><strong>-- Comp =</strong> $150k-$200k annual base + benefits</p><p><strong>-- Work Authorization = </strong>our is NOT able to sponsor or transfer visas at this time</p><p><strong>-- Focus =</strong> Design, automate, and maintain CI/CD pipelines and infrastructure across Linux, Windows, and macOS environments</p><p><strong>-- MUST HAVES =</strong> <strong><em>Last 5+ years w/ a focus on DevOps using AWS, EKS,</em></strong> <strong><em>GitLab</em></strong></p><p><strong>-- Bonus Points =</strong> Okta Integration, Backstage.io (or similar dev portals), Mobile or Gaming CI/CD Pipelines, DataDog, OpenTofu, Ansible, CloudFormation, Docker, Python</p><p><br></p>
<p>We are seeking an experienced Senior Data Engineer to support and enhance enterprise business intelligence and analytics environments. This role focuses on designing, building, and maintaining scalable data pipelines and cloud‑based data platforms using AWS services. The ideal candidate brings deep hands‑on experience with AWS Glue, PySpark, Redshift, and serverless architectures, along with strong SQL and data analysis skills.</p><p>This role will collaborate closely with architecture, security, compliance, and development teams to ensure data solutions are performant, secure, and compliant with regulatory requirements.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, build, and maintain scalable ETL/ELT pipelines using AWS Glue with PySpark for large‑scale data processing</li><li>Develop and support serverless integrations using AWS Lambda for event‑driven workflows and system integrations</li><li>Design and optimize Amazon Redshift data warehouse solutions, including:</li><li>Advanced SQL analytics</li><li>Stored procedures</li><li>Performance tuning</li><li>Lead implementation of secure vendor file transfer and ingestion solutions using AWS Transfer Family</li><li>Design and implement database migration and replication pipelines using AWS Database Migration Service (DMS)</li><li>Build and manage workflow orchestration using Apache Airflow or similar orchestration tools</li><li>Analyze data quality, transformation logic, and pipeline performance using SQL and data analysis techniques</li><li>Troubleshoot and resolve production data pipeline and integration issues across AWS services</li><li>Provide technical guidance to development team members on:</li><li>AWS best practices</li><li>Cost optimization</li><li>Performance optimization</li><li>Partner with enterprise architecture, security, and compliance teams to ensure SOX and regulatory compliance</li></ul>
<p><strong>***For immediate response please email Valerie Nielsen***</strong></p><p><br></p><p><strong>Job Title:</strong> Machine Learning Engineer / Data Engineer</p><p> <strong>Location:</strong> Culver City, CA (Onsite 4 days per week)</p><p> <strong>Compensation:</strong> $200,000 Base + Bonus/Equity (if applicable)</p><p><strong>Overview</strong></p><p> We are seeking a <strong>Machine Learning Engineer / Data Engineer</strong> to help build scalable data and machine learning platforms that power intelligent products and decision systems. This role will focus on developing infrastructure and pipelines that enable multiple teams to leverage advanced analytics, real-time decisioning, and modern AI capabilities including LLM-based applications.</p><p>The ideal candidate has experience building <strong>data and ML platforms used across an organization</strong>, and enjoys working at the intersection of <strong>data engineering, machine learning infrastructure, and production AI systems</strong>.</p><p><strong>Responsibilities</strong></p><ul><li>Design and build <strong>scalable data and machine learning platforms</strong> used by multiple internal teams</li><li>Develop and maintain <strong>ML pipelines, feature stores, and training workflows</strong></li><li>Build infrastructure supporting <strong>LLM-powered applications</strong>, including embeddings, vector search, and <strong>RAG pipelines</strong></li><li>Develop systems for <strong>real-time decisioning</strong>, including pricing, personalization, and recommendation engines</li><li>Build and maintain <strong>experimentation platforms and A/B testing infrastructure</strong></li><li>Optimize data pipelines and ML workflows for <strong>performance and scalability</strong>, including GPU-based training environments</li><li>Collaborate with product, engineering, and data teams to operationalize machine learning models in production</li></ul><p><br></p>
We are looking for a talented Data Engineer to join our team in Glendale, California. In this long-term contract role, you will be instrumental in designing, developing, and maintaining scalable data pipelines and platforms that support critical business operations. Through collaboration with cross-functional teams, you will contribute to innovative data solutions that enhance decision-making processes and drive operational excellence.<br><br>Responsibilities:<br>• Develop, maintain, and optimize data pipelines to support the Core Data platform.<br>• Create tools and services to enhance data discovery, governance, and privacy.<br>• Collaborate with product managers, architects, and software engineers to ensure the success of data platforms.<br>• Apply technologies such as Airflow, Spark, Databricks, Delta Lake, and Kubernetes to build advanced data solutions.<br>• Establish and document best practices for pipeline configurations, naming conventions, and operational standards.<br>• Monitor and ensure the accuracy, reliability, and efficiency of datasets to meet service level agreements (SLAs).<br>• Participate in agile and scrum ceremonies to improve collaboration and team processes.<br>• Foster relationships with stakeholders to understand their needs and prioritize platform enhancements.<br>• Maintain detailed documentation to support data governance and quality initiatives.
<p><strong>***Please email Valerie Nielsen for immediate response*** </strong></p><p><br></p><p><strong>Job Title:</strong> Data Engineer</p><p> <strong>Location:</strong> West Los Angeles, CA (Onsite)</p><p> <strong>Salary:</strong> $150,000 Base + Bonus</p><p><strong>Overview</strong></p><p> We are seeking a <strong>Data Engineer</strong> to join our team onsite in <strong>West Los Angeles</strong>. This role is ideal for someone early in their career who has strong technical fundamentals, enjoys working with data, and has curiosity around modern AI tools. The ideal candidate has a strong analytical mindset and enjoys solving complex data problems while building scalable pipelines and data models.</p><p><strong>Responsibilities</strong></p><ul><li>Build, maintain, and optimize data pipelines and ETL processes</li><li>Write efficient and scalable <strong>SQL and Python</strong> code for data transformation and analysis</li><li>Work with cloud data platforms in <strong>AWS or Azure</strong></li><li>Support data modeling, data warehouse development, and reporting pipelines</li><li>Collaborate with analytics and product teams to deliver clean, reliable datasets</li><li>Explore and leverage <strong>AI tools (e.g., Claude or similar)</strong> to improve workflows and productivity</li><li>Ensure data quality, performance, and scalability across systems</li></ul><p><br></p>
<p><strong>Network Security Administrator</strong></p><p><strong>Position Summary</strong></p><p>The Network Security Administrator is responsible for protecting the organization’s network infrastructure through proactive monitoring, configuration, and management of next‑generation firewalls and related security technologies. This role focuses heavily on <strong>NGFW </strong>Next-Generation Firewalls, ensuring secure and reliable connectivity across enterprise environments. The specialist will be a key contributor to incident response, threat mitigation, and continuous improvement of security controls.</p><p><br></p><p><br></p><p><br></p><p><strong>Key Responsibilities</strong></p><p><strong>Firewall & Network Security Administration</strong></p><ul><li>Configure, manage, and maintain <strong>Next‑Generation Firewalls (NGFW)</strong>, including policies, objects, NAT rules, App‑ID, User‑ID, threat prevention profiles, and Remote access VPN, ACL’s, IDS/IPS policies.</li><li>Perform regular firewall rule reviews, cleanup, and optimization to improve performance and reduce risk.</li></ul><p><strong>Security Operations & Monitoring</strong></p><ul><li>Monitor network security alerts, traffic anomalies, and firewall logs using vendor specific tools, SIEM platforms, and packet capture utilities.</li><li>Investigate and remediate network‑based security incidents, coordinating with SOC or incident response teams as necessary.</li><li>Perform security event correlation and escalation following established SOPs.</li></ul><p><strong>Network Infrastructure Support</strong></p><ul><li>Support secure network connectivity across data centers, branch sites, cloud environments, and remote access solutions.</li><li>Troubleshoot layer 2/3 issues related to routing, switching, VPN tunnels, and connectivity impacts to security appliances.</li><li>Assist in deploying secure architectures for new network builds, migrations, and cloud integrations.</li></ul><p><br></p>
<p>Role Overview</p><p>This role is responsible for developing, enhancing, and maintaining enterprise analytics solutions, including dashboards, reporting assets, data models, and custom web applications used by operational and executive stakeholders. The position focuses on transforming complex data into actionable insights through scalable analytics solutions within an agile development environment.</p><p>The ideal candidate brings strong <strong>MicroStrategy expertise</strong>, modern <strong>data warehousing experience</strong>, and the ability to leverage <strong>AI‑assisted development tools</strong> to deliver high‑quality analytics products efficiently.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, develop, and maintain <strong>MicroStrategy dashboards, reports, and semantic/data models</strong></li><li>Support and enhance <strong>custom analytics dashboards and web applications</strong> built with <strong>React</strong></li><li>Migrate select legacy reporting solutions to modern <strong>React‑based frameworks</strong></li><li>Perform <strong>data modeling, SQL development, and performance tuning</strong></li><li>Translate business and reporting requirements into technical specifications and analytics solutions</li><li>Use <strong>AI‑assisted coding tools</strong> (e.g., Cursor, Claude Code) to accelerate development, prototyping, and documentation while maintaining coding standards</li><li>Collaborate with technical leads and architects to adopt new tools, standards, and best practices</li><li>Optimize reporting performance across datasets, dashboards, and databases</li><li>Work effectively with <strong>remote and cross‑functional team members</strong></li></ul><p><br></p>