We are looking for a skilled Software Engineer to join our team on a long-term contract basis in Rancho Cucamonga, California. In this role, you will collaborate with various teams, including engineering, inventory control, and planning, to streamline processes and create detailed documentation packages. Your expertise in software development and ability to work closely with machinists and assemblers will be essential in ensuring accurate and efficient workflows.<br><br>Responsibilities:<br>• Transform 3D models generated by engineers into comprehensive drawing packages, including individual drawings for each element in the assembly.<br>• Collaborate with inventory control and planning teams to maintain and organize online documentation, including job build materials, time elements, and work instructions.<br>• Coordinate and conduct meetings with engineers, machinists, and assemblers to discuss designs and develop clear, actionable instructions.<br>• Develop and implement software solutions using programming languages such as C#, .NET, and ASP.NET.<br>• Apply JavaScript and React.js to enhance functionality and usability of applications.<br>• Ensure accurate documentation and workflow processes for manufacturing operations.<br>• Troubleshoot and resolve technical issues related to software and documentation systems.<br>• Maintain effective communication across teams to ensure alignment on project goals and deliverables.<br>• Provide technical expertise and guidance to support ongoing improvement initiatives.
<p>We are looking for a skilled and experienced Software Engineering Manager to lead a dynamic team in Los Angeles, California. This role requires a balance of hands-on development and leadership, with a focus on modernizing a legacy monolithic application into a scalable, service-oriented architecture. The ideal candidate will have a strong technical background, excellent communication skills, and a passion for mentoring engineers while driving platform evolution.</p><p><br></p><p>Responsibilities:</p><p>• Lead a team of engineers, dedicating 60-70% of your time to hands-on development and 30-40% to mentoring and guiding team members.</p><p>• Develop and refactor services using .NET technologies, ensuring alignment with service-oriented architecture principles.</p><p>• Break down tightly integrated application functionalities into reusable and well-defined services.</p><p>• Manage the delivery of new features and improvements while balancing platform modernization efforts.</p><p>• Collaborate with architecture and platform teams to define service boundaries, integration patterns, and standards.</p><p>• Design and implement APIs, service interfaces, and integration patterns that enhance scalability and extensibility.</p><p>• Review and approve code contributions to ensure they meet quality, performance, and security standards.</p><p>• Build and maintain shared components, utilities, and frameworks to accelerate development and promote reuse.</p><p>• Ensure adherence to best practices in CI/CD, DevOps, and operational readiness across the team.</p>
We are looking for an experienced Platform / DevOps Engineer to join our team in Los Angeles, California. This role focuses on enhancing developer workflows, maintaining platform operations, and ensuring system observability to support production environments. As part of a long-term contract, you will play a key role in optimizing cloud resources, managing access controls, and troubleshooting issues across various tools and environments.<br><br>Responsibilities:<br>• Manage user access across Atlassian tools, Azure DevOps, GitHub, and other platforms, ensuring secure and compliant permissions.<br>• Process and oversee access requests using ServiceNow and internal workflows to maintain least-privilege access.<br>• Design, maintain, and troubleshoot CI/CD pipelines within Azure DevOps and GitHub Actions.<br>• Provide support for containerized applications using Docker and Kubernetes, including environment configuration.<br>• Collaborate with Systems Engineering teams to manage cloud resources and optimize configurations in Azure.<br>• Analyze system logs and metrics using Elastic tools to identify and resolve backend service issues.<br>• Investigate and troubleshoot issues related to server environments, databases, and backend services.<br>• Partner with engineering teams to identify root causes of system failures and implement preventive measures.<br>• Participate in incident response efforts and contribute to post-incident reviews and improvements.
<p><strong>-- Client =</strong> Interactive Entertainment</p><p><strong>-- Location =</strong> Remote</p><p><strong>-- Comp =</strong> $150k-$200k annual base + benefits</p><p><strong>-- Work Authorization = </strong>our is NOT able to sponsor or transfer visas at this time</p><p><strong>-- Focus =</strong> Design, automate, and maintain CI/CD pipelines and infrastructure across Linux, Windows, and macOS environments</p><p><strong>-- MUST HAVES =</strong> <strong><em>Last 5+ years w/ a focus on DevOps using AWS, EKS,</em></strong> <strong><em>GitLab</em></strong></p><p><strong>-- Bonus Points =</strong> Okta Integration, Backstage.io (or similar dev portals), Mobile or Gaming CI/CD Pipelines, DataDog, OpenTofu, Ansible, CloudFormation, Docker, Python</p><p><br></p>
<p>We are seeking an experienced Senior Data Engineer to support and enhance enterprise business intelligence and analytics environments. This role focuses on designing, building, and maintaining scalable data pipelines and cloud‑based data platforms using AWS services. The ideal candidate brings deep hands‑on experience with AWS Glue, PySpark, Redshift, and serverless architectures, along with strong SQL and data analysis skills.</p><p>This role will collaborate closely with architecture, security, compliance, and development teams to ensure data solutions are performant, secure, and compliant with regulatory requirements.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, build, and maintain scalable ETL/ELT pipelines using AWS Glue with PySpark for large‑scale data processing</li><li>Develop and support serverless integrations using AWS Lambda for event‑driven workflows and system integrations</li><li>Design and optimize Amazon Redshift data warehouse solutions, including:</li><li>Advanced SQL analytics</li><li>Stored procedures</li><li>Performance tuning</li><li>Lead implementation of secure vendor file transfer and ingestion solutions using AWS Transfer Family</li><li>Design and implement database migration and replication pipelines using AWS Database Migration Service (DMS)</li><li>Build and manage workflow orchestration using Apache Airflow or similar orchestration tools</li><li>Analyze data quality, transformation logic, and pipeline performance using SQL and data analysis techniques</li><li>Troubleshoot and resolve production data pipeline and integration issues across AWS services</li><li>Provide technical guidance to development team members on:</li><li>AWS best practices</li><li>Cost optimization</li><li>Performance optimization</li><li>Partner with enterprise architecture, security, and compliance teams to ensure SOX and regulatory compliance</li></ul>
<p><strong>***For immediate response please email Valerie Nielsen***</strong></p><p><br></p><p><strong>Job Title:</strong> Engineering Manager (.NET)</p><p> <strong>Location:</strong> Woodland Hills, CA (Hybrid – 2–3 days onsite)</p><p> <strong>Openings:</strong> 2</p><p> <strong>Compensation:</strong> $200,000 – $220,000 Base + 10% Bonus</p><p><strong>Overview</strong></p><p> We are seeking a <strong>hands-on Engineering Manager (.NET)</strong> to lead a small team of engineers while actively contributing to the codebase. This role is ideal for a strong technical leader who enjoys mentoring developers, driving engineering best practices, and remaining deeply involved in architecture and development.</p><p>The Engineering Manager will oversee a team of <strong>3 software engineers</strong> and play a key role in building and scaling modern applications using the <strong>Microsoft .NET ecosystem</strong>.</p><p><strong>Responsibilities</strong></p><ul><li>Lead, mentor, and manage a team of <strong>3 software engineers</strong></li><li>Remain <strong>hands-on with development</strong> using <strong>.NET and ASP.NET</strong></li><li>Design and build scalable backend services and APIs</li><li>Participate in system architecture, technical design, and code reviews</li><li>Collaborate with product, data, and infrastructure teams to deliver new features</li><li>Improve engineering processes, code quality, and development standards</li><li>Support hiring, performance management, and team development</li><li>Ensure applications are reliable, scalable, and secure</li></ul><p><br></p>
<p><strong>***For immediate response please email Valerie Nielsen***</strong></p><p><br></p><p><strong>Job Title:</strong> Machine Learning Engineer / Data Engineer</p><p> <strong>Location:</strong> Culver City, CA (Onsite 4 days per week)</p><p> <strong>Compensation:</strong> $200,000 Base + Bonus/Equity (if applicable)</p><p><strong>Overview</strong></p><p> We are seeking a <strong>Machine Learning Engineer / Data Engineer</strong> to help build scalable data and machine learning platforms that power intelligent products and decision systems. This role will focus on developing infrastructure and pipelines that enable multiple teams to leverage advanced analytics, real-time decisioning, and modern AI capabilities including LLM-based applications.</p><p>The ideal candidate has experience building <strong>data and ML platforms used across an organization</strong>, and enjoys working at the intersection of <strong>data engineering, machine learning infrastructure, and production AI systems</strong>.</p><p><strong>Responsibilities</strong></p><ul><li>Design and build <strong>scalable data and machine learning platforms</strong> used by multiple internal teams</li><li>Develop and maintain <strong>ML pipelines, feature stores, and training workflows</strong></li><li>Build infrastructure supporting <strong>LLM-powered applications</strong>, including embeddings, vector search, and <strong>RAG pipelines</strong></li><li>Develop systems for <strong>real-time decisioning</strong>, including pricing, personalization, and recommendation engines</li><li>Build and maintain <strong>experimentation platforms and A/B testing infrastructure</strong></li><li>Optimize data pipelines and ML workflows for <strong>performance and scalability</strong>, including GPU-based training environments</li><li>Collaborate with product, engineering, and data teams to operationalize machine learning models in production</li></ul><p><br></p>
<p><strong>Data Engineer (Hybrid, Los Angeles)</strong></p><p><strong>Location:</strong> Los Angeles, California</p><p><strong>Compensation:</strong> $140,000 - $175,000 per year</p><p><strong>Work Environment:</strong> Hybrid, with onsite requirements</p><p>Are you passionate about crafting highly-scalable and performant data systems? Do you have expertise in Azure Databricks, Spark SQL, and real-time data pipelines? We are searching for a talented and motivated <strong>Data Engineer</strong> to join our team in Los Angeles. You'll work in a hybrid environment that combines onsite collaboration with the flexibility of remote work.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and implement data pipelines and ETL workflows using cutting-edge Azure technologies (e.g., Databricks, Synapse Analytics, Synapse Pipelines).</li><li>Manage and optimize big data processes, ensuring scalability, efficiency, and data accuracy.</li><li>Build and work with real-time data pipelines leveraging technologies such as Kafka, Event Hubs, and Spark Streaming.</li><li>Apply advanced skills in Python and Spark SQL to build data solutions for analytics and machine learning.</li><li>Collaborate with business analysts and stakeholders to implement impactful dashboards using Power BI.</li><li>Architect and support the seamless integration of diverse data sources into a central platform for analytics, reporting, and model serving via ML Flow.</li></ul><p><br></p>
We are looking for a talented Data Engineer to join our team in Glendale, California. In this long-term contract role, you will be instrumental in designing, developing, and maintaining scalable data pipelines and platforms that support critical business operations. Through collaboration with cross-functional teams, you will contribute to innovative data solutions that enhance decision-making processes and drive operational excellence.<br><br>Responsibilities:<br>• Develop, maintain, and optimize data pipelines to support the Core Data platform.<br>• Create tools and services to enhance data discovery, governance, and privacy.<br>• Collaborate with product managers, architects, and software engineers to ensure the success of data platforms.<br>• Apply technologies such as Airflow, Spark, Databricks, Delta Lake, and Kubernetes to build advanced data solutions.<br>• Establish and document best practices for pipeline configurations, naming conventions, and operational standards.<br>• Monitor and ensure the accuracy, reliability, and efficiency of datasets to meet service level agreements (SLAs).<br>• Participate in agile and scrum ceremonies to improve collaboration and team processes.<br>• Foster relationships with stakeholders to understand their needs and prioritize platform enhancements.<br>• Maintain detailed documentation to support data governance and quality initiatives.
<p><strong>***Please email Valerie Nielsen for immediate response*** </strong></p><p><br></p><p><strong>Job Title:</strong> Data Engineer</p><p> <strong>Location:</strong> West Los Angeles, CA (Onsite)</p><p> <strong>Salary:</strong> $150,000 Base + Bonus</p><p><strong>Overview</strong></p><p> We are seeking a <strong>Data Engineer</strong> to join our team onsite in <strong>West Los Angeles</strong>. This role is ideal for someone early in their career who has strong technical fundamentals, enjoys working with data, and has curiosity around modern AI tools. The ideal candidate has a strong analytical mindset and enjoys solving complex data problems while building scalable pipelines and data models.</p><p><strong>Responsibilities</strong></p><ul><li>Build, maintain, and optimize data pipelines and ETL processes</li><li>Write efficient and scalable <strong>SQL and Python</strong> code for data transformation and analysis</li><li>Work with cloud data platforms in <strong>AWS or Azure</strong></li><li>Support data modeling, data warehouse development, and reporting pipelines</li><li>Collaborate with analytics and product teams to deliver clean, reliable datasets</li><li>Explore and leverage <strong>AI tools (e.g., Claude or similar)</strong> to improve workflows and productivity</li><li>Ensure data quality, performance, and scalability across systems</li></ul><p><br></p>
<p><strong>Data Engineer – CRM Integration (Hybrid in San Fernando Valley)</strong></p><p><strong>Location:</strong> San Fernando Valley (Hybrid – 3x per week onsite)</p><p><strong>Compensation:</strong> $140K–$170K annual base salary</p><p><strong>Job Type:</strong> Full Time, Permanent</p><p><strong>Overview:</strong></p><p>Join our growing technology team as a Data Engineer with a focus on CRM data integration. This permanent role will play a key part in supporting analytics and business intelligence across our organization. The position offers a collaborative hybrid environment and highly competitive compensation.</p><p><strong>Responsibilities:</strong></p><ul><li>Design, develop, and optimize data pipelines and workflows integrating multiple CRM systems (Salesforce, Dynamics, HubSpot, Netsuite, or similar).</li><li>Build and maintain scalable data architectures for analytics and reporting.</li><li>Manage and advance CRM data integrations, including real-time and batch processing solutions.</li><li>Deploy ML models, automate workflows, and support model serving using Azure Databricks (ML Flow experience preferred).</li><li>Utilize Azure Synapse Analytics & Pipelines for high-volume data management.</li><li>Write advanced Python and Spark SQL code for ETL, transformation, and analytics.</li><li>Collaborate with BI and analytics teams to deliver actionable insights using PowerBI.</li><li>Support streaming solutions with technologies like Kafka, Event Hubs, and Spark Streaming.</li></ul><p><br></p>
<p><strong>Network Security Administrator</strong></p><p><strong>Position Summary</strong></p><p>The Network Security Administrator is responsible for protecting the organization’s network infrastructure through proactive monitoring, configuration, and management of next‑generation firewalls and related security technologies. This role focuses heavily on <strong>NGFW </strong>Next-Generation Firewalls, ensuring secure and reliable connectivity across enterprise environments. The specialist will be a key contributor to incident response, threat mitigation, and continuous improvement of security controls.</p><p><br></p><p><br></p><p><br></p><p><strong>Key Responsibilities</strong></p><p><strong>Firewall & Network Security Administration</strong></p><ul><li>Configure, manage, and maintain <strong>Next‑Generation Firewalls (NGFW)</strong>, including policies, objects, NAT rules, App‑ID, User‑ID, threat prevention profiles, and Remote access VPN, ACL’s, IDS/IPS policies.</li><li>Perform regular firewall rule reviews, cleanup, and optimization to improve performance and reduce risk.</li></ul><p><strong>Security Operations & Monitoring</strong></p><ul><li>Monitor network security alerts, traffic anomalies, and firewall logs using vendor specific tools, SIEM platforms, and packet capture utilities.</li><li>Investigate and remediate network‑based security incidents, coordinating with SOC or incident response teams as necessary.</li><li>Perform security event correlation and escalation following established SOPs.</li></ul><p><strong>Network Infrastructure Support</strong></p><ul><li>Support secure network connectivity across data centers, branch sites, cloud environments, and remote access solutions.</li><li>Troubleshoot layer 2/3 issues related to routing, switching, VPN tunnels, and connectivity impacts to security appliances.</li><li>Assist in deploying secure architectures for new network builds, migrations, and cloud integrations.</li></ul><p><br></p>
<p>Role Overview</p><p>This role is responsible for developing, enhancing, and maintaining enterprise analytics solutions, including dashboards, reporting assets, data models, and custom web applications used by operational and executive stakeholders. The position focuses on transforming complex data into actionable insights through scalable analytics solutions within an agile development environment.</p><p>The ideal candidate brings strong <strong>MicroStrategy expertise</strong>, modern <strong>data warehousing experience</strong>, and the ability to leverage <strong>AI‑assisted development tools</strong> to deliver high‑quality analytics products efficiently.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, develop, and maintain <strong>MicroStrategy dashboards, reports, and semantic/data models</strong></li><li>Support and enhance <strong>custom analytics dashboards and web applications</strong> built with <strong>React</strong></li><li>Migrate select legacy reporting solutions to modern <strong>React‑based frameworks</strong></li><li>Perform <strong>data modeling, SQL development, and performance tuning</strong></li><li>Translate business and reporting requirements into technical specifications and analytics solutions</li><li>Use <strong>AI‑assisted coding tools</strong> (e.g., Cursor, Claude Code) to accelerate development, prototyping, and documentation while maintaining coding standards</li><li>Collaborate with technical leads and architects to adopt new tools, standards, and best practices</li><li>Optimize reporting performance across datasets, dashboards, and databases</li><li>Work effectively with <strong>remote and cross‑functional team members</strong></li></ul><p><br></p>