We are looking for a skilled Software Engineer to join our team on a long-term contract basis in Rancho Cucamonga, California. In this role, you will collaborate with various teams, including engineering, inventory control, and planning, to streamline processes and create detailed documentation packages. Your expertise in software development and ability to work closely with machinists and assemblers will be essential in ensuring accurate and efficient workflows.<br><br>Responsibilities:<br>• Transform 3D models generated by engineers into comprehensive drawing packages, including individual drawings for each element in the assembly.<br>• Collaborate with inventory control and planning teams to maintain and organize online documentation, including job build materials, time elements, and work instructions.<br>• Coordinate and conduct meetings with engineers, machinists, and assemblers to discuss designs and develop clear, actionable instructions.<br>• Develop and implement software solutions using programming languages such as C#, .NET, and ASP.NET.<br>• Apply JavaScript and React.js to enhance functionality and usability of applications.<br>• Ensure accurate documentation and workflow processes for manufacturing operations.<br>• Troubleshoot and resolve technical issues related to software and documentation systems.<br>• Maintain effective communication across teams to ensure alignment on project goals and deliverables.<br>• Provide technical expertise and guidance to support ongoing improvement initiatives.
<p>We are looking for a Senior Packaging Engineer to join our Product Development team (hybrid in greater Los Angeles). In this role, you will play a pivotal part in the lifecycle of packaging solutions for a variety of beauty and personal care brands.. The position requires a creative and technical mindset to deliver cost-effective, functional, and visually appealing packaging that aligns with brand standards.</p><p><br></p><p>Responsibilities:</p><p>• Develop and oversee packaging specifications, codes, and artwork to ensure accuracy and alignment with product requirements.</p><p>• Collaborate with Product Development and Formulation teams to evaluate packaging compatibility with product formulas and ensure optimal performance.</p><p>• Coordinate with suppliers and contract manufacturers to obtain samples, conduct testing, and ensure packaging components meet quality, cost, and timeline expectations.</p><p>• Create and maintain technical drawings, die lines, and component specifications using relevant systems.</p><p>• Review and approve artwork mechanicals, ensuring proper fit and adherence to packaging standards.</p><p>• Partner with Supply Chain and Operations teams to design shippers and unit cartons, identify cost-saving opportunities, and resolve production or vendor challenges.</p><p>• Track packaging deliverables and ensure components are ready for brand project milestones through regular team meetings.</p><p>• Organize and maintain pre-production samples and component libraries, ensuring materials are accessible and up-to-date.</p><p>• Assist in the creation and management of molds, collaborating with design and manufacturing partners to meet specifications.</p><p>• Proactively address challenges, communicate effectively across teams, and maintain a solutions-oriented approach.</p>
<p><strong>***For immediate response please email Valerie Nielsen***</strong></p><p><br></p><p><strong>Job Title:</strong> Machine Learning Engineer / Data Engineer</p><p> <strong>Location:</strong> Culver City, CA (Onsite 4 days per week)</p><p> <strong>Compensation:</strong> $200,000 Base + Bonus/Equity (if applicable)</p><p><strong>Overview</strong></p><p> We are seeking a <strong>Machine Learning Engineer / Data Engineer</strong> to help build scalable data and machine learning platforms that power intelligent products and decision systems. This role will focus on developing infrastructure and pipelines that enable multiple teams to leverage advanced analytics, real-time decisioning, and modern AI capabilities including LLM-based applications.</p><p>The ideal candidate has experience building <strong>data and ML platforms used across an organization</strong>, and enjoys working at the intersection of <strong>data engineering, machine learning infrastructure, and production AI systems</strong>.</p><p><strong>Responsibilities</strong></p><ul><li>Design and build <strong>scalable data and machine learning platforms</strong> used by multiple internal teams</li><li>Develop and maintain <strong>ML pipelines, feature stores, and training workflows</strong></li><li>Build infrastructure supporting <strong>LLM-powered applications</strong>, including embeddings, vector search, and <strong>RAG pipelines</strong></li><li>Develop systems for <strong>real-time decisioning</strong>, including pricing, personalization, and recommendation engines</li><li>Build and maintain <strong>experimentation platforms and A/B testing infrastructure</strong></li><li>Optimize data pipelines and ML workflows for <strong>performance and scalability</strong>, including GPU-based training environments</li><li>Collaborate with product, engineering, and data teams to operationalize machine learning models in production</li></ul><p><br></p>
We are looking for a talented Data Engineer to join our team in Glendale, California. In this long-term contract role, you will be instrumental in designing, developing, and maintaining scalable data pipelines and platforms that support critical business operations. Through collaboration with cross-functional teams, you will contribute to innovative data solutions that enhance decision-making processes and drive operational excellence.<br><br>Responsibilities:<br>• Develop, maintain, and optimize data pipelines to support the Core Data platform.<br>• Create tools and services to enhance data discovery, governance, and privacy.<br>• Collaborate with product managers, architects, and software engineers to ensure the success of data platforms.<br>• Apply technologies such as Airflow, Spark, Databricks, Delta Lake, and Kubernetes to build advanced data solutions.<br>• Establish and document best practices for pipeline configurations, naming conventions, and operational standards.<br>• Monitor and ensure the accuracy, reliability, and efficiency of datasets to meet service level agreements (SLAs).<br>• Participate in agile and scrum ceremonies to improve collaboration and team processes.<br>• Foster relationships with stakeholders to understand their needs and prioritize platform enhancements.<br>• Maintain detailed documentation to support data governance and quality initiatives.
<p><strong>***Please email Valerie Nielsen for immediate response*** </strong></p><p><br></p><p><strong>Job Title:</strong> Data Engineer</p><p> <strong>Location:</strong> West Los Angeles, CA (Onsite)</p><p> <strong>Salary:</strong> $150,000 Base + Bonus</p><p><strong>Overview</strong></p><p> We are seeking a <strong>Data Engineer</strong> to join our team onsite in <strong>West Los Angeles</strong>. This role is ideal for someone early in their career who has strong technical fundamentals, enjoys working with data, and has curiosity around modern AI tools. The ideal candidate has a strong analytical mindset and enjoys solving complex data problems while building scalable pipelines and data models.</p><p><strong>Responsibilities</strong></p><ul><li>Build, maintain, and optimize data pipelines and ETL processes</li><li>Write efficient and scalable <strong>SQL and Python</strong> code for data transformation and analysis</li><li>Work with cloud data platforms in <strong>AWS or Azure</strong></li><li>Support data modeling, data warehouse development, and reporting pipelines</li><li>Collaborate with analytics and product teams to deliver clean, reliable datasets</li><li>Explore and leverage <strong>AI tools (e.g., Claude or similar)</strong> to improve workflows and productivity</li><li>Ensure data quality, performance, and scalability across systems</li></ul><p><br></p>
<p>We are seeking a Tech Lead for our client in the South Bay. The ideal candidate will have 5-10 years of experience in developing large-scale enterprise applications backend systems and experience in senior technical role such as technical lead team lead or another hands-on leadership roles. The Tech lead in this role will work with Digital Services Development and will deliver exceptional support to develop meaningful digital experience and strengthen customer experience. The Tech Lead will support all digital applications architecture development production support and solutioning.</p><p>This candidate must have: </p><p>- Strong experience in Java/J2EE and related open-source frameworks Spring Boot Spring Integration Spring Data JPA Hibernate etc.</p><p>- Experience with public cloud providers, particularly AWS.</p><p>- Proficient in Micro Services/backend development event-driven architectures and building enterprise integration solutions with various application servers and messaging systems JBoss/WebSphere Application server Kafka Red Hat AMQ JMS and others</p><p>- Familiar with containerized app deployments Docker Kubernetes framework or Red Hat OpenShift Server less Architecture.</p><p>- Experience in database systems RDMS RDS NoSQL DynamoDB etc.</p><p>- Strong knowledge of object-oriented design patterns- current IT trends modern technology landscape architecture principles and advanced development techniques</p><p>- Agile practices and development methodology expertise</p><p>- Familiarity with DevOps practices and tools for continuous integration and deployment.</p><p><br></p><p><br></p><p><br></p>
<p><strong>Our client is seeking a Senior AWS Data Engineer for a long term, multi-year assignment.</strong></p><p><br></p><p><strong>This role is onsite 4 days/week in Torrance, CA. </strong></p><p><br></p><p>This role is to support and enhance enterprise business intelligence and analytics environments. This role focuses on designing, building, and maintaining scalable data pipelines and cloud‑based data platforms using AWS services. The ideal candidate brings deep hands‑on experience with AWS Glue, PySpark, Redshift, and serverless architectures, along with strong SQL and data analysis skills.</p><p>This role will collaborate closely with architecture, security, compliance, and development teams to ensure data solutions are performant, secure, and compliant with regulatory requirements.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, build, and maintain scalable ETL/ELT pipelines using AWS Glue with PySpark for large‑scale data processing</li><li>Develop and support serverless integrations using AWS Lambda for event‑driven workflows and system integrations</li><li>Design and optimize Amazon Redshift data warehouse solutions, including:</li><li>Advanced SQL analytics</li><li>Stored procedures</li><li>Performance tuning</li><li>Lead implementation of secure vendor file transfer and ingestion solutions using AWS Transfer Family</li><li>Design and implement database migration and replication pipelines using AWS Database Migration Service (DMS)</li><li>Build and manage workflow orchestration using Apache Airflow or similar orchestration tools</li><li>Analyze data quality, transformation logic, and pipeline performance using SQL and data analysis techniques</li><li>Troubleshoot and resolve production data pipeline and integration issues across AWS services</li><li>Provide technical guidance to development team members on:</li><li>AWS best practices</li><li>Cost optimization</li><li>Performance optimization</li><li>Partner with enterprise architecture, security, and compliance teams to ensure SOX and regulatory compliance</li></ul>