<p><strong>Machine Learning Engineer II </strong>(Contract)</p><p><strong>Location: </strong>Hybrid – Philadelphia, PA <strong>or </strong>Washington, DC | Onsite 3–4 days per week</p><p><strong>Assignment Length:</strong> 38 Weeks, Potential for Extension</p><p><strong>Position Overview</strong></p><p>We are seeking a <strong>Research / Machine Learning Engineer II</strong> to support advanced AI/ML initiatives across large-scale consumer-facing platforms, including Search, Browse, Personalization, Campaign Management, and Voice/NLP technologies. This role is heavily focused on <strong>model validation, quality, and automation</strong>, with an emphasis on building and enhancing machine learning models that validate and support AI-driven tools developed by engineering teams.</p><p>The ideal candidate brings a strong quality mindset, hands-on experience with machine learning models, and a deep interest in testing and validating large language models (LLMs) and AI systems in production-like environments.</p><p><strong>Key Responsibilities</strong></p><ul><li>Design, build, and enhance <strong>machine learning models primarily used for validation and quality assurance</strong> of AI/ML-driven tools.</li><li>Develop models that assist in testing, validating, and improving automation frameworks used by engineering and tooling teams.</li><li>Enhance and support existing AI/ML automation tools, including those working with <strong>speech and NLP data</strong>.</li><li>Implement <strong>prompt-based interactions with Large Language Models (LLMs)</strong> to support validation and test use cases.</li><li>Research, evaluate, and experiment with various ML models across multiple domains to determine best-fit solutions.</li><li>Contribute software development efforts toward <strong>proof-of-concept initiatives</strong> in AI/ML, NLP, and related strategic areas (e.g., Computer Vision where applicable).</li><li>Collaborate closely with cross-functional engineering, tooling, and SDET teams across multiple locations.</li><li>Support and mentor engineering teams by promoting modern software development, data practices, and quality-driven AI development.</li><li>Ensure AI/ML solutions meet expectations for <strong>performance, reliability, scalability, and product quality</strong>.</li></ul>
We are looking for a highly experienced Senior Machine Learning Engineer to join our team in Boston, Massachusetts. In this role, you will design, develop, and deploy cutting-edge machine learning systems that solve complex problems and scale effectively in production environments. This position offers an exciting opportunity to contribute to impactful projects, leveraging your expertise in machine learning, cloud infrastructure, and data engineering.<br><br>Responsibilities:<br>• Build and deploy machine learning models and solutions for production environments, ensuring they meet scalability and performance standards.<br>• Design and implement comprehensive ML pipelines, including data ingestion, feature engineering, model training, evaluation, and serving.<br>• Write clean, efficient code in Python and leverage its ML ecosystem, such as TensorFlow, PyTorch, and scikit-learn.<br>• Work with large datasets to extract meaningful insights and develop complex queries using modern data processing tools.<br>• Utilize containerization technologies like Docker and cloud platforms such as AWS to ensure robust and scalable deployment.<br>• Apply MLOps best practices, including CI/CD pipelines, automated testing, and performance monitoring, to maintain reliable machine learning systems.<br>• Conduct research and apply deep machine learning and AI techniques, including statistical modeling and large language models.<br>• Solve complex analytical problems with pragmatic engineering approaches while maintaining scientific rigor.<br>• Collaborate with cross-functional teams to align machine learning solutions with business goals and mission-driven objectives.<br>• Monitor and address issues like data drift and model performance to ensure continuous improvement and reliability.
<p>As our portfolio of AI-driven solutions continues to expand, we’re looking for an experienced <strong>Machine Learning Engineer</strong> to join our high-impact data science team. This role offers the opportunity to work across trading, operations, and support functions—delivering production-grade machine learning systems that solve real business problems.</p><p>You’ll collaborate with data scientists, software engineers, and commercial stakeholders to design, build, and deploy models that drive decision-making and innovation. From project scoping to model deployment, you’ll have visibility and influence across the full ML lifecycle.</p><p>🔧 Core Responsibilities</p><ul><li>Act as a thought partner to commercial teams, identifying high-value opportunities for AI/ML applications</li><li>Lead the design, development, and deployment of machine learning systems, with a focus on <strong>NLP</strong>, <strong>LLMs</strong>, and <strong>Generative AI</strong></li><li>Prioritize projects based on business impact and evolving market conditions</li><li>Collaborate with cross-functional teams to gather requirements and align solutions with strategic goals</li><li>Integrate ML solutions—including GenAI—into existing platforms to ensure seamless user experiences and scalable adoption</li><li>Participate in code reviews, experiment design, and tooling decisions to maintain high engineering standards</li><li>Share knowledge and mentor colleagues to build machine learning fluency across the organization</li></ul><p><br></p>
<p><strong>***For immediate response please email Valerie Nielsen***</strong></p><p><br></p><p><strong>Job Title:</strong> Machine Learning Engineer / Data Engineer</p><p> <strong>Location:</strong> Culver City, CA (Onsite 4 days per week)</p><p> <strong>Compensation:</strong> $200,000 Base + Bonus/Equity (if applicable)</p><p><strong>Overview</strong></p><p> We are seeking a <strong>Machine Learning Engineer / Data Engineer</strong> to help build scalable data and machine learning platforms that power intelligent products and decision systems. This role will focus on developing infrastructure and pipelines that enable multiple teams to leverage advanced analytics, real-time decisioning, and modern AI capabilities including LLM-based applications.</p><p>The ideal candidate has experience building <strong>data and ML platforms used across an organization</strong>, and enjoys working at the intersection of <strong>data engineering, machine learning infrastructure, and production AI systems</strong>.</p><p><strong>Responsibilities</strong></p><ul><li>Design and build <strong>scalable data and machine learning platforms</strong> used by multiple internal teams</li><li>Develop and maintain <strong>ML pipelines, feature stores, and training workflows</strong></li><li>Build infrastructure supporting <strong>LLM-powered applications</strong>, including embeddings, vector search, and <strong>RAG pipelines</strong></li><li>Develop systems for <strong>real-time decisioning</strong>, including pricing, personalization, and recommendation engines</li><li>Build and maintain <strong>experimentation platforms and A/B testing infrastructure</strong></li><li>Optimize data pipelines and ML workflows for <strong>performance and scalability</strong>, including GPU-based training environments</li><li>Collaborate with product, engineering, and data teams to operationalize machine learning models in production</li></ul><p><br></p>
<p>We are seeking a skilled AI Engineer to join our dynamic technology team. The ideal candidate has hands-on experience integrating advanced AI and large language model (LLM) features into applications, as well as a strong background in designing and delivering AI-driven solutions. In this role, you will work closely with product, engineering, and data teams to build and enhance innovative products using the latest AI frameworks and tools.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><p><br></p><ul><li>Design, develop, and integrate AI and LLM features into new or existing applications, ensuring scalable and reliable deployment.</li><li>Collaborate with cross-functional teams to define technical requirements and deliver AI-driven functionalities in production environments.</li><li>Utilize AI frameworks, APIs, and platforms such as OpenAI, LangChain, vector databases, and machine learning libraries to accelerate solution development.</li><li>Lead prompt engineering, fine-tuning, and model optimization initiatives to improve performance and user outcomes.</li><li>Evaluate and select the most appropriate AI/ML models, tools, and platforms for project needs.</li><li>Conduct documentation, code reviews, testing, and performance monitoring of AI-driven products.</li><li>Stay up to date with advancements in artificial intelligence, generative models, and industry best practices.</li></ul><p><br></p>
We are looking for an experienced Artificial Intelligence (AI) Engineer to join our team in Atlanta, Georgia. This is a long-term contract position where you will play a pivotal role in advancing AI initiatives across clinical and business operations. The ideal candidate will have a strong technical background, excellent communication skills, and the ability to collaborate across multiple departments to drive innovative solutions in healthcare.<br><br>Responsibilities:<br>• Partner with various departments to identify, design, and implement AI solutions that address clinical, financial, and operational needs.<br>• Evaluate and integrate third-party AI tools and platforms, with a focus on healthcare applications such as NexTech, call center automation, AI-powered scribing, and clinical trial identification.<br>• Develop and support AI applications to enhance patient identification for trials, automate documentation, and improve workflows.<br>• Build and maintain AI-driven dashboards and analytics using tools like Power BI to provide actionable insights for clinical and business teams.<br>• Ensure AI integrations meet scalability, security, and compliance requirements, adhering to healthcare data privacy standards.<br>• Serve as a strategic advisor by proactively identifying opportunities for organizational improvement through AI.<br>• Collaborate with stakeholders across IT and non-IT teams to foster innovation and streamline operations.<br>• Stay updated on industry trends, regulatory standards, and emerging AI technologies relevant to healthcare.<br>• Provide technical leadership and guidance on AI-related projects, ensuring alignment with organizational goals.
<p>We’re seeking an AI Workflow Engineer to join a newly formed team focused on advancing internal AI and LLM platforms. In this role, you’ll collaborate with technical colleagues across multiple departments, helping design and build agentic workflow systems and business logic orchestration powered by leading AI solutions.</p><p><br></p><p>We’re looking for professionals with a development background (Python, Node.js, or similar) who are passionate about moving beyond traditional model training and instead want to tackle applied business applications of AI. You’ll be responsible for writing code to coordinate workflows, integrate data sources, and support sophisticated agentic architectures. Experience working with orchestration tools (n8n), integration frameworks, Retrieval-Augmented Generation (RAG), and MCP tools is a big plus.</p><p><br></p><p>Your work will involve:</p><ul><li>Developing AI-driven workflows and conversational agents that connect with internal and external data sources</li><li>Working hands-on with code—even if some tasks use low-code or no-code platforms</li><li>Implementing and optimizing solutions for applied data science challenges</li><li>Reviewing, improving, and championing future initiatives based on evolving business requirements</li></ul><p>Ideal candidates have worked directly with AI or LLM platforms, possess strong development or applied machine learning experience, and are comfortable in fast-moving, innovative environments. Familiarity with NVIDIA-backed infrastructure is desirable but not required.</p><p><br></p><p>If you’re excited about shaping agentic AI solutions, integrating business logic, and driving new workflows, we encourage you to apply.</p>
We are looking for an Artificial Intelligence (AI) Engineer to join our IT team in Rye, New York. In this role, you will design and implement AI-driven automation systems to enhance operational efficiency, streamline workflows, and support data-driven decision-making across the organization. This position offers an exciting opportunity to apply cutting-edge AI technologies to real-world challenges while fostering innovation within the company.<br><br>Responsibilities:<br>• Develop and deploy AI-powered solutions to automate workflows and increase productivity across various departments.<br>• Design and implement intelligent agents capable of assisting employees and automating decision-making processes.<br>• Research and evaluate emerging AI technologies to identify tools that can improve operational efficiency.<br>• Collaborate with stakeholders to identify opportunities for AI-driven process improvements and automation.<br>• Create detailed documentation of AI solutions, architectures, and workflows for knowledge sharing within the organization.<br>• Participate in pilot programs and proof-of-concept initiatives to experiment with new AI applications.<br>• Optimize and manage data structures and workflows to support AI-driven applications.<br>• Ensure scalability and reliability of deployed AI systems to meet business needs.<br>• Stay updated on advancements in AI and machine learning to continuously improve solutions.
We are looking for an experienced Artificial Intelligence (AI) Engineer to join our innovative team in Dallas, Texas. In this role, you will design and implement cutting-edge AI solutions, leveraging your expertise to solve complex problems and drive technological advancements. This position offers the opportunity to work on impactful projects that integrate advanced machine learning and AI technologies.<br><br>Responsibilities:<br>• Design, develop, and optimize AI models and algorithms to address specific business needs.<br>• Implement computer vision solutions to enhance functionality and improve detection systems.<br>• Collaborate with cross-functional teams to integrate AI technologies into existing systems.<br>• Utilize frameworks such as TensorFlow to build and deploy scalable machine learning models.<br>• Identify opportunities for innovation and propose strategies for AI-driven improvements.<br>• Conduct research on emerging AI trends to ensure solutions remain cutting-edge.<br>• Test, validate, and refine AI systems to ensure accuracy and reliability.<br>• Develop documentation and provide guidance for the integration and application of AI technologies.<br>• Monitor system performance and troubleshoot issues to maintain optimal functionality.<br>• Support the development of agentic AI systems to improve automation and decision-making processes.
We are looking for a skilled Artificial Intelligence (AI) Engineer to join our team in Indianapolis, Indiana. In this long-term contract role, you will apply your expertise in machine learning, computer vision, and AI technologies to develop innovative solutions for complex challenges. If you're passionate about advancing AI capabilities and enjoy working in a collaborative environment, we’d love to hear from you.<br><br>Responsibilities:<br>• Design and implement machine learning models to solve real-world problems.<br>• Develop advanced computer vision algorithms for object detection and recognition.<br>• Optimize AI frameworks using TensorFlow to enhance system performance.<br>• Conduct data analysis and preprocessing to improve model accuracy and reliability.<br>• Collaborate with cross-functional teams to integrate AI solutions into existing workflows.<br>• Test and validate models to ensure robustness and scalability.<br>• Research emerging trends in artificial intelligence to incorporate cutting-edge techniques.<br>• Provide technical guidance and mentorship to entry-level team members.<br>• Document processes and technical specifications for knowledge sharing and future reference.
We are looking for a skilled Artificial Intelligence (AI) Engineer to design, implement, and optimize cutting-edge AI solutions in Microsoft Azure. This role requires a proactive individual with expertise in integrating and maintaining advanced AI tools while ensuring compliance, security, and performance. Join our dynamic team in Dallas, Texas, to contribute to innovative projects and collaborate with cross-functional teams.<br><br>Responsibilities:<br>• Develop and maintain scalable AI solutions using Microsoft Azure platforms, including Cosmos DB, Function Apps, and Azure OpenAI.<br>• Ensure strict compliance with data governance policies and manage sensitive information securely.<br>• Monitor system performance and troubleshoot issues using Application Insights to enhance user experience and optimize costs.<br>• Manage and improve AI-enhanced chatbot functionalities, focusing on document embedding and vector search pipelines.<br>• Collaborate with cross-functional teams to integrate AI features and uphold quality standards.<br>• Implement best practices for secure integrations with Azure Key Vault and other tools.<br>• Optimize cloud resources, deployment pipelines, and rate limits for seamless operations.<br>• Document processes and solutions while providing analytics to support continuous improvement.
<p>We are seeking an experienced MLOps Engineer to design, deploy, monitor, and maintain machine learning solutions in production across AWS, Microsoft Azure, and Snowflake environments. This role will collaborate closely with data scientists, platform engineers, and cloud teams to operationalize ML models, automate pipelines, and build reliable, secure, and scalable ML/data platforms.</p><p>The ideal candidate brings strong hands-on expertise across the end-to-end ML lifecycle, cloud-native deployment, CI/CD automation, model monitoring, and production-grade data pipelines.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><p>· Design and implement end-to-end ML pipelines for ingestion, feature engineering, training, validation, deployment, and monitoring.</p><p>· Deploy and manage ML models in production across AWS, Azure, and Snowflake ecosystems.</p><p>· Build batch and real-time inference pipelines using cloud-native and platform-native services.</p><p>· Automate model packaging, testing, releases, and rollback using CI/CD best practices.</p><p>· Integrate ML workflows with AWS SageMaker, AWS Lambda, Azure Machine Learning, Azure Data Factory, and Snowflake.</p><p>· Build and maintain orchestration workflows using Airflow, Azure Data Factory, or similar tools.</p><p>· Implement experiment tracking, model registries, and model governance processes.</p><p>· Monitor model accuracy, drift, latency, throughput, pipeline performance, and infrastructure usage.</p><p>· Establish advanced deployment strategies (canary, shadow, blue-green, rollback).</p><p>· Collaborate with cross-functional teams to transition models from research to production.</p><p>· Ensure security, compliance, traceability, and access control for ML systems and data.</p><p>· Optimize platform reliability, performance, and cost across AWS, Azure, and Snowflake.</p>
<p><strong>Data Modeling and Analysis</strong></p><ul><li>Design data models and optimize performance: Creating the structure of data relationships ensuring efficient data retrieval and calculations.</li><li>Create calculated columns and measures: Using DAX to calculate derived values and aggregate metrics.</li><li>Perform exploratory data analysis (EDA): Using BI tools to explore data, identify trends, and patterns.</li><li>Apply advanced data analysis techniques (e.g., statistical analysis, time series analysis, predictive modeling).</li><li>Integrate machine learning models into Power BI dashboards.</li><li>Experience building semantic models</li></ul><p><strong>Dashboard Development and Visualization</strong></p><ul><li>Designing dashboards: Creating visually appealing and interactive dashboards.</li><li>Creating visualizations: Using charts, graphs, and other visual elements to represent data.</li><li>Implementing interactivity: Adding filters, slicers, and drill-down capabilities.</li><li>Expertise in SQL and DAX and knowledge of Python, R.</li><li>Strong proficiency in Power BI.</li><li>Data modeling and visualization skills.</li><li>Strong problem-solving skills to address technical challenges and data quality issues.</li><li>Analytical skills with capacity to analyze complex data problems and draw meaningful insights.</li></ul>
We are looking for a talented Data Engineer to join our team in Glendale, California. In this long-term contract role, you will be instrumental in designing, developing, and maintaining scalable data pipelines and platforms that support critical business operations. Through collaboration with cross-functional teams, you will contribute to innovative data solutions that enhance decision-making processes and drive operational excellence.<br><br>Responsibilities:<br>• Develop, maintain, and optimize data pipelines to support the Core Data platform.<br>• Create tools and services to enhance data discovery, governance, and privacy.<br>• Collaborate with product managers, architects, and software engineers to ensure the success of data platforms.<br>• Apply technologies such as Airflow, Spark, Databricks, Delta Lake, and Kubernetes to build advanced data solutions.<br>• Establish and document best practices for pipeline configurations, naming conventions, and operational standards.<br>• Monitor and ensure the accuracy, reliability, and efficiency of datasets to meet service level agreements (SLAs).<br>• Participate in agile and scrum ceremonies to improve collaboration and team processes.<br>• Foster relationships with stakeholders to understand their needs and prioritize platform enhancements.<br>• Maintain detailed documentation to support data governance and quality initiatives.
We are looking for a skilled Data Engineer to join our team in Foxborough, Massachusetts, on a long-term contract basis. In this role, you will design, optimize, and maintain data pipelines and storage solutions, leveraging modern tools to ensure high performance and reliability. This position offers an exciting opportunity to collaborate across teams and implement cutting-edge practices in data engineering and analytics.<br><br>Responsibilities:<br>• Optimize Amazon Redshift performance by configuring distribution keys, sort keys, and fine-tuning queries.<br>• Develop and maintain robust data pipelines using AWS Glue and orchestrate workflows with Airflow.<br>• Manage semantic layers and metadata to support reliable analytics and AI-driven insights.<br>• Implement best practices for data partitioning, compression, and columnar storage formats.<br>• Monitor and troubleshoot data workflows to ensure high availability, reliability, and automated observability.<br>• Automate data processing tasks using Python and AWS native tools.<br>• Enforce data security and governance policies, including row- and column-level controls, using Lake Formation and AWS services.<br>• Oversee compliance monitoring and auditing through CloudWatch, CloudTrail, and similar tools.<br>• Continuously refine and improve data architecture by adopting emerging AWS best practices and patterns.<br>• Collaborate closely with Operations, Data Governance, and other teams to align with standards and achieve delivery objectives.
We are looking for a Senior Data Engineer to develop and optimize enterprise data systems that support analytics and digital solutions. In this role, you will design and implement robust data architectures, ensuring seamless data integration and transformation processes across the organization. Your expertise will drive the creation of reliable pipelines and scalable infrastructure, enabling advanced analytics and machine learning capabilities.<br><br>Responsibilities:<br>• Design and implement scalable data pipelines using Databricks, Spark, and Delta Lake to support enterprise-level analytics.<br>• Develop and maintain efficient data models tailored for AI, analytics, and operational systems.<br>• Lead Master Data Management initiatives to establish unified and accurate data records across platforms.<br>• Create batch and near-real-time data processing workflows for structured and semi-structured datasets.<br>• Collaborate with AI and software development teams to ensure delivery of high-quality datasets for machine learning.<br>• Define and enforce data architecture standards, ensuring scalability, reliability, and governance.<br>• Troubleshoot and optimize data systems to maintain performance and reliability in complex environments.<br>• Partner with cloud and IT teams to integrate modern data platforms and ensure seamless functionality.
<p><strong>Data Engineer (Hybrid, Los Angeles)</strong></p><p><strong>Location:</strong> Los Angeles, California</p><p><strong>Compensation:</strong> $140,000 - $175,000 per year</p><p><strong>Work Environment:</strong> Hybrid, with onsite requirements</p><p>Are you passionate about crafting highly-scalable and performant data systems? Do you have expertise in Azure Databricks, Spark SQL, and real-time data pipelines? We are searching for a talented and motivated <strong>Data Engineer</strong> to join our team in Los Angeles. You'll work in a hybrid environment that combines onsite collaboration with the flexibility of remote work.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and implement data pipelines and ETL workflows using cutting-edge Azure technologies (e.g., Databricks, Synapse Analytics, Synapse Pipelines).</li><li>Manage and optimize big data processes, ensuring scalability, efficiency, and data accuracy.</li><li>Build and work with real-time data pipelines leveraging technologies such as Kafka, Event Hubs, and Spark Streaming.</li><li>Apply advanced skills in Python and Spark SQL to build data solutions for analytics and machine learning.</li><li>Collaborate with business analysts and stakeholders to implement impactful dashboards using Power BI.</li><li>Architect and support the seamless integration of diverse data sources into a central platform for analytics, reporting, and model serving via ML Flow.</li></ul><p><br></p>
<p>We are seeking a Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. This role will support data-driven decision-making by ensuring reliable data flow, transformation, and accessibility across the organization.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, build, and maintain ETL/ELT data pipelines</li><li>Develop and optimize data models and data architectures</li><li>Integrate data from multiple sources (APIs, databases, third-party systems)</li><li>Ensure data quality, integrity, and reliability</li><li>Collaborate with data analysts, data scientists, and business stakeholders</li><li>Monitor and troubleshoot data pipeline performance issues</li><li>Implement best practices for data governance and security</li></ul><p><br></p>
<p>We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and analytics solutions that support enterprise reporting and advanced dashboards. This role will work with cross‑cloud data sources, including SAP, GCP, and BigQuery, and partner closely with analytics and business teams to deliver high‑quality, analytics‑ready datasets powering BI and AI initiatives.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, build, and maintain data pipelines following <strong>Medallion Architecture (Bronze, Silver, Gold)</strong> best practices.</li><li>Develop and support ETL processes pulling data from <strong>SAP, Google Cloud Platform (GCP), and BigQuery</strong>.</li><li>Ensure high data quality, reliability, and performance across ingestion and transformation layers.</li><li>Support analytics and visualization teams by delivering clean, well‑modeled datasets for:</li><li><strong>Power BI dashboards using DAX</strong></li><li><strong>Google Looker dashboards using LookML</strong></li><li>Collaborate with stakeholders to understand data requirements and translate them into scalable data models.</li><li>Maintain documentation on data sources, transformations, and architecture.</li><li>Support AI and API‑driven initiatives, including planned usage of <strong>Google ADK for API integrations</strong></li></ul><p><br></p><p><br></p>
<p>We are looking for a talented Data Engineer to join our team in Fort Lauderdale, Florida. This long-term contract position offers the opportunity to work on cutting-edge technologies and contribute to the development of efficient data pipelines and processes. The ideal candidate will have a strong background in data engineering and a passion for delivering high-quality solutions that drive business success.</p><p><br></p><p>Responsibilities:</p><p>• Design and implement scalable data pipelines using Snowflake, Python, and other relevant tools.</p><p>• Collaborate with stakeholders to gather and refine data requirements, ensuring alignment with business needs.</p><p>• Develop and maintain data models to support analytics, reporting, and operational processes.</p><p>• Optimize data warehouse performance by tuning queries and managing resources effectively.</p><p>• Ensure data quality through rigorous testing and governance protocols.</p><p>• Implement security and compliance measures to protect sensitive data.</p><p>• Research and integrate emerging technologies to enhance system capabilities.</p><p>• Support ETL processes for data extraction, transformation, and loading.</p><p>• Work with technologies such as Apache Spark, Hadoop, and Kafka to manage and process large datasets.</p><p>• Provide technical guidance and support to team members and stakeholders.</p>
We are looking for an experienced Data Engineer to join our team on a long-term contract basis. Based in Houston, Texas, this role offers an exciting opportunity to work with cutting-edge data technologies, design scalable solutions, and contribute to data-driven decision-making processes. If you are passionate about optimizing data systems and driving innovation, we encourage you to apply.<br><br>Responsibilities:<br>• Develop, maintain, and optimize scalable data pipelines using Apache Spark and Python.<br>• Implement ETL processes to ensure seamless extraction, transformation, and loading of data across systems.<br>• Collaborate with cross-functional teams to integrate Apache Hadoop and Apache Kafka into the data architecture.<br>• Monitor and troubleshoot data systems to ensure reliability and performance.<br>• Design and maintain data models, ensuring alignment with business requirements.<br>• Conduct thorough testing and validation of data processes to guarantee accuracy.<br>• Document data workflows and processes for future reference and team collaboration.<br>• Provide technical guidance and support to team members on data engineering best practices.<br>• Stay current on emerging technologies and trends in big data and analytics.<br>• Contribute to improving data governance and security protocols.
We are looking for a skilled Data Engineer to join our team in Houston, Texas. This Contract to permanent position offers an exciting opportunity to work at the intersection of data engineering, analytics, and business strategy. If you have a strong background in building and optimizing data pipelines and are passionate about leveraging technology to drive insights, we encourage you to apply.<br><br>Responsibilities:<br>• Design, develop, and optimize scalable data pipelines and workflows to support business analytics.<br>• Collaborate with cross-functional teams to gather and analyze data requirements.<br>• Implement ETL processes to extract, transform, and load data from diverse sources.<br>• Utilize tools such as Apache Spark and Hadoop to manage large-scale data processing.<br>• Integrate streaming data systems using Apache Kafka to enhance real-time analytics.<br>• Monitor and troubleshoot data flow and systems to ensure high performance and reliability.<br>• Develop and maintain documentation for data engineering processes and systems.<br>• Ensure data security and integrity across all platforms and processes.<br>• Work closely with stakeholders to translate business needs into technical solutions.<br>• Stay updated with industry trends and emerging technologies to improve data engineering practices.
<p>I’m building a world-class team to power our next generation of data products. We’re looking for a Senior Data Engineer who knows AWS inside and out—someone who can <strong>design secure, scalable data pipelines</strong>, <strong>own ETL/ELT workflows</strong>, <strong>engineer cloud data infrastructure</strong>, and <strong>deliver dimensional and semantic models</strong> that our analysts, data scientists, and applications can trust.</p><p>You’ll work closely with product, security, platform engineering, and analytics to move our architecture toward a <strong>real-time, governed, cost-aware</strong>, and <strong>highly automated</strong> data ecosystem.</p><p><strong>What You’ll Do</strong></p><ul><li><strong>Design & build end-to-end pipelines</strong> on AWS (batch and streaming) using services like <strong>Glue, EMR, Lambda, Step Functions, Kinesis, MSK</strong>, and <strong>Fargate</strong>.</li><li><strong>Develop robust ETL/ELT</strong> (PySpark, Spark SQL, SQL, Python) for structured, semi-structured, and unstructured data at scale.</li><li><strong>Own data storage & processing layers</strong>: <strong>S3 (Lake/Lakehouse), Redshift (or Snowflake on AWS), DynamoDB</strong>, and <strong>Athena</strong> with strong partitioning, compaction, and performance tuning.</li><li><strong>Implement data models</strong> (3NF, dimensional/star, Data Vault, Lakehouse medallion) for analytics and operational workloads.</li><li><strong>Engineer secure infrastructure-as-code</strong> with <strong>Terraform</strong> (or <strong>CDK</strong>) across multi-account setups; implement CI/CD via <strong>GitHub Actions</strong> or <strong>AWS CodeBuild/CodePipeline</strong>.</li><li><strong>Harden security & governance</strong>: use <strong>IAM</strong>, <strong>Lake Formation</strong>, <strong>KMS</strong>, <strong>Secrets Manager</strong>, <strong>VPC/PrivateLink</strong>, <strong>GLUE Catalog</strong>, and fine-grained access controls. Partner with SecOps on compliance (e.g., <strong>SOC 2</strong>, <strong>FedRAMP</strong>, <strong>HIPAA</strong> depending on dataset).</li><li><strong>Observability & reliability</strong>: build monitoring with <strong>CloudWatch</strong>, <strong>OpenTelemetry</strong>, and data quality checks (e.g., <strong>Great Expectations</strong>, <strong>Deequ</strong>), implement SLOs and alerts.</li><li><strong>Champion best practices</strong>: code reviews, testing (unit/integration), documentation, runbooks, and blameless postmortems.</li><li><strong>Mentor</strong> mid-level engineers and collaborate on architectural decisions, standards, and technical roadmaps.</li></ul><p><br></p>
<p>We are looking for an experienced Data Engineer to join our team on a contract basis in Columbus, Ohio. In this role, you will take on a leadership position, driving the development and optimization of data pipelines that support enterprise-wide analytics and decision-making. You will also play a key role in mentoring team members, fostering collaboration, and ensuring the integrity and quality of data across various business functions.</p><p><br></p><p>Responsibilities:</p><p>• Design, develop, and maintain efficient data pipelines to support enterprise analytics and reporting.</p><p>• Collaborate with business analysts and data science teams to refine data requirements and ensure alignment with organizational goals.</p><p>• Enhance and automate data integration and management processes to improve operational efficiency.</p><p>• Lead efforts to ensure data quality by testing for accuracy, consistency, and conformity to business rules.</p><p>• Provide training and guidance to team members and other stakeholders on data pipelining and preparation techniques.</p><p>• Partner with data governance teams to promote vetted content into the curated data catalog for reuse.</p><p>• Stay updated on emerging technologies and assess their impact on current systems and processes.</p><p>• Offer leadership, coaching, and mentorship to team members, encouraging attention to detail in their development.</p><p>• Work closely with stakeholders to understand business needs and ensure solutions meet those requirements.</p><p>• Perform additional duties as assigned to support organizational objectives.</p>