We are looking for a skilled Data Engineer to support our organization's data initiatives in Savannah, Georgia. This Contract to permanent role focuses on managing, optimizing, and securing data systems to drive strategic decision-making and improve overall performance. The ideal candidate will work closely with technology teams, analytics departments, and business stakeholders to ensure seamless data integration, accuracy, and scalability.<br><br>Responsibilities:<br>• Design and implement robust data lake and warehouse architectures to support organizational needs.<br>• Develop efficient ETL pipelines to process and integrate data from multiple sources.<br>• Collaborate with analytics teams to create and refine data models for reporting and visualization.<br>• Monitor and maintain data systems to ensure quality, security, and availability.<br>• Troubleshoot data-related issues and perform in-depth analyses to identify solutions.<br>• Define and manage organizational data assets, including SaaS tools and platforms.<br>• Partner with IT and security teams to meet compliance and governance standards.<br>• Document workflows, pipelines, and architecture for knowledge sharing and long-term use.<br>• Translate business requirements into technical solutions that meet reporting and analytics needs.<br>• Provide guidance and mentorship to team members on data usage and best practices.
We are looking for an experienced Data Engineer to join our team in Cincinnati, Ohio. This long-term contract position offers the opportunity to work on cutting-edge data engineering projects while collaborating with multidisciplinary teams to deliver high-quality solutions. The ideal candidate will have a strong background in Databricks and big data technologies, along with a passion for optimizing data processes and systems.<br><br>Responsibilities:<br>• Design, build, and enhance data pipelines using Databricks Runtime, Delta Lake, Autoloader, and Structured Streaming.<br>• Implement secure and governed data access protocols utilizing Unity Catalog, workspace controls, and audit configurations.<br>• Manage and integrate structured and unstructured data from diverse sources, including APIs and cloud storage.<br>• Develop and maintain notebook-based workflows and manage jobs using Databricks Workflows and Jobs.<br>• Apply best practices for performance tuning, scalability, and cost optimization in Databricks environments.<br>• Collaborate with data scientists, analysts, and business stakeholders to deliver clean and reliable datasets.<br>• Support continuous integration and deployment processes for Databricks jobs and system configurations.<br>• Ensure high standards of data quality and security across all engineering tasks.<br>• Troubleshoot and resolve issues to maintain operational efficiency in data pipelines.
<p>IMMEDIATE HIRE NEEDED. Interviews to begin the first week of February. </p><p><br></p><p>We are looking for a skilled Snowflake Marketing Data Engineer to join our team in Tampa, Florida in a hybrid in-office work schedule (2 to 3 days remote per week) preferably, remote candidates may be considered depending of the quality in match. </p><p><br></p><p>In this role, you will be responsible for designing, implementing, and maintaining data solutions that support critical business operations. Your expertise will play a key part in driving data-driven decisions and optimizing performance across various platforms.</p><p><br></p><p>Responsibilities:</p><p>• Develop and maintain ETL processes to efficiently extract, transform, and load data from multiple sources.</p><p>• Analyze marketing data to uncover insights and support strategic decision-making.</p><p>• Create and manage dashboards and reports using Power BI to visualize data effectively.</p><p>• Integrate and leverage tools like Braze and Google Analytics to enhance data tracking and reporting capabilities.</p><p>• Collaborate with cross-functional teams to ensure the accuracy and reliability of data systems.</p><p>• Optimize database performance and troubleshoot any issues related to data pipelines.</p><p>• Document data workflows and provide training to stakeholders on best practices.</p><p>• Work with cloud-based platforms, such as Snowflake, to store and manage large datasets.</p><p>• Ensure data security and compliance with company policies and standards.</p>
We are looking for an experienced Data Engineer to join our team in New York, New York. In this role, you will design, build, and maintain data infrastructure to support business intelligence and analytics needs. The ideal candidate will have a strong technical background, a passion for working with complex datasets, and expertise in cloud-based data platforms.<br><br>Responsibilities:<br>• Develop, implement, and optimize ETL pipelines to ensure efficient data processing and integration.<br>• Design and maintain scalable data solutions, including data warehouses and data lakes.<br>• Collaborate with cross-functional teams to identify data requirements and deliver actionable insights.<br>• Utilize Snowflake, AWS, and other cloud-based platforms to manage data infrastructure and ensure performance optimization.<br>• Leverage Python and SQL to build robust data workflows and automate processes.<br>• Employ orchestration tools like Airflow and dbt to streamline data operations.<br>• Support data analytics and visualization efforts by enabling the creation of impactful dashboards using tools such as Tableau.<br>• Work with marketing and product data sources, including platforms like Google Analytics, to extract and integrate valuable insights.<br>• Implement CI/CD pipelines and DevOps practices to enhance data engineering processes.<br>• Ensure data security and compliance across all systems and tools.
<p>The Senior Data Engineer plays a key role in architecting, developing, and operating reliable, production-ready data solutions that enable analytics, automation, and operational processes across our client’s organization.</p><p><br></p><p>Operating within a modern, cloud-based data ecosystem, this role is responsible for bringing together data from internal platforms and external partners, transforming it into trusted, high-quality assets, and delivering it consistently to downstream users and systems. The work spans the full data lifecycle—ingestion, orchestration, transformation, and delivery—and blends advanced SQL development with Python-based pipeline and workflow automation.</p><p><br></p><p>This role sits at the intersection of data and systems engineering and works closely with Business Intelligence, Business Technology, and operational teams to ensure data solutions are scalable, dependable, and aligned with real business outcomes.</p><p><br></p><p><br></p><p><br></p><p><br></p>
We are looking for an experienced Data Engineer to join our dynamic team in Mayville, Wisconsin. In this role, you will play a key part in developing and enhancing reporting and analytics solutions within a modern data environment. The ideal candidate is passionate about transforming complex data into actionable insights, improving processes, and creating reliable reporting systems. This is a long-term contract position offering the opportunity to make a meaningful impact within a collaborative and forward-thinking team.<br><br>Responsibilities:<br>• Design, develop, and maintain scalable data pipelines to support reporting and analytics needs.<br>• Create and optimize Power BI dashboards and reports to deliver accessible and trustworthy insights.<br>• Automate workflows using Power Automate to improve operational efficiency.<br>• Develop scripts using languages such as PowerShell or Python to streamline data processing tasks.<br>• Integrate and manage data sources including Oracle, Snowflake (hosted within Azure), and other enterprise systems.<br>• Collaborate with stakeholders to gather requirements and deliver customized solutions.<br>• Support the transition to cloud-based data environments, including Azure Data Warehouse and Fabric.<br>• Troubleshoot and resolve data-related issues, ensuring data integrity and reliability.<br>• Document processes and workflows to ensure clarity and maintainability.<br>• Stay updated on industry trends to recommend and implement innovative data solutions.
We are looking for an experienced Data Engineer to join our team in Jacksonville, Florida. In this role, you will take the lead in designing and building a cutting-edge Azure lakehouse platform that enables business leaders to access analytics through natural language queries. This position combines hands-on technical expertise with leadership responsibilities, offering an opportunity to mentor a team of skilled engineers while driving innovation.<br><br>Responsibilities:<br>• Architect and develop a robust Azure lakehouse platform, utilizing Azure Data Lake Gen2, Delta Lake, and PySpark to create efficient data pipelines.<br>• Implement a semantic layer and metric store to ensure consistent data translation and definitions across the organization.<br>• Design and maintain real-time and batch data pipelines, incorporating medallion architecture, schema evolution, and data contracts.<br>• Build retrieval systems for large language models (LLMs) using Azure OpenAI and vectorized Delta tables to support chat-based analytics.<br>• Ensure data quality, lineage, and observability through tools like Great Expectations and Unity Catalog, while optimizing costs through partitioning and compaction.<br>• Develop automated systems for anomaly detection and alerting using Azure ML pipelines and Event Grid.<br>• Collaborate with product and operations teams to translate complex business questions into actionable data models and queries.<br>• Lead and mentor a team of data and Python engineers, establishing best practices in CI/CD, code reviews, and documentation.<br>• Ensure compliance with security, privacy, and governance standards by designing and implementing robust data handling protocols.
We are looking for a skilled Data Engineer to join our team in Wyoming, Michigan. This Contract to permanent role offers an exciting opportunity to design, manage, and optimize data architecture and engineering solutions across a dynamic healthcare organization. The ideal candidate will play a key role in ensuring efficient data governance and infrastructure performance while collaborating with cross-functional teams.<br><br>Responsibilities:<br>• Develop and maintain robust data architectures and frameworks, including relational and graph databases, to meet business objectives.<br>• Create and manage data pipelines to extract, transform, and load data from various sources into data warehouses.<br>• Ensure data governance policies are implemented and monitored, including retention and backup protocols.<br>• Collaborate with teams across departments to translate business requirements into technical specifications.<br>• Monitor and optimize the performance of data assets, identifying opportunities for improvement.<br>• Design scalable and secure data solutions using cloud-based platforms like AWS and Microsoft Azure.<br>• Implement advanced tools and technologies, such as AI, to enhance data analytics and processing capabilities.<br>• Mentor and support team members by sharing technical expertise and providing guidance.<br>• Establish key performance indicators (KPIs) to measure database performance and drive continuous improvement.<br>• Stay up to date with emerging trends and advancements in data engineering and architecture.
We are looking for a highly skilled Data Scientist to contribute to a long-term contract position within the healthcare industry. This role focuses on supporting the enterprise-wide launch of Power BI by creating and delivering engaging, high-quality learning materials. The ideal candidate will work remotely, collaborating closely with leadership and subject matter experts to empower analytics and non-analytics professionals to efficiently use Power BI in their daily tasks.<br><br>Responsibilities:<br>• Develop scalable learning experiences tailored to diverse user personas and varying levels of technical expertise.<br>• Collaborate with the data literacy program team and Power BI specialists to ensure instructional content aligns with program objectives.<br>• Translate complex concepts related to Power BI and business intelligence into accessible and engaging educational materials.<br>• Design and deliver training programs using instructional design best practices and tools such as Camtasia, Adobe Creative Suite, or Articulate.<br>• Conduct user interviews to understand learning challenges and tailor content to meet specific needs.<br>• Enhance or create new data literacy resources, such as courses, modules, and curricula, to address emerging needs and best practices.<br>• Evaluate and adapt existing educational materials to make them sustainable and applicable across the organization.<br>• Participate in marketing efforts for the Data Literacy Program, including speaking engagements, blog posts, and other creative channels.<br>• Identify opportunities for new program initiatives that support analytics tools and data literacy.<br>• Serve as a subject matter expert in data literacy on national platforms through networking and conference participation.
We are looking for an experienced Data Engineering Manager to lead the strategic development and management of our enterprise data warehouse in Columbus, Ohio. This position combines technical expertise with leadership responsibilities to ensure data assets are efficiently structured, integrated, and utilized for operational processes, analytics, compliance, and external partnerships. The ideal candidate will drive innovation while maintaining robust data architecture standards to support the organization's long-term goals.<br><br>Responsibilities:<br>• Oversee the design, implementation, and optimization of the enterprise data warehouse and associated reporting systems.<br>• Ensure seamless data integration between source systems, analytics platforms, and reporting tools to maintain accuracy and reliability.<br>• Collaborate with various teams to align data structures and solutions with organizational objectives.<br>• Provide strategic direction for data architecture and recommend scalable solutions aligned with industry best practices.<br>• Develop and enforce standards for enterprise reporting, key performance indicators, and consistent data definitions.<br>• Promote uniformity in business rules and metric calculations across departments to ensure credible and authoritative data outputs.<br>• Review and validate data workflows, transformations, and reports to ensure completeness and accuracy.<br>• Identify and implement system improvements to enhance the functionality and efficiency of data platforms.<br>• Address and resolve issues related to data integrity or reporting disruptions, ensuring minimal downtime.<br>• Mentor team members and provide technical guidance to build a highly skilled and capable team.
<p>We are looking for a skilled Data Analyst / Engineer to join our team on a contract basis remotely. This role focuses on financial data processing, automation, and reporting within a dynamic environment. The ideal candidate will excel at managing data workflows, automating manual processes, and delivering accurate insights to support business decisions.</p><p><br></p><p>Responsibilities:</p><p>• Extract and reconcile financial data from multiple databases, ensuring accuracy and consistency across accounts receivable, accounts payable, and general ledger lanes.</p><p>• Automate manual reporting processes by developing repeatable daily and month-end pipelines for reliable and auditable data.</p><p>• Design and oversee data workflows across development, production, and utility databases, ensuring secure and efficient access.</p><p>• Create and deliver advanced Excel-based reports using macros, formulas, and Power Query to enhance usability for finance teams.</p><p>• Implement data validation and snapshot techniques to support reconciliation and decision-making processes.</p><p>• Ensure the traceability and accuracy of financial data by establishing robust controls and audit mechanisms.</p><p>• Collaborate with stakeholders to understand reporting requirements and translate them into scalable solutions.</p><p>• Utilize expertise in SQL and Teradata Data Warehouse to optimize database objects and queries for performance.</p><p>• Develop and maintain documentation for automated processes and data workflows to ensure clarity and continuity.</p>
We are looking for an experienced Data Architect to design and implement cutting-edge data solutions that meet the evolving needs of our enterprise. This role involves building secure, scalable, and high-performing data platforms while leveraging modern technologies and aligning with organizational goals. The ideal candidate will have expertise in cloud-based architecture, data governance, and advanced analytics, driving innovation across diverse business functions.<br><br>Responsibilities:<br>• Develop comprehensive data architecture strategies for advanced analytics and big data solutions using Azure Databricks.<br>• Design and implement Databricks Delta Lake-based Lakehouse architecture, utilizing PySpark Jobs, Databricks Workflows, Unity Catalog, and Medallion architecture.<br>• Optimize and configure Databricks clusters, notebooks, and workflows to ensure efficiency and scalability.<br>• Integrate Databricks with Azure services such as Azure Data Lake Storage, Azure Data Factory, Azure Key Vault, and Microsoft Fabric.<br>• Establish and enforce best practices for data governance, security, and cost management.<br>• Collaborate with data engineers, analysts, and business stakeholders to translate functional requirements into robust technical solutions.<br>• Provide technical mentoring and leadership to team members focused on Databricks and Azure technologies.<br>• Monitor, troubleshoot, and enhance data pipelines and workflows to maintain reliability and performance.<br>• Ensure compliance with organizational and regulatory standards regarding data security and privacy.<br>• Document configurations, processes, and governance standards to support long-term scalability and usability.
<p><strong>Overview</strong></p><p>We are seeking a Senior Data Engineer to support a major Salesforce Phase 2 data migration initiative. This role will focus heavily on building and optimizing data pipelines, developing ETL workflows, and moving CRM data from Salesforce into Databricks.</p><p>The engineer will work closely with a senior team member, contribute to Scrum ceremonies, and play a key role in developing the core CRM data environment used by the advertising organization.</p><p><br></p><p><strong>Key Responsibilities</strong></p><p><strong>Data Engineering & Migration</strong></p><ul><li>Develop ETL jobs that move and transform Salesforce data into Databricks.</li><li>Build, test, and maintain high‑volume data pipelines across AWS + Databricks.</li><li>Perform data migration, data integration, and pipeline development (including Mulesoft-related work).</li><li>Ensure all pipelines are reliable, scalable, and optimized for production.</li></ul><p><strong>Development & Infrastructure</strong></p><ul><li>Use Python and PySpark to build ETL components and transformation logic.</li><li>Leverage Spark/PySpark for distributed processing at scale (must‑have).</li><li>Use Terraform to provision and manage cloud infrastructure.</li><li>Set up CI/CD pipelines using Concourse or GitHub Actions for automated deployments.</li></ul><p><strong>Quality, Documentation & Support</strong></p><ul><li>Document ETL processes, pipelines, and data flows.</li><li>Participate in testing, QA, and validation of migrated datasets.</li><li>Provide post‑delivery support and proactively mitigate project risks or single points of failure (SPOF).</li><li>Troubleshoot production issues and implement long‑term fixes to maintain pipeline stability.</li></ul><p><strong>Collaboration</strong></p><ul><li>Work closely with engineering teammates to translate business requirements into working pipelines.</li><li>Participate in weekly Scrum ceremonies.</li><li>Contribute to shared best practices and continuous improvement across the data engineering team.</li></ul><p><br></p>
<p>We are looking for an experienced Data Analyst to join our team on a long-term contract basis for a global finance firm. This role is fully remote. This role requires someone with strong attention to detail who excels in data reconciliation, fraud investigation, and analytics. You will play a key role in analyzing data for accuracy and identifying potential discrepancies or fraudulent activities.</p><p><br></p><p><strong><u>Responsibilities:</u></strong></p><p>• Conduct regular account reconciliations to ensure data consistency and accuracy.</p><p>• Investigate suspected fraudulent activities using advanced analytics tools.</p><p>• Perform in-depth data analysis to identify trends and anomalies.</p><p>• Utilize VLOOKUP and other Excel functions to organize and interpret complex datasets.</p><p>• Collaborate with cross-functional teams to address data discrepancies and improve processes.</p><p>• Develop and implement anti-fraud strategies based on identified risks.</p><p>• Maintain detailed documentation of findings and reconciliation processes.</p><p>• Provide actionable insights to support decision-making and enhance operational efficiency.</p><p>• Ensure compliance with relevant regulations and standards during data analysis and investigations.</p>
We are looking for a skilled Data Analyst to join our team on a long-term contract basis in Cincinnati, Ohio. In this role, you will leverage your expertise to analyze data patterns, identify potential fraudulent activities, and support investigations with actionable insights. This position offers an excellent opportunity to contribute to fraud prevention and detection strategies while working with diverse datasets.<br><br>Responsibilities:<br>• Analyze large datasets to identify trends, anomalies, and potential fraudulent activities.<br>• Develop and implement fraud detection models and analytics tools.<br>• Collaborate with fraud investigation teams to provide data-driven insights and recommendations.<br>• Monitor and assess data for suspected fraudulent behavior, ensuring timely identification and reporting.<br>• Create detailed reports and visualizations to support anti-fraud initiatives.<br>• Enhance existing fraud prevention strategies through continuous data analysis.<br>• Conduct regular audits of data systems to ensure accuracy and reliability.<br>• Work closely with cross-functional teams to improve fraud detection capabilities.<br>• Maintain and safeguard data integrity throughout all analytical processes.
<p><strong>Data Engineer (Python / AWS)</strong></p><p><strong>Location:</strong> Remote (Northeast / Greater Boston area preferred)</p><p><strong>Type:</strong> Full-Time</p><p><strong>Level:</strong> Mid-to-Senior Individual Contributor</p><p><strong>About the Role</strong></p><p>We are looking for a strong individual contributor who excels in the Python data ecosystem and enjoys building reliable, scalable data pipelines. This role sits within a data engineering group responsible for integrating large volumes of data from external partners and transforming it into usable datasets for internal teams. You’ll work with modern cloud tools while also helping our team gradually transition away from a legacy platform.</p><p>This position is ideal for someone who wants to stay hands-on, focus on technical execution, and remain in an IC role for the next several years. We’re not looking for someone who is aiming to move immediately into architecture or leadership.</p><p>This team is fully distributed, and although candidates in the Boston area can go into the office, the rest of the group is remote. Anyone local may occasionally sit with other teams when on site.</p><p><br></p><p><strong>What You’ll Do</strong></p><ul><li>Build and maintain ETL pipelines that ingest, clean, and aggregate data received from external vendors and large enterprise partners.</li><li>Develop Python‑based data processing workflows deployed on AWS cloud services.</li><li>Work with tools such as AWS Glue, Airflow, dbt, and PySpark to support data transformations and pipeline orchestration.</li><li>Help modernize existing workflows and assist in the gradual migration away from a legacy data system.</li><li>Collaborate with internal stakeholders to understand data needs, define requirements, and ensure reliable integration of partner data feeds.</li><li>Troubleshoot pipeline issues, optimize performance, and improve overall system stability.</li><li>Contribute to best practices around code quality, testing, documentation, and data governance.</li></ul><p><br></p>
We are looking for an experienced Data/Information Architect to join our team in Philadelphia, Pennsylvania. In this long-term contract position, you will play a crucial role in designing and implementing data architecture solutions that support organizational goals. This opportunity is ideal for professionals passionate about building robust data frameworks and contributing to the healthcare industry.<br><br>Responsibilities:<br>• Develop and implement comprehensive data architecture strategies to support business objectives.<br>• Design and maintain data models using tools such as Erwin Data Modeler, Toad Data Modeler, and SQL.<br>• Collaborate with stakeholders to optimize data management processes and ensure seamless integration across platforms.<br>• Create and manage digital file systems, ensuring proper organization and accessibility.<br>• Provide expertise in database systems including SQL Server, Oracle, DB2, and Teradata.<br>• Utilize Python and SQL to develop scripts and automate data processing workflows.<br>• Ensure the accuracy and compliance of legal documentation within data systems.<br>• Work with Epic Software and AEM Architect to align data solutions with healthcare requirements.<br>• Perform data analysis to identify trends and improve system performance.<br>• Use Office tools to document processes and communicate findings effectively.
<p>We are seeking a <strong>Mid–Senior Level Full Stack Developer</strong> to support and maintain a suite of existing enterprise applications. This role will focus on <strong>stabilizing, upgrading, and maintaining legacy systems</strong> that are currently in “keep the lights on” mode.</p><p>These applications will eventually be sunset, so the work will primarily involve <strong>maintenance, upgrades, and support of existing functionality rather than new product development or greenfield engineering.</strong></p><p>The ideal candidate is comfortable working within <strong>established systems</strong>, addressing <strong>technical debt, upgrades, and out-of-support components</strong>, and collaborating in an <strong>Agile environment</strong>.</p><p>Key Responsibilities</p><ul><li>Maintain and support existing enterprise applications</li><li>Implement <strong>system upgrades and updates</strong> to ensure applications remain stable and secure</li><li>Address <strong>out-of-support technologies and compatibility changes</strong></li><li>Troubleshoot and resolve issues in legacy application environments</li><li>Participate in <strong>Agile / SAFe ceremonies</strong>, including stand-ups, ticket management, and story refinement</li><li>Collaborate with development teams to ensure continuity of application functionality</li><li>Contribute primarily to <strong>back-end development</strong>, while still supporting full stack functionality as needed</li></ul><p>Required Qualifications</p><ul><li><strong>Mid–Senior level developer experience</strong> (not a Lead or Architect role)</li><li>Strong experience with <strong>TypeScript</strong></li><li>Strong experience with <strong>.NET (4.6 – current versions)</strong></li><li>Experience working with <strong>legacy ASP.NET applications</strong></li><li>Experience with <strong>Python</strong> (some flexibility; willingness to work in Python environments)</li><li>Experience working within <strong>Agile or SAFe frameworks</strong></li><li>Ability to maintain and enhance <strong>legacy or enterprise systems</strong></li></ul><p>Nice to Have</p><ul><li>Experience modernizing or supporting <strong>applications approaching end-of-life</strong></li><li>Strong debugging and troubleshooting skills in complex environments</li><li>Experience working on <strong>backend-heavy workloads within full stack applications</strong></li></ul><p><br></p>
<p>Robert Half is seeking a <strong>Full-Stack Developer </strong>to support a manufacturing organization based in Oregon. This role blends hands-on day to day maintenance and direct collaboration with business stakeholders, focusing on designing, building, and maintaining secure, scalable, full-stack .NET applications using modern frameworks and best practices.</p><p><br></p><p><strong><u>Job Details </u></strong></p><p><strong>Duration: </strong>4 month contract with potential to extend or convert </p><p><strong>Location: </strong>100% Remote - Must work PST timezone </p><p><strong>Schedule: </strong>Monday - Friday Core Business Hours (PST)</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Design, develop, test, and maintain full-stack applications using <strong>C#, modern .NET, and ASP.NET</strong>.</li><li>Build and consume <strong>RESTful APIs</strong>, ensuring proper security, versioning, pagination, and error handling.</li><li>Develop responsive front-end interfaces using <strong>JavaScript/TypeScript, HTML5, CSS3</strong>, and modern frameworks.</li><li>Build applications using <strong>Blazor (Server and/or WebAssembly)</strong> and <strong>Telerik UI components</strong>.</li><li>Design, develop, and maintain <strong>SQL Server databases</strong>, including <strong>T-SQL</strong>, stored procedures, indexing, and performance tuning.</li><li>Partner directly with business users to analyze requirements, recommend solutions, and translate needs into technical designs.</li><li>Document solutions clearly from both <strong>technical</strong> and <strong>end-user</strong> perspectives.</li><li>Apply <strong>SOLID principles</strong>, common design patterns, <strong>Git-based workflows</strong>, and <strong>Agile development practices</strong>.</li></ul><p><br></p><p><br></p>
We are looking for a Senior Database Engineer to provide expert technical leadership for our global, cloud-based data infrastructure. This role involves designing, operating, and optimizing scalable, secure, and resilient database systems to support enterprise-scale workloads across AWS and Azure. As this is a Contract position with the possibility of becoming permanent, it offers an excellent opportunity to contribute to the development of cutting-edge database solutions while collaborating with cross-functional teams.<br><br>Responsibilities:<br>• Design and manage multi-region database architectures across AWS and Azure to support geo-distributed workloads.<br>• Architect and maintain relational, NoSQL, and document databases such as Snowflake, PostgreSQL, DynamoDB, Cosmos DB, and MongoDB.<br>• Lead hands-on database migrations between cloud platforms and legacy systems with a focus on scalability and reliability.<br>• Implement indexing strategies, optimize queries, and establish scaling patterns for handling large datasets and real-time applications.<br>• Enhance database performance to ensure high availability, low latency, and cost efficiency at an enterprise level.<br>• Support and refine data ingestion workflows and pipeline integrations using tools like AWS Glue, Step Functions, Lambda, and Azure Data Factory.<br>• Collaborate with Data Engineering teams to develop streaming solutions using Kafka, Kinesis, and AWS services.<br>• Apply robust security measures, including encryption, access controls, and secrets management, to protect database systems.<br>• Develop disaster recovery strategies and maintain backup solutions to ensure data integrity and availability.<br>• Monitor database systems using tools like CloudWatch, Azure Monitor, and Datadog, ensuring optimal reliability and performance.
We are looking for a Senior Database Engineer to take on a critical role in shaping the future of our global data platform. In this position, you will lead technical strategy, architect robust multi-cloud systems, and oversee initiatives to ensure reliability, scalability, and cost efficiency. You will have a hands-on approach, providing mentorship and collaborating with leadership to drive impactful technical decisions. This is a contract opportunity with the potential for a permanent position, located in Lehi, Utah.<br><br>Responsibilities:<br>• Develop and execute the technical roadmap for a scalable and reliable data infrastructure.<br>• Architect and implement multi-region, cross-account data platforms to support global operations.<br>• Establish and enforce engineering standards for database design, data pipelines, reliability, and observability.<br>• Lead post-incident reviews and implement solutions to prevent recurring issues.<br>• Collaborate with product and engineering teams to identify technical risks and optimize roadmaps.<br>• Design and oversee large-scale data migrations, ensuring fault tolerance and self-healing capabilities.<br>• Optimize database performance through indexing, query tuning, and capacity planning.<br>• Implement robust security measures, including encryption, secrets management, and access controls.<br>• Partner with cross-functional teams to align business requirements with technical solutions.<br>• Provide hands-on leadership in developing critical systems and resolving complex production incidents.
We are looking for an experienced Senior Data Scientist to join our dynamic team in Boston, Massachusetts. In this role, you will leverage your expertise in statistical modeling, machine learning, and cloud-based analytics to drive impactful decisions and solutions. The ideal candidate will bring a strong technical background, a passion for working with regulated data, and a commitment to ethical AI practices.<br><br>Responsibilities:<br>• Develop and implement advanced statistical models and machine learning algorithms to solve complex business problems.<br>• Monitor and evaluate the performance of AI models, ensuring reliability, fairness, and compliance with ethical standards.<br>• Collaborate with engineering and product teams to translate data-driven insights into actionable strategies.<br>• Utilize cloud-based tools such as AWS SageMaker and Redshift to design and deploy scalable analytics solutions.<br>• Handle sensitive healthcare or clinical trial datasets while adhering to strict data privacy and security regulations.<br>• Conduct exploratory data analysis and create visualizations to communicate findings effectively.<br>• Build and optimize ETL pipelines for efficient data transformation and integration.<br>• Apply Bayesian statistics and time-series forecasting techniques to improve predictive accuracy.<br>• Maintain comprehensive documentation of data science workflows and processes.<br>• Stay updated on industry trends and advancements to continuously enhance methodologies and tools.
We are looking for an experienced Senior Data Engineer with a strong background in Python and modern data engineering tools to join our team in West Des Moines, Iowa. This is a long-term contract position that requires expertise in designing, building, and optimizing data pipelines and working with cloud-based data warehouses. If you thrive in a collaborative environment and have a passion for transforming raw data into actionable insights, we encourage you to apply.<br><br>Responsibilities:<br>• Develop, debug, and optimize Python-based data pipelines using frameworks such as Flask, Django, or FastAPI.<br>• Design and implement data transformations in a data warehouse using tools like dbt, ensuring high-quality analytics-ready datasets.<br>• Utilize Amazon Redshift and Snowflake for managing large-scale data storage and performing advanced querying and optimization.<br>• Automate data integration processes using platforms like Fivetran and orchestration tools such as Prefect or Airflow.<br>• Build reusable and maintainable data models to improve performance and scalability for analytics and reporting.<br>• Conduct data analysis and visualization leveraging Python libraries such as NumPy, Pandas, TensorFlow, and PyTorch.<br>• Manage version control for data engineering projects using Git and GitHub.<br>• Ensure data quality through automated testing and validation processes.<br>• Document workflows, code, and data transformations following best practices for readability and maintainability.<br>• Optimize cloud-based data warehouse and lake platforms for performance and integration of new data sources.
<p>We are looking for an experienced Data Analyst III to join our team. In this role, you will apply advanced mathematical and data modeling techniques to deliver insightful business analyses and recommendations. You will collaborate with multiple business groups and senior stakeholders to drive informed decision-making and enhance processes. This is a long-term contract position offering an exciting opportunity to work on complex projects and influence strategic outcomes.</p><p><br></p><p>Responsibilities:</p><p>• Conduct detailed analyses to identify trends and provide actionable recommendations for business solutions.</p><p>• Summarize and present findings through reports, charts, and presentations to stakeholders.</p><p>• Develop and refine analytical models to support future business decisions.</p><p>• Collaborate with business teams to gather requirements and design effective data analysis strategies.</p><p>• Retrieve, verify, and prepare data from various sources for accurate reporting.</p><p>• Create advanced queries and tools to simplify data management and reporting processes.</p><p>• Forecast outcomes and analyze trends to support strategic planning and process improvements.</p><p>• Act as a liaison between departments, providing data-driven insights and answering queries about business processes.</p><p>• Mentor and guide less experienced team members, assigning tasks and ensuring project deliverables.</p><p>• Support cross-functional projects and provide input to external groups, vendors, or agencies as needed.</p>
We are looking for a Senior Engineer to join our team on a long-term contract basis in Appleton, Wisconsin. In this role, you will leverage your technical expertise to design and implement software solutions that align with organizational strategies. You will play a pivotal role in driving innovation, ensuring system scalability, and collaborating with cross-functional teams to deliver impactful outcomes.<br><br>Responsibilities:<br>• Develop and implement software solutions that meet business-critical needs, applying sound engineering practices.<br>• Collaborate with team members to establish and refine engineering standards, templates, and frameworks.<br>• Utilize technical expertise to solve complex problems and remove roadblocks, ensuring smooth project execution.<br>• Drive automation and process improvements to enhance system stability, scalability, and resilience.<br>• Actively participate in product planning, helping teams break down and prioritize work effectively.<br>• Promote the use of DevOps principles, including CI/CD pipelines, to improve deployment and build processes.<br>• Evaluate and select technology vendors, contributing to proof-of-concept initiatives.<br>• Share knowledge and mentor other engineers to foster growth within the team with attention to detail.<br>• Continuously assess and adopt new technologies to maintain cutting-edge software solutions.<br>• Provide technical guidance and support for subsystem maintenance and troubleshooting.