<p>We are looking for an experienced Data Architect to join our team on a long-term contract basis in Cleveland, Ohio. This role involves designing scalable enterprise data platforms, ensuring data quality, and implementing robust data governance frameworks. You will play a pivotal role in leveraging Azure services and AI-driven analytics to optimize data architecture and enhance operational insights.</p><p><br></p><p>Responsibilities:</p><p>• Develop and implement enterprise-wide data architectures and canonical data models.</p><p>• Establish data ownership protocols, governance standards, and quality benchmarks.</p><p>• Analyze and stabilize data pipelines across distributed systems and platforms.</p><p>• Perform detailed data analysis and reconciliation to identify and resolve integrity issues.</p><p>• Design and implement monitoring tools to validate and improve data quality.</p><p>• Enhance data observability and lineage tracking to streamline governance processes.</p><p>• Utilize AI-driven analytics and automation to detect anomalies and accelerate decision-making.</p><p>• Collaborate with engineering teams to align data architecture with integration services and platform requirements.</p><p>• Optimize event-driven and distributed data systems for scalability and reliability.</p><p>• Conduct hands-on work with Azure services, such as Azure Data Factory and Synapse, to implement solutions.</p>
We are looking for an experienced IT Site & ERP Administrator to oversee and enhance the management of ERP systems and IT operations at our Solon, Ohio location. In this role, you will ensure system integrity, lead IT-related projects, and provide specialized user support. The ideal candidate will possess strong technical expertise and a proactive approach to problem-solving in a dynamic, regulated environment.<br><br>Responsibilities:<br>• Administer and manage the local ERP system (ERP BusinessOne), ensuring smooth operation and reliability.<br>• Diagnose, address, and resolve technical issues, including system errors and bugs, while documenting resolutions for future reference.<br>• Translate business needs into system configurations through effective communication and a structured change management process.<br>• Provide training and share expertise with users to support their roles and enhance system utilization.<br>• Develop, maintain, and improve data reports to meet business requirements.<br>• Implement new system configurations and update existing ones to optimize performance.<br>• Manage user access rights by defining and maintaining permissions.<br>• Lead and execute local IT projects aligned with the organization’s global IT strategy.<br>• Oversee IT adherence to best practices and compliance with procedures in a regulated manufacturing environment.<br>• Collaborate with external providers for issue escalation and resolution when necessary.
<p><strong>Overview</strong></p><p>Reporting to the Manager, IT Network Infrastructure, the IT Network Security Admin is responsible for providing technical assistance and support related to client's network systems 24/7.</p><p> </p><p><strong>Responsibilities</strong></p><p>· Maintaining and administering networking security tools</p><p>· Perform escalated troubleshooting for network issues</p><p>· Configuring and installing various networking security devices and services</p><p>· Manage all SLAs</p><p>· Performing maintenance and upgrades for security configurations or end-of-life equipment cycle</p><p>· Communicate with customers of all updates</p><p>· Work with established configuration and change management system to ensure policies for awareness and approval are correctly upheld</p><p>· Work closely with selected vendors for projects and new technology implementations</p><p>· Deeper troubleshooting through diagnostic tools and techniques</p><p>· Determines the best solution based on the specific issue</p><p>· Escalates unresolved issues to next-level support</p><p>· Performs daily maintenance and reporting</p><p>· Performs additional duties as required</p><p><br></p>
We are looking for a skilled Salesforce Administrator to oversee the strategy, implementation, and ongoing management of the Salesforce platform. In this role, you will collaborate with stakeholders to translate business needs into technical solutions while ensuring the system operates smoothly and efficiently. This position requires a proactive individual who can manage the product roadmap, execute hands-on configurations, and optimize platform capabilities.<br><br>Responsibilities:<br>• Act as the primary administrator for Salesforce, handling configurations such as objects, workflows, automations, permissions, and integrations.<br>• Collaborate with stakeholders to gather requirements, prioritize enhancements, and develop tailored solutions to meet business objectives.<br>• Plan and maintain the Salesforce product roadmap, ensuring successful releases and adoption across departments.<br>• Create and maintain training materials, documentation, and best practices to support system users.<br>• Monitor data integrity, system performance, and compliance with security standards.<br>• Troubleshoot technical issues, manage support tickets, and provide ongoing assistance to users.<br>• Identify opportunities for platform optimization and lead improvement initiatives.<br>• Facilitate cross-functional communication to align Salesforce capabilities with organizational goals.<br>• Ensure consistent updates and upgrades align with company needs and industry standards.
<p>We are seeking an experienced NetSuite ERP Administrator to support, maintain, and optimize the NetSuite ERP environment within a manufacturing organization. This role partners closely with accounting, operations, and other business units to ensure the ERP system effectively supports material planning, financial processes, and reporting needs. The position also provides light IT systems support related to user access and device connectivity.</p><p><br></p><p>Key NetSuite Responsibilities</p><p>· Administer, configure, and support NetSuite ERP, ensuring system stability and data integrity</p><p>· Collaborate with accounting, manufacturing, and operations teams to support material management, planning, and financial workflows</p><p>· Manage user roles, permissions, and access within NetSuite</p><p>· Develop and maintain reports and queries using SQL and NetSuite reporting tools</p><p>· Troubleshoot ERP-related issues and coordinate with internal teams or vendors as needed</p><p>· Support integration points and system enhancements; </p><p>· use SPS Commerce for these processes)</p><p>· Familiarity with BlendApps Bartender Label Integration. </p><p><br></p><p>Other Responsibilities</p><p> We are a lean team and we all pitch in where needed. </p><p>· Provide basic Microsoft Active Directory support, including user account management</p><p>· Help out with endpoint management when needed. </p><p>· Assist with printer and access-related issues as they relate to system users</p>
We are looking for an experienced Epicor Database Developer to join our team in New Orleans, Louisiana. The ideal candidate will be skilled in designing and managing database systems, optimizing stored procedures, and working with Epicor ERP solutions. This role is perfect for someone who thrives on solving complex data challenges and ensuring seamless system integrations.<br><br>Responsibilities:<br>• Develop, maintain, and optimize SQL databases to support business processes and data requirements.<br>• Write and refine stored procedures and queries using T-SQL to ensure efficient data handling.<br>• Design and implement ETL processes to extract, transform, and load data between systems.<br>• Collaborate with stakeholders to analyze data needs and develop solutions within Epicor ERP systems.<br>• Troubleshoot and resolve database performance issues to ensure reliable operations.<br>• Implement best practices for database security and data integrity.<br>• Work closely with cross-functional teams to support system integrations and upgrades.<br>• Provide documentation and training for database processes and workflows.<br>• Stay updated on the latest developments in database technologies and Epicor ERP enhancements.
<p>The individual in this position will handle daily support practices, incident support, projects, etc. and provide feedback to the technical and leadership teams. Network Engineers install and maintain network & telecom infrastructure, collect network performance data, and resolve network issues in collaboration with vendors, and IT teams. They work at organizations' centralized network operations or data centers. It is expected that the candidate has demonstrated historical competency in networking while also has the technical aptitude to progress towards cross disciplinary technology operations support.</p><p>Responsibilities</p><p>• Provide Tier 3 Technical Support.</p><p>• Occasional travel (primarily day travel by automobile) to branch sites within a region to address technical needs.</p><p>• Resolve incidents by thorough troubleshooting delivering data-driven options utilizing the OSI model.</p><p>• Execute or implement changes and infrastructure requests.</p><p>• Install and configure network & telecom hardware and software, including routers, switches, firewalls, wireless access points, cubes, Adtrans.</p><p>• Monitor network performance to identify and resolve issues promptly.</p><p>• Maintain network security by implementing security measures and protocols per the Bank’s standards.</p><p>• Manage network backup and recovery processes to ensure data integrity and availability.</p><p>• Perform regular maintenance and updates on network & telecom systems to ensure optimal performance.</p><p>• Troubleshoot and provide client support for technology and infrastructure issues.</p><p>• Document and update incident technical details into service desk ticketing system.</p><p>• Maintain comprehensive documentation of network configurations, changes, and inventory.</p><p>• Generate regular reports on network performance, security incidents, and usage statistics.</p><p>• Communicate with key stakeholders to provide frequent status updates regarding technical issues and status of the remediation efforts.</p><p>• Execution of standard operational processes which involve technology and client interfaces.</p><p>• Perform all triage functions</p><p>• Extensive in-depth familiarity with Cisco IOS, UCS, UCM, UCCX, CER and monitoring platforms</p><p>• Perform hardware replacements</p><p>• Troubleshoot software or configuration related issues</p><p>• Adhere to operating procedures</p><p>• Identify and remediate possible major impacting events within the network</p><p>• Work with vendors across the technology to provide the highest possible uptime rate of network services</p><p>• Quickly identify the layer 1 and 2 connections to an IP address</p><p><br></p><p><br></p>
<p><strong>Senior Data Engineer</strong></p><p><strong>Location:</strong> Philadelphia, PA (Hybrid/Onsite as required)</p><p><strong>Employment Type: </strong>39 Week Contract, Potential for Extension</p><p><strong>Position Overview</strong></p><p>We are seeking an experienced <strong>Data Engineer</strong> to support the development and ongoing operation of a large-scale, cloud-based IoT platform. This role focuses on building and supporting scalable, secure, and high‑performance infrastructure, tooling, and frameworks that enable engineering teams to efficiently develop, test, deploy, and operate modern microservices.</p><p>The ideal candidate brings strong cloud engineering experience, a passion for quality and security, and the ability to collaborate in a fast‑paced Agile environment.</p><p><strong>Key Responsibilities</strong></p><ul><li>Develop, operate, and support DevOps and platform engineering tools that enable cloud-based IoT services</li><li>Build and promote horizontal tools, frameworks, and best practices supporting microservices, CI/CD, security, monitoring, and performance</li><li>Collaborate with engineering teams to define development standards, workflows, and methodologies</li><li>Design and implement shared libraries and frameworks to support scalable and highly available systems</li><li>Support production platform operations, troubleshooting, and continuous improvement with focus on quality, performance, and security</li><li>Translate system architecture and product requirements into well-designed, tested software solutions</li><li>Work in an Agile environment delivering incremental, high-quality software</li><li>Provide technical guidance and promote modern engineering practices across teams</li></ul>
We are looking for a skilled Data Platform Engineer to join our team on a long-term contract basis in Cleveland, Ohio. In this role, you will be responsible for managing and maintaining cloud-based analytics platforms, ensuring their stability, performance, and reliability. This is an excellent opportunity to work in a dynamic environment with cutting-edge technologies, including Kubernetes and containerized applications.<br><br>Responsibilities:<br>• Oversee the daily administration and operational support of cloud-based analytics platforms.<br>• Install, configure, monitor, and troubleshoot platform components and services to ensure optimal performance.<br>• Manage deployments within Kubernetes environments, addressing any related issues.<br>• Monitor system health and integrate tools for logging, alerting, and observability.<br>• Resolve performance, connectivity, and access issues to maintain system reliability.<br>• Configure and manage data source connections and platform integrations.<br>• Identify and mitigate potential capacity or performance risks by recommending improvements.<br>• Collaborate with internal teams, including data, engineering, and infrastructure, to meet organizational goals.<br>• Provide user support in a customer-facing or internal capacity, addressing technical concerns effectively.
We are looking for an experienced Data Engineer to join our team in Chicago, Illinois. In this role, you will design and implement data solutions that drive business insights and support strategic decision-making. Your expertise in Microsoft Fabric and Azure Databricks will be key in optimizing data workflows and ensuring the reliability of our data systems.<br><br>Responsibilities:<br>• Develop, implement, and maintain scalable data pipelines to support business analytics and reporting needs.<br>• Utilize Microsoft Fabric and Azure Databricks to design efficient data architectures and workflows.<br>• Collaborate with cross-functional teams to understand data requirements and deliver tailored solutions.<br>• Ensure data integrity and security across all systems and processes.<br>• Optimize data storage and retrieval processes for improved performance and scalability.<br>• Monitor system performance and troubleshoot issues as needed to ensure seamless operations.<br>• Document processes and procedures to maintain a clear record of data engineering solutions.<br>• Stay updated with emerging technologies and industry best practices to enhance data engineering capabilities.
Additional Skills:<br><br>Deep hands-on expertise with dbt (Cloud or Core), including model development, testing, macros, packages, documentation, scheduling, and performance optimization.<br>Strong command of dbt project structure, materializations (including incremental models and snapshots), and integration with BI-owned metric certification and semantic layers.<br>Ability to evaluate when to leverage community dbt packages versus building custom solutions.<br>Expert-level SQL for complex analytical transformations and performance optimization.<br>Strong data modeling skills across dimensional (Kimball), Data Vault, and domain-oriented patterns, including temporal modeling, SCDs, and surrogate keys.<br>Proven judgment in balancing normalization vs. denormalization for performance, flexibility, and downstream analytics use cases.<br>Experience designing and implementing automated data quality testing and validation frameworks.<br>Familiarity with data quality tooling (e.g., Great Expectations) and core data quality dimensions across analytics workflows.<br>Familiarity with modern analytics stacks and how analytics engineering integrates with cloud data platforms, ingestion tools, dbt, and BI systems.<br>Working knowledge of DataOps practices such as version control, CI/CD, and automated testing.<br>Knowledge of K–12 education data domains and metrics, including enrollment, attendance, assessments, staffing, and multi-state reporting requirements.<br>Familiarity with education data privacy (FERPA), academic calendars, and operational rhythms.<br>Proven ability to lead technical teams, facilitate requirements and design discussions, and manage competing stakeholder priorities.<br>Strong communication and change management skills, translating technical capabilities into clear business value. <br> <br><br>Required experience:<br><br>Bachelor’s degree in Computer Science, Information Systems, Data Science, Statistics, Mathematics, or a related field, or equivalent practical experience.<br>7+ years of experience in analytics engineering, data engineering, data analytics, or closely related technical roles.<br>3+ years of experience in technical leadership or people management, leading analytics, data, or BI teams.<br>Demonstrated hands-on experience with dbt (2+ years) building and maintaining production data models and transformations.<br>Strong data modeling expertise, with a proven track record designing dimensional models, analytics data marts, or business-facing data products.<br>Expert-level SQL skills, including complex analytical queries and performance optimization<br>Experience partnering with non-technical stakeholders to gather requirements and translate them into effective technical solutions.<br> <br>Preferred Education and Experience:<br><br>Master’s degree in Data Science, Statistics, Computer Science, or a related analytical field.<br>dbt Analytics Engineering certification or equivalent demonstrated expertise<br>Hands-on experience with Snowflake or comparable cloud data warehouse platforms.<br>Experience working with K–12 education data, student information systems, or education analytics.<br>Experience building data solutions for multi-state or geographically distributed organizations.<br>Exposure to data governance practices, including business glossaries and data quality frameworks<br>Familiarity with modern data stack tools (e.g., ingestion, orchestration, BI, and data quality platforms).<br>Experience leading analytics teams using Agile or iterative delivery methodologies.
We are looking for a talented Data Engineer to join our team in Glendale, California. In this long-term contract role, you will be instrumental in designing, developing, and maintaining scalable data pipelines and platforms that support critical business operations. Through collaboration with cross-functional teams, you will contribute to innovative data solutions that enhance decision-making processes and drive operational excellence.<br><br>Responsibilities:<br>• Develop, maintain, and optimize data pipelines to support the Core Data platform.<br>• Create tools and services to enhance data discovery, governance, and privacy.<br>• Collaborate with product managers, architects, and software engineers to ensure the success of data platforms.<br>• Apply technologies such as Airflow, Spark, Databricks, Delta Lake, and Kubernetes to build advanced data solutions.<br>• Establish and document best practices for pipeline configurations, naming conventions, and operational standards.<br>• Monitor and ensure the accuracy, reliability, and efficiency of datasets to meet service level agreements (SLAs).<br>• Participate in agile and scrum ceremonies to improve collaboration and team processes.<br>• Foster relationships with stakeholders to understand their needs and prioritize platform enhancements.<br>• Maintain detailed documentation to support data governance and quality initiatives.
We are looking for a skilled Data Engineer to join our team in Foxborough, Massachusetts, on a long-term contract basis. In this role, you will design, optimize, and maintain data pipelines and storage solutions, leveraging modern tools to ensure high performance and reliability. This position offers an exciting opportunity to collaborate across teams and implement cutting-edge practices in data engineering and analytics.<br><br>Responsibilities:<br>• Optimize Amazon Redshift performance by configuring distribution keys, sort keys, and fine-tuning queries.<br>• Develop and maintain robust data pipelines using AWS Glue and orchestrate workflows with Airflow.<br>• Manage semantic layers and metadata to support reliable analytics and AI-driven insights.<br>• Implement best practices for data partitioning, compression, and columnar storage formats.<br>• Monitor and troubleshoot data workflows to ensure high availability, reliability, and automated observability.<br>• Automate data processing tasks using Python and AWS native tools.<br>• Enforce data security and governance policies, including row- and column-level controls, using Lake Formation and AWS services.<br>• Oversee compliance monitoring and auditing through CloudWatch, CloudTrail, and similar tools.<br>• Continuously refine and improve data architecture by adopting emerging AWS best practices and patterns.<br>• Collaborate closely with Operations, Data Governance, and other teams to align with standards and achieve delivery objectives.
<p><strong>Data Engineer (Hybrid, Los Angeles)</strong></p><p><strong>Location:</strong> Los Angeles, California</p><p><strong>Compensation:</strong> $140,000 - $175,000 per year</p><p><strong>Work Environment:</strong> Hybrid, with onsite requirements</p><p>Are you passionate about crafting highly-scalable and performant data systems? Do you have expertise in Azure Databricks, Spark SQL, and real-time data pipelines? We are searching for a talented and motivated <strong>Data Engineer</strong> to join our team in Los Angeles. You'll work in a hybrid environment that combines onsite collaboration with the flexibility of remote work.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and implement data pipelines and ETL workflows using cutting-edge Azure technologies (e.g., Databricks, Synapse Analytics, Synapse Pipelines).</li><li>Manage and optimize big data processes, ensuring scalability, efficiency, and data accuracy.</li><li>Build and work with real-time data pipelines leveraging technologies such as Kafka, Event Hubs, and Spark Streaming.</li><li>Apply advanced skills in Python and Spark SQL to build data solutions for analytics and machine learning.</li><li>Collaborate with business analysts and stakeholders to implement impactful dashboards using Power BI.</li><li>Architect and support the seamless integration of diverse data sources into a central platform for analytics, reporting, and model serving via ML Flow.</li></ul><p><br></p>
<p>We are looking for a detail-oriented Data Migration Specialist to join our team on a contract basis in Maple Plain, Minnesota. In this role, you will collaborate with cross-functional teams to assess, cleanse, and migrate data while ensuring its accuracy and usability. This position offers an exciting opportunity to contribute to critical data transformation projects within the manufacturing industry.</p><p><br></p><p>Responsibilities:</p><p>• Assess existing data to identify quality issues, duplication, and structural gaps, ensuring readiness for migration from HubSpot to Salesforce.</p><p>• Cleanse and standardize data using Snowflake, including deduplication, normalization, and application of business rules.</p><p>• Develop and document field mappings and transformation logic to align data with Salesforce requirements.</p><p>• Support test migrations, validate data integrity, and reconcile discrepancies post-migration.</p><p>• Prepare comprehensive documentation, including migration steps, validation checks, and governance guidelines.</p><p>• Standardize field naming conventions, formats, and reference data for consistent usage.</p><p>• Collaborate with stakeholders to define data entry standards and long-term maintenance expectations.</p><p>• Produce migration-ready datasets and ensure alignment with Salesforce’s data model.</p><p>• Deliver clear work instructions and governance documentation to prevent future data issues.</p><p>• Facilitate stakeholder sign-off on data quality and migration outcomes.</p>
<p>We are seeking a Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. This role will support data-driven decision-making by ensuring reliable data flow, transformation, and accessibility across the organization.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, build, and maintain ETL/ELT data pipelines</li><li>Develop and optimize data models and data architectures</li><li>Integrate data from multiple sources (APIs, databases, third-party systems)</li><li>Ensure data quality, integrity, and reliability</li><li>Collaborate with data analysts, data scientists, and business stakeholders</li><li>Monitor and troubleshoot data pipeline performance issues</li><li>Implement best practices for data governance and security</li></ul><p><br></p>
<p>We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and analytics solutions that support enterprise reporting and advanced dashboards. This role will work with cross‑cloud data sources, including SAP, GCP, and BigQuery, and partner closely with analytics and business teams to deliver high‑quality, analytics‑ready datasets powering BI and AI initiatives.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, build, and maintain data pipelines following <strong>Medallion Architecture (Bronze, Silver, Gold)</strong> best practices.</li><li>Develop and support ETL processes pulling data from <strong>SAP, Google Cloud Platform (GCP), and BigQuery</strong>.</li><li>Ensure high data quality, reliability, and performance across ingestion and transformation layers.</li><li>Support analytics and visualization teams by delivering clean, well‑modeled datasets for:</li><li><strong>Power BI dashboards using DAX</strong></li><li><strong>Google Looker dashboards using LookML</strong></li><li>Collaborate with stakeholders to understand data requirements and translate them into scalable data models.</li><li>Maintain documentation on data sources, transformations, and architecture.</li><li>Support AI and API‑driven initiatives, including planned usage of <strong>Google ADK for API integrations</strong></li></ul><p><br></p><p><br></p>
We are looking for an experienced Data Engineer to join our team on a long-term contract basis. Based in Houston, Texas, this role offers an exciting opportunity to work with cutting-edge data technologies, design scalable solutions, and contribute to data-driven decision-making processes. If you are passionate about optimizing data systems and driving innovation, we encourage you to apply.<br><br>Responsibilities:<br>• Develop, maintain, and optimize scalable data pipelines using Apache Spark and Python.<br>• Implement ETL processes to ensure seamless extraction, transformation, and loading of data across systems.<br>• Collaborate with cross-functional teams to integrate Apache Hadoop and Apache Kafka into the data architecture.<br>• Monitor and troubleshoot data systems to ensure reliability and performance.<br>• Design and maintain data models, ensuring alignment with business requirements.<br>• Conduct thorough testing and validation of data processes to guarantee accuracy.<br>• Document data workflows and processes for future reference and team collaboration.<br>• Provide technical guidance and support to team members on data engineering best practices.<br>• Stay current on emerging technologies and trends in big data and analytics.<br>• Contribute to improving data governance and security protocols.
<p>We are looking for an experienced Data Engineer to join our team on a contract basis in Columbus, Ohio. In this role, you will take on a leadership position, driving the development and optimization of data pipelines that support enterprise-wide analytics and decision-making. You will also play a key role in mentoring team members, fostering collaboration, and ensuring the integrity and quality of data across various business functions.</p><p><br></p><p>Responsibilities:</p><p>• Design, develop, and maintain efficient data pipelines to support enterprise analytics and reporting.</p><p>• Collaborate with business analysts and data science teams to refine data requirements and ensure alignment with organizational goals.</p><p>• Enhance and automate data integration and management processes to improve operational efficiency.</p><p>• Lead efforts to ensure data quality by testing for accuracy, consistency, and conformity to business rules.</p><p>• Provide training and guidance to team members and other stakeholders on data pipelining and preparation techniques.</p><p>• Partner with data governance teams to promote vetted content into the curated data catalog for reuse.</p><p>• Stay updated on emerging technologies and assess their impact on current systems and processes.</p><p>• Offer leadership, coaching, and mentorship to team members, encouraging attention to detail in their development.</p><p>• Work closely with stakeholders to understand business needs and ensure solutions meet those requirements.</p><p>• Perform additional duties as assigned to support organizational objectives.</p>
<p>We are seeking a highly skilled Data Engineer to design, build, and manage our data infrastructure. The ideal candidate is an expert in writing complex SQL queries, designing efficient database schemas, and developing ETL/ELT pipelines. This role ensures data accuracy, accessibility, and performance optimization to support business intelligence, analytics, and reporting initiatives.</p><p><br></p><p><strong><em><u>Key Responsibilities</u></em></strong></p><p><br></p><p><strong>Database Design & Management</strong></p><ul><li>Design, develop, and maintain relational databases, including SQL Server, PostgreSQL, and Oracle, as well as cloud-based data warehouses.</li></ul><p><strong>Strategic SQL & Data Engineering</strong></p><ul><li>Develop advanced, optimized SQL queries, stored procedures, and functions to process and analyze large, complex datasets and deliver actionable business insights.</li></ul><p><strong>Data Pipeline Automation & Orchestration</strong></p><ul><li>Build, automate, and orchestrate ETL/ELT workflows using SQL, Python, and cloud-native tools to integrate and transform data from diverse, distributed sources.</li></ul><p><strong>Performance Optimization</strong></p><ul><li>Tune SQL queries and optimize database schemas through indexing, partitioning, and normalization to improve data retrieval and processing performance.</li></ul><p><strong>Data Integrity & Security</strong></p><ul><li>Ensure data quality, consistency, and integrity across systems.</li><li>Implement data masking, encryption, and role-based access control (RBAC).</li></ul><p><strong>Documentation</strong></p><ul><li>Maintain comprehensive technical documentation, including database schemas, data dictionaries, and ETL workflows.</li></ul>
<p>The Database Engineer will design, develop, and maintain database solutions that meet the needs of our business and clients. You will be responsible for ensuring the performance, availability, and security of our database systems while collaborating with software engineers, data analysts, and IT teams.</p><p> </p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, implement, and maintain highly available and scalable database systems (e.g., SQL, NoSQL).</li><li>Optimize database performance through indexing, query optimization, and capacity planning.</li><li>Create and manage database schemas, tables, stored procedures, and triggers.</li><li>Develop and maintain ETL (Extract, Transform, Load) processes for data integration.</li><li>Ensure data integrity and consistency across distributed systems.</li><li>Monitor database performance and troubleshoot issues to ensure minimal downtime.</li><li>Collaborate with software development teams to design database architectures that align with application requirements.</li><li>Implement data security best practices, including encryption, backups, and access controls.</li><li>Stay updated on emerging database technologies and recommend solutions to enhance efficiency.</li><li>Document database configurations, processes, and best practices for internal knowledge sharing.</li></ul><p><br></p>
We are looking for a skilled Data Engineer to join our team in Wayne, Pennsylvania, on a contract to permanent basis. This role offers an exciting opportunity to design, implement, and optimize data pipelines while integrating applications with various digital marketplaces. The ideal candidate will bring strong technical expertise and a collaborative mindset to support business insights and analytics effectively.<br><br>Responsibilities:<br>• Develop and maintain data pipelines and ensure seamless application connectivity with digital marketplaces such as TikTok Shop, Shopify, and Amazon.<br>• Collaborate closely with business teams to understand requirements and provide actionable analytics.<br>• Lead the creation of scalable and efficient data solutions tailored to business needs.<br>• Apply expertise in Python, Snowflake, and other relevant technologies to deliver high-quality results.<br>• Facilitate and support integrations with e-commerce platforms, leveraging previous experience where applicable.<br>• Build robust APIs and ensure their effective implementation.<br>• Utilize Microsoft SQL for database management and optimization.<br>• Provide technical guidance and mentorship to ensure project success.<br>• Troubleshoot and resolve issues related to data workflows and integrations.<br>• Continuously evaluate and improve processes to enhance efficiency and performance.
We are looking for an experienced Data Engineer to join our team on a contract basis in Madison, Wisconsin. In this role, you will focus on designing, building, and optimizing robust data pipelines and cloud-based data architectures. This position requires a strong technical background and the ability to work with various data sources, tools, and platforms to drive seamless data integration and transformation.<br><br>Responsibilities:<br>• Design and develop scalable data pipelines to support business needs and analytics.<br>• Utilize Snowflake and cloud-based platforms, such as Azure, to manage and optimize data architecture.<br>• Integrate and customize data ingestion processes for platforms like Shopify, Oracle, and NetSuite.<br>• Collaborate with teams to connect data sources and deliver data-driven solutions for dashboards and AI applications.<br>• Implement and manage ETL processes to ensure data accuracy and reliability.<br>• Work with tools like Apache Spark, Hadoop, and Kafka to process and analyze large datasets.<br>• Develop APIs and custom applications to facilitate seamless data movement and integration.<br>• Leverage AWS services, including AWS Data Pipeline and CloudFormation, to enhance data workflows.<br>• Troubleshoot and resolve data pipeline issues to maintain system efficiency and performance.
<p><strong>Data Engineer</strong></p><p>On-site | Austin, TX | Contract-to-Hire</p><p><br></p><p><strong>Responsibilities:</strong></p><ul><li>Design, build, and maintain scalable data pipelines and ETL/ELT processes</li><li>Develop and optimize data architectures for data lakes, warehouses, and analytics platforms</li><li>Ingest, transform, and integrate data from multiple sources (databases, APIs, streaming systems)</li><li>Ensure data quality, reliability, and performance across data systems</li><li>Collaborate with data scientists, analysts, and business stakeholders to support reporting and analytics needs</li><li>Optimize database performance, queries, and data storage strategies</li><li>Implement data governance, security, and compliance best practices</li><li>Automate data workflows and monitoring processes</li><li>Troubleshoot and resolve data pipeline failures and performance issues</li><li>Document data models, workflows, and technical processes</li></ul>