We are looking for a skilled Data Engineer to join our team in Wayne, Pennsylvania, on a contract to permanent basis. This role offers an exciting opportunity to design, implement, and optimize data pipelines while integrating applications with various digital marketplaces. The ideal candidate will bring strong technical expertise and a collaborative mindset to support business insights and analytics effectively.<br><br>Responsibilities:<br>• Develop and maintain data pipelines and ensure seamless application connectivity with digital marketplaces such as TikTok Shop, Shopify, and Amazon.<br>• Collaborate closely with business teams to understand requirements and provide actionable analytics.<br>• Lead the creation of scalable and efficient data solutions tailored to business needs.<br>• Apply expertise in Python, Snowflake, and other relevant technologies to deliver high-quality results.<br>• Facilitate and support integrations with e-commerce platforms, leveraging previous experience where applicable.<br>• Build robust APIs and ensure their effective implementation.<br>• Utilize Microsoft SQL for database management and optimization.<br>• Provide technical guidance and mentorship to ensure project success.<br>• Troubleshoot and resolve issues related to data workflows and integrations.<br>• Continuously evaluate and improve processes to enhance efficiency and performance.
The Opportunity: Be part of a dynamic team that designs, develops, and optimizes data solutions supporting enterprise-level products across diverse industries. This role provides a clear track to higher-level positions, including Lead Data Engineer and Data Architect, for those who demonstrate vision, initiative, and impact. Key Responsibilities: Design, develop, and optimize relational database objects and data models using Microsoft SQL Server and Snowflake. Build and maintain scalable ETL/ELT pipelines for batch and streaming data using SSIS and cloud-native solutions. Integrate and utilize Redis for caching, session management, and real-time analytics. Develop and maintain data visualizations and reporting solutions using Sigma Computing, SSRS, and other BI tools. Collaborate across engineering, analytics, and product teams to deliver impactful data solutions. Ensure data security, governance, and compliance across all platforms. Participate in Agile Scrum ceremonies and contribute to continuous improvement within the data engineering process. Support database deployments using DevOps practices, including version control (Git) and CI/CD pipelines (Azure DevOps, Flyway, Octopus, SonarQube). Troubleshoot and resolve performance, reliability, and scalability issues across the data platform. Mentor entry level team members and participate in design/code reviews.
<p><strong>Overview</strong></p><p>We are seeking a Senior Data Engineer to support a major Salesforce Phase 2 data migration initiative. This role will focus heavily on building and optimizing data pipelines, developing ETL workflows, and moving CRM data from Salesforce into Databricks.</p><p>The engineer will work closely with a senior team member, contribute to Scrum ceremonies, and play a key role in developing the core CRM data environment used by the advertising organization.</p><p><br></p><p><strong>Key Responsibilities</strong></p><p><strong>Data Engineering & Migration</strong></p><ul><li>Develop ETL jobs that move and transform Salesforce data into Databricks.</li><li>Build, test, and maintain high‑volume data pipelines across AWS + Databricks.</li><li>Perform data migration, data integration, and pipeline development (including Mulesoft-related work).</li><li>Ensure all pipelines are reliable, scalable, and optimized for production.</li></ul><p><strong>Development & Infrastructure</strong></p><ul><li>Use Python and PySpark to build ETL components and transformation logic.</li><li>Leverage Spark/PySpark for distributed processing at scale (must‑have).</li><li>Use Terraform to provision and manage cloud infrastructure.</li><li>Set up CI/CD pipelines using Concourse or GitHub Actions for automated deployments.</li></ul><p><strong>Quality, Documentation & Support</strong></p><ul><li>Document ETL processes, pipelines, and data flows.</li><li>Participate in testing, QA, and validation of migrated datasets.</li><li>Provide post‑delivery support and proactively mitigate project risks or single points of failure (SPOF).</li><li>Troubleshoot production issues and implement long‑term fixes to maintain pipeline stability.</li></ul><p><strong>Collaboration</strong></p><ul><li>Work closely with engineering teammates to translate business requirements into working pipelines.</li><li>Participate in weekly Scrum ceremonies.</li><li>Contribute to shared best practices and continuous improvement across the data engineering team.</li></ul><p><br></p>
We are looking for a highly skilled Data Scientist to contribute to a long-term contract position within the healthcare industry. This role focuses on supporting the enterprise-wide launch of Power BI by creating and delivering engaging, high-quality learning materials. The ideal candidate will work remotely, collaborating closely with leadership and subject matter experts to empower analytics and non-analytics professionals to efficiently use Power BI in their daily tasks.<br><br>Responsibilities:<br>• Develop scalable learning experiences tailored to diverse user personas and varying levels of technical expertise.<br>• Collaborate with the data literacy program team and Power BI specialists to ensure instructional content aligns with program objectives.<br>• Translate complex concepts related to Power BI and business intelligence into accessible and engaging educational materials.<br>• Design and deliver training programs using instructional design best practices and tools such as Camtasia, Adobe Creative Suite, or Articulate.<br>• Conduct user interviews to understand learning challenges and tailor content to meet specific needs.<br>• Enhance or create new data literacy resources, such as courses, modules, and curricula, to address emerging needs and best practices.<br>• Evaluate and adapt existing educational materials to make them sustainable and applicable across the organization.<br>• Participate in marketing efforts for the Data Literacy Program, including speaking engagements, blog posts, and other creative channels.<br>• Identify opportunities for new program initiatives that support analytics tools and data literacy.<br>• Serve as a subject matter expert in data literacy on national platforms through networking and conference participation.
We are looking for a skilled Cloud Engineer to join our team in Wayne, Pennsylvania. This role requires a deep understanding of cloud technologies, particularly Microsoft Azure and Microsoft 365, as well as expertise in infrastructure and identity management. You will play a key part in ensuring seamless operations and resolving complex technical challenges.<br><br>Responsibilities:<br>• Diagnose and address technical issues across Microsoft Azure, Microsoft 365, Entra ID, Intune, and hybrid or on-premise server environments.<br>• Act as a Tier 3 escalation point to manage and resolve advanced infrastructure, cloud, identity, and networking problems.<br>• Develop and maintain detailed documentation of client systems, including Azure resources, Intune configurations, and support actions.<br>• Collaborate with cross-functional teams to implement and optimize cloud solutions.<br>• Ensure the security and compliance of cloud and on-premise environments.<br>• Provide technical guidance and recommendations to clients regarding best practices in cloud and infrastructure management.<br>• Monitor system performance and proactively identify areas for improvement.<br>• Support the integration of new technologies into existing environments.<br>• Assist in training and mentoring team members who are new to cloud and infrastructure topics.
We are looking for an experienced Data/Information Architect to join our team in Philadelphia, Pennsylvania. In this long-term contract position, you will play a crucial role in designing and implementing data architecture solutions that support organizational goals. This opportunity is ideal for professionals passionate about building robust data frameworks and contributing to the healthcare industry.<br><br>Responsibilities:<br>• Develop and implement comprehensive data architecture strategies to support business objectives.<br>• Design and maintain data models using tools such as Erwin Data Modeler, Toad Data Modeler, and SQL.<br>• Collaborate with stakeholders to optimize data management processes and ensure seamless integration across platforms.<br>• Create and manage digital file systems, ensuring proper organization and accessibility.<br>• Provide expertise in database systems including SQL Server, Oracle, DB2, and Teradata.<br>• Utilize Python and SQL to develop scripts and automate data processing workflows.<br>• Ensure the accuracy and compliance of legal documentation within data systems.<br>• Work with Epic Software and AEM Architect to align data solutions with healthcare requirements.<br>• Perform data analysis to identify trends and improve system performance.<br>• Use Office tools to document processes and communicate findings effectively.
<p>Are you passionate about next-generation data engineering, AI, and modern cloud technologies? Our company is seeking an innovative and driven Snowflake Solutions Engineer to join our IT team in a fully remote capacity. In this role, you will lead the design and implementation of advanced Snowflake-native applications and AI-powered data solutions, creating measurable business impact utilizing Snowflake’s latest platform features. This is an exceptional opportunity to work at the forefront of data, leveraging Streamlit, Cortex AI, and emerging Snowflake technologies.</p><p><strong>Key Responsibilities:</strong></p><p><strong>Snowflake Native Application Development (30%)</strong></p><ul><li>Design and build interactive data applications using Snowflake Streamlit to enable intuitive, self-service analytics and operational workflows for business users.</li><li>Develop reusable frameworks and component libraries for rapid application delivery.</li><li>Integrate Snowflake Native Apps and third-party marketplace applications to continuously extend platform capabilities.</li><li>Create custom UDFs and stored procedures to support advanced business logic.</li></ul><p><strong>Data Architecture and Modern Platform Design (30%)</strong></p><ul><li>Develop cutting-edge data architecture solutions spanning data warehousing, data lakes, and lakehouse approaches.</li><li>Implement medallion (bronze-silver-gold) patterns to maintain data quality and governance.</li><li>Recommend optimal architecture patterns for structured analytics, semi-structured data, and AI/ML workloads.</li><li>Establish best practices for data organization, storage optimization, and query performance.</li></ul><p><strong>AI & Advanced Analytics Collaboration (15%)</strong></p><ul><li>Partner with AI/data science teams to support and enhance Snowflake-based AI workloads.</li><li>Enable implementation of Snowflake Cortex AI features for practical business cases.</li><li>Guide data access and feature engineering for ML model requirements.</li><li>Contribute platform expertise to AI proof-of-concept initiatives.</li></ul><p><strong>Security, Governance, & Technical Leadership (15%)</strong></p><ul><li>Design and implement RBAC hierarchies, enforcing least privilege principles.</li><li>Define security best practices including network policies and encryption; implement row/column security and data masking.</li><li>Apply tag-based policies for advanced governance.</li><li>Monitor and optimize application performance, cost, and user experience.</li><li>Lead architectural discussions, create technical documentation, and share best practices.</li></ul><p><br></p>
We are looking for a skilled Software Engineer to join our team in Bethlehem, Pennsylvania. This role involves designing and optimizing data systems, managing tools for data orchestration, and ensuring secure and efficient operations. The ideal candidate will thrive in a collaborative environment while delivering impactful solutions for business intelligence and operations.<br><br>Responsibilities:<br>• Build and manage data orchestration tools, including creating variables, setting notifications, and configuring retries.<br>• Optimize Snowflake performance by adjusting warehouse sizing, clustering, and profiling queries.<br>• Schedule and oversee near real-time data loads using Snowflake Tasks and Streams.<br>• Implement rigorous data quality checks such as verifying freshness, row counts, and referential integrity.<br>• Monitor and control costs through usage dashboards and guardrails.<br>• Ensure secure operations by maintaining roles, managing secrets, and auditing logs.<br>• Develop and monitor Power BI datasets to support Finance and Operations teams.<br>• Collaborate with stakeholders to gather requirements and deliver tailored solutions.<br>• Enhance and maintain front-end data applications using tools like Streamlit and Python.<br>• Create detailed documentation, including runbooks, root cause analyses, and change tickets for releases.
<p><strong>AWS Big Data Architect (with Hadoop) </strong></p><p><strong>Location:</strong> Hybrid 4x Onsite – Philadelphia, PA</p><p><strong>Contract Duration:</strong> April 6, 2026 – December 31, 2026</p><p><strong>Employment Type:</strong> W2 Contract</p><p><strong>Overview</strong></p><p>We are seeking a highly skilled <strong>AWS Big Data Architect / Senior Data Engineer</strong> to design, develop, and deliver scalable Big Data Warehouse solutions. This is a hands-on role suited for someone who is passionate about technology, thrives in a collaborative environment, and can work effectively with both technical and non-technical stakeholders. The ideal candidate excels in fast-paced settings and is committed to producing high-quality, impactful results.</p><p>This role offers the opportunity to collaborate with engineering teams across the enterprise and influence broader data and technology strategies.</p><p><strong>Key Responsibilities</strong></p><ul><li>Design and develop scalable Big Data Warehouse solutions across the full data supply chain.</li><li>Build and implement metadata management solutions.</li><li>Create and maintain technical documentation, user documentation, data models, data dictionaries, glossaries, process flows, and architecture diagrams.</li><li>Enhance and expand the enterprise Data Lake environment.</li><li>Solve complex data integration challenges across multiple systems.</li><li>Design and execute strategies for real-time data analysis and decision-making.</li><li>Collaborate with business partners, analysts, developers, architects, and engineers to support ongoing data quality initiatives.</li><li>Work closely with Data Science teams to improve actionable insights.</li><li>Continuously expand knowledge of new tools, platforms, and technologies.</li></ul>
We are looking for a skilled and detail-oriented Data Analyst to join our team in Allentown, Pennsylvania. This is a Contract-to-permanent position that offers the opportunity to grow within a dynamic and fast-paced environment. The ideal candidate will excel in analyzing data, performing reconciliations, and supporting financial decision-making with actionable insights.<br><br>Responsibilities:<br>• Manage and oversee intercompany transactions, ensuring accurate billing and compliance.<br>• Perform detailed reconciliations, including cash, third-party accounts, and insurance data.<br>• Contribute to audits, budgeting processes, forecasting activities, and special projects as needed.<br>• Analyze performance metrics such as sales, labor, food costs, and customer behavior to identify trends.<br>• Develop user-friendly dashboards and reports for operational and executive teams.<br>• Ensure data accuracy and connectivity across inventory, accounting, and organizational systems.<br>• Translate data insights into practical recommendations to enhance service and operational efficiency.<br>• Uphold internal controls and compliance standards while handling sensitive financial information.<br>• Support a collaborative environment by providing clear communication and actionable data insights.
<p>Job Summary</p><p>We are seeking a motivated and analytical Data Analyst to support product quality, device performance, and customer experience improvements across a large consumer device ecosystem. In this role, you will help drive data‑informed decisions by analyzing telemetry, reliability KPIs, and customer satisfaction trends. You’ll collaborate closely with engineering, operations, and customer experience teams to ensure data accuracy, generate actionable insights, and support strategic initiatives that help enhance overall device reliability.</p><p>This role is ideal for someone who thrives in cross‑functional environments, enjoys solving complex problems, and can translate data into compelling, executive‑ready insights.</p><p><br></p><p>Key Responsibilities</p><ul><li>Analyze device performance data, reliability KPIs, telemetry metrics, and CSAT feedback to identify improvement opportunities.</li><li>Support key initiatives related to error tracking, ticket resolution trends, and release impact assessments.</li><li>Partner with stakeholders across engineering, operations, and customer experience to align priorities and ensure timely delivery of insights.</li><li>Create compelling presentations and executive‑level decks to communicate data findings and recommendations.</li><li>Participate in working groups tackling high‑impact reliability issues, contributing to root cause analysis and solution tracking.</li><li>Assist in expanding global data capabilities through improved data access, partnerships, and data enablement initiatives.</li></ul>