We are looking for an experienced Data Engineer to join our team in Newtown Square, Pennsylvania. In this long-term contract position, you will play a pivotal role in designing and implementing robust data solutions to support organizational goals. This is an exciting opportunity to lead the development of modern data architectures and collaborate with diverse teams to drive impactful results.<br><br>Responsibilities:<br>• Lead the implementation of an enterprise Snowflake data lake, ensuring timely delivery and optimal performance.<br>• Oversee the integration of multiple data sources, including Oracle Financials, PostgreSQL, and Salesforce, into a unified data platform.<br>• Collaborate with finance teams to facilitate a transition to a 12-month accounting calendar and support accelerated financial close processes.<br>• Develop and maintain multi-source analytics dashboards to enhance operational insights and decision-making.<br>• Manage day-to-day operations of the Snowflake platform, focusing on performance tuning and cost optimization.<br>• Ensure data quality and reliability, providing business users with a trustworthy platform.<br>• Document architectural designs, data workflows, and operational procedures to support sustainable data management.<br>• Coordinate with external vendors to meet project deadlines and ensure successful implementations.
<p><strong>Senior Data Engineer</strong></p><p><strong>Location:</strong> Philadelphia, PA (Hybrid/Onsite as required)</p><p><strong>Employment Type: </strong>39 Week Contract, Potential for Extension</p><p><strong>Project Focus:</strong> Salesforce → Databricks Data Migration</p><p><strong>About the Role</strong></p><p>We are seeking a <strong>Senior Data Engineer</strong> to support a major Salesforce data migration initiative. This role is centered around building, optimizing, and maintaining high‑quality data pipelines that feed into Databricks, with a strong emphasis on Spark/PySpark and Python-based ETL development. The engineer will work closely with a senior team member, participate in Agile ceremonies, and contribute to the development of a core CRM data platform.</p><p><strong>Key Responsibilities</strong></p><p><strong>Data Engineering & Development</strong></p><ul><li>Develop ETL jobs and data pipelines that migrate and integrate data between Salesforce, AWS, and Databricks.</li><li>Build, test, and maintain scalable data pipelines on AWS + Databricks environments.</li><li>Use Python as a primary language for data engineering tasks and ETL job creation.</li><li>Utilize Spark and PySpark for all high‑volume processing and transformation work (<strong>must‑have</strong>).</li><li>Support integration and pipeline development, including Mulesoft-related components.</li><li>Conduct documentation, testing, QA, and post‑delivery support for all data engineering outputs.</li><li>Identify and mitigate risks, including eliminating single points of failure (SPOFs).</li></ul><p><strong>Infrastructure & DevOps Collaboration</strong></p><ul><li>Use Terraform for infrastructure provisioning and environment management.</li><li>Set up and manage CI/CD pipelines using Concourse or GitHub Actions to ensure consistent and reliable deployments.</li><li>Troubleshoot pipeline issues, resolve defects efficiently, and maintain reliable operations.</li></ul><p><strong>Cross-Team Collaboration</strong></p><ul><li>Partner with engineering, architecture, and technical product teams to translate requirements into scalable data solutions.</li><li>Contribute to best practices, knowledge-sharing, and continuous improvement across the engineering organization.</li><li>Participate in weekly Scrum ceremonies and collaborate in an Agile environment.</li></ul>
We are looking for an experienced Data Analyst to join our team on a long-term contract basis in Wilmington, Delaware. In this role, you will play a pivotal part in designing and developing software solutions for both desktop and web applications. You will collaborate closely with business stakeholders to understand and document requirements, ensuring the delivery of robust and effective reporting and application solutions.<br><br>Responsibilities:<br>• Design and implement front-end and back-end software for complex desktop and web applications.<br>• Collaborate with business users and management to identify, analyze, and document application and reporting requirements.<br>• Develop stable and efficient solutions for Power BI reporting and application development projects.<br>• Manage all stages of the software development lifecycle, including requirements gathering, design, coding, testing, deployment, and ongoing support.<br>• Utilize best practices in software development and explore innovative solutions to meet business objectives.<br>• Optimize database objects and ensure efficient data handling and retrieval.<br>• Apply Agile methodologies to project management and software development tasks.<br>• Create and maintain comprehensive documentation for business requirements and application designs.<br>• Work independently and as part of a team to deliver high-quality results.<br>• Support scheduled updates and releases with a focus on system stability and performance.
We are looking for an experienced Data Analyst to support healthcare initiatives in Philadelphia, Pennsylvania. This is a long-term contract position that requires strong analytical skills and a focus on fraud detection and prevention. The ideal candidate will leverage data-driven insights to enhance decision-making and ensure the integrity of healthcare operations.<br><br>Responsibilities:<br>• Conduct detailed data analyses to identify patterns of suspected fraud and anomalies in healthcare systems.<br>• Develop and implement fraud detection models using advanced analytics tools and techniques.<br>• Collaborate with cross-functional teams to investigate potential fraudulent activities and propose actionable solutions.<br>• Utilize platforms such as Epics and Chartmaxx to extract and analyze data effectively.<br>• Generate comprehensive reports and dashboards to present findings and support decision-making.<br>• Monitor ongoing healthcare operations to ensure compliance with anti-fraud protocols.<br>• Optimize data workflows and processes to enhance efficiency and accuracy.<br>• Stay updated on industry trends and best practices in fraud analytics and healthcare data analysis.<br>• Provide recommendations to improve system integrity and prevent future fraudulent activities.
<p>We are looking for a skilled Data Warehouse Analyst to join our team in New Jersey. In this role, you will transform logistics challenges into actionable insights through advanced data analysis and reporting. By collaborating with cross-functional teams, you will play a pivotal role in enhancing operational efficiency and driving key business decisions.</p><p><br></p><p>Responsibilities:</p><p>• Collaborate with Operations, Transportation, and Finance teams to establish and refine KPIs that drive logistics and fulfillment performance.</p><p>• Develop and optimize labor planning and forecasting models for warehouse and delivery operations, partnering closely with recruitment teams.</p><p>• Analyze distribution and fulfillment data to uncover performance trends and identify cost-saving opportunities.</p><p>• Design and maintain dashboards and reports to provide real-time insights into logistics metrics, including delivery times, warehouse productivity, and route optimization.</p><p>• Automate reporting processes to improve accuracy and timeliness of operational data.</p><p>• Continuously enhance data integrity and streamline workflows to optimize logistics operations.</p><p>• Work on data modeling and warehousing projects to support scalable analytics and reporting solutions.</p><p>• Partner with stakeholders to deliver clear and actionable insights to improve decision-making processes.</p><p>• Investigate and implement tools and techniques to improve overall business intelligence capabilities.</p>
<p><strong>Overview</strong></p><p>We are seeking a Senior Data Engineer to support a major Salesforce Phase 2 data migration initiative. This role will focus heavily on building and optimizing data pipelines, developing ETL workflows, and moving CRM data from Salesforce into Databricks.</p><p>The engineer will work closely with a senior team member, contribute to Scrum ceremonies, and play a key role in developing the core CRM data environment used by the advertising organization.</p><p><br></p><p><strong>Key Responsibilities</strong></p><p><strong>Data Engineering & Migration</strong></p><ul><li>Develop ETL jobs that move and transform Salesforce data into Databricks.</li><li>Build, test, and maintain high‑volume data pipelines across AWS + Databricks.</li><li>Perform data migration, data integration, and pipeline development (including Mulesoft-related work).</li><li>Ensure all pipelines are reliable, scalable, and optimized for production.</li></ul><p><strong>Development & Infrastructure</strong></p><ul><li>Use Python and PySpark to build ETL components and transformation logic.</li><li>Leverage Spark/PySpark for distributed processing at scale (must‑have).</li><li>Use Terraform to provision and manage cloud infrastructure.</li><li>Set up CI/CD pipelines using Concourse or GitHub Actions for automated deployments.</li></ul><p><strong>Quality, Documentation & Support</strong></p><ul><li>Document ETL processes, pipelines, and data flows.</li><li>Participate in testing, QA, and validation of migrated datasets.</li><li>Provide post‑delivery support and proactively mitigate project risks or single points of failure (SPOF).</li><li>Troubleshoot production issues and implement long‑term fixes to maintain pipeline stability.</li></ul><p><strong>Collaboration</strong></p><ul><li>Work closely with engineering teammates to translate business requirements into working pipelines.</li><li>Participate in weekly Scrum ceremonies.</li><li>Contribute to shared best practices and continuous improvement across the data engineering team.</li></ul><p><br></p>
<p>We are looking for an experienced Database Architect to oversee the management, optimization, and security of our database platforms. This role involves administering both on-premises Microsoft SQL Server and cloud-based systems such as Azure SQL, Microsoft Fabric, and Snowflake. You will ensure the performance, availability, and compliance of enterprise data assets while collaborating with cross-functional teams to support data-driven initiatives across the organization.</p><p><br></p><p>Responsibilities:</p><p>• Manage and maintain database systems, including Microsoft SQL Server and cloud environments such as Azure SQL and Snowflake.</p><p>• Develop and implement backup, restore, and disaster recovery protocols, ensuring alignment with recovery objectives and conducting regular tests.</p><p>• Design high-availability configurations and capacity plans to support organizational data needs.</p><p>• Optimize database performance by monitoring query execution, addressing bottlenecks, and improving resource utilization.</p><p>• Build and maintain data pipelines for ingestion and integration into Snowflake, utilizing tools like Microsoft Data Gateway and secure file transfers.</p><p>• Strengthen database security through role-based access control, encryption, and adherence to data retention policies.</p><p>• Automate routine database tasks using scripts and tools such as PowerShell and Azure SQL.</p><p>• Monitor database platforms using alerts, dashboards, and automated tools to ensure system health and reliability.</p><p>• Support audits and regulatory examinations by providing evidence of database configurations and resolving findings.</p><p>• Collaborate with analytics, infrastructure, and security teams to ensure seamless data connectivity, modeling, and secure vendor integrations.</p>
We are looking for an experienced Database Technology Manager to oversee and optimize database systems, ensuring seamless integration and performance across platforms. This role demands a strong technical background in system administration and database technologies. As a long-term contract position, it offers an excellent opportunity for individuals seeking stability and growth. Join us in Philadelphia, Pennsylvania, where you will play a pivotal role in managing critical technological resources.<br><br>Responsibilities:<br>• Manage and maintain database systems to ensure optimal functionality and performance.<br>• Administer Active Directory and ensure its seamless integration with other platforms.<br>• Oversee Citrix technologies to enhance remote access and system efficiency.<br>• Provide desktop administration support, addressing hardware and software issues.<br>• Troubleshoot and resolve computer hardware challenges to minimize downtime.<br>• Deliver remote desktop support to ensure continuous access for end-users.<br>• Manage Blackbaud systems, ensuring data integrity and accessibility.<br>• Develop and implement strategies to improve database security and reliability.<br>• Collaborate with cross-functional teams to align database solutions with organizational goals.<br>• Document processes and maintain accurate records for compliance and future reference.
We are looking for a skilled Software Engineer to join our team in Bethlehem, Pennsylvania. This role involves designing and optimizing data systems, managing tools for data orchestration, and ensuring secure and efficient operations. The ideal candidate will thrive in a collaborative environment while delivering impactful solutions for business intelligence and operations.<br><br>Responsibilities:<br>• Build and manage data orchestration tools, including creating variables, setting notifications, and configuring retries.<br>• Optimize Snowflake performance by adjusting warehouse sizing, clustering, and profiling queries.<br>• Schedule and oversee near real-time data loads using Snowflake Tasks and Streams.<br>• Implement rigorous data quality checks such as verifying freshness, row counts, and referential integrity.<br>• Monitor and control costs through usage dashboards and guardrails.<br>• Ensure secure operations by maintaining roles, managing secrets, and auditing logs.<br>• Develop and monitor Power BI datasets to support Finance and Operations teams.<br>• Collaborate with stakeholders to gather requirements and deliver tailored solutions.<br>• Enhance and maintain front-end data applications using tools like Streamlit and Python.<br>• Create detailed documentation, including runbooks, root cause analyses, and change tickets for releases.
We are looking for a skilled Cloud Engineer to join our team in Wayne, Pennsylvania. This role requires a deep understanding of cloud technologies, particularly Microsoft Azure and Microsoft 365, as well as expertise in infrastructure and identity management. You will play a key part in ensuring seamless operations and resolving complex technical challenges.<br><br>Responsibilities:<br>• Diagnose and address technical issues across Microsoft Azure, Microsoft 365, Entra ID, Intune, and hybrid or on-premise server environments.<br>• Act as a Tier 3 escalation point to manage and resolve advanced infrastructure, cloud, identity, and networking problems.<br>• Develop and maintain detailed documentation of client systems, including Azure resources, Intune configurations, and support actions.<br>• Collaborate with cross-functional teams to implement and optimize cloud solutions.<br>• Ensure the security and compliance of cloud and on-premise environments.<br>• Provide technical guidance and recommendations to clients regarding best practices in cloud and infrastructure management.<br>• Monitor system performance and proactively identify areas for improvement.<br>• Support the integration of new technologies into existing environments.<br>• Assist in training and mentoring team members who are new to cloud and infrastructure topics.
The Opportunity: Be part of a dynamic team that designs, develops, and optimizes data solutions supporting enterprise-level products across diverse industries. This role provides a clear track to higher-level positions, including Lead Data Engineer and Data Architect, for those who demonstrate vision, initiative, and impact. Key Responsibilities: Design, develop, and optimize relational database objects and data models using Microsoft SQL Server and Snowflake. Build and maintain scalable ETL/ELT pipelines for batch and streaming data using SSIS and cloud-native solutions. Integrate and utilize Redis for caching, session management, and real-time analytics. Develop and maintain data visualizations and reporting solutions using Sigma Computing, SSRS, and other BI tools. Collaborate across engineering, analytics, and product teams to deliver impactful data solutions. Ensure data security, governance, and compliance across all platforms. Participate in Agile Scrum ceremonies and contribute to continuous improvement within the data engineering process. Support database deployments using DevOps practices, including version control (Git) and CI/CD pipelines (Azure DevOps, Flyway, Octopus, SonarQube). Troubleshoot and resolve performance, reliability, and scalability issues across the data platform. Mentor entry level team members and participate in design/code reviews.
Professional Qualifications:<br>• 5+ years experience in IT as a Technical Business analyst/system analyst and created functional requirements for complex projects within the Insurance P&C domain.<br>• Good understanding of the underwriting process includes the policy lifecycle, coverages, endorsements, forms, rating, etc.<br>• Experience in working in Underwriting and Claims applications.<br>• Experience in data integration and data quality projects.<br>• Ability to work with technical teams (developers, architects, QA, infrastructure), business users and software vendors to document requirements on time<br>• Excellent understanding of how technology impacts the business.<br>• Excellent team player with a proven background of individual contribution.<br><br>Preferred Technical Skills:<br>• Insurance Policy Administration System experience (Duck Creek, Guidewire, etc.)<br>• Understanding of XML and/or JSON languages is a plus<br>• Strong SQL skills to query SQL databases
We are looking for a skilled and detail-oriented Business Analyst to contribute to the development and improvement of our Global Portal. In this role, you will work closely with diverse teams, including Product Management, IT, Operations, and Business Stakeholders, to create solutions that enhance the digital experience for customers. This is a long-term contract position based in Piscataway, New Jersey.<br><br>Responsibilities:<br>• Analyze and document business, functional, and non-functional requirements by conducting workshops, interviews, and system evaluations.<br>• Develop detailed process maps to identify inefficiencies, gaps, and opportunities for optimization and automation.<br>• Collaborate with cross-functional teams to create and maintain clear use cases and workflow diagrams.<br>• Facilitate alignment among Product Management, Development, QA, and Stakeholders to ensure mutual understanding of priorities and requirements.<br>• Support the testing phase by creating test cases, assisting in execution, and validating results during User Acceptance Testing.<br>• Provide training and knowledge transfer for both internal teams and external customers, ensuring seamless adoption of new features.<br>• Act as a subject matter expert for the Global Portal, ensuring consistency in customer experience across different regions and products.<br>• Partner with change management teams to prepare businesses for new portal enhancements and ensure successful implementation.<br>• Coordinate with Product and Program Managers to monitor project progress, address risks, and manage scope adjustments.
<p>our team is seeking an experienced Network Engineer to join our growing IT group. In this role, you will support and enhance critical network infrastructure across multiple sites, with a primary focus on Cisco technologies and enterprise network operations. This hands-on position is ideal for technically driven professionals who value service excellence, quality documentation, and continuous learning. Key Responsibilities: Install, configure, and support network equipment and devices (routers, switches, firewalls) throughout their life cycle. Conduct site visits to branch and office locations for installations and troubleshooting as needed. Review, audit, and remediate network devices and configurations to ensure compliance with internal and industry security standards. Provide backup support and participate in vendor escalations for Cisco unified communications systems (training provided as needed). Maintain and update network systems, including regular software and firmware upgrades. Create and update technical documentation, diagrams, and departmental procedure run-books. Respond to and resolve support requests from internal teams. Analyze, diagnose, and document root causes of network problems; propose and implement corrective actions. Interface with third-party vendors for escalations and resolution of complex issues. Meet deadlines on daily tasks as well as short- and long-term technology projects. Mentor and share technical knowledge with other IT team members. Participate in after-hours on-call rotation for critical incidents and scheduled changes.</p>
We are looking for an experienced Web Developer to join our team in Wilmington, Delaware. The ideal candidate will have a strong background in Python and React, along with a passion for creating scalable and efficient web applications. This role involves close collaboration with cross-functional teams to design, develop, and maintain high-quality digital solutions.<br><br>Responsibilities:<br>• Design and develop web applications using Python frameworks such as Django, Flask, or FastAPI.<br>• Create responsive and dynamic user interfaces with React, ensuring seamless user experiences.<br>• Optimize web applications for performance, scalability, and security.<br>• Write clear, maintainable, and well-documented code to ensure long-term usability.<br>• Diagnose and resolve technical issues, including debugging and upgrading existing systems.<br>• Utilize relational and NoSQL databases such as PostgreSQL, MySQL, or MongoDB for data management.<br>• Implement secure authentication, authorization, and data protection protocols.<br>• Collaborate with team members to integrate APIs and other third-party services.<br>• Stay updated on emerging web technologies and incorporate them into projects as needed.<br>• Participate in code reviews to maintain high development standards.
We are looking for a skilled and dedicated Cyber Security Engineer to join our team in Chesterbrook, Pennsylvania. This contract-to-permanent position involves overseeing information security governance, managing vendor relationships, and mitigating risks to ensure a secure and compliant environment. The ideal candidate will bring hands-on expertise in security practices, coupled with strong analytical and communication skills, to drive the implementation of robust security programs.<br><br>Responsibilities:<br>• Act as the primary liaison with offshore teams to ensure compliance with organizational security policies and standards.<br>• Monitor vendor performance against service level agreements and identify areas for improvement.<br>• Develop and enforce governance practices to align operations with security and compliance requirements.<br>• Collaborate with business units to ensure security measures are integrated into vendor projects.<br>• Conduct assessments to evaluate supplier compliance with confidentiality, integrity, and availability standards.<br>• Provide expert advice on information security, analyzing vulnerabilities and recommending remediation strategies.<br>• Draft and maintain organizational security policies and procedures, ensuring adherence to compliance standards.<br>• Prepare detailed reports on security governance and vulnerabilities for stakeholders and leadership teams.<br>• Facilitate regular risk assessments and vulnerability scans, ensuring timely resolution of findings.<br>• Support special projects and contribute to the continuous improvement of security practices.
<p>Our client is seeking a Financial Data Specialist. The Financial Data Specialist will be responsible for capturing, analyzing, and validating information. This individual will work closely with both internal teams and external market participants, providing high-quality data support and resolving complex inquiries. The ideal candidate is detail-oriented, proactive, and thrives in a fast-paced, data-driven environment.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><p> • Build and maintain strong relationships with internal stakeholders and external market participants, including exchanges, financial institutions, and legal entities</p><p> • Serve as a subject-matter expert (SME) </p><p> • Collaborate cross-functionally to identify dependencies impacting data quality and ensure accurate data delivery across platforms</p><p> • Respond to client inquiries in a timely and professional manner, delivering high-quality support</p><p> • Escalate and communicate data issues or client feedback to appropriate teams to drive improvements</p><p> • Identify opportunities to enhance processes, workflows, and data accuracy</p>
<p>Are you passionate about next-generation data engineering, AI, and modern cloud technologies? Our company is seeking an innovative and driven Snowflake Solutions Engineer to join our IT team in a fully remote capacity. In this role, you will lead the design and implementation of advanced Snowflake-native applications and AI-powered data solutions, creating measurable business impact utilizing Snowflake’s latest platform features. This is an exceptional opportunity to work at the forefront of data, leveraging Streamlit, Cortex AI, and emerging Snowflake technologies.</p><p><strong>Key Responsibilities:</strong></p><p><strong>Snowflake Native Application Development (30%)</strong></p><ul><li>Design and build interactive data applications using Snowflake Streamlit to enable intuitive, self-service analytics and operational workflows for business users.</li><li>Develop reusable frameworks and component libraries for rapid application delivery.</li><li>Integrate Snowflake Native Apps and third-party marketplace applications to continuously extend platform capabilities.</li><li>Create custom UDFs and stored procedures to support advanced business logic.</li></ul><p><strong>Data Architecture and Modern Platform Design (30%)</strong></p><ul><li>Develop cutting-edge data architecture solutions spanning data warehousing, data lakes, and lakehouse approaches.</li><li>Implement medallion (bronze-silver-gold) patterns to maintain data quality and governance.</li><li>Recommend optimal architecture patterns for structured analytics, semi-structured data, and AI/ML workloads.</li><li>Establish best practices for data organization, storage optimization, and query performance.</li></ul><p><strong>AI & Advanced Analytics Collaboration (15%)</strong></p><ul><li>Partner with AI/data science teams to support and enhance Snowflake-based AI workloads.</li><li>Enable implementation of Snowflake Cortex AI features for practical business cases.</li><li>Guide data access and feature engineering for ML model requirements.</li><li>Contribute platform expertise to AI proof-of-concept initiatives.</li></ul><p><strong>Security, Governance, & Technical Leadership (15%)</strong></p><ul><li>Design and implement RBAC hierarchies, enforcing least privilege principles.</li><li>Define security best practices including network policies and encryption; implement row/column security and data masking.</li><li>Apply tag-based policies for advanced governance.</li><li>Monitor and optimize application performance, cost, and user experience.</li><li>Lead architectural discussions, create technical documentation, and share best practices.</li></ul><p><br></p>