We are looking for an experienced Data Engineer to join our team in Newtown Square, Pennsylvania. In this long-term contract position, you will play a pivotal role in designing and implementing robust data solutions to support organizational goals. This is an exciting opportunity to lead the development of modern data architectures and collaborate with diverse teams to drive impactful results.<br><br>Responsibilities:<br>• Lead the implementation of an enterprise Snowflake data lake, ensuring timely delivery and optimal performance.<br>• Oversee the integration of multiple data sources, including Oracle Financials, PostgreSQL, and Salesforce, into a unified data platform.<br>• Collaborate with finance teams to facilitate a transition to a 12-month accounting calendar and support accelerated financial close processes.<br>• Develop and maintain multi-source analytics dashboards to enhance operational insights and decision-making.<br>• Manage day-to-day operations of the Snowflake platform, focusing on performance tuning and cost optimization.<br>• Ensure data quality and reliability, providing business users with a trustworthy platform.<br>• Document architectural designs, data workflows, and operational procedures to support sustainable data management.<br>• Coordinate with external vendors to meet project deadlines and ensure successful implementations.
<p><strong>Senior Data Engineer</strong></p><p><strong>Location:</strong> Philadelphia, PA (Hybrid/Onsite as required)</p><p><strong>Employment Type: </strong>39 Week Contract, Potential for Extension</p><p><strong>Position Overview</strong></p><p>We are seeking an experienced <strong>Data Engineer</strong> to support the development and ongoing operation of a large-scale, cloud-based IoT platform. This role focuses on building and supporting scalable, secure, and high‑performance infrastructure, tooling, and frameworks that enable engineering teams to efficiently develop, test, deploy, and operate modern microservices.</p><p>The ideal candidate brings strong cloud engineering experience, a passion for quality and security, and the ability to collaborate in a fast‑paced Agile environment.</p><p><strong>Key Responsibilities</strong></p><ul><li>Develop, operate, and support DevOps and platform engineering tools that enable cloud-based IoT services</li><li>Build and promote horizontal tools, frameworks, and best practices supporting microservices, CI/CD, security, monitoring, and performance</li><li>Collaborate with engineering teams to define development standards, workflows, and methodologies</li><li>Design and implement shared libraries and frameworks to support scalable and highly available systems</li><li>Support production platform operations, troubleshooting, and continuous improvement with focus on quality, performance, and security</li><li>Translate system architecture and product requirements into well-designed, tested software solutions</li><li>Work in an Agile environment delivering incremental, high-quality software</li><li>Provide technical guidance and promote modern engineering practices across teams</li></ul>
We are looking for an experienced Data Analyst to support healthcare initiatives in Philadelphia, Pennsylvania. This is a long-term contract position that requires strong analytical skills and a focus on fraud detection and prevention. The ideal candidate will leverage data-driven insights to enhance decision-making and ensure the integrity of healthcare operations.<br><br>Responsibilities:<br>• Conduct detailed data analyses to identify patterns of suspected fraud and anomalies in healthcare systems.<br>• Develop and implement fraud detection models using advanced analytics tools and techniques.<br>• Collaborate with cross-functional teams to investigate potential fraudulent activities and propose actionable solutions.<br>• Utilize platforms such as Epics and Chartmaxx to extract and analyze data effectively.<br>• Generate comprehensive reports and dashboards to present findings and support decision-making.<br>• Monitor ongoing healthcare operations to ensure compliance with anti-fraud protocols.<br>• Optimize data workflows and processes to enhance efficiency and accuracy.<br>• Stay updated on industry trends and best practices in fraud analytics and healthcare data analysis.<br>• Provide recommendations to improve system integrity and prevent future fraudulent activities.
<p>We are looking for a skilled Data Warehouse Analyst to join our team in New Jersey. In this role, you will transform logistics challenges into actionable insights through advanced data analysis and reporting. By collaborating with cross-functional teams, you will play a pivotal role in enhancing operational efficiency and driving key business decisions.</p><p><br></p><p>Responsibilities:</p><p>• Collaborate with Operations, Transportation, and Finance teams to establish and refine KPIs that drive logistics and fulfillment performance.</p><p>• Develop and optimize labor planning and forecasting models for warehouse and delivery operations, partnering closely with recruitment teams.</p><p>• Analyze distribution and fulfillment data to uncover performance trends and identify cost-saving opportunities.</p><p>• Design and maintain dashboards and reports to provide real-time insights into logistics metrics, including delivery times, warehouse productivity, and route optimization.</p><p>• Automate reporting processes to improve accuracy and timeliness of operational data.</p><p>• Continuously enhance data integrity and streamline workflows to optimize logistics operations.</p><p>• Work on data modeling and warehousing projects to support scalable analytics and reporting solutions.</p><p>• Partner with stakeholders to deliver clear and actionable insights to improve decision-making processes.</p><p>• Investigate and implement tools and techniques to improve overall business intelligence capabilities.</p>
<p><strong>Overview</strong></p><p>We are seeking a Senior Data Engineer to support a major Salesforce Phase 2 data migration initiative. This role will focus heavily on building and optimizing data pipelines, developing ETL workflows, and moving CRM data from Salesforce into Databricks.</p><p>The engineer will work closely with a senior team member, contribute to Scrum ceremonies, and play a key role in developing the core CRM data environment used by the advertising organization.</p><p><br></p><p><strong>Key Responsibilities</strong></p><p><strong>Data Engineering & Migration</strong></p><ul><li>Develop ETL jobs that move and transform Salesforce data into Databricks.</li><li>Build, test, and maintain high‑volume data pipelines across AWS + Databricks.</li><li>Perform data migration, data integration, and pipeline development (including Mulesoft-related work).</li><li>Ensure all pipelines are reliable, scalable, and optimized for production.</li></ul><p><strong>Development & Infrastructure</strong></p><ul><li>Use Python and PySpark to build ETL components and transformation logic.</li><li>Leverage Spark/PySpark for distributed processing at scale (must‑have).</li><li>Use Terraform to provision and manage cloud infrastructure.</li><li>Set up CI/CD pipelines using Concourse or GitHub Actions for automated deployments.</li></ul><p><strong>Quality, Documentation & Support</strong></p><ul><li>Document ETL processes, pipelines, and data flows.</li><li>Participate in testing, QA, and validation of migrated datasets.</li><li>Provide post‑delivery support and proactively mitigate project risks or single points of failure (SPOF).</li><li>Troubleshoot production issues and implement long‑term fixes to maintain pipeline stability.</li></ul><p><strong>Collaboration</strong></p><ul><li>Work closely with engineering teammates to translate business requirements into working pipelines.</li><li>Participate in weekly Scrum ceremonies.</li><li>Contribute to shared best practices and continuous improvement across the data engineering team.</li></ul><p><br></p>
The Opportunity: Be part of a dynamic team that designs, develops, and optimizes data solutions supporting enterprise-level products across diverse industries. This role provides a clear track to higher-level positions, including Lead Data Engineer and Data Architect, for those who demonstrate vision, initiative, and impact. Key Responsibilities: Design, develop, and optimize relational database objects and data models using Microsoft SQL Server and Snowflake. Build and maintain scalable ETL/ELT pipelines for batch and streaming data using SSIS and cloud-native solutions. Integrate and utilize Redis for caching, session management, and real-time analytics. Develop and maintain data visualizations and reporting solutions using Sigma Computing, SSRS, and other BI tools. Collaborate across engineering, analytics, and product teams to deliver impactful data solutions. Ensure data security, governance, and compliance across all platforms. Participate in Agile Scrum ceremonies and contribute to continuous improvement within the data engineering process. Support database deployments using DevOps practices, including version control (Git) and CI/CD pipelines (Azure DevOps, Flyway, Octopus, SonarQube). Troubleshoot and resolve performance, reliability, and scalability issues across the data platform. Mentor entry level team members and participate in design/code reviews.
<p><strong>Machine Learning Engineer II </strong>(Contract)</p><p><strong>Location: </strong>Hybrid – Philadelphia, PA <strong>or </strong>Washington, DC | Onsite 3–4 days per week</p><p><strong>Assignment Length:</strong> 38 Weeks, Potential for Extension</p><p><strong>Position Overview</strong></p><p>We are seeking a <strong>Research / Machine Learning Engineer II</strong> to support advanced AI/ML initiatives across large-scale consumer-facing platforms, including Search, Browse, Personalization, Campaign Management, and Voice/NLP technologies. This role is heavily focused on <strong>model validation, quality, and automation</strong>, with an emphasis on building and enhancing machine learning models that validate and support AI-driven tools developed by engineering teams.</p><p>The ideal candidate brings a strong quality mindset, hands-on experience with machine learning models, and a deep interest in testing and validating large language models (LLMs) and AI systems in production-like environments.</p><p><strong>Key Responsibilities</strong></p><ul><li>Design, build, and enhance <strong>machine learning models primarily used for validation and quality assurance</strong> of AI/ML-driven tools.</li><li>Develop models that assist in testing, validating, and improving automation frameworks used by engineering and tooling teams.</li><li>Enhance and support existing AI/ML automation tools, including those working with <strong>speech and NLP data</strong>.</li><li>Implement <strong>prompt-based interactions with Large Language Models (LLMs)</strong> to support validation and test use cases.</li><li>Research, evaluate, and experiment with various ML models across multiple domains to determine best-fit solutions.</li><li>Contribute software development efforts toward <strong>proof-of-concept initiatives</strong> in AI/ML, NLP, and related strategic areas (e.g., Computer Vision where applicable).</li><li>Collaborate closely with cross-functional engineering, tooling, and SDET teams across multiple locations.</li><li>Support and mentor engineering teams by promoting modern software development, data practices, and quality-driven AI development.</li><li>Ensure AI/ML solutions meet expectations for <strong>performance, reliability, scalability, and product quality</strong>.</li></ul>
Professional Qualifications:<br>• 5+ years experience in IT as a Technical Business analyst/system analyst and created functional requirements for complex projects within the Insurance P&C domain.<br>• Good understanding of the underwriting process includes the policy lifecycle, coverages, endorsements, forms, rating, etc.<br>• Experience in working in Underwriting and Claims applications.<br>• Experience in data integration and data quality projects.<br>• Ability to work with technical teams (developers, architects, QA, infrastructure), business users and software vendors to document requirements on time<br>• Excellent understanding of how technology impacts the business.<br>• Excellent team player with a proven background of individual contribution.<br><br>Preferred Technical Skills:<br>• Insurance Policy Administration System experience (Duck Creek, Guidewire, etc.)<br>• Understanding of XML and/or JSON languages is a plus<br>• Strong SQL skills to query SQL databases
<p>our team is seeking an experienced Network Engineer to join our growing IT group. In this role, you will support and enhance critical network infrastructure across multiple sites, with a primary focus on Cisco technologies and enterprise network operations. This hands-on position is ideal for technically driven professionals who value service excellence, quality documentation, and continuous learning. Key Responsibilities: Install, configure, and support network equipment and devices (routers, switches, firewalls) throughout their life cycle. Conduct site visits to branch and office locations for installations and troubleshooting as needed. Review, audit, and remediate network devices and configurations to ensure compliance with internal and industry security standards. Provide backup support and participate in vendor escalations for Cisco unified communications systems (training provided as needed). Maintain and update network systems, including regular software and firmware upgrades. Create and update technical documentation, diagrams, and departmental procedure run-books. Respond to and resolve support requests from internal teams. Analyze, diagnose, and document root causes of network problems; propose and implement corrective actions. Interface with third-party vendors for escalations and resolution of complex issues. Meet deadlines on daily tasks as well as short- and long-term technology projects. Mentor and share technical knowledge with other IT team members. Participate in after-hours on-call rotation for critical incidents and scheduled changes.</p>
<p>Robert Half is seeking a skilled Network Systems Engineer to join our IT team and provide comprehensive computer and network support across the US. This role’s core focus is managing and maintaining the Enterprise Network Infrastructure, while also delivering high-level (Tier-2 and Tier-3) technical services to the organization.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Assist senior engineers with installation, maintenance, administration, and upgrades of Enterprise Network Infrastructure platforms.</li><li>Monitor network systems and data centers to ensure top performance and reliability.</li><li>Troubleshoot, analyze, and resolve system outages, collaborating with vendors as needed.</li><li>Work with vendors, clients, carriers, and technical teams on network implementation, optimization, security, and ongoing management.</li><li>Develop and help administer preventative maintenance programs, business continuity, and disaster recovery plans; document and test these protocols regularly.</li><li>Ensure adherence to internal and industry design, programming, and application standards.</li><li>Manage support ticket queues, resolving issues efficiently and with excellent service.</li><li>Availability to work weekends and after-hours for operational issues, patch cycles, and critical updates as required.</li><li>Perform additional duties and projects as assigned.</li></ul><p><br></p>
We are looking for a skilled and dedicated Cyber Security Engineer to join our team in Chesterbrook, Pennsylvania. This contract-to-permanent position involves overseeing information security governance, managing vendor relationships, and mitigating risks to ensure a secure and compliant environment. The ideal candidate will bring hands-on expertise in security practices, coupled with strong analytical and communication skills, to drive the implementation of robust security programs.<br><br>Responsibilities:<br>• Act as the primary liaison with offshore teams to ensure compliance with organizational security policies and standards.<br>• Monitor vendor performance against service level agreements and identify areas for improvement.<br>• Develop and enforce governance practices to align operations with security and compliance requirements.<br>• Collaborate with business units to ensure security measures are integrated into vendor projects.<br>• Conduct assessments to evaluate supplier compliance with confidentiality, integrity, and availability standards.<br>• Provide expert advice on information security, analyzing vulnerabilities and recommending remediation strategies.<br>• Draft and maintain organizational security policies and procedures, ensuring adherence to compliance standards.<br>• Prepare detailed reports on security governance and vulnerabilities for stakeholders and leadership teams.<br>• Facilitate regular risk assessments and vulnerability scans, ensuring timely resolution of findings.<br>• Support special projects and contribute to the continuous improvement of security practices.
<p>Our client is seeking a Financial Data Specialist. The Financial Data Specialist will be responsible for capturing, analyzing, and validating information. This individual will work closely with both internal teams and external market participants, providing high-quality data support and resolving complex inquiries. The ideal candidate is detail-oriented, proactive, and thrives in a fast-paced, data-driven environment.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><p> • Build and maintain strong relationships with internal stakeholders and external market participants, including exchanges, financial institutions, and legal entities</p><p> • Serve as a subject-matter expert (SME) </p><p> • Collaborate cross-functionally to identify dependencies impacting data quality and ensure accurate data delivery across platforms</p><p> • Respond to client inquiries in a timely and professional manner, delivering high-quality support</p><p> • Escalate and communicate data issues or client feedback to appropriate teams to drive improvements</p><p> • Identify opportunities to enhance processes, workflows, and data accuracy</p>
<p>Are you passionate about next-generation data engineering, AI, and modern cloud technologies? Our company is seeking an innovative and driven Snowflake Solutions Engineer to join our IT team in a fully remote capacity. In this role, you will lead the design and implementation of advanced Snowflake-native applications and AI-powered data solutions, creating measurable business impact utilizing Snowflake’s latest platform features. This is an exceptional opportunity to work at the forefront of data, leveraging Streamlit, Cortex AI, and emerging Snowflake technologies.</p><p><strong>Key Responsibilities:</strong></p><p><strong>Snowflake Native Application Development (30%)</strong></p><ul><li>Design and build interactive data applications using Snowflake Streamlit to enable intuitive, self-service analytics and operational workflows for business users.</li><li>Develop reusable frameworks and component libraries for rapid application delivery.</li><li>Integrate Snowflake Native Apps and third-party marketplace applications to continuously extend platform capabilities.</li><li>Create custom UDFs and stored procedures to support advanced business logic.</li></ul><p><strong>Data Architecture and Modern Platform Design (30%)</strong></p><ul><li>Develop cutting-edge data architecture solutions spanning data warehousing, data lakes, and lakehouse approaches.</li><li>Implement medallion (bronze-silver-gold) patterns to maintain data quality and governance.</li><li>Recommend optimal architecture patterns for structured analytics, semi-structured data, and AI/ML workloads.</li><li>Establish best practices for data organization, storage optimization, and query performance.</li></ul><p><strong>AI & Advanced Analytics Collaboration (15%)</strong></p><ul><li>Partner with AI/data science teams to support and enhance Snowflake-based AI workloads.</li><li>Enable implementation of Snowflake Cortex AI features for practical business cases.</li><li>Guide data access and feature engineering for ML model requirements.</li><li>Contribute platform expertise to AI proof-of-concept initiatives.</li></ul><p><strong>Security, Governance, & Technical Leadership (15%)</strong></p><ul><li>Design and implement RBAC hierarchies, enforcing least privilege principles.</li><li>Define security best practices including network policies and encryption; implement row/column security and data masking.</li><li>Apply tag-based policies for advanced governance.</li><li>Monitor and optimize application performance, cost, and user experience.</li><li>Lead architectural discussions, create technical documentation, and share best practices.</li></ul><p><br></p>