We are looking for an experienced Data Analyst to support healthcare initiatives in Philadelphia, Pennsylvania. This is a long-term contract position that requires strong analytical skills and a focus on fraud detection and prevention. The ideal candidate will leverage data-driven insights to enhance decision-making and ensure the integrity of healthcare operations.<br><br>Responsibilities:<br>• Conduct detailed data analyses to identify patterns of suspected fraud and anomalies in healthcare systems.<br>• Develop and implement fraud detection models using advanced analytics tools and techniques.<br>• Collaborate with cross-functional teams to investigate potential fraudulent activities and propose actionable solutions.<br>• Utilize platforms such as Epics and Chartmaxx to extract and analyze data effectively.<br>• Generate comprehensive reports and dashboards to present findings and support decision-making.<br>• Monitor ongoing healthcare operations to ensure compliance with anti-fraud protocols.<br>• Optimize data workflows and processes to enhance efficiency and accuracy.<br>• Stay updated on industry trends and best practices in fraud analytics and healthcare data analysis.<br>• Provide recommendations to improve system integrity and prevent future fraudulent activities.
<p>We are looking for a skilled Data Warehouse Analyst to join our team in New Jersey. In this role, you will transform logistics challenges into actionable insights through advanced data analysis and reporting. By collaborating with cross-functional teams, you will play a pivotal role in enhancing operational efficiency and driving key business decisions.</p><p><br></p><p>Responsibilities:</p><p>• Collaborate with Operations, Transportation, and Finance teams to establish and refine KPIs that drive logistics and fulfillment performance.</p><p>• Develop and optimize labor planning and forecasting models for warehouse and delivery operations, partnering closely with recruitment teams.</p><p>• Analyze distribution and fulfillment data to uncover performance trends and identify cost-saving opportunities.</p><p>• Design and maintain dashboards and reports to provide real-time insights into logistics metrics, including delivery times, warehouse productivity, and route optimization.</p><p>• Automate reporting processes to improve accuracy and timeliness of operational data.</p><p>• Continuously enhance data integrity and streamline workflows to optimize logistics operations.</p><p>• Work on data modeling and warehousing projects to support scalable analytics and reporting solutions.</p><p>• Partner with stakeholders to deliver clear and actionable insights to improve decision-making processes.</p><p>• Investigate and implement tools and techniques to improve overall business intelligence capabilities.</p>
We are looking for a skilled and dedicated Cyber Security Engineer to join our team in Chesterbrook, Pennsylvania. This contract-to-permanent position involves overseeing information security governance, managing vendor relationships, and mitigating risks to ensure a secure and compliant environment. The ideal candidate will bring hands-on expertise in security practices, coupled with strong analytical and communication skills, to drive the implementation of robust security programs.<br><br>Responsibilities:<br>• Act as the primary liaison with offshore teams to ensure compliance with organizational security policies and standards.<br>• Monitor vendor performance against service level agreements and identify areas for improvement.<br>• Develop and enforce governance practices to align operations with security and compliance requirements.<br>• Collaborate with business units to ensure security measures are integrated into vendor projects.<br>• Conduct assessments to evaluate supplier compliance with confidentiality, integrity, and availability standards.<br>• Provide expert advice on information security, analyzing vulnerabilities and recommending remediation strategies.<br>• Draft and maintain organizational security policies and procedures, ensuring adherence to compliance standards.<br>• Prepare detailed reports on security governance and vulnerabilities for stakeholders and leadership teams.<br>• Facilitate regular risk assessments and vulnerability scans, ensuring timely resolution of findings.<br>• Support special projects and contribute to the continuous improvement of security practices.
We are looking for a Desktop Support Analyst to deliver hands-on technical support for employees in New York, New York. This Long-term Contract position is ideal for someone who enjoys resolving user issues, maintaining reliable workstation performance, and providing responsive service across a fast-paced work environment. The role will support day-to-day desktop operations, assist remote and international teams, and contribute to a consistent, high-quality end-user experience.<br><br>Responsibilities:<br>• Deliver first- and second-line technical assistance for hardware, software, and infrastructure-related incidents and service requests across the organization.<br>• Provide in-person floor support on a rotating schedule, assisting employees directly and ensuring all requests are properly recorded in the service management system.<br>• Take full ownership of assigned tickets from initial intake through final resolution, including user updates, troubleshooting, and timely closure.<br>• Support colleagues in international offices by providing remote assistance that aligns with established service standards and response expectations.<br>• Follow defined escalation procedures to route complex issues appropriately and maintain dependable support delivery.<br>• Investigate recurring technical problems, identify underlying causes, and create clear knowledge documentation for both engineers and end users.<br>• Administer user lifecycle activities such as onboarding, offboarding, account support, and related end-user access tasks.<br>• Configure, maintain, and troubleshoot laptops, desktop hardware, mobile devices, remote access tools, and Windows 10 workstation environments.<br>• Assist with event technology support and coordinate Zoom-based meeting and interview connections with domestic and international participants.<br>• Participate in after-hours on-call coverage and contribute to time-sensitive projects and organization-wide IT communications as needed.
<p>Are you passionate about next-generation data engineering, AI, and modern cloud technologies? Our company is seeking an innovative and driven Snowflake Solutions Engineer to join our IT team in a fully remote capacity. In this role, you will lead the design and implementation of advanced Snowflake-native applications and AI-powered data solutions, creating measurable business impact utilizing Snowflake’s latest platform features. This is an exceptional opportunity to work at the forefront of data, leveraging Streamlit, Cortex AI, and emerging Snowflake technologies.</p><p><strong>Key Responsibilities:</strong></p><p><strong>Snowflake Native Application Development (30%)</strong></p><ul><li>Design and build interactive data applications using Snowflake Streamlit to enable intuitive, self-service analytics and operational workflows for business users.</li><li>Develop reusable frameworks and component libraries for rapid application delivery.</li><li>Integrate Snowflake Native Apps and third-party marketplace applications to continuously extend platform capabilities.</li><li>Create custom UDFs and stored procedures to support advanced business logic.</li></ul><p><strong>Data Architecture and Modern Platform Design (30%)</strong></p><ul><li>Develop cutting-edge data architecture solutions spanning data warehousing, data lakes, and lakehouse approaches.</li><li>Implement medallion (bronze-silver-gold) patterns to maintain data quality and governance.</li><li>Recommend optimal architecture patterns for structured analytics, semi-structured data, and AI/ML workloads.</li><li>Establish best practices for data organization, storage optimization, and query performance.</li></ul><p><strong>AI & Advanced Analytics Collaboration (15%)</strong></p><ul><li>Partner with AI/data science teams to support and enhance Snowflake-based AI workloads.</li><li>Enable implementation of Snowflake Cortex AI features for practical business cases.</li><li>Guide data access and feature engineering for ML model requirements.</li><li>Contribute platform expertise to AI proof-of-concept initiatives.</li></ul><p><strong>Security, Governance, & Technical Leadership (15%)</strong></p><ul><li>Design and implement RBAC hierarchies, enforcing least privilege principles.</li><li>Define security best practices including network policies and encryption; implement row/column security and data masking.</li><li>Apply tag-based policies for advanced governance.</li><li>Monitor and optimize application performance, cost, and user experience.</li><li>Lead architectural discussions, create technical documentation, and share best practices.</li></ul><p><br></p>
<p><strong>Data Scientist (Big Data) III – Contractor</strong></p><p><strong>Employment Type:</strong> 27 Week Contract, Potential for Extension or Conversion</p><p><strong>Location: </strong>MUST CURRENTLY RESIDE in Philadelphia Region</p><p><strong>Employment Type:</strong> Contract / Temporary</p><p><strong>Pay: </strong>Available on W2 </p><p><strong>Position Overview</strong></p><p>The Senior Data Scientist (Big Data) will support large‑scale data science initiatives by designing, developing, and deploying advanced analytical and machine learning solutions. This role collaborates closely with data engineers, analysts, software developers, and business stakeholders to deliver scalable, production‑ready data products that drive data‑informed decision making.</p><p>The successful candidate will apply statistical modeling, machine learning, and big data technologies to solve complex business problems, while also providing technical guidance and mentorship across project teams.</p><p><strong>Key Responsibilities</strong></p><ul><li>Lead complex, cross‑functional data science initiatives delivering solutions across multiple technologies and platforms.</li><li>Design, develop, and deploy data mining, statistical, machine learning, and graph‑based algorithms for large‑scale data sets.</li><li>Partner with data engineering teams to ensure proper implementation, performance, and operational use of analytical solutions.</li><li>Review and assess data science programs and models at an enterprise level to evaluate suitability, performance, and scalability.</li><li>Build and maintain scalable big‑data analytics solutions supporting accurate targeting, forecasting, and advanced insights.</li><li>Develop and support end‑to‑end machine learning pipelines, including data preparation, training, testing, validation, and deployment.</li><li>Establish performance metrics, monitoring, and evaluation procedures for models in production.</li><li>Translate complex analytical findings into clear, actionable insights for technical and non‑technical stakeholders.</li><li>Provide mentorship and technical guidance to junior team members.</li><li>Contribute to data strategy, methodology selection, and continuous improvement of analytics capabilities.</li><li>Support testing, validation, and user acceptance activities to ensure alignment with business requirements.</li><li>Perform additional related duties as needed to support analytics and data initiatives.</li></ul>
<p>We are looking for a Data Engineer to join a team focused on building reliable, scalable data solutions. In this role, you will create and enhance cloud-based data pipelines, organize data for analytics, and help ensure that business teams have access to trusted information. This position also partners closely with technical and non-technical stakeholders to turn reporting and data needs into practical engineering outcomes.</p><p><br></p><p>Responsibilities:</p><p>• Create and support scalable data ingestion and transformation workflows using Azure Data Factory, Databricks, and PySpark.</p><p>• Connect and consolidate data from enterprise platforms, operational databases, telematics feeds, APIs, and other internal or external sources.</p><p>• Structure and manage data within Azure Data Lake and lakehouse environments to support performance, accessibility, and long-term maintainability.</p><p>• Design curated datasets, data models, and schemas that improve usability for analytics, business intelligence, and downstream reporting.</p><p>• Apply governance and lineage practices through Unity Catalog while promoting strong data quality, consistency, and security standards.</p><p>• Work with business stakeholders and cross-functional teams to gather requirements, define technical specifications, and deliver data solutions aligned with operational needs.</p><p>• Improve pipeline stability and efficiency by troubleshooting failures, resolving performance issues, and refining storage and query strategies.</p><p>• Support Power BI reporting by preparing datasets, assisting with model improvements, and helping maintain reporting standards and governance practices.</p><p>• Use GitHub-based development practices for version control, peer review, CI/CD, and disciplined deployment processes.</p><p>• Mentor less-experienced engineers and contribute to a collaborative environment focused on continuous improvement and dependable delivery.</p>
<p>We are seeking a skilled and motivated Data Engineer to join our team, with deep hands-on experience building and optimizing data pipelines and lakehouse solutions in Databricks. In this role, you will collaborate with cross-functional teams to design, develop, and operate scalable, reliable data products that drive business value.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, build, and maintain batch and streaming data pipelines using Databricks (Spark, Delta Lake, Jobs/Workflows).</li><li>Partner with data scientists, analysts, and application teams to deliver trusted, well-modeled data sets and features in the Databricks Lakehouse.</li><li>Optimize Spark jobs (partitioning, caching, join strategies) and Databricks cluster configurations for performance, scalability, and cost.</li><li>Implement data quality checks, observability, governance, and security controls (e.g., Unity Catalog, access policies) within Databricks.</li><li>Troubleshoot and resolve pipeline failures, data issues, and production incidents; perform root-cause analysis and implement preventative improvements.</li></ul><p><br></p>