<p>Robert half has a brand new opening for a Data Engineer with a reputable client here in Tampa.</p><p>Full-time position, HYBRID schedule out of their Tampa office.</p><p>Compensation ranging $100-115K depending on experience</p><p>*Medical benefits are also 100% covered after onboarding period*</p><p><br></p><p>Data Engineer (BI/ETL) focused on building and optimizing ETL/ELT pipelines, migrating/cleaning data between internal, vendor, and legacy systems, and improving data quality. SQL is absolutely required, and this role leans heavily into backend data movement — not dashboarding.</p><p><br></p><p><strong>Top Skills Looking For:</strong></p><ul><li>Strong <strong>SQL </strong>(non negotiable)</li><li>Experience designing and maintaining <strong>ETL / ELT pipelines</strong> using frameworks such as <strong>Apache Airflow, DBT (Data Build Tool), or equivalent orchestration systems</strong>, with the ability to schedule, monitor, and recover complex multi-stage jobs.</li><li><strong>Experience moving data across multiple systems</strong></li></ul><p>Description:</p><p>Build and maintain business intelligence solutions to include law enforcement, detention, human resources, finance, and integration of data from agency criminal justice partners.</p><p>• Design and develop BI solutions.</p><p>• Gather user requirements, develop technical and functional requirements, produce reporting solutions, and document the design and development process, metadata, and business rules.</p><p>• Model, implement, and maintain databases and data marts to support BI reporting.</p><p>• Develop extract, transform, load (ETL) to support the loading of data into data marts.</p><p>• Monitor the data quality of existing databases and data marts and recommend governance and control around self-service BI/Analytics considering the evolution of the BI Industry’s best practices.</p><p>• Perform other related duties as required.</p>
We are looking for a skilled Data Engineer to join our team in Tampa, Florida. This is a Contract to permanent position, offering an excellent opportunity to contribute to innovative business intelligence solutions while advancing your career. The ideal candidate will have a strong background in data engineering, database design, and analytics, with the ability to solve complex problems and deliver high-quality results.<br><br>Responsibilities:<br>• Design and implement robust business intelligence solutions tailored to meet organizational needs.<br>• Collaborate with stakeholders to gather user requirements and translate them into technical and functional specifications.<br>• Create and maintain databases and data marts that support analytics and reporting activities.<br>• Develop and optimize ETL processes to efficiently load data into data marts.<br>• Monitor and ensure the accuracy, consistency, and quality of data within databases and reporting systems.<br>• Recommend and implement governance practices to improve self-service BI and analytics capabilities.<br>• Develop automated data validation checks to maintain data integrity and accuracy.<br>• Utilize dimensional modeling and star/snowflake schemas to design effective data warehouses.<br>• Troubleshoot and debug issues across application and database layers to ensure smooth operations.<br>• Perform exploratory data analysis to identify trends, anomalies, and areas for improvement.
<p>I’m building a world-class team to power our next generation of data products. We’re looking for a Senior Data Engineer who knows AWS inside and out—someone who can <strong>design secure, scalable data pipelines</strong>, <strong>own ETL/ELT workflows</strong>, <strong>engineer cloud data infrastructure</strong>, and <strong>deliver dimensional and semantic models</strong> that our analysts, data scientists, and applications can trust.</p><p>You’ll work closely with product, security, platform engineering, and analytics to move our architecture toward a <strong>real-time, governed, cost-aware</strong>, and <strong>highly automated</strong> data ecosystem.</p><p><strong>What You’ll Do</strong></p><ul><li><strong>Design & build end-to-end pipelines</strong> on AWS (batch and streaming) using services like <strong>Glue, EMR, Lambda, Step Functions, Kinesis, MSK</strong>, and <strong>Fargate</strong>.</li><li><strong>Develop robust ETL/ELT</strong> (PySpark, Spark SQL, SQL, Python) for structured, semi-structured, and unstructured data at scale.</li><li><strong>Own data storage & processing layers</strong>: <strong>S3 (Lake/Lakehouse), Redshift (or Snowflake on AWS), DynamoDB</strong>, and <strong>Athena</strong> with strong partitioning, compaction, and performance tuning.</li><li><strong>Implement data models</strong> (3NF, dimensional/star, Data Vault, Lakehouse medallion) for analytics and operational workloads.</li><li><strong>Engineer secure infrastructure-as-code</strong> with <strong>Terraform</strong> (or <strong>CDK</strong>) across multi-account setups; implement CI/CD via <strong>GitHub Actions</strong> or <strong>AWS CodeBuild/CodePipeline</strong>.</li><li><strong>Harden security & governance</strong>: use <strong>IAM</strong>, <strong>Lake Formation</strong>, <strong>KMS</strong>, <strong>Secrets Manager</strong>, <strong>VPC/PrivateLink</strong>, <strong>GLUE Catalog</strong>, and fine-grained access controls. Partner with SecOps on compliance (e.g., <strong>SOC 2</strong>, <strong>FedRAMP</strong>, <strong>HIPAA</strong> depending on dataset).</li><li><strong>Observability & reliability</strong>: build monitoring with <strong>CloudWatch</strong>, <strong>OpenTelemetry</strong>, and data quality checks (e.g., <strong>Great Expectations</strong>, <strong>Deequ</strong>), implement SLOs and alerts.</li><li><strong>Champion best practices</strong>: code reviews, testing (unit/integration), documentation, runbooks, and blameless postmortems.</li><li><strong>Mentor</strong> mid-level engineers and collaborate on architectural decisions, standards, and technical roadmaps.</li></ul><p><br></p>
We are looking for a talented Data Engineer to join our team in Grand Rapids, Michigan. In this role, you will focus on designing, building, and optimizing robust data solutions using Snowflake and other cloud-based technologies. You will work closely with business intelligence and analytics teams to deliver scalable, high-performance data pipelines that support organizational goals.<br><br>Responsibilities:<br>• Design and implement scalable data models, schemas, and tables within Snowflake, including staging, integration, and presentation layers.<br>• Develop and optimize data pipelines using Snowflake tools such as Snowpipe, Streams, Tasks, and stored procedures.<br>• Ensure data security and access through role-based controls and best practices for data sharing.<br>• Build and maintain ETL pipelines leveraging tools like dbt, Matillion, Fivetran, Informatica, or Azure-native solutions.<br>• Integrate data from diverse sources such as APIs, IoT devices, and NoSQL databases to create unified datasets.<br>• Enhance performance by utilizing clustering, partitioning, caching, and efficient warehouse sizing strategies.<br>• Collaborate with cloud technologies such as AWS, Azure, or Google Cloud to support Snowflake infrastructure and operations.<br>• Implement automated workflows and CI/CD processes for seamless deployment of data solutions.<br>• Maintain high standards for data accuracy, completeness, and reliability while supporting governance and documentation.<br>• Work closely with analytics, reporting, and business teams to troubleshoot issues and deliver scalable solutions.
We are looking for a skilled Data Engineer to join our team in Wayne, Pennsylvania, on a contract to permanent basis. This role offers an exciting opportunity to design, implement, and optimize data pipelines while integrating applications with various digital marketplaces. The ideal candidate will bring strong technical expertise and a collaborative mindset to support business insights and analytics effectively.<br><br>Responsibilities:<br>• Develop and maintain data pipelines and ensure seamless application connectivity with digital marketplaces such as TikTok Shop, Shopify, and Amazon.<br>• Collaborate closely with business teams to understand requirements and provide actionable analytics.<br>• Lead the creation of scalable and efficient data solutions tailored to business needs.<br>• Apply expertise in Python, Snowflake, and other relevant technologies to deliver high-quality results.<br>• Facilitate and support integrations with e-commerce platforms, leveraging previous experience where applicable.<br>• Build robust APIs and ensure their effective implementation.<br>• Utilize Microsoft SQL for database management and optimization.<br>• Provide technical guidance and mentorship to ensure project success.<br>• Troubleshoot and resolve issues related to data workflows and integrations.<br>• Continuously evaluate and improve processes to enhance efficiency and performance.
<p>Robert Half is hiring! We are looking for an experienced Data Engineer to join our team in Greenville, South Carolina. This role offers an exciting opportunity to work with modern data technologies, ensuring the efficient operation and optimization of data pipelines and systems. The ideal candidate will bring a strong technical background, leadership skills, and a proactive approach to maintaining and improving data infrastructure.</p><p><br></p><p>Responsibilities:</p><p>• Oversee daily data loads and ensure the smooth operation of data pipelines and related systems.</p><p>• Troubleshoot and resolve issues such as pipeline failures, performance bottlenecks, schema mismatches, and cloud resource disruptions.</p><p>• Conduct root-cause analyses and implement permanent solutions to prevent recurring issues.</p><p>• Maintain and optimize existing data processes, refactoring or retiring outdated workflows as necessary.</p><p>• Design and build scalable data ingestion pipelines using technologies such as Azure Data Factory, Databricks, and Synapse Pipelines.</p><p>• Collaborate with teams to create and improve operational runbooks, monitoring dashboards, and incident response workflows.</p><p>• Develop reusable ingestion patterns for platforms like Guidewire DataHub, InfoCenter, and other business data sources.</p><p>• Lead the implementation of real-time and event-driven data engineering solutions to enable operational insights and automation.</p><p>• Partner with architects to modernize data workloads using advanced frameworks like Delta Lake and Medallion Architecture.</p><p>• Mentor entry-level engineers, enforce coding best practices, and review code to ensure quality and compliance.</p>
<p>IMMEDIATE HIRE NEEDED. Interviews to begin the first week of February. </p><p><br></p><p>We are looking for a skilled Snowflake Marketing Data Engineer to join our team in Tampa, Florida in a hybrid in-office work schedule (2 to 3 days remote per week) preferably, remote candidates may be considered depending of the quality in match. </p><p><br></p><p>In this role, you will be responsible for designing, implementing, and maintaining data solutions that support critical business operations. Your expertise will play a key part in driving data-driven decisions and optimizing performance across various platforms.</p><p><br></p><p>Responsibilities:</p><p>• Develop and maintain ETL processes to efficiently extract, transform, and load data from multiple sources.</p><p>• Analyze marketing data to uncover insights and support strategic decision-making.</p><p>• Create and manage dashboards and reports using Power BI to visualize data effectively.</p><p>• Integrate and leverage tools like Braze and Google Analytics to enhance data tracking and reporting capabilities.</p><p>• Collaborate with cross-functional teams to ensure the accuracy and reliability of data systems.</p><p>• Optimize database performance and troubleshoot any issues related to data pipelines.</p><p>• Document data workflows and provide training to stakeholders on best practices.</p><p>• Work with cloud-based platforms, such as Snowflake, to store and manage large datasets.</p><p>• Ensure data security and compliance with company policies and standards.</p>
We are looking for an experienced Data Engineer to join our team in New York, New York. In this role, you will design, build, and maintain data infrastructure to support business intelligence and analytics needs. The ideal candidate will have a strong technical background, a passion for working with complex datasets, and expertise in cloud-based data platforms.<br><br>Responsibilities:<br>• Develop, implement, and optimize ETL pipelines to ensure efficient data processing and integration.<br>• Design and maintain scalable data solutions, including data warehouses and data lakes.<br>• Collaborate with cross-functional teams to identify data requirements and deliver actionable insights.<br>• Utilize Snowflake, AWS, and other cloud-based platforms to manage data infrastructure and ensure performance optimization.<br>• Leverage Python and SQL to build robust data workflows and automate processes.<br>• Employ orchestration tools like Airflow and dbt to streamline data operations.<br>• Support data analytics and visualization efforts by enabling the creation of impactful dashboards using tools such as Tableau.<br>• Work with marketing and product data sources, including platforms like Google Analytics, to extract and integrate valuable insights.<br>• Implement CI/CD pipelines and DevOps practices to enhance data engineering processes.<br>• Ensure data security and compliance across all systems and tools.
<p>The Senior Data Engineer plays a key role in architecting, developing, and operating reliable, production-ready data solutions that enable analytics, automation, and operational processes across our client’s organization.</p><p><br></p><p>Operating within a modern, cloud-based data ecosystem, this role is responsible for bringing together data from internal platforms and external partners, transforming it into trusted, high-quality assets, and delivering it consistently to downstream users and systems. The work spans the full data lifecycle—ingestion, orchestration, transformation, and delivery—and blends advanced SQL development with Python-based pipeline and workflow automation.</p><p><br></p><p>This role sits at the intersection of data and systems engineering and works closely with Business Intelligence, Business Technology, and operational teams to ensure data solutions are scalable, dependable, and aligned with real business outcomes.</p><p><br></p><p><br></p><p><br></p><p><br></p>
<p>We are looking for a talented Data Engineer to join our team in Miami, Florida. This long-term contract position offers the opportunity to work on cutting-edge technologies and contribute to the development of efficient data pipelines and processes. The ideal candidate will have a strong background in data engineering and a passion for delivering high-quality solutions that drive business success.</p><p><br></p><p>Responsibilities:</p><p>• Design and implement scalable data pipelines using Snowflake, Python, and other relevant tools.</p><p>• Collaborate with stakeholders to gather and refine data requirements, ensuring alignment with business needs.</p><p>• Develop and maintain data models to support analytics, reporting, and operational processes.</p><p>• Optimize data warehouse performance by tuning queries and managing resources effectively.</p><p>• Ensure data quality through rigorous testing and governance protocols.</p><p>• Implement security and compliance measures to protect sensitive data.</p><p>• Research and integrate emerging technologies to enhance system capabilities.</p><p>• Support ETL processes for data extraction, transformation, and loading.</p><p>• Work with technologies such as Apache Spark, Hadoop, and Kafka to manage and process large datasets.</p><p>• Provide technical guidance and support to team members and stakeholders.</p>
We are looking for an experienced Data Engineer to join our dynamic team in Mayville, Wisconsin. In this role, you will play a key part in developing and enhancing reporting and analytics solutions within a modern data environment. The ideal candidate is passionate about transforming complex data into actionable insights, improving processes, and creating reliable reporting systems. This is a long-term contract position offering the opportunity to make a meaningful impact within a collaborative and forward-thinking team.<br><br>Responsibilities:<br>• Design, develop, and maintain scalable data pipelines to support reporting and analytics needs.<br>• Create and optimize Power BI dashboards and reports to deliver accessible and trustworthy insights.<br>• Automate workflows using Power Automate to improve operational efficiency.<br>• Develop scripts using languages such as PowerShell or Python to streamline data processing tasks.<br>• Integrate and manage data sources including Oracle, Snowflake (hosted within Azure), and other enterprise systems.<br>• Collaborate with stakeholders to gather requirements and deliver customized solutions.<br>• Support the transition to cloud-based data environments, including Azure Data Warehouse and Fabric.<br>• Troubleshoot and resolve data-related issues, ensuring data integrity and reliability.<br>• Document processes and workflows to ensure clarity and maintainability.<br>• Stay updated on industry trends to recommend and implement innovative data solutions.
<p>Since it’s 2026, the Data Engineering landscape in DC has shifted heavily toward <strong>Cloud-Native architectures</strong> and <strong>GenAI-ready pipelines</strong>. Robert Half typically recruits for both their internal corporate teams and their high-end consulting arm (Protiviti).</p><p>Here is a tailored job description based on current 2026 market standards and Robert Half’s specific hiring trends in the District.</p><p><br></p><p>Job Title: Data Engineer</p><p><strong>Location:</strong> Washington, DC (Hybrid – Downtown DC Office)</p><p><strong>Company:</strong> Robert Half </p><p><strong>Employment Type: </strong>Contract-to-Hire</p><p>Role Overview</p><p>As a Data Engineer at Robert Half, you will be the backbone of our data-driven decision-making process. You aren't just "moving data"; you are architecting the flow of information that powers our localized market analytics and global recruitment engines. In the DC market, this often involves handling high-compliance data environments and integrating cutting-edge AI frameworks into traditional ETL workflows.</p><p><br></p><p><br></p>
We are looking for an experienced Data Engineer to join our team in Jacksonville, Florida. In this role, you will take the lead in designing and building a cutting-edge Azure lakehouse platform that enables business leaders to access analytics through natural language queries. This position combines hands-on technical expertise with leadership responsibilities, offering an opportunity to mentor a team of skilled engineers while driving innovation.<br><br>Responsibilities:<br>• Architect and develop a robust Azure lakehouse platform, utilizing Azure Data Lake Gen2, Delta Lake, and PySpark to create efficient data pipelines.<br>• Implement a semantic layer and metric store to ensure consistent data translation and definitions across the organization.<br>• Design and maintain real-time and batch data pipelines, incorporating medallion architecture, schema evolution, and data contracts.<br>• Build retrieval systems for large language models (LLMs) using Azure OpenAI and vectorized Delta tables to support chat-based analytics.<br>• Ensure data quality, lineage, and observability through tools like Great Expectations and Unity Catalog, while optimizing costs through partitioning and compaction.<br>• Develop automated systems for anomaly detection and alerting using Azure ML pipelines and Event Grid.<br>• Collaborate with product and operations teams to translate complex business questions into actionable data models and queries.<br>• Lead and mentor a team of data and Python engineers, establishing best practices in CI/CD, code reviews, and documentation.<br>• Ensure compliance with security, privacy, and governance standards by designing and implementing robust data handling protocols.
We are looking for a skilled Data Engineer to join our team in Wyoming, Michigan. This Contract to permanent role offers an exciting opportunity to design, manage, and optimize data architecture and engineering solutions across a dynamic healthcare organization. The ideal candidate will play a key role in ensuring efficient data governance and infrastructure performance while collaborating with cross-functional teams.<br><br>Responsibilities:<br>• Develop and maintain robust data architectures and frameworks, including relational and graph databases, to meet business objectives.<br>• Create and manage data pipelines to extract, transform, and load data from various sources into data warehouses.<br>• Ensure data governance policies are implemented and monitored, including retention and backup protocols.<br>• Collaborate with teams across departments to translate business requirements into technical specifications.<br>• Monitor and optimize the performance of data assets, identifying opportunities for improvement.<br>• Design scalable and secure data solutions using cloud-based platforms like AWS and Microsoft Azure.<br>• Implement advanced tools and technologies, such as AI, to enhance data analytics and processing capabilities.<br>• Mentor and support team members by sharing technical expertise and providing guidance.<br>• Establish key performance indicators (KPIs) to measure database performance and drive continuous improvement.<br>• Stay up to date with emerging trends and advancements in data engineering and architecture.
We are looking for a skilled Data Engineer to join our team in Washington, District of Columbia. In this role, you will play a key part in designing and implementing secure, scalable solutions to support data and analytics initiatives. This is a long-term contract position, offering the opportunity to work with cutting-edge technologies and contribute to impactful projects.<br><br>Responsibilities:<br>• Develop, test, and maintain robust data pipelines and engineering solutions to support analytics and integrate new data sources.<br>• Collaborate with team members, stakeholders, and external vendors to evaluate and implement reliable, scalable, and secure technologies.<br>• Create efficient, automated processes to handle repetitive data management tasks.<br>• Conduct targeted data manipulation and analysis across diverse datasets.<br>• Implement advanced security measures within data warehouses and analytics platforms to counter evolving threats.<br>• Document technical processes and solutions to ensure seamless collaboration and knowledge sharing.<br>• Monitor and optimize system performance to ensure scalability and reliability.<br>• Stay updated on emerging data engineering trends and incorporate them into workflows.
<p>Position Overview</p><p>We are seeking a talented <strong>Data Engineer</strong> with strong experience in <strong>Python, AWS, and Databricks</strong> to design and build scalable data pipelines and modern data platforms. The ideal candidate will help develop and maintain data infrastructure that supports analytics, machine learning, and business intelligence initiatives. This role requires hands-on experience working with large datasets, cloud-native architectures, and distributed data processing frameworks.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, build, and maintain <strong>scalable data pipelines and ETL/ELT workflows</strong> using Python and cloud technologies.</li><li>Develop and optimize data solutions using <strong>AWS services and Databricks</strong>.</li><li>Build and manage <strong>data lakes and data warehouses</strong> for structured and unstructured data.</li><li>Implement <strong>data transformation and processing pipelines</strong> using Apache Spark within Databricks.</li><li>Integrate data from multiple sources including APIs, databases, and streaming systems.</li><li>Ensure <strong>data quality, governance, security, and compliance</strong> across the data platform.</li><li>Monitor pipeline performance and troubleshoot <strong>data pipeline failures or latency issues</strong>.</li><li>Collaborate with <strong>data analysts, data scientists, and business stakeholders</strong> to deliver reliable datasets.</li><li>Optimize storage and compute costs within the AWS ecosystem.</li><li><br></li></ul><p><br></p>
<p>We are seeking a skilled <strong>Azure Data Engineer</strong> to design, build, and maintain scalable data solutions on the Microsoft Azure platform. The ideal candidate will have strong experience developing data pipelines, optimizing data architectures, and supporting analytics and business intelligence initiatives. This role will work closely with data analysts, data scientists, and business stakeholders to ensure reliable, high-quality data is available for reporting and advanced analytics.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, develop, and maintain <strong>scalable data pipelines and ETL/ELT processes</strong> using Azure data services.</li><li>Build and manage data solutions using tools such as <strong>Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, and Azure Databricks</strong>.</li><li>Develop and optimize <strong>data models, transformations, and storage strategies</strong> for large-scale structured and unstructured datasets.</li><li>Ensure <strong>data quality, integrity, and security</strong> across the data platform.</li><li>Monitor and troubleshoot data workflows, pipeline failures, and performance issues.</li><li>Collaborate with data analysts, BI developers, and data scientists to deliver reliable datasets for reporting and analytics.</li><li>Implement <strong>data governance and best practices</strong> for data management and documentation.</li><li>Automate data processes and deployments using <strong>CI/CD pipelines and infrastructure-as-code practices</strong>.</li><li>Optimize cost and performance of Azure data services.</li><li>Stay current with new Azure features, tools, and industry best practices.</li></ul><p><br></p>
<p>Architect and lead the modern data platform using <strong>Microsoft Fabric</strong>. You’ll define data models, pipelines, governance, and performance patterns that power analytics at scale.</p><p><strong>What You’ll Do</strong></p><ul><li>Design end‑to‑end architectures leveraging Fabric (OneLake, Lakehouses, Warehouses)</li><li>Define medallion/layered models, dimensional designs, and semantic layers</li><li>Lead ingestion and transformation with Dataflows Gen2 / Data Factory / Notebooks</li><li>Establish governance (data quality, lineage, security, RLS/OLS)</li><li>Optimize performance and cost; standardize reuseable patterns</li><li>Mentor data engineers/analysts; review solutions and set best practices</li><li>Partner with business to translate use cases into scalable models</li></ul><p><br></p>
We are looking for an experienced Data Architect to design and implement cutting-edge data solutions that meet the evolving needs of our enterprise. This role involves building secure, scalable, and high-performing data platforms while leveraging modern technologies and aligning with organizational goals. The ideal candidate will have expertise in cloud-based architecture, data governance, and advanced analytics, driving innovation across diverse business functions.<br><br>Responsibilities:<br>• Develop comprehensive data architecture strategies for advanced analytics and big data solutions using Azure Databricks.<br>• Design and implement Databricks Delta Lake-based Lakehouse architecture, utilizing PySpark Jobs, Databricks Workflows, Unity Catalog, and Medallion architecture.<br>• Optimize and configure Databricks clusters, notebooks, and workflows to ensure efficiency and scalability.<br>• Integrate Databricks with Azure services such as Azure Data Lake Storage, Azure Data Factory, Azure Key Vault, and Microsoft Fabric.<br>• Establish and enforce best practices for data governance, security, and cost management.<br>• Collaborate with data engineers, analysts, and business stakeholders to translate functional requirements into robust technical solutions.<br>• Provide technical mentoring and leadership to team members focused on Databricks and Azure technologies.<br>• Monitor, troubleshoot, and enhance data pipelines and workflows to maintain reliability and performance.<br>• Ensure compliance with organizational and regulatory standards regarding data security and privacy.<br>• Document configurations, processes, and governance standards to support long-term scalability and usability.
<p>Robert Half is seeking an experienced Data Architect to design and lead scalable, secure, and high-performing enterprise data solutions. This role will focus on building next-generation cloud data platforms, driving adoption of modern analytics technologies, and ensuring alignment with governance and security standards.</p><p><br></p><p>You’ll serve as a hands-on technical leader, partnering closely with engineering, analytics, and business teams to architect data platforms that enable advanced analytics and AI/ML initiatives. This position blends deep technical expertise with strategic thinking to help unlock the value of data across the organization.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Design and implement end-to-end data architecture for big data and advanced analytics platforms.</li><li>Architect and build Delta Lake–based lakehouse environments from the ground up, including DLT pipelines, PySpark jobs, workflows, Unity Catalog, and Medallion architecture.</li><li>Develop scalable data models that meet performance, security, and governance requirements.</li><li>Configure and optimize clusters, notebooks, and workflows to support ETL/ELT pipelines.</li><li>Integrate cloud data platforms with supporting services such as data storage, orchestration, secrets management, and analytics tools.</li><li>Establish and enforce best practices for data governance, security, and cost optimization.</li><li>Collaborate with data engineers, analysts, and stakeholders to translate business requirements into technical solutions.</li><li>Provide technical leadership and mentorship to team members.</li><li>Monitor, troubleshoot, and optimize data pipelines to ensure reliability and efficiency.</li><li>Ensure compliance with organizational and regulatory standards related to data privacy and security.</li><li>Create and maintain documentation for architecture, processes, and governance standards.</li></ul>
<p>We are looking for a skilled Web Manager to oversee and optimize the performance of our digital platforms, ensuring an exceptional user experience for both B2B and B2C audiences. In this role, you will drive technical and creative strategies to enhance website functionality, visibility, and scalability, while maintaining alignment with global brand standards. Your expertise will be vital in delivering state-of-the-art solutions that support long-term growth and innovation.</p><p><br></p><p>Responsibilities:</p><p>• Develop and implement a strategic roadmap to enhance website performance, user experience, and technical functionality across B2B and B2C platforms.</p><p>• Optimize site architecture and content to align with AI-driven search trends and improve visibility through advanced SEO techniques.</p><p>• Lead initiatives to ensure seamless platform and database migrations, maintaining data integrity and minimizing downtime.</p><p>• Collaborate with global teams to align on shared technological goals and ensure consistency across digital properties.</p><p>• Manage and mentor a team of developers and designers to deliver high-quality web projects that meet business objectives.</p><p>• Translate complex business needs into actionable technical requirements, bridging the gap between marketing, sales, and IT.</p><p>• Monitor website analytics and performance metrics to identify areas for improvement and implement data-driven solutions.</p><p>• Establish standards for code quality, site maintenance, and security to ensure reliable and stable digital operations.</p><p>• Stay updated on emerging technologies and best practices to maintain a competitive edge in web development.</p>
We are looking for an experienced Systems Manager to lead and develop a team responsible for maintaining and optimizing critical hosted environments. This role requires a strong technical background combined with excellent leadership skills to ensure the delivery of high-quality infrastructure services to enterprise clients across diverse industries.<br><br>Responsibilities:<br>• Lead and mentor a team of systems and infrastructure engineers in managing a 24/7 hosted environment.<br>• Oversee operations across virtualized platforms, cloud hosting services, and application delivery systems.<br>• Ensure the infrastructure meets high availability, security, and performance standards.<br>• Manage incident response processes, conduct root-cause analyses, and implement continuous improvement strategies.<br>• Collaborate with cross-functional teams to align technical capabilities with business and customer requirements.<br>• Develop and enforce engineering best practices, documentation protocols, and proactive monitoring solutions.<br>• Handle capacity planning, configuration management, and lifecycle upgrades for servers, storage systems, and virtualization platforms.<br>• Contribute to strategic planning for scalability, modernization, and the introduction of new technical solutions.<br>• Recruit, train, and retain skilled engineering talent while fostering a collaborative and innovative team environment.
<p>We are looking for a highly motivated and detail-oriented Development Manager to join our team in Los Angeles, California. In this long-term contract role, you will play a critical part in securing funding opportunities, developing compelling proposals, and supporting organizational growth within the non-profit sector. This position requires strong collaboration, strategic thinking, and excellent communication skills to ensure the success of various fundraising initiatives.</p><p><br></p><p>Responsibilities:</p><p>• Identify and pursue funding opportunities from various sources, including foundations, corporations, and government agencies.</p><p>• Lead the creation and submission of grant proposals</p><p>• Collaborate with program leaders to craft comprehensive program narratives and budgets for funding applications.</p><p>• Research and track potential funding opportunities to align with organizational program needs.</p><p>• Attend bidder conferences to gather insights and information on government funding opportunities.</p><p>• Coordinate proposal development across departments to maintain consistency and avoid redundancy.</p><p>• Manage and update records of inquiries, grant proposals, and funding applications for accurate tracking.</p><p>• Edit and review grant proposals to ensure precision, clarity, and competitiveness.</p><p>• Assist in preparing annual funding requests for programs like the Emergency Food and Shelter Program.</p><p>• Ensure all submissions comply with established guidelines and deadlines.</p>
We are looking for an experienced Data and Analytics Manager to lead master data management initiatives within our organization. This role involves developing and implementing data governance policies and procedures to ensure accuracy, consistency, and reliability across all systems. As a long-term contract position based in Dallas, Texas, this opportunity offers the chance to collaborate with cross-functional teams and lead efforts to optimize data processes.<br><br>Responsibilities:<br>• Develop and oversee master data governance standards to ensure data accuracy, consistency, and reliability.<br>• Collaborate with stakeholders across departments to communicate business processes and data requirements.<br>• Approve and process requests for creating or modifying master data while auditing completed mappings for quality assurance.<br>• Lead a team of mapping specialists, allocating resources effectively and ensuring adherence to timelines and priorities.<br>• Create and maintain documentation to support team training and mapping verification processes.<br>• Monitor and audit team performance to ensure consistent quality and address knowledge gaps.<br>• Coordinate updates and progress reports for organizational collaboration meetings, ensuring alignment with business goals.<br>• Proactively communicate with stakeholders, providing regular updates on team efforts, challenges, and achievements.<br>• Review and analyze source data processes to ensure a thorough understanding of data mapping requirements.<br>• Track changes to IT architecture and processes to assess impacts on data usage and quality.