<ul><li>Responsible for building, automating, and maintaining workflows within Jira to support reporting and operational efficiency</li><li>Develop and manage custom reporting dashboards sourced from Jira data, with a focus on actionable insights</li><li>Create performance reporting by agent, including productivity, throughput, and workflow efficiency metrics</li><li>Experience integrating and reporting from Azure DevOps, including: </li><li>Building custom reports from Azure DevOps data</li><li>Feeding Azure DevOps data into bespoke dashboards (non‑out‑of‑the‑box solutions)</li><li>Design and maintain end-to-end workflow automation, including pipelines that support reporting and dashboard updates</li><li>Dashboards will be maintained through Jira workflows, not static/manual updates</li><li>Role owns the build and long-term maintenance of workflows, reporting pipelines, and dashboards</li><li>Reporting requests come directly from stakeholders, requiring quick turnaround and ongoing enhancements</li><li>Strong collaboration with business users to translate reporting needs into scalable Jira and Azure DevOps solution</li></ul><p><br></p>
<p>Key Responsibilities</p><ul><li>Design, build, automate, and maintain Jira workflows to support operational efficiency and enterprise reporting needs.</li><li>Develop and manage custom, dynamic dashboards sourced from Jira data, focused on delivering actionable insights—not static or manual updates.</li><li>Create and maintain performance reporting by agent, including productivity, throughput, and workflow efficiency metrics.</li><li>Integrate Azure DevOps data into customized reporting solutions, including:</li><li>Building custom reports directly from Azure DevOps data</li><li>Feeding Azure DevOps data into bespoke dashboards outside of standard, out-of-the-box configurations</li><li>Design and maintain end-to-end workflow automation and reporting pipelines that power dashboard updates and analytics.</li><li>Own the full lifecycle of workflows, reporting pipelines, and dashboards, including enhancements, optimization, and long-term support.</li><li>Respond directly to reporting requests from stakeholders, delivering quick turnarounds while ensuring scalability and sustainability.</li><li>Collaborate closely with business users to translate reporting and operational needs into effective Jira and Azure DevOps solutions</li></ul><p><br></p><p><br></p>
<p><strong>Overview</strong></p><p>We are seeking a Senior Data Engineer to support a major Salesforce Phase 2 data migration initiative. This role will focus heavily on building and optimizing data pipelines, developing ETL workflows, and moving CRM data from Salesforce into Databricks.</p><p>The engineer will work closely with a senior team member, contribute to Scrum ceremonies, and play a key role in developing the core CRM data environment used by the advertising organization.</p><p><br></p><p><strong>Key Responsibilities</strong></p><p><strong>Data Engineering & Migration</strong></p><ul><li>Develop ETL jobs that move and transform Salesforce data into Databricks.</li><li>Build, test, and maintain high‑volume data pipelines across AWS + Databricks.</li><li>Perform data migration, data integration, and pipeline development (including Mulesoft-related work).</li><li>Ensure all pipelines are reliable, scalable, and optimized for production.</li></ul><p><strong>Development & Infrastructure</strong></p><ul><li>Use Python and PySpark to build ETL components and transformation logic.</li><li>Leverage Spark/PySpark for distributed processing at scale (must‑have).</li><li>Use Terraform to provision and manage cloud infrastructure.</li><li>Set up CI/CD pipelines using Concourse or GitHub Actions for automated deployments.</li></ul><p><strong>Quality, Documentation & Support</strong></p><ul><li>Document ETL processes, pipelines, and data flows.</li><li>Participate in testing, QA, and validation of migrated datasets.</li><li>Provide post‑delivery support and proactively mitigate project risks or single points of failure (SPOF).</li><li>Troubleshoot production issues and implement long‑term fixes to maintain pipeline stability.</li></ul><p><strong>Collaboration</strong></p><ul><li>Work closely with engineering teammates to translate business requirements into working pipelines.</li><li>Participate in weekly Scrum ceremonies.</li><li>Contribute to shared best practices and continuous improvement across the data engineering team.</li></ul><p><br></p>
<p>We are seeking a highly experienced Senior Data Engineer professional to lead the design, development, and operationalization of advanced data and AI/ML solutions. This role requires a strong technical foundation in cloud platforms, modern data engineering frameworks, ML system deployment, and semantic data modeling. The ideal candidate combines deep technical expertise with strong leadership and communication skills to guide teams and drive strategic initiatives across the organization.</p><p><br></p><p><strong>Technical Leadership</strong></p><ul><li>Lead the end-to-end design, development, deployment, and maintenance of large-scale data engineering and machine learning pipelines.</li><li>Architect and operationalize AI/ML systems in production environments, ensuring high reliability, performance, and observability.</li><li>Leverage cloud platforms (GCP or AWS) to build scalable, secure, and cost‑efficient data and ML infrastructure.</li><li>Utilize streaming and real-time processing technologies such as Apache Kafka and Apache Flink to support event-driven architectures and advanced analytics use cases.</li><li>Develop robust data transformations and semantic models using tools such as dbt.</li><li>Implement and maintain Infrastructure as Code using Terraform or similar frameworks.</li><li>Ensure cloud architectures follow best practices for security, compliance, and governance.</li></ul><p><strong>Team & Cross-Functional Leadership</strong></p><ul><li>Provide technical leadership, mentorship, and guidance to data engineers, ML engineers, and other stakeholders.</li><li>Collaborate closely with Data Science, DevOps, Security, and Product teams to ensure cohesive delivery of data and ML initiatives.</li><li>Communicate complex technical concepts clearly to both technical and non-technical audiences, supporting informed decision‑making.</li></ul><p><strong>Operational Excellence</strong></p><ul><li>Maintain production AI/ML systems with focus on reliability, monitoring, versioning, and lifecycle management.</li><li>Establish and uphold engineering best practices, coding standards, CI/CD frameworks, and documentation.</li><li>Continuously evaluate emerging technologies, frameworks, and methodologies to strengthen the organization’s data and ML capabilities.</li></ul><p><br></p>
We are looking for an experienced Senior Data Engineer to join our team in Woodbury, Minnesota. In this role, you will play a key part in designing and optimizing data systems, ensuring scalability and reliability for business-critical operations. The ideal candidate will have a strong background in data engineering and a passion for leveraging technology to drive impactful solutions.<br><br>Responsibilities:<br>• Redesign and optimize complex business logic embedded in Postgres functions to improve functionality.<br>• Develop scalable database schemas and create data models that are optimized for analytics and AI applications.<br>• Implement database partitioning, indexing, and performance tuning to ensure data growth is supported efficiently.<br>• Build and maintain production-grade data pipelines from data ingestion to end-user consumption.<br>• Establish robust processes for data quality assurance, monitoring, and operational reliability within pipelines.<br>• Troubleshoot and resolve data-related and performance issues directly in production environments.<br>• Collaborate with cross-functional teams to ensure seamless integration of data systems into business processes.
<p>We are looking for an experienced Senior Data Engineer to join our team in Boston, Massachusetts. In this role, you will be responsible for designing and building a robust data platform from the ground up, playing a pivotal part in shaping the data strategy and supporting AI-driven initiatives. This is a unique opportunity to contribute to the creation of a new data engineering function within a dynamic financial services environment. This role is hybrid, onsite in Boston 3 days a week. </p><p><br></p><p>Responsibilities:</p><p>• Design, develop, and implement a scalable data platform using Microsoft Fabric and other technologies within the Microsoft ecosystem.</p><p>• Collaborate with stakeholders to define the data strategy and implement solutions that align with business goals.</p><p>• Oversee and manage external consultants assisting with the development of the data platform.</p><p>• Support AI enablement initiatives by ensuring the data architecture meets analytical and operational needs.</p><p>• Create and maintain ETL processes to ensure efficient data extraction, transformation, and loading.</p><p>• Optimize database performance across SQL, NoSQL, and other database systems.</p><p>• Utilize Python for data engineering tasks, including scripting and automation.</p><p>• Work closely with IT and analytics teams to ensure seamless integration of the data platform into existing systems.</p><p>• Provide technical leadership and guidance while exploring future opportunities to build and expand the data engineering function.</p><p>• Ensure compliance with industry standards and best practices in data security and management.</p>