<p>We are looking for an experienced Senior Data Engineer to join our team in Boston, Massachusetts. In this role, you will be responsible for designing and building a robust data platform from the ground up, playing a pivotal part in shaping the data strategy and supporting AI-driven initiatives. This is a unique opportunity to contribute to the creation of a new data engineering function within a dynamic financial services environment. This role is hybrid, onsite in Boston 3 days a week. </p><p><br></p><p>Responsibilities:</p><p>• Design, develop, and implement a scalable data platform using Microsoft Fabric and other technologies within the Microsoft ecosystem.</p><p>• Collaborate with stakeholders to define the data strategy and implement solutions that align with business goals.</p><p>• Oversee and manage external consultants assisting with the development of the data platform.</p><p>• Support AI enablement initiatives by ensuring the data architecture meets analytical and operational needs.</p><p>• Create and maintain ETL processes to ensure efficient data extraction, transformation, and loading.</p><p>• Optimize database performance across SQL, NoSQL, and other database systems.</p><p>• Utilize Python for data engineering tasks, including scripting and automation.</p><p>• Work closely with IT and analytics teams to ensure seamless integration of the data platform into existing systems.</p><p>• Provide technical leadership and guidance while exploring future opportunities to build and expand the data engineering function.</p><p>• Ensure compliance with industry standards and best practices in data security and management.</p>
We are looking for a highly experienced Senior Machine Learning Engineer to join our team in Boston, Massachusetts. In this role, you will design, develop, and deploy cutting-edge machine learning systems that solve complex problems and scale effectively in production environments. This position offers an exciting opportunity to contribute to impactful projects, leveraging your expertise in machine learning, cloud infrastructure, and data engineering.<br><br>Responsibilities:<br>• Build and deploy machine learning models and solutions for production environments, ensuring they meet scalability and performance standards.<br>• Design and implement comprehensive ML pipelines, including data ingestion, feature engineering, model training, evaluation, and serving.<br>• Write clean, efficient code in Python and leverage its ML ecosystem, such as TensorFlow, PyTorch, and scikit-learn.<br>• Work with large datasets to extract meaningful insights and develop complex queries using modern data processing tools.<br>• Utilize containerization technologies like Docker and cloud platforms such as AWS to ensure robust and scalable deployment.<br>• Apply MLOps best practices, including CI/CD pipelines, automated testing, and performance monitoring, to maintain reliable machine learning systems.<br>• Conduct research and apply deep machine learning and AI techniques, including statistical modeling and large language models.<br>• Solve complex analytical problems with pragmatic engineering approaches while maintaining scientific rigor.<br>• Collaborate with cross-functional teams to align machine learning solutions with business goals and mission-driven objectives.<br>• Monitor and address issues like data drift and model performance to ensure continuous improvement and reliability.
We are looking for a Quality Assurance Tester to join our team in Burlington, Massachusetts, on a long-term contract basis. This role involves ensuring the reliability and efficiency of software systems through rigorous testing and collaboration with cross-functional teams. You will play a key role in enhancing product quality by implementing testing strategies and frameworks.<br><br>Responsibilities:<br>• Develop and execute detailed test plans to validate software functionality and performance.<br>• Implement automated testing frameworks and tools to streamline the quality assurance process.<br>• Collaborate with software development teams to identify and resolve issues during the development lifecycle.<br>• Conduct benchmarking and competitor analysis to ensure product quality meets industry standards.<br>• Perform continuous integration testing using tools like Jenkins and TeamCity.<br>• Evaluate supplier contributions and sourcing strategies to ensure compatibility with project requirements.<br>• Provide onsite training and guidance to team members on testing methodologies and tools.<br>• Analyze project plans and recommend improvements to enhance testing efficiency.<br>• Troubleshoot and resolve testing challenges to maintain workflow continuity.<br>• Document and report test results, ensuring transparency and actionable insights.
<p><strong>M365 Implementation Engineer</strong></p><p>Location: Remote</p><p>Department: Professional Services</p><p>Type: Full-Time</p><p><br></p><p><strong>About the Role</strong></p><p>We are seeking an experienced M365 Implementation Engineer to join our Professional Services team. This position combines hands-on engineering work with frequent client interaction. You will design, deploy, and optimize Microsoft 365 solutions while collaborating directly with customers through daily video calls, workshops, and project updates.</p><p>This role is ideal for someone who enjoys both the technical side of M365 and the client-facing aspects of consulting.</p><p><br></p><p><strong>What You’ll Do</strong></p><ul><li>Lead the deployment, configuration, and migration of Microsoft 365 services in client environments.</li><li>Deliver solutions across collaboration, communication, and cloud productivity platforms within the M365 ecosystem.</li><li>Meet with clients regularly through video calls to gather requirements, present progress, and provide technical guidance.</li><li>Develop automated workflows, apps, and dashboards using the Power Platform to streamline business processes.</li><li>Implement best practices around identity, governance, compliance, and security within the M365 tenant.</li><li>Troubleshoot escalated issues and support clients throughout project delivery.</li><li>Work closely with project managers and stakeholders to translate requirements into effective technical solutions.</li></ul><p><br></p>
<p><strong>Data Engineer (Python / AWS)</strong></p><p><strong>Location:</strong> Remote (Northeast / Greater Boston area preferred)</p><p><strong>Type:</strong> Full-Time</p><p><strong>Level:</strong> Mid-to-Senior Individual Contributor</p><p><strong>About the Role</strong></p><p>We are looking for a strong individual contributor who excels in the Python data ecosystem and enjoys building reliable, scalable data pipelines. This role sits within a data engineering group responsible for integrating large volumes of data from external partners and transforming it into usable datasets for internal teams. You’ll work with modern cloud tools while also helping our team gradually transition away from a legacy platform.</p><p>This position is ideal for someone who wants to stay hands-on, focus on technical execution, and remain in an IC role for the next several years. We’re not looking for someone who is aiming to move immediately into architecture or leadership.</p><p>This team is fully distributed, and although candidates in the Boston area can go into the office, the rest of the group is remote. Anyone local may occasionally sit with other teams when on site.</p><p><br></p><p><strong>What You’ll Do</strong></p><ul><li>Build and maintain ETL pipelines that ingest, clean, and aggregate data received from external vendors and large enterprise partners.</li><li>Develop Python‑based data processing workflows deployed on AWS cloud services.</li><li>Work with tools such as AWS Glue, Airflow, dbt, and PySpark to support data transformations and pipeline orchestration.</li><li>Help modernize existing workflows and assist in the gradual migration away from a legacy data system.</li><li>Collaborate with internal stakeholders to understand data needs, define requirements, and ensure reliable integration of partner data feeds.</li><li>Troubleshoot pipeline issues, optimize performance, and improve overall system stability.</li><li>Contribute to best practices around code quality, testing, documentation, and data governance.</li></ul><p><br></p>