<p>We are looking for an experienced Technical Lead to oversee and guide technical operations in North Brunswick, New Jersey. This role involves managing a diverse team, ensuring quality standards, and driving innovation in product development. The ideal candidate will thrive in a collaborative environment and possess strong leadership skills to manage technical projects effectively.</p><p><br></p><p>Responsibilities:</p><ul><li>Lead and mentor the U.S.-based technical team across multiple sites, driving professional development and team growth.</li><li>Manage quality control systems and champion continuous improvement to optimize operational performance.</li><li>Ensure adherence to internal standards and maintain robust quality management systems.</li><li>Partner with global technical teams to develop and enforce standardized practices.</li><li>Act as the primary technical contact for key clients, delivering customized solutions and addressing specific needs.</li><li>Strategically allocate resources across technical projects to align with business objectives and timelines.</li></ul>
<p><strong>Data Engineer (Hybrid, Los Angeles)</strong></p><p><strong>Location:</strong> Los Angeles, California</p><p><strong>Compensation:</strong> $140,000 - $175,000 per year</p><p><strong>Work Environment:</strong> Hybrid, with onsite requirements</p><p>Are you passionate about crafting highly-scalable and performant data systems? Do you have expertise in Azure Databricks, Spark SQL, and real-time data pipelines? We are searching for a talented and motivated <strong>Data Engineer</strong> to join our team in Los Angeles. You'll work in a hybrid environment that combines onsite collaboration with the flexibility of remote work.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and implement data pipelines and ETL workflows using cutting-edge Azure technologies (e.g., Databricks, Synapse Analytics, Synapse Pipelines).</li><li>Manage and optimize big data processes, ensuring scalability, efficiency, and data accuracy.</li><li>Build and work with real-time data pipelines leveraging technologies such as Kafka, Event Hubs, and Spark Streaming.</li><li>Apply advanced skills in Python and Spark SQL to build data solutions for analytics and machine learning.</li><li>Collaborate with business analysts and stakeholders to implement impactful dashboards using Power BI.</li><li>Architect and support the seamless integration of diverse data sources into a central platform for analytics, reporting, and model serving via ML Flow.</li></ul>
<p>We are offering an opportunity for a Software Engineer at our location in Jacksonville, Florida. In this role, you will be tasked with developing, testing, and deploying software solutions using a modern technology stack. You will work closely with product owners, designers, and other engineers in an Agile environment to deliver excellent user experiences. </p><p><br></p><p>Responsibilities: </p><p><br></p><p>• Contribute to the creation and maintenance of software applications utilizing .NET Core, C#, and Angular JS.</p><p>• Develop and implement RESTful APIs to ensure seamless integration between various systems.</p><p>• Participate actively in designing and implementing microservices architecture.</p><p>• Write efficient SQL queries and manage SQL Server databases.</p><p>• Engage in all stages of the software development lifecycle, including gathering requirements, designing, coding, testing, and deploying.</p><p>• Write unit and integration tests, and contribute to automated testing initiatives.</p><p>• Utilize Git for version control and collaborate effectively with the team through code reviews.</p><p>• Apply Docker and containerization technologies for application deployment and management.</p><p>• Participate in Agile ceremonies such as sprint planning, daily stand-ups, sprint reviews, and retrospectives.</p><p>• Troubleshoot software defects and production issues.</p><p>• Stay abreast of the latest technology trends and best practices.</p><p>• Contribute to technical documentation and knowledge sharing within the team.</p><p>• Collaborate effectively with cross-functional teams including product, design, and QA.</p>
<p>We are seeking a <strong>Senior Data Engineer</strong> with deep expertise in <strong>Power BI, Microsoft Fabric, DAX, and SQL</strong> to join our growing analytics team. This role will focus on building and optimizing data models, pipelines, and reporting solutions to support enterprise-wide business intelligence and decision-making. You’ll collaborate closely with analysts, stakeholders, and IT teams to ensure data solutions are scalable, efficient, and insightful.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, build, and maintain scalable data pipelines and ETL processes within the Microsoft ecosystem.</li><li>Develop and optimize complex SQL queries, stored procedures, and views for efficient data extraction and transformation.</li><li>Build and enhance <strong>Power BI dashboards</strong> and reports with advanced DAX calculations to provide actionable insights.</li><li>Work with <strong>Microsoft Fabric</strong> to manage data integration, governance, and scalable analytics workloads.</li><li>Partner with business stakeholders to gather requirements and translate them into technical solutions.</li><li>Ensure data accuracy, security, and availability across multiple business units.</li><li>Monitor, troubleshoot, and optimize data infrastructure for performance and cost efficiency.</li><li>Mentor junior team members and contribute to best practices around data engineering and BI.</li></ul><p><br></p>
<p><strong>Job Summary:</strong></p><p> The Software Engineer will design, develop, and maintain web and desktop applications using .NET technologies. This role involves collaborating with cross-functional teams to deliver robust and scalable solutions.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Develop and maintain applications using C#, ASP.NET, and .NET Core.</li><li>Write clean, efficient, and well-documented code.</li><li>Participate in code reviews and testing.</li><li>Integrate APIs and work with SQL Server databases.</li><li>Troubleshoot and optimize application performance.</li></ul><p><br></p>
<p><strong>Job Title: DevOps Engineer </strong></p><p> <strong>Location:</strong> Alpharetta, GA </p><p> <strong>Duration:</strong> 1 Year (Potential for Extension)</p><p> <strong>Employment Type:</strong> Contract </p><p><strong>About the Role</strong></p><p>We are seeking a skilled and proactive <strong>DevOps Engineer</strong> to join our team. This team is responsible for managing and evolving a complex CI/CD pipeline that spans Octopus Deploy, GitHub, and AWS infrastructure. The ideal candidate will be passionate about automation, deployment stability, and continuous improvement.</p><p>This role offers the opportunity to work with a collaborative team managing over 50 environments and 400+ servers, with a focus on customizing, stabilizing, and automating deployment workflows.</p><p>e<strong>Key Responsibilities</strong></p><ul><li>Customize and stabilize CI/CD pipelines using Octopus Deploy and GitHub Actions with a strong emphasis on security.</li><li>Develop and maintain deployment scripts using PowerShell.</li><li>Troubleshoot and support existing configurations involving Terraform and Ansible.</li><li>Collaborate with development and SRE teams to streamline deployment processes and reduce manual intervention.</li><li>Manage secrets securely using AWS KMS.</li><li>Create and maintain reusable scripts and templates for deployment automation.</li></ul>
<p>The Data Engineer will architect, develop, and maintain high-quality data pipelines and infrastructure to support analytics, reporting, and data science teams. This position requires hands-on expertise in ETL development, cloud platforms, and data warehousing to ensure the availability and accuracy of critical business data.</p>
<p>We are looking for a dedicated Software Automation Engineer to join our team in Schaumburg, Illinois. In this role, you will play a key part in designing and implementing automation frameworks to optimize engineering processes. This is a long-term contract position, offering an excellent opportunity to work onsite in a collaborative and innovative environment.</p><p><br></p><p>Responsibilities:</p><p>• Develop and implement automation frameworks tailored to engineering processes.</p><p>• Utilize CREO to design and refine automation solutions.</p><p>• Collaborate with team members to ensure seamless integration of automation systems.</p><p>• Conduct thorough testing and debugging to enhance system reliability.</p><p>• Provide technical support and guidance for automation-related challenges.</p><p>• Document processes and solutions to maintain clear project records.</p><p>• Work independently while receiving support from senior engineers.</p><p>• Ensure compliance with industry standards and best practices in automation design.</p><p>• Manage project timelines and deliverables effectively.</p><p>• Contribute to the continuous improvement of automation tools and methodologies. </p><p>• Integrate with Windchill PLM.</p>
We are looking for an Application Support Engineer to provide technical assistance and expertise for design-related applications, including AutoCAD and other specialized tools. In this role, you will collaborate with users, teams, and vendors to ensure seamless operation, troubleshoot issues, and optimize workflows. This position is integral to maintaining the efficiency of design and drafting operations within the organization.<br><br>Responsibilities:<br>• Deliver technical support for design applications such as AutoCAD, addressing errors, performance concerns, and integration challenges.<br>• Assist design teams with application workflows, file management, and template configurations.<br>• Coordinate with vendors and internal IT teams to resolve complex software issues and ensure system reliability.<br>• Develop and update documentation, training resources, and best practices for application usage.<br>• Oversee software updates, patches, licensing, and the deployment of new tools.<br>• Partner with design, engineering, and operations teams to improve workflows and boost productivity.<br>• Troubleshoot hardware and peripheral devices, including printers and plotters, to support end users.<br>• Monitor system performance and proactively address potential issues to maintain optimal functionality.
<p><strong>Job Summary:</strong></p><p> The Data Engineer will build and optimize data pipelines, ensuring reliable data collection, storage, and access for analytics and machine learning use cases. This role will work closely with analytics and engineering teams to design scalable data architectures.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Develop and maintain ETL/ELT pipelines for structured and unstructured data.</li><li>Integrate data from multiple sources into cloud data warehouses (Snowflake, BigQuery, Redshift).</li><li>Optimize data storage and retrieval for performance and scalability.</li><li>Collaborate with analysts and data scientists to ensure data quality.</li><li>Implement data governance and security best practices.</li></ul><p><br></p>
<p><strong>Key Responsibilities</strong></p><ul><li>Design, develop, and maintain efficient, reliable data pipelines and ETL processes using <strong>Microsoft Fabric</strong> or <strong>Databricks</strong> for large-scale data processing.</li><li>Collaborate with stakeholders to understand business requirements and create data solutions that align with technical and business strategies.</li><li>Optimize and improve workflows for data ingestion, transformation, and delivery to ensure high performance at scale.</li><li>Implement scalable architectures for big data processing, ensuring data quality, security, and governance best practices.</li><li>Develop and maintain reusable code to ensure consistency and reliability for future workflows.</li><li>Monitor and troubleshoot production environments, ensuring data systems perform as expected.</li><li>Stay current with emerging technologies, trends, and tools related to cloud-based data engineering platforms.</li></ul>
<p>Our client in Wisconsin has an immediate and critical opening for a Senior Software Engineer to join their team on a long-term contract. They are looking to bring in a staff engineer-level individual who not only writes excellent code but also provides technical leadership, mentorship, and strong partnership with their in-house Architect.</p><p><br></p><p>This is not a heads-down coding role—this is a seat at the table for someone who can guide and direct work across a team, own design responsibilities, and jump in with hands-on development as needed. You’ll help shape technical strategy while also staying close to delivery.</p><p><br></p><p>What You’ll Be Doing:</p><p>Lead technical efforts in a collaborative Scrum team environment.</p><p><br></p><p>Partner closely with the Architect to co-own solution design and implementation.</p><p><br></p><p>Provide guidance and task direction to other engineers while contributing code yourself.</p><p><br></p><p>Bring proactive energy to problem-solving—someone who speaks up, shares ideas, and drives initiatives forward.</p><p><br></p><p>Balance and bridge cloud and on-premise environments with a strong understanding of DevOps and platform capabilities.</p><p><br></p><p>Support modern application architecture using tools like Kafka, GitHub, SQL Server, and OpenShift.</p><p><br></p><p>Must-Have Technical Skills:</p><p>Java (Core & Frameworks)</p><p><br></p><p>SQL Server</p><p><br></p><p>GitHub</p><p><br></p><p>OpenShift</p><p><br></p><p>Kafka</p><p><br></p><p>Cloud & On-Premise Development (on-premise is critical)</p><p><br></p><p>DevOps & Test Automation exposure</p><p><br></p><p>Working in Agile/Scrum teams</p>
We are looking for an experienced Sr. Software Engineer to join our team in Jacksonville, Florida. This is a long-term contract position offering the opportunity to develop and deploy innovative applications while working with cutting-edge technologies. The ideal candidate will have strong problem-solving skills and a deep understanding of software development processes.<br><br>Responsibilities:<br>• Design, develop, and implement applications that meet business requirements and technical specifications.<br>• Utilize Domino Designer tools to create, modify, and maintain applications.<br>• Manage and optimize Domino databases, ensuring efficient access and organization of data.<br>• Write and troubleshoot scripts using LotusScript and JavaScript to enhance application functionality.<br>• Build web-based interfaces using JavaScript and other web development tools.<br>• Analyze and resolve technical issues related to Domino applications and infrastructure.<br>• Collaborate with team members and stakeholders to ensure successful project delivery.<br>• Apply domain knowledge to understand and support business workflows and processes.<br>• Develop APIs to facilitate seamless integration between systems.<br>• Stay updated on industry trends and technologies to continuously improve application development practices.
We are looking for a skilled Big Data Engineer to join our team in Westfield, Indiana. This role involves leveraging advanced technologies to design, implement, and optimize big data solutions that drive our business objectives. The ideal candidate will have extensive experience in data engineering and a passion for building scalable systems.<br><br>Responsibilities:<br>• Design, develop, and implement scalable big data solutions using Python, Apache Spark, and other relevant technologies.<br>• Build and optimize ETL pipelines to efficiently handle large volumes of structured and unstructured data.<br>• Manage and process data using frameworks such as Apache Hadoop and Apache Kafka.<br>• Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions.<br>• Utilize cloud platforms like Amazon Web Services (AWS) to deploy and maintain data systems.<br>• Ensure the security, reliability, and performance of big data architectures.<br>• Troubleshoot and resolve issues related to data systems and pipelines.<br>• Monitor and analyze system performance to identify opportunities for improvement.<br>• Stay updated on emerging technologies and incorporate them into data engineering practices as appropriate.
<p>We are looking for a skilled Senior Software Engineer to join our team in Oxford, Massachusetts. In this role, you will lead backend development efforts, design robust APIs, and contribute to the creation of scalable web services. This position requires a strong background in Java and JavaScript, along with experience in cloud technologies and agile methodologies. This role is hybrid, onsite 3 days a week. Candidates must have GC or be USC.</p><p><br></p><p>Responsibilities:</p><p>• Develop and enhance backend systems using Java, Spring Boot, and related technologies.</p><p>• Design and implement APIs to ensure seamless integration across systems.</p><p>• Build and maintain scalable web services to support business needs.</p><p>• Collaborate with cross-functional teams to define business logic and technical requirements.</p><p>• Utilize cloud platforms such as AWS for deploying and managing applications.</p><p>• Participate in Agile Scrum processes to deliver high-quality software solutions.</p><p>• Conduct AB testing to optimize system performance and user experience.</p><p>• Implement client-side scripting using JavaScript and Ajax for interactive web applications.</p><p>• Manage and track development tasks using Atlassian Jira tools.</p><p>• Provide technical guidance and mentorship to less experienced team members.</p>
<p>Our client, a popular, local gaming and hospitality property in Las Vegas, is seeking an <strong>IT Applications Support Specialist</strong> to join their onsite technology team. This role serves as the escalation point from the Help Desk, supporting hotel, casino, and food & beverage operations with a focus on <strong>Micros POS systems, Aristocrat Gaming, and Oracle Fusion applications</strong>. You’ll collaborate cross-functionally with IT, Casino Operations, and vendor partners while delivering exceptional guest service in a fast-paced environment. This will be a short-term CTH role, and fully onsite on property. </p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Serve as the primary administrator for <strong>Micros POS</strong>, including configuration, deployment, programming, and maintenance.</li><li>Provide application support for <strong>Aristocrat Gaming, Oracle Fusion, Bartec, Lotus Notes, and Visual Basic</strong> systems.</li><li>Configure, deploy, and maintain IT hardware/software across hotel, casino, bar, and front-of-house operations.</li><li>Perform and monitor <strong>backups, restores, and system log reviews</strong> to ensure operational integrity.</li><li>Ensure compliance with <strong>Nevada Gaming Control Board regulations</strong> and company SOPs.</li><li>Partner with internal teams and vendors to resolve escalated issues while maintaining a high standard of guest service.</li><li>Cross-train and support other IT functions within the department.</li></ul><p><br></p>
<p>We are looking for an experienced Data Engineer to join our team in Houston. In this role, you will design and optimize data pipeline architecture, ensuring the seamless integration and analysis of structured and unstructured data. You will collaborate with stakeholders to deliver reliable and scalable data solutions that align with organizational objectives.</p><p><br></p><p>Responsibilities:</p><p>• Design, develop, and maintain scalable data pipelines using Microsoft Fabric components, including Azure Data Factory, Lakehouse, Data Warehouses, and Dataflows.</p><p>• Implement solutions for extracting, integrating, and processing large volumes of structured and unstructured data.</p><p>• Collaborate with data scientists, analysts, and stakeholders to understand data requirements and deliver tailored solutions.</p><p>• Optimize and troubleshoot data pipelines to ensure high performance and reliability.</p><p>• Apply data governance and security best practices to maintain data integrity and compliance.</p><p>• Monitor and enhance data infrastructure to ensure scalability and availability.</p><p>• Stay informed on advancements in data engineering and Microsoft Fabric technologies to drive innovation.</p><p>• Manage the Data Platform roadmap, including capacity planning, future-proofing, and optimization.</p><p>• Address data quality issues by tracing lineage, implementing cleansing processes, and ensuring consistency.</p><p>• Coordinate with IT Operations and external vendors to resolve production issues and improve platform capabilities.</p>
We are looking for an experienced Software Engineer III to join our dynamic team in Alpharetta, Georgia. This long-term contract position offers an exciting opportunity to design, develop, and optimize enterprise-level software solutions. The ideal candidate will bring strong technical expertise, a collaborative mindset, and a passion for delivering high-quality, scalable applications.<br><br>Responsibilities:<br>• Design, develop, and implement cloud-based data applications and APIs using industry best practices.<br>• Build and maintain microservices and middleware APIs within a microservice architecture.<br>• Troubleshoot and resolve issues in existing systems, ensuring adherence to secure coding standards.<br>• Collaborate with cross-functional teams to gather and finalize technical requirements.<br>• Create and review detailed technical specifications for complex system components.<br>• Apply DevOps strategies to streamline development and deployment processes.<br>• Upgrade legacy C++ applications to enterprise Java while maintaining functionality and performance.<br>• Conduct code reviews and implement coding best practices to ensure quality and consistency.<br>• Analyze and resolve complex technical challenges to support ongoing project needs.<br>• Stay updated on emerging technologies to enhance system design and development.
<p>We are looking for an experienced Senior Data Engineer to join our team in Oxford, Massachusetts. In this role, you will design and maintain data platforms, leveraging cutting-edge technologies to optimize processes and drive analytical insights. This position requires a strong background in Python development, cloud technologies, and big data tools. This role is hybrid, onsite 3 days a week. Candidates must have GC or be USC.</p><p><br></p><p>Responsibilities:</p><p>• Develop, implement, and maintain scalable data platforms to support business needs.</p><p>• Utilize Python and PySpark to design and optimize data workflows.</p><p>• Collaborate with cross-functional teams to integrate data solutions with existing systems.</p><p>• Leverage Snowflake and other cloud technologies to manage and store large datasets.</p><p>• Implement and refine algorithms for data processing and analytics.</p><p>• Work with Apache Spark and Hadoop to build robust data pipelines.</p><p>• Create APIs to enhance data accessibility and integration.</p><p>• Monitor and troubleshoot data platforms to ensure optimal performance.</p><p>• Stay updated on emerging trends in big data and cloud technologies to continuously improve solutions.</p><p>• Participate in technical discussions and provide expertise during team reviews.</p>
<p>We’re seeking a skilled and motivated Software Engineer to join our dynamic team in Jacksonville, Florida. This Contract-to-Permanent opportunity offers the chance to work on impactful projects, drive automation, and contribute to process optimization in a collaborative and forward-thinking environment.</p><p>Key Responsibilities:</p><ul><li>Design and develop robust software applications using C++ to meet business and technical requirements.</li><li>Write clean, efficient, and maintainable code following industry best practices.</li><li>Collaborate with cross-functional teams to gather requirements and ensure smooth system integration.</li><li>Perform unit testing and support integration testing to ensure software quality and reliability.</li><li>Diagnose and resolve software and hardware-related issues effectively.</li><li>Create and refine batch scripts to automate workflows and enhance operational efficiency.</li><li>Maintain and improve existing software systems to boost performance and functionality.</li><li>Stay current with advancements in C++ and related technologies to apply modern development techniques.</li><li>Offer technical insights and recommendations to enhance development processes and team productivity.</li></ul><p><br></p>
<p><strong>Job Summary:</strong></p><p> The DevOps Engineer will streamline development and deployment pipelines, automate processes, and ensure system reliability. You will bridge software development and IT operations to enable faster, more stable releases.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Implement CI/CD pipelines using Jenkins, GitHub Actions, or Azure DevOps.</li><li>Automate infrastructure with Terraform or Ansible.</li><li>Monitor system performance and reliability.</li><li>Collaborate with development teams to optimize release processes.</li><li>Manage containerized environments with Docker and Kubernetes.</li></ul><p><br></p>
<p>We are looking for an experienced and innovative Software Engineer's (Mid and Senior level) to join our team in Sandy, Utah. In this role, you’ll design and develop advanced solutions that drive automation in the cybersecurity domain, impacting enterprise customers worldwide. You’ll work on systems at scale, collaborate with industry-leading professionals, and see the direct results of your contributions.</p><p><br></p><p>Responsibilities:</p><p>• Develop and implement creative solutions using advanced technologies to enhance platform functionality.</p><p>• Design secure and efficient APIs to streamline and automate security operations.</p><p>• Lead efforts in continuous integration, deployment, and optimization of complex software systems.</p><p>• Automate key stages of the software development lifecycle for improved efficiency.</p><p>• Collaborate with internal and external teams to ensure seamless product functionality and user satisfaction.</p><p>• Take ownership of projects, ensuring timely delivery and high-quality output.</p><p>• Mentor team members, fostering a culture of knowledge sharing and collaboration.</p><p>• Create scalable systems that address enterprise-level challenges and deliver measurable results.</p>
We are looking for a skilled Data Engineer to join our team in Carmel, Indiana. This is a long-term contract opportunity for an individual with a strong background in building and optimizing data pipelines and systems. The ideal candidate will have a passion for working with large-scale data and a proven ability to leverage modern tools and technologies to deliver high-quality solutions.<br><br>Responsibilities:<br>• Design, implement, and maintain scalable data pipelines and ETL processes to support business needs.<br>• Develop and optimize solutions using tools such as Apache Spark, Python, and Apache Hadoop.<br>• Manage and integrate streaming data platforms like Apache Kafka to ensure real-time data processing.<br>• Utilize technologies such as AWS Lambda, Step Functions, Glue, and Redshift for cloud-based data solutions.<br>• Collaborate with cross-functional teams to understand data requirements and provide innovative solutions.<br>• Ensure data quality and integrity through rigorous testing and validation processes.<br>• Create and maintain documentation for data workflows, processes, and system architecture.<br>• Implement infrastructure as code using Terraform to enhance system reliability and scalability.<br>• Troubleshoot and resolve data-related technical issues promptly.<br>• Stay updated on emerging trends and technologies in data engineering to continuously improve practices.
We are looking for a skilled Data Engineer to join our team in Chicago, Illinois. In this long-term contract role, you will contribute to the development and maintenance of robust data infrastructures that support business intelligence solutions. If you thrive in a collaborative environment and enjoy solving complex problems, this position offers an excellent opportunity to apply your expertise and drive impactful projects forward.<br><br>Responsibilities:<br>• Design, build, and maintain scalable data infrastructures to support business intelligence needs.<br>• Develop and implement data models and frameworks that optimize data analysis and reporting.<br>• Collaborate with business partners to understand data requirements and deliver effective solutions.<br>• Lead initiatives in business intelligence projects, ensuring alignment with organizational goals.<br>• Apply compliance standards within project scope, document processes, and participate in related activities.<br>• Utilize industry knowledge to enhance data solutions and provide informed recommendations.<br>• Work on the full data warehouse lifecycle, including data analysis, dimensional modeling, and design.<br>• Solve complex problems using proven methodologies and adapt them to new challenges.<br>• Assist team members informally with troubleshooting and knowledge sharing.<br>• Leverage tools such as Apache Spark, Python, Hadoop, Kafka, and ETL processes to optimize data workflows.
<p><br></p><ul><li>Design and implement scalable data architectures using Azure services, especially Azure Data Factory</li><li>Build, deploy, and maintain data pipelines for structured and unstructured data</li><li>Lead the integration of Microsoft Fabric into enterprise data solutions</li><li>Collaborate with data scientists to operationalize AI/ML models and embed them into business processes</li><li>Ensure data governance, security, and best practices across the entire data lifecycle</li><li>Partner with cross-functional teams to translate business needs into technical solutions</li></ul><p><br></p><p> Nice to Have</p><ul><li>Microsoft certifications (e.g., Azure Data Engineer, Azure Solutions Architect)</li><li>Experience with Power BI integration</li><li>Exposure to data mesh or data fabric concepts</li></ul><p><br></p><p><br></p>