<p>We are seeking a talented and motivated Python Data Engineer to join our global team. In this role, you will be instrumental in expanding and optimizing our data assets to enhance analytical capabilities across the organization. You will collaborate closely with traders, analysts, researchers, and data scientists to gather requirements and deliver scalable data solutions that support critical business functions.</p><p><br></p><p>Responsibilities</p><ul><li>Develop modular and reusable Python components to connect external data sources with internal systems and databases.</li><li>Work directly with business stakeholders to translate analytical requirements into technical implementations.</li><li>Ensure the integrity and maintainability of the central Python codebase by adhering to existing design standards and best practices.</li><li>Maintain and improve the in-house Python ETL toolkit, contributing to the standardization and consolidation of data engineering workflows.</li><li>Partner with global team members to ensure efficient coordination and delivery.</li><li>Actively participate in internal Python development community and support ongoing business development initiatives with technical expertise.</li></ul>
<p>As our portfolio of AI-driven solutions continues to expand, we’re looking for an experienced <strong>Machine Learning Engineer</strong> to join our high-impact data science team. This role offers the opportunity to work across trading, operations, and support functions—delivering production-grade machine learning systems that solve real business problems.</p><p>You’ll collaborate with data scientists, software engineers, and commercial stakeholders to design, build, and deploy models that drive decision-making and innovation. From project scoping to model deployment, you’ll have visibility and influence across the full ML lifecycle.</p><p>🔧 Core Responsibilities</p><ul><li>Act as a thought partner to commercial teams, identifying high-value opportunities for AI/ML applications</li><li>Lead the design, development, and deployment of machine learning systems, with a focus on <strong>NLP</strong>, <strong>LLMs</strong>, and <strong>Generative AI</strong></li><li>Prioritize projects based on business impact and evolving market conditions</li><li>Collaborate with cross-functional teams to gather requirements and align solutions with strategic goals</li><li>Integrate ML solutions—including GenAI—into existing platforms to ensure seamless user experiences and scalable adoption</li><li>Participate in code reviews, experiment design, and tooling decisions to maintain high engineering standards</li><li>Share knowledge and mentor colleagues to build machine learning fluency across the organization</li></ul><p><br></p>
<p><strong>Full Stack Python Developer</strong></p><p>We are looking for a talented <strong>Full Stack Python Developer</strong> to join our team in <strong>Houston, Texas</strong>. In this role, you will collaborate with technical and non-technical stakeholders to design and develop innovative applications that support global commercial and operational functions. This position offers an exciting opportunity to create impactful solutions that enhance decision-making and optimize processes.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Design, develop, and maintain full-stack applications using <strong>Python</strong>, <strong>React</strong>, and <strong>C#</strong>.</li><li>Rapidly prototype and iterate on solutions based on user feedback.</li><li>Analyze business requirements and manage project lifecycles independently.</li><li>Collaborate with cross-functional teams to identify opportunities for innovation.</li><li>Optimize and maintain relational databases such as <strong>PostgreSQL</strong> or <strong>Oracle</strong>.</li><li>Develop APIs and integrate existing tools and platforms to enhance system capabilities.</li><li>Stay current with advancements in Python, React, and related technologies.</li><li>Provide technical expertise and support to commercial teams.</li><li>Troubleshoot and resolve issues within existing applications.</li></ul><p><br></p><p><br></p>
We are looking for an experienced Lead Data Engineer to oversee the design, implementation, and management of advanced data infrastructure in Houston, Texas. This role requires expertise in architecting scalable solutions, optimizing data pipelines, and ensuring data quality to support analytics, machine learning, and real-time processing. The ideal candidate will have a deep understanding of Lakehouse architecture and Medallion design principles to deliver robust and governed data solutions.<br><br>Responsibilities:<br>• Develop and implement scalable data pipelines to ingest, process, and store large datasets using tools such as Apache Spark, Hadoop, and Kafka.<br>• Utilize cloud platforms like AWS or Azure to manage data storage and processing, leveraging services such as S3, Lambda, and Azure Data Lake.<br>• Design and operationalize data architecture following Medallion patterns to ensure data usability and quality across Bronze, Silver, and Gold layers.<br>• Build and optimize data models and storage solutions, including Databricks Lakehouses, to support analytical and operational needs.<br>• Automate data workflows using tools like Apache Airflow and Fivetran to streamline integration and improve efficiency.<br>• Lead initiatives to establish best practices in data management, facilitating knowledge sharing and collaboration across technical and business teams.<br>• Collaborate with data scientists to provide infrastructure and tools for complex analytical models, using programming languages like Python or R.<br>• Implement and enforce data governance policies, including encryption, masking, and access controls, within cloud environments.<br>• Monitor and troubleshoot data pipelines for performance issues, applying tuning techniques to enhance throughput and reliability.<br>• Stay updated with emerging technologies in data engineering and advocate for improvements to the organization's data systems.
<p>Position Overview</p><p>We are seeking a delivery‑focused Data Automation Engineer to design and implement innovative automation solutions across a Microsoft Azure‑based data analytics platform. This role partners closely with engineering teams and stakeholders to translate business requirements into scalable data engineering and AI‑enabled solutions.</p><p>The ideal candidate is hands‑on with Azure Data Factory, Synapse Pipelines, Apache Spark, Python, and SQL, and brings experience building reliable ETL pipelines across SQL and NoSQL environments. This role emphasizes performance optimization, automation, and proactive data quality within Agile DevOps delivery models.</p><p><br></p><p>Key Responsibilities</p><p>Data Engineering & Automation</p><ul><li>Develop high‑performance data pipelines using Azure Data Factory, Synapse Pipelines, Spark Notebooks, Python, and SQL.</li><li>Design ETL workflows supporting advanced analytics, reporting, and AI/ML use cases.</li><li>Implement data migration, integrity, quality, metadata, and security controls across pipelines.</li><li>Monitor, troubleshoot, and optimize pipelines for availability, scalability, and performance.</li></ul><p>Performance Testing & Optimization</p><ul><li>Execute ETL performance testing and validate load performance against benchmarks.</li><li>Analyze pipeline runtime, throughput, latency, and resource utilization.</li><li>Support tuning activities (e.g., query optimization, partitioning, indexing).</li><li>Validate data completeness and consistency after high‑volume processing.</li></ul><p>Platform Collaboration & DevOps Support</p><ul><li>Collaborate with DevOps and infrastructure teams to optimize compute, memory, and scaling.</li><li>Maintain versioning and configuration control across environments.</li><li>Support production, testing, development, and integration environments.</li><li>Actively participate in Agile delivery processes including Program Increment planning.</li></ul>
<p>As the Software Engineer, you will be responsible for designing and developing automated solutions and system integrations to optimize our business operations. You will be a key player in gathering requirements from non-technical stakeholders, translating them into technical specifications, and ensuring that the delivered solutions meet their needs. You will be responsible for fostering and maintaining strong relationships with stakeholders, ensuring they have confidence in the technology solutions that support their business processes. Your advanced skills in solution design, AWS, and programming languages will be critical to delivering scalable, reliable, and impactful solutions. </p><p><br></p><p><br></p><p>Automation Development: Design, develop, and oversee the maintenance of automation scripts and tools to streamline and optimize business processes.</p><p>Cloud Integration: Architect and manage integrations between various systems and AWS services, ensuring seamless data flow and system interoperability.</p><p>Solution Design: Architect scalable and reliable integration solutions that align with business requirements and technical constraints.</p><p>Testing & Validation: Oversee and participate in the testing of automation and integration solutions to ensure functionality, reliability, and security.</p><p>Documentation: Maintain detailed documentation of automation processes, integration workflows, and system configurations.</p><p>Continuous Improvement: Lead efforts to identify opportunities for process improvements, proposing and implementing innovative automation solutions across the organization.</p><p>Support & Troubleshooting: Provide high-level support for existing automation and integration solutions, troubleshooting issues, and implementing fixes as necessary.</p><p><br></p><p><br></p><p><br></p>
<p><br></p><p>Software Platform Engineer will design, build, and maintain a core Data & Machine Learning platform.</p><p><br></p><p>Platform Development: Design and implement new features for our AWS and Databricks-based platform, staying current with industry trends and advancements in AI. Core Component Implementation: Test and integrate central platform components that support our technology stack and serve tenants across the organization. Collaboration: Partner with other engineering teams to identify and deliver platform enhancements that solve specific business problems. Maintain Excellence: Uphold strict security protocols, compliance controls, and architectural principles in all aspects of your work.</p><p><br></p><p><br></p>