<p>Robert Half Technology is looking for a programmer analyst who combines a real passion for formulating and defining systems scope and objectives with an understanding of software and applications programming and industry requirements. Through research and fact-finding, you'd make recommendations for developing or modifying applications or databases.</p><p>What you get to do every single day</p><p>· Build and test programming changes for each phase of systems development prior to implementation. Writes test cases and expected results. Reviews results for conformance to requirements. May plan simple tests or a defined subset of a larger system test. May make recommendations for acceptance/rejection if requirements are not all met</p><p>· Analyzes user requests for systems changes or improvements. Documents functional requirements and assesses cost, feasibility and utility. Develops recommendation as to how, when or whether to proceed with making the changes</p><p>· Acts as the in-house guide on applications, systems and/or processes to internal clients in identifying and resolving, processing/reporting programming problems. Consultation can take the form of trouble shooting and/or education</p><p>· Provides ongoing training and assistance for end users and other partner groups for a particular application, system or process</p><p>· Analyzes processing procedures. Develops recommendations for improvements</p><p>· Builds and maintains dictionaries for applications and systems support by the analyst</p><p>· Analyzes and documents issues. Works with other programmers to correct code problems</p><p>· Assists in the developing communication content for specific system changes being implemented in production</p>
<p>Interact and support Front Office (Traders) with data management, quantitative analysis and application development • Knowledge of the PJM, New York, and/or Northeast power markets. Congestion trading and/or financial transmission rights are a plus • Work with an Enterprise Data Warehouse with a comprehensive understanding of relational and non-relational databases • Develop complex logic in database schema based on diverse business processes spanning multiple subsystems and large datasets • Create and enhance standard and custom reports and workflows as per operational demands. • Comply with all ENGIE NA policies and procedures. • Create and maintain technical documentation</p>
<p><strong>Skills and Knowledge:</strong></p><ul><li>Excellent understanding of Relational Database Design</li><li>Strong technical experience in database development, performing DDL operations, writing queries and stored procedures, and optimizing database objects</li><li>Working knowledge of ETL using SSIS or comparable tool</li><li>Solid reporting skills preferably using SSRS</li><li>Establishment and implementation of reporting tools to create reports and dashboards using SSRS, Power BI, Tableau, or similar analytic tool</li><li>Experience with Data management standards such as data governance is a plus</li><li>Effective analyst able to work closely with non-technical users</li><li>Knowledge of MS Access, MS Visual Studio, Crystal Reports or Datawatch Monarch is a plus</li><li>Proficiency in interpersonal communication, presentation, problem solving and</li></ul><p><br></p>
<p>Position Title: Data Engineer</p><p>Location: Onsite – Houston Area</p><p>Compensation:</p><ul><li>Base Salary: $120K–$130K</li><li>Bonus: ~10% </li></ul><p>Overview:</p><p>We’re hiring a Data Engineer to lead the development and optimization of enterprise-grade data pipelines and infrastructure. This role is essential to enabling high-quality analytics, reporting, and business intelligence across the organization. The ideal candidate will bring deep expertise in Azure-based data tools, strong SQL and BI capabilities, and a collaborative mindset to support cross-functional data initiatives.</p><p>Key Responsibilities:</p><ul><li>Design, build, and maintain scalable data pipelines using Azure Data Factory, Microsoft Fabric, PySpark, and Spark SQL</li><li>Develop ETL processes to extract, transform, and load data from diverse sources</li><li>Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality, actionable data solutions</li><li>Ensure data integrity, security, and compliance with governance standards</li><li>Optimize pipeline performance and troubleshoot data infrastructure issues</li><li>Manage the data platform roadmap, including capacity planning and vendor coordination</li><li>Support reporting and analytics needs using Power BI and SQL</li><li>Drive continuous improvement in data quality, accessibility, and literacy across the organization</li><li>Monitor usage, deprecate unused datasets, and implement data cleansing processes</li><li>Lead initiatives to enhance data modeling, visualization standards, and reporting frameworks</li></ul><p><br></p>
<p>The Salesforce Technical Architect will deliver technical expertise in design, development, coding, testing, and enhancements to the Salesforce ecosystem. This role involves leading design sessions, architecting solutions from the ground up, and mentoring team members on best practices.</p><p> </p><p>Key Responsibilities:</p><ul><li>Architect Salesforce.com solutions from the ground up</li><li>Design, develop, test, and deploy high-quality business solutions on the Salesforce platform</li><li>Lead design/coding sessions and resolve technical/design conflicts</li><li>Implement solutions across Sales Cloud, Service Cloud, Force.com, Chatter, and AppExchange</li><li>Develop using Apex, Visualforce, Lightning Components, SOQL, and related technologies</li><li>Integrate Salesforce with other applications using SOAP, REST, and middleware tools like MuleSoft</li><li>Mentor and coach less-experienced team members</li></ul><p><br></p>
<p>Job Description: Technical Analyst – Data Science</p><p><br></p><p>We are seeking a highly analytical and technically skilled Technical Analyst – Data Science to support business needs through data analysis, solution development, and collaboration with cross-functional teams. This role blends technical expertise with business acumen to deliver impactful insights and scalable data solutions.</p><p>WKey Responsibilities:</p><ul><li>Analyze business processes and translate requirements into technical specifications</li><li>Design and develop scripts, APIs, and automation tools for advanced analytics</li><li>Build and maintain data pipelines to support data science initiatives</li><li>Collaborate with stakeholders and data scientists to define problems and implement solutions</li><li>Develop reports and visualizations to communicate insights and metrics</li><li>Ensure data quality and reliability across analytics workflows</li><li>Support Agile development processes including sprint planning and UAT</li><li>Document technical approaches and contribute to team knowledge sharing</li><li>Apply geospatial analysis techniques to support location-based decision-making</li></ul>
Job Summary: We are seeking a experienced Data Engineer with 8+ years of experience to architect, build, and maintain scalable data infrastructure and pipelines. This role is pivotal in enabling advanced analytics and data-driven decision-making across the organization. The ideal candidate will have deep expertise in data architecture, cloud platforms, and modern data engineering tools. <br> Key Responsibilities: Design, develop, and maintain scalable and efficient data pipelines and ETL processes. Architect data solutions that support business intelligence, machine learning, and operational reporting. Collaborate with cross-functional teams to gather requirements and deliver data solutions aligned with business goals. Ensure data quality, integrity, and security across all systems and platforms. Optimize data workflows and troubleshoot performance issues. Integrate structured and unstructured data from various internal and external sources. Implement and enforce data governance policies and best practices. Preferred Skills: Experience with real-time data streaming technologies (e.g., Kafka, Spark Streaming). Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). Knowledge of CI/CD pipelines and version control systems (e.g., Git). Relevant certifications in cloud or data engineering technologies.