We are looking for a Software Developer to join a team supporting application development efforts in Texas. This is a Contract position suited for someone who is detail oriented and can build, enhance, and maintain web-based solutions using Microsoft technologies. The role offers the opportunity to contribute across the full development lifecycle while partnering with technical stakeholders to deliver reliable, scalable applications.<br><br>Responsibilities:<br>• Design, develop, and maintain software applications using C#, .NET, and ASP.NET technologies.<br>• Create and enhance web-based features that support usability, performance, and long-term maintainability.<br>• Write clean, efficient code and participate in code reviews to uphold development standards.<br>• Troubleshoot application issues, identify root causes, and implement effective fixes in a timely manner.<br>• Collaborate with team members and business partners to translate technical needs into practical software solutions.<br>• Support testing activities, debugging efforts, and deployment readiness for new and updated functionality.<br>• Contribute to ongoing improvements of existing systems built on the .NET Framework.<br>• Develop interactive front-end components using JavaScript to improve user experience and application functionality.
We are looking for a Data Engineer to support the design, development, and optimization of modern data solutions in Houston, Texas. This Long-term Contract position is ideal for someone who enjoys building reliable pipelines, working with large-scale datasets, and improving the flow of information across systems. The role offers the opportunity to contribute technical expertise in a collaborative environment focused on performance, scalability, and data quality.<br><br>Responsibilities:<br>• Build and maintain scalable data pipelines that collect, transform, and deliver data for analytics and operational use.<br>• Develop ETL processes that improve the accuracy, consistency, and availability of data across multiple sources.<br>• Use Python and Apache Spark to process large datasets efficiently and support advanced data engineering workflows.<br>• Work with Hadoop-based environments to manage distributed data processing and storage activities.<br>• Integrate streaming and messaging solutions using Apache Kafka to support timely data movement and event-driven processing.<br>• Monitor pipeline performance, troubleshoot failures, and implement enhancements that strengthen reliability and efficiency.<br>• Partner with technical and business stakeholders to understand data needs and translate them into practical engineering solutions.
<p><strong>Key Responsibilities:</strong></p><ul><li>Design and implement <strong>scalable cloud storage solutions in AWS</strong>, with a strong focus on <strong>S3 bucket architecture, optimization, and governance</strong></li><li>Support and help lead a small solutions team (Cloud Engineering & Storage) while contributing hands-on to technical delivery</li><li>Partner with business units and project managers to <strong>gather storage and infrastructure requirements</strong>, translating them into scalable cloud solutions</li><li>Evaluate current <strong>data center and cloud environments</strong>, recommending improvements, migrations, and modernization strategies</li><li>Manage and optimize <strong>cloud storage platforms (including file/object storage systems such as FFX or similar technologies)</strong></li><li>Provide support for <strong>production environments</strong>, ensuring reliability, performance, and cost efficiency across storage platforms</li><li>Communicate effectively with both technical and non-technical stakeholders, including limited <strong>end-user interaction and support</strong></li></ul><p><br></p>
<p>Position Overview</p><p>We are seeking a delivery‑focused Data Automation Engineer to design and implement innovative automation solutions across a Microsoft Azure‑based data analytics platform. This role partners closely with engineering teams and stakeholders to translate business requirements into scalable data engineering and AI‑enabled solutions.</p><p>The ideal candidate is hands‑on with Azure Data Factory, Synapse Pipelines, Apache Spark, Python, and SQL, and brings experience building reliable ETL pipelines across SQL and NoSQL environments. This role emphasizes performance optimization, automation, and proactive data quality within Agile DevOps delivery models.</p><p><br></p><p>Key Responsibilities</p><p>Data Engineering & Automation</p><ul><li>Develop high‑performance data pipelines using Azure Data Factory, Synapse Pipelines, Spark Notebooks, Python, and SQL.</li><li>Design ETL workflows supporting advanced analytics, reporting, and AI/ML use cases.</li><li>Implement data migration, integrity, quality, metadata, and security controls across pipelines.</li><li>Monitor, troubleshoot, and optimize pipelines for availability, scalability, and performance.</li></ul><p>Performance Testing & Optimization</p><ul><li>Execute ETL performance testing and validate load performance against benchmarks.</li><li>Analyze pipeline runtime, throughput, latency, and resource utilization.</li><li>Support tuning activities (e.g., query optimization, partitioning, indexing).</li><li>Validate data completeness and consistency after high‑volume processing.</li></ul><p>Platform Collaboration & DevOps Support</p><ul><li>Collaborate with DevOps and infrastructure teams to optimize compute, memory, and scaling.</li><li>Maintain versioning and configuration control across environments.</li><li>Support production, testing, development, and integration environments.</li><li>Actively participate in Agile delivery processes including Program Increment planning.</li></ul>
We are looking for a Business Intelligence (BI) Engineer to join a growing team in the Energy/Natural Resources sector. This contract opportunity with potential for a permanent role is ideal for someone who can turn complex business data into clear, actionable reporting and dashboard solutions. The position will focus on building reliable business intelligence assets, partnering with stakeholders to understand reporting needs, and delivering insights through Microsoft Power BI and related BI technologies.<br><br>Responsibilities:<br>• Design and develop interactive dashboards, visual reports, and data models that support operational and strategic decision-making.<br>• Work closely with business partners to gather reporting objectives, translate requirements into technical solutions, and deliver meaningful analytics.<br>• Build, optimize, and maintain Power BI datasets, reports, and dashboards to ensure accuracy, usability, and performance.<br>• Create calculated measures and logic using DAX to support advanced reporting needs and data interpretation.<br>• Validate data from multiple sources, troubleshoot reporting issues, and resolve inconsistencies to maintain dependable business intelligence outputs.<br>• Improve reporting processes by identifying opportunities to streamline workflows, enhance data visibility, and strengthen analytics capabilities.<br>• Document BI solutions, reporting standards, and technical design details to support ongoing maintenance and knowledge sharing.<br>• Support evolving business intelligence initiatives, including changes to reporting environments or related systems, as needed.
<p><strong><u>Essential Duties and Responsibilities:</u></strong></p><ul><li>Design and deploy F5 BIG-IP solutions, including LTM (Local Traffic Manager), DNS, and APM (Access Policy Manager).</li><li>Design and deploy Security Assertion Markup Language (SAML)/ OpenID connect (OIDC) authentication methodologies.</li><li>Configure and manage advanced F5 iRules and policies to support business-critical applications.</li><li>Optimize application performance by implementing load balancing, SSL offloading, and traffic routing solutions.</li><li>Troubleshoot and resolve issues related to F5 devices, ensuring high availability and performance.</li><li>Collaborate with cross-functional teams to integrate F5 solutions into existing network infrastructure.</li><li>Monitor F5 devices and applications using analytics tools to detect and mitigate potential risks.</li><li>Implement F5 WAF (Web Application Firewall) configurations to protect against web-based threats.</li><li>Automate routine F5 tasks using APIs, Ansible, or other automation frameworks.</li><li>Maintain and update F5OS, system documentation, policies, and procedures.</li><li>Stay updated on the latest F5 technologies and industry best practices.</li></ul><p><br></p>