We are looking for an experienced Cloud Administrator to oversee and enhance our cloud and hybrid infrastructure, ensuring optimal performance, security, and scalability. This role requires proactive management of enterprise systems and a hands-on approach to Azure-based technologies. You will play a critical role in implementing best practices, collaborating with teams, and driving innovation within our IT operations.<br><br>Responsibilities:<br>• Design, implement, and maintain robust Azure-based infrastructure, including virtual machines, networking, storage, and identity management.<br>• Manage Azure Active Directory configurations, ensuring secure access and identity governance.<br>• Optimize cloud resources to achieve maximum performance, scalability, and cost efficiency.<br>• Implement and monitor security protocols across both cloud and on-premises environments.<br>• Ensure compliance with industry standards while managing firewalls, VPNs, and endpoint security solutions.<br>• Administer hybrid systems that integrate on-premises and Azure cloud infrastructures.<br>• Troubleshoot and resolve complex infrastructure issues while monitoring system performance.<br>• Develop and maintain automation scripts using tools such as PowerShell, Terraform, or Azure templates.<br>• Collaborate with DevOps teams to improve CI/CD pipelines and adopt cloud-native practices.<br>• Mentor less experienced team members, contribute to strategic IT planning, and support disaster recovery initiatives.
We are in search of a Land Analyst to join our team situated in Houston, Texas. In this role, you will be tasked with the critical responsibility of managing customer applications and maintaining accurate customer records. This position is crucial in the monitoring of customer accounts and taking the necessary actions. This opportunity is a perfect fit for individuals who thrive in a dynamic, fast-paced team environment.<br><br>Responsibilities:<br>• Undertaking the processing of customer credit applications with accuracy and efficiency.<br>• Maintaining accurate customer credit records.<br>• Communicating effectively with operators on issues related to interest calculations, title support requests, curative, suspense, direct deposit, and billing and revenue.<br>• Reviewing assignments, leases, contracts, and title opinions to determine and verify company interest.<br>• Ensuring correct wells, leases, and contracts are assigned during the review of draft assignments.<br>• Managing the workover AFE function, which includes the verification of ownership and interest, internal routing, tracker maintenance, and timely communication of elections to operators.<br>• Creating well, leases, and contract exhibits for documents to be filed of record.<br>• Setting up and maintaining JIB and revenue decks representing company interests in Bolo.<br>• Handling the setup and maintenance of operator and vendor records in Bolo.<br>• Assisting with acquisition, divestiture, due diligence, audits, and other special projects as needed.<br>• Providing necessary documentation and explanations related to land administration during external audits.
<p><strong>Telecom Analyst – Telecom Optimization (Contract)</strong></p><p><strong>Role Summary</strong></p><p>This contract role supports telecom cost optimization and modernization efforts by evaluating legacy analog voice and data services. The Telecom Analyst will analyze usage, billing, and inventory data to determine whether services should be retained, replaced, or disconnected, and will provide clear, data‑driven recommendations to stakeholders. The position requires hands‑on experience with legacy telecommunications circuits and strong analytical and communication skills.</p><p><br></p><p><strong>Key Responsibilities</strong></p><ul><li>Analyze telecommunications billing, usage, and inventory data to identify cost‑saving opportunities</li><li>Evaluate legacy analog lines and data circuits by location, usage, and business need</li><li>Identify redundant, underutilized, or obsolete services for disconnect or replacement</li><li>Collaborate with site contacts, TEM vendors, and telecom carriers to gather and validate required data</li><li>Develop clear recommendations to retain, replace, or disconnect telecom services</li><li>Track findings, decisions, and actions in designated tracking and reporting systems</li><li>Manage project timelines, deliverables, and execution of approved disconnect or optimization initiatives</li><li>Provide regular status updates, resolve issues, and maintain clear communication with stakeholders throughout the project lifecycle</li></ul><p><strong>Key Interactions</strong></p><ul><li>Site leadership and local contacts to validate service requirements and business impact</li><li>Telecom Expense Management (TEM) vendors for billing, inventory, and reporting data</li><li>Telecom carriers to confirm circuit details, disconnects, and service changes</li><li>Internal project stakeholders to provide updates, recommendations, and execution status</li></ul>
<p><strong>Principal Data Scientist (AI/ML Focus)</strong></p><p><strong>Service Type:</strong> 42 Week Contract </p><p><strong>Worksite:</strong> Onsite, Monday–Thursday — Houston, TX</p><p><strong>Pay: </strong>Available on W2 </p><p><strong>Position Overview</strong></p><p>We are seeking a <strong>Principal Scientist, Data</strong> with deep expertise in <strong>AI, Machine Learning, Natural Language Processing (NLP), Computer Vision (CV), and Generative AI</strong>. This role requires a strong technical foundation, excellent communication skills, and the ability to translate complex methodologies into meaningful business outcomes.</p><p>The ideal candidate is proactive, innovative, and passionate about developing advanced AI-driven solutions using modern architectures including <strong>LLMs, deep learning models, multi-agent systems, and generative AI techniques</strong>.</p><p><strong>Requirements</strong></p><ul><li>Strong background in <strong>NLP, Computer Vision, and Generative AI</strong>.</li><li>Broad background in <strong>Artificial Intelligence</strong>.</li><li>Excellent verbal and written communication skills.</li></ul><p> <strong>Key Responsibilities</strong></p><ul><li>Develop, train, and optimize <strong>machine learning and deep learning models</strong>.</li><li>Build advanced AI solutions using <strong>LLMs, multi-agent systems, fine-tuning techniques, and inference optimization</strong>.</li><li>Transform complex data science methodologies into actionable insights.</li><li>Collaborate closely with stakeholders to develop high-value, data-driven solutions.</li><li>Create clear, compelling presentations, dashboards, and deliverables for non-technical audiences.</li><li>Drive full lifecycle AI/ML projects from ideation through deployment.</li></ul>
We are looking for an experienced AWS/Databricks Engineer to join our team in Houston, Texas. This is a long-term contract position ideal for professionals with a strong background in data engineering and cloud technologies. The role will focus on leveraging Python and Databricks to optimize data processes and enhance system performance.<br><br>Responsibilities:<br>• Develop and implement scalable data engineering solutions using Python and Databricks.<br>• Collaborate with cross-functional teams to design and optimize data workflows.<br>• Migrate and enhance existing Python scripts to Databricks for improved functionality.<br>• Utilize cloud technologies to support data integration and analytics processes.<br>• Implement algorithms and data visualization methods to present actionable insights.<br>• Design and maintain APIs to streamline data interactions and integrations.<br>• Work with tools like Apache Kafka, Spark, and Hadoop to manage large-scale data systems.<br>• Perform data analysis and develop strategies to improve system efficiency.<br>• Ensure high-quality data pipelines and address performance bottlenecks.<br>• Stay updated on emerging trends in data engineering and recommend innovative solutions.
<p>We are seeking a skilled <strong>Azure Data Engineer</strong> to design, build, and maintain scalable data solutions on the Microsoft Azure platform. The ideal candidate will have strong experience developing data pipelines, optimizing data architectures, and supporting analytics and business intelligence initiatives. This role will work closely with data analysts, data scientists, and business stakeholders to ensure reliable, high-quality data is available for reporting and advanced analytics.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, develop, and maintain <strong>scalable data pipelines and ETL/ELT processes</strong> using Azure data services.</li><li>Build and manage data solutions using tools such as <strong>Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, and Azure Databricks</strong>.</li><li>Develop and optimize <strong>data models, transformations, and storage strategies</strong> for large-scale structured and unstructured datasets.</li><li>Ensure <strong>data quality, integrity, and security</strong> across the data platform.</li><li>Monitor and troubleshoot data workflows, pipeline failures, and performance issues.</li><li>Collaborate with data analysts, BI developers, and data scientists to deliver reliable datasets for reporting and analytics.</li><li>Implement <strong>data governance and best practices</strong> for data management and documentation.</li><li>Automate data processes and deployments using <strong>CI/CD pipelines and infrastructure-as-code practices</strong>.</li><li>Optimize cost and performance of Azure data services.</li><li>Stay current with new Azure features, tools, and industry best practices.</li></ul><p><br></p>
<p>Position Overview</p><p>We are seeking a talented <strong>Data Engineer</strong> with strong experience in <strong>Python, AWS, and Databricks</strong> to design and build scalable data pipelines and modern data platforms. The ideal candidate will help develop and maintain data infrastructure that supports analytics, machine learning, and business intelligence initiatives. This role requires hands-on experience working with large datasets, cloud-native architectures, and distributed data processing frameworks.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, build, and maintain <strong>scalable data pipelines and ETL/ELT workflows</strong> using Python and cloud technologies.</li><li>Develop and optimize data solutions using <strong>AWS services and Databricks</strong>.</li><li>Build and manage <strong>data lakes and data warehouses</strong> for structured and unstructured data.</li><li>Implement <strong>data transformation and processing pipelines</strong> using Apache Spark within Databricks.</li><li>Integrate data from multiple sources including APIs, databases, and streaming systems.</li><li>Ensure <strong>data quality, governance, security, and compliance</strong> across the data platform.</li><li>Monitor pipeline performance and troubleshoot <strong>data pipeline failures or latency issues</strong>.</li><li>Collaborate with <strong>data analysts, data scientists, and business stakeholders</strong> to deliver reliable datasets.</li><li>Optimize storage and compute costs within the AWS ecosystem.</li><li><br></li></ul><p><br></p>