<p><strong>Position Summary:</strong></p><ul><li>We are looking for a Data Operations Engineer to support and oversee the automated data‑pipeline environment built on AWS. This position bridges data engineering and customer operations, ensuring that incoming datasets are processed accurately, consistently, and securely within established ingestion and transformation frameworks.</li><li>Key responsibilities include monitoring automated workflows, troubleshooting processing failures, validating data quality, and helping onboard new customers by aligning their data formats to a standardized internal model.</li><li>The role requires strong proficiency in SQL and Python, practical experience with AWS services, and the ability to communicate effectively with external customers when data issues arise.</li></ul><p><strong>Responsibilities:</strong></p><p><strong>Data Pipeline Monitoring & Operations:</strong></p><ul><li>Monitor automated batch and streaming data pipelines in AWS</li><li>Identify, troubleshoot, and resolve data processing failures</li><li>Investigate file‑level errors, schema mismatches, and transformation issues</li><li>Perform root‑cause analysis and document resolutions</li><li>Ensure data integrity, completeness, and timeliness across environments</li><li>Escalate architectural or systemic issues to the Data Engineering team</li></ul><p><strong>Customer Data Onboarding & Implementation:</strong></p><ul><li>Collaborate directly with customers to understand their file formats and data structures</li><li>Create and maintain mapping templates to align customer data to a normalized data model</li><li>Validate sample files and run tests on ingestion workflows</li><li>Configure ingestion parameters within predefined frameworks</li><li>Support customer go‑live processes and initial data processing cycles</li></ul><p><strong>Data Quality & Continuous Improvement:</strong></p><ul><li>Write SQL queries to validate data accuracy and research anomalies</li><li>Develop lightweight Python scripts for validation, transformation checks, or automation tasks</li><li>Improve monitoring processes, internal documentation, and operational playbooks</li><li>Work with engineering teams to strengthen platform reliability and observability</li></ul><p><strong>Customer & Cross‑Functional Collaboration:</strong></p><ul><li>Communicate clearly with customers regarding file issues or data discrepancies</li><li>Partner with internal teams including Data Engineering, Product, and Support</li><li>Provide feedback to enhance scalability, resilience, and overall platform performance</li></ul>
<p><strong>Overview</strong></p><p>The SQL Database Administrator will be responsible for ensuring the optimal performance, availability, and security of our database systems. This role involves a keen understanding of SQL Server architecture, with a focus on performance profiling, permissions management, and data integrity practices. The ideal candidate will possess strong technical skills in SQL and related database technologies, along with an analytical mindset. The candidate will also have Azure experience.</p><p><br></p><p><strong><u>Responsibilities</u></strong></p><p><strong>Performance Profiling:</strong></p><ul><li>Monitor, analyze, and optimize the performance of databases to ensure they meet the needs of the business.</li><li>Use tools such as SQL Server Management Studio (SSMS) to identify and resolve performance bottlenecks.</li><li>Regularly conduct performance reviews and make necessary adjustments to database configurations for optimal operation.</li></ul><p><strong>Permissions Management:</strong></p><ul><li>Ensure that database users and applications have the appropriate permissions to access and modify tables, stored procedures, and other database objects.</li><li>Maintain column level encryption for data across clustered server environments.</li><li>Develop and implement policies for access controls, and audit logging.</li></ul><p><strong>Database Optimization:</strong></p><ul><li>Update databases to leverage foreign keys, indexes, and other features that enhance performance and integrity.</li><li>Perform database normalization to eliminate redundancy and improve data consistency.</li><li>Optimize stored procedures and application queries for efficiency and speed.</li><li>Load balancing , clustering, database tuning, and denormalization experience to maintain best performance</li><li>Leverage Data-tier application tools to maintain schema versioning and consistency in multiple databases.</li></ul><p><strong>Collaboration and Support:</strong></p><ul><li>Work closely with developers and other stakeholders to support database needs and address technical challenges.</li><li>Provide training and guidance on SQL Server technologies and best practices to team members as needed.</li><li>Support team when creating new columns/tables to optimize storage and data retrieval efficiency based on various factors such as data types.</li><li>Stay updated with the latest trends in database technology to continually improve our systems.</li></ul><p><strong>Backup and Recovery and Replication:</strong></p><ul><li>Develop, implement, and maintain a comprehensive backup and disaster recovery strategy.</li><li>Monitor backup logs and storage systems to ensure that all backups are successful, complete, and performed on schedule.</li><li>Conduct regular tests of the restore processes to validate effectiveness and to refresh lower environments with production like data.</li><li>Manage replication of databases across multiple servers in the cluster and review updates to before release to mitigate their impact on replication.</li></ul><p><strong>Data Sanitization:</strong></p><ul><li>Ensure that sensitive data is properly sanitized in lower environments and reduce unnecessary exposure of personal or sensitive information.</li></ul>
<p><strong>Responsibilities</strong></p><ul><li>Lead end-to-end delivery of large-scale technology programs across Corporate Functions</li><li>Drive program governance, execution tracking, and risk/issue management across multiple workstreams</li><li>Coordinate across business, product, architecture, and engineering teams to ensure alignment on scope, priorities, and timelines</li><li>Manage interdependencies across integration initiatives (e.g., platform consolidation, data migration, vendor systems)</li><li>Establish and maintain program plans, RAID logs, and executive-level reporting</li><li>Ensure alignment with regulatory, risk, and control requirements in a banking environment</li><li>Facilitate decision-making and escalation management with senior stakeholders (CIO, product owners, vendors)</li></ul><p><br></p>
<p><strong>Overview</strong></p><p>Our client is seeking a Senior Software Engineer to add to their team as they continue building out integrations on a weekly basis. This is a 90 day contract-to-hire position and is 100% remote. Client can only hire in these approved states: Florida, Georgia, Iowa, Kentucky, Maryland, Michigan, Missouri, North Carolina, Nebraska, New York, Ohio, Pennsylvania, South Carolina, South Dakota, Tennessee, Texas, Virginia, Washington, Wisconsin, or West Virginia)</p><p><br></p><p><strong>Key Responsibilities </strong> </p><ul><li>Architect and Implement Integrations Framework: Develop a scalable and resilient integrations framework that prioritizes ETL techniques and data pipeline efficiency. </li><li>Technical Leadership & Mentorship: Lead and mentor a team of 3 engineers, promoting a culture of extreme ownership, accountability, and clear, effective communication. </li><li>Develop Data Integrations: Design and develop robust integrations with third-party systems, emphasizing data extraction, transformation, and loading combined with API-driven approaches. </li><li>Establish Best Practices: Define and enforce best practices for integration design, development, documentation, and open team communication. </li><li>Collaborate with Stakeholders: Work closely with product managers, engineering teams, and other stakeholders, ensuring alignment with business objectives through transparent and proactive communication. </li><li>Oversee Project Delivery: Manage end-to-end delivery of integration projects, ensuring timely completion and accountability at every stage. </li><li>Drive Innovation: Lead initiatives to innovate our integration strategies and technologies, continuously improving our data handling and ETL processes.</li></ul>