Candidates must have more than five (5) years of hands-on experience with large, complex ADMS/OMS environments, preferably GE V ADMS v3.11 or higher. This experience must include planning, developing, and implementing test strategies and plans, as well as creating and executing comprehensive test cases. <br>We need a contractor to assume the responsibilities of an QA Test Lead for OMS (see attached job description).<br>• Coordinate with the development team, business analysts, and product/business owners to gain a thorough understanding of the application and testing requirements.<br>• Review requirements and design documents, log all questions into the clarification tracker, and ensure there are no unresolved queries before testing begins.<br>• Determine the scope and out-of-scope items for testing.<br>• Create test scenarios, stories, and test cases that cover all functionalities and requirements, ensuring they are comprehensive and effective.<br>• Prepare test data, confirming its availability and accuracy.<br>• Develop a Requirement Traceability Matrix (RTM) to ensure all requirements are validated.<br>• Support the Test Lead in creating the test plan.<br>• Upload test cases into the test management tool for execution.<br>• Execute thorough testing by running all test cases to confirm the application functions as intended.<br>• Conduct regression testing to verify that fixes have not introduced new issues.<br>• Document test results with screenshots, including date and timestamp, and mask any PII.<br>• Identify and report defects found during testing, documenting steps to reproduce, test data, screenshots, and relevant details.<br>• Retest defects and re-execute test cases once defects are resolved.<br>• Track application downtime in the downtime tracker.<br>• Report any risks to the Test Lead and Project Manager, outlining the impact on testing or the project.
<p><strong>About the Role:</strong></p><p>We are seeking a versatile and forward-thinking Full Stack Developer to join our dynamic team. The ideal candidate will be proficient across multiple programming languages and frameworks, with a strong foundation in AI integration, testing, and performance optimization. This role requires a developer who thrives in a fast-paced environment and is passionate about building secure, scalable, and innovative web applications.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><ul><li>Design, develop, and maintain full stack applications using Ruby, Ruby on Rails, Python, Django, HTML, CSS, and JavaScript.</li><li>Integrate AI frameworks, APIs, and plugins to enhance application capabilities.</li><li>Conduct thorough debugging, unit testing, and regression testing to ensure code quality.</li><li>Ensure applications meet security and compliance standards, including US and EU regulations.</li><li>Implement and manage payment systems integration.</li><li>Perform load testing to validate performance under high traffic conditions.</li><li>Collaborate with cross-functional teams to define and implement web architecture (preferred but not required).</li></ul><p><br></p><p><br></p>
<p>We are seeking an experienced <strong>Business Analyst</strong> with a strong background in the <strong>Property & Casualty Insurance</strong> industry to support complex technology initiatives. This role will partner with business stakeholders, technical teams, and third-party vendors to gather, analyze, and document functional requirements for underwriting, claims, and data-focused projects.</p><p>The ideal candidate brings deep knowledge of the <strong>P& C insurance domain</strong>, particularly underwriting operations and policy administration, along with strong technical analysis skills and the ability to translate business needs into clear requirements.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Gather, analyze, and document business and functional requirements for complex IT projects.</li><li>Work closely with business users, developers, architects, QA teams, infrastructure teams, and software vendors to ensure requirements are clearly defined and delivered on time.</li><li>Support initiatives related to <strong>underwriting</strong>, <strong>claims applications</strong>, <strong>data integration</strong>, and <strong>data quality</strong>.</li><li>Translate business processes and needs into detailed functional specifications, workflows, and use cases.</li><li>Partner with stakeholders to improve system functionality and support business objectives across the insurance policy lifecycle.</li><li>Assist with requirements validation, testing support, and issue resolution throughout the project lifecycle.</li><li>Identify opportunities where technology can improve operational efficiency and business outcomes.</li></ul><p><br></p>
<p>The Senior Software Engineer is a hands-on technical leadership position responsible for designing, building, and maintaining high-quality software solutions. This role emphasizes both individual development work and ownership of design decisions for features and subsystems. Modern tools, including AI-assisted development and architectural support, are leveraged to drive delivery while maintaining accountability for technical outcomes.</p><p><br></p><p><strong>Responsibilities:</strong></p><p><br></p><ul><li>Design, implement, test, and maintain scalable, secure, and reliable applications and services.</li><li>Act as a senior technical contributor, with responsibility for the design and implementation of features and subsystems.</li><li>Contribute actively to development tasks, applying advanced coding expertise in several programming languages and frameworks.</li><li>Participate in architectural discussions and support incremental evolution of systems with team leads.</li><li>Conduct code reviews and mentor engineering team members, fostering best practices and ongoing improvement.</li><li>Translate requirements from product owners, business analysts, and stakeholders into technical solutions.</li><li>Identify and mitigate technical risks in assigned systems and projects.</li><li>Support and enhance cloud-based applications (Azure, AWS) with emphasis on performance, reliability, and scalability.</li><li>Collaborate effectively with onshore and offshore teams to ensure successful project execution.</li><li>Keep abreast of industry trends and new technologies to encourage innovation.</li><li>Utilize AI-assisted tools to expedite design, documentation, and implementation, while ensuring technical quality.</li><li>Lead and support AI-related initiatives, drawing on prior experience with AI/ML technologies; recommend and implement suitable AI tools and frameworks.</li><li>Test and demonstrate emerging AI tools and platforms via proofs of concept (POCs) to highlight business value.</li><li>Guide customers in leveraging AI to optimize business processes; support teams working on business-facing AI efforts.</li><li>Collaborate with stakeholders to contribute to defining an AI roadmap aligned with organizational strategy and technology objectives.</li></ul>
<p>We are seeking an experienced <strong>HCM Developer</strong> to support, enhance, and maintain Oracle HCM integrations and related technical solutions. This role will be responsible for reviewing Oracle updates and release documentation, assessing impacts to current and future integrations, resolving production issues, coordinating testing efforts, and developing technical solutions that align with evolving business needs. The ideal candidate will collaborate closely with functional teams, business stakeholders, and internal and external integration partners to ensure seamless integration performance and ongoing system optimization. Based on general knowledge.</p><p><strong>Key Responsibilities:</strong></p><ul><li>Review Oracle updates, upgrades, and release documentation to determine impacts on existing and new integrations. Based on general knowledge.</li><li>Analyze Oracle release notes for changes that may affect current integrations, including file layouts, new or renamed fields, and field size changes. Based on general knowledge.</li><li>Work closely with internal and external integration partners to communicate changes, testing requirements, and technical updates. Based on general knowledge.</li><li>Assist with production integration issues and troubleshoot problems in collaboration with business teams and integration partners. Based on general knowledge.</li><li>Identify, recommend, and help implement solutions for production and integration-related issues. Based on general knowledge.</li><li>Log service requests and work with Oracle Support as needed to resolve technical issues. Based on general knowledge.</li><li>Develop new code and modify existing code to support integration and business requirements. Based on general knowledge.</li><li>Coordinate testing efforts with end users, functional leads, and integration partners. Based on general knowledge.</li><li>Develop technical documentation and integration requirements to support retrofit efforts and code updates by integration partners. Based on general knowledge.</li><li>Partner with functional leads to coordinate testing, implementation, and rollout activities. Based on general knowledge.</li><li>Create and maintain project plans to manage integration updates, timelines, and testing activities. Based on general knowledge.</li><li>Work with functional leads to understand business requirements as new modules and functionality are introduced, especially where integration automation may be affected. Based on general knowledge.</li><li>Stay current on Oracle HCM technologies, releases, and emerging features to identify opportunities for improved automation and business solutions. Based on general knowledge.</li><li>Perform other related duties as assigned. Based on general knowledge.</li></ul><p><br></p>
<p>We are seeking a skilled and motivated Data Engineer to join our team, with deep hands-on experience building and optimizing data pipelines and lakehouse solutions in Databricks. In this role, you will collaborate with cross-functional teams to design, develop, and operate scalable, reliable data products that drive business value.</p><p><br></p><p>Key Responsibilities</p><ul><li>Design, build, and maintain batch and streaming data pipelines using Databricks (Spark, Delta Lake, Jobs/Workflows).</li><li>Partner with data scientists, analysts, and application teams to deliver trusted, well-modeled data sets and features in the Databricks Lakehouse.</li><li>Optimize Spark jobs (partitioning, caching, join strategies) and Databricks cluster configurations for performance, scalability, and cost.</li><li>Implement data quality checks, observability, governance, and security controls (e.g., Unity Catalog, access policies) within Databricks.</li><li>Troubleshoot and resolve pipeline failures, data issues, and production incidents; perform root-cause analysis and implement preventative improvements.</li></ul><p><br></p>
<p>We are seeking a hands-on technical leader with deep expertise in software development, infrastructure, and web application design. This role is responsible for shaping infrastructure strategy to ensure systems are scalable, efficient, and reliable. The ideal candidate will lead initiatives that tackle questions such as how to improve application speed and how to prepare platforms to seamlessly support growing traffic, ultimately driving solutions that enhance performance and reliability across the organization.</p><p><br></p><p><strong>Responsibilities</strong></p><ul><li>Define and drive backend, infrastructure, and DevOps strategy</li><li>Architect scalable, secure, and reliable systems</li><li>Diagnose and resolve performance, infrastructure, and integration issues</li><li>Oversee backend development, including coding standards, integrations, and APIs</li><li>Lead infrastructure tooling, cloud strategy, automation efforts, and CI/CD pipelines</li><li>Evaluate new tools and frameworks to improve efficiency and system performance</li><li>Establish metrics and KPIs to measure infrastructure performance</li><li>Implement proactive monitoring, logging, and alerting solutions</li><li>Manage infrastructure budgets with a focus on cost optimization</li><li>Communicate technical solutions clearly and effectively to stakeholders at all levels</li><li>Conduct all code reviews, ensuring quality, consistency, and best practices across the team</li><li>Manage frontend and database infrastructure, addressing performance bottlenecks and recurring issues caused by poor architecture</li></ul><p><br></p>
<p>We are seeking an Integration Engineer to provide technical leadership across the end-to-end development and delivery of new insurance products and capabilities, enabling new business opportunities and operational efficiencies. This role will lead the design, development, and integration of underwriting frameworks and platforms across multiple lines of business using Agile and DevOps methodologies.</p><p>This position requires hands-on development experience, deep integration expertise, and the ability to technically lead globally distributed teams to deliver scalable, secure, and high-quality cloud-native platforms and services aligned with business priorities.</p><p><br></p><p><strong>Key Responsibilities:</strong></p><p>· Provide hands-on technical leadership for application design, development, and integration of secure and scalable solutions.</p><p>· Lead end-to-end development, implementation, and integration of new insurance products and packages using underwriting frameworks.</p><p>· Design and implement platforms enabling continuous delivery of features and capabilities aligned with business priorities.</p><p>· Partner closely with Product Managers/Owners, Business Analysts, Developers, and enterprise teams to define optimal technical solutions.</p><p>· Lead high-level design efforts, perform code reviews, and guide implementation by development teams.</p><p>· Develop select modules while reviewing and mentoring other engineers’ code.</p><p>· Mentor and coach junior engineers across multiple Scrum teams.</p><p>· Leverage an Agile-based operating model across distributed teams.</p><p>· Prepare technical documentation and deliver sprint and system demonstrations.</p><p>· Manage production workloads and act as an SME for Level 3 production support issues.</p><p>· Perform complex troubleshooting, root cause analysis, and resolution of critical production defects.</p><p>· Communicate status, risks, and dependencies in accordance with agreed communication plans.</p>
<p>We are seeking a highly skilled Data Engineer to design, build, and manage our data infrastructure. The ideal candidate is an expert in writing complex SQL queries, designing efficient database schemas, and developing ETL/ELT pipelines. This role ensures data accuracy, accessibility, and performance optimization to support business intelligence, analytics, and reporting initiatives.</p><p><br></p><p><strong><em><u>Key Responsibilities</u></em></strong></p><p><br></p><p><strong>Database Design & Management</strong></p><ul><li>Design, develop, and maintain relational databases, including SQL Server, PostgreSQL, and Oracle, as well as cloud-based data warehouses.</li></ul><p><strong>Strategic SQL & Data Engineering</strong></p><ul><li>Develop advanced, optimized SQL queries, stored procedures, and functions to process and analyze large, complex datasets and deliver actionable business insights.</li></ul><p><strong>Data Pipeline Automation & Orchestration</strong></p><ul><li>Build, automate, and orchestrate ETL/ELT workflows using SQL, Python, and cloud-native tools to integrate and transform data from diverse, distributed sources.</li></ul><p><strong>Performance Optimization</strong></p><ul><li>Tune SQL queries and optimize database schemas through indexing, partitioning, and normalization to improve data retrieval and processing performance.</li></ul><p><strong>Data Integrity & Security</strong></p><ul><li>Ensure data quality, consistency, and integrity across systems.</li><li>Implement data masking, encryption, and role-based access control (RBAC).</li></ul><p><strong>Documentation</strong></p><ul><li>Maintain comprehensive technical documentation, including database schemas, data dictionaries, and ETL workflows.</li></ul>