We are looking for an experienced Data Engineer to join our team on a contract basis in Cleveland, Ohio. This role involves creating scalable data solutions, optimizing database environments, and supporting business intelligence reporting for manufacturing metrics. The ideal candidate will have expertise in modern data engineering practices and a strong ability to collaborate with stakeholders.<br><br>Responsibilities:<br>• Redesign and optimize existing data models to improve efficiency and scalability.<br>• Structure and organize incoming data to ensure seamless integration into reporting systems.<br>• Build advanced time intelligence features within Power BI to enhance reporting capabilities.<br>• Craft operational reports that provide actionable insights on manufacturing metrics.<br>• Develop and implement reporting solutions that deliver measurable business value.<br>• Utilize modern data transformation tools, such as dbt, to streamline workflows.<br>• Support analytical reporting and contract review processes by ensuring accurate data representation.<br>• Assist in establishing a robust database environment that integrates well with Power BI.<br>• Collaborate with stakeholders to understand data requirements and translate them into actionable solutions.<br>• Explore and implement forward-thinking data engineering practices to enhance system performance.
<p>We are looking for a skilled Data Engineer to join our team. In this role, you will play a key part in designing and building scalable data solutions to support our mission of improving cancer care. The ideal candidate will thrive in a collaborative environment and have a strong background in developing robust data pipelines and working with cloud-based platforms.</p><p><br></p><p>Responsibilities:</p><p>• Design and implement scalable, componentized processes to enhance the business intelligence platform.</p><p>• Develop and optimize robust data pipelines to support data integration and transformation.</p><p>• Collaborate with cross-functional teams to translate requirements into actionable tasks and deliverables.</p><p>• Evaluate and utilize big data and cloud technologies to deliver effective solutions.</p><p>• Troubleshoot and resolve technical issues efficiently, while identifying opportunities for process improvements.</p><p>• Maintain clear documentation and communicate effectively with team members about code functionality.</p><p>• Adapt to shifting priorities and requirements in an agile development environment.</p><p>• Implement instrumentation, logging, and alerting practices to ensure system reliability</p>
We are looking for a skilled and experienced Senior Data Engineer to join our team in New York, New York. This role is ideal for someone who thrives on working with complex datasets, building scalable data solutions, and collaborating with cross-functional teams. If you have a passion for leveraging data to drive strategic decisions, we encourage you to apply.<br><br>Responsibilities:<br>• Design, implement, and maintain scalable data architectures to support business intelligence and analytics needs.<br>• Develop and optimize data pipelines and workflows for efficient data processing and integration.<br>• Collaborate with stakeholders to understand data requirements and translate them into actionable solutions.<br>• Leverage cloud platforms, such as AWS, Azure, or Google Cloud, to manage and enhance data infrastructure.<br>• Utilize Big Data technologies, including Apache Spark and Databricks, to process and analyze large datasets.<br>• Write efficient scripts and perform data manipulation using Python and other relevant programming languages.<br>• Ensure data quality and integrity by implementing robust validation and monitoring processes.<br>• Provide technical guidance and support to team members on best practices in data engineering.<br>• Stay updated on emerging data technologies and tools to continuously improve systems.<br>• Partner with API developers to integrate data systems and enable seamless data access.
<p>We are looking for an experienced Informatica and AWS Data Engineer to join our team in Southern California. In this long-term, multi-year position, you will play a pivotal role in configuring and managing Informatica Cloud Catalog, Governance, and Marketplace systems, ensuring seamless integration with various platforms and tools. This opportunity is ideal for professionals with a strong background in data governance, security, and compliance, as well as expertise in cloud technologies and database systems.</p><p><br></p><p>Responsibilities:</p><p>• Configure and implement role-based and policy-based access controls within Informatica Cloud Catalog and Governance systems.</p><p>• Develop and set up connections for diverse platforms, including mainframe databases, cloud services, S3, Athena, and Redshift.</p><p>• Troubleshoot and resolve issues encountered during connection creation and data profiling.</p><p>• Optimize performance by identifying and addressing bottlenecks in profiling workflows.</p><p>• Configure and manage Cloud Marketplace integrations to enforce policy-based data protections.</p><p>• Review and communicate Informatica upgrade schedules, exploring new features and coordinating timelines with business and technical teams.</p><p>• Collaborate with infrastructure teams to establish clusters for managing profiling workloads efficiently.</p><p>• Support governance initiatives by classifying and safeguarding sensitive financial and customer data.</p><p>• Create and manage metadata, glossaries, and data quality rules across regions to ensure compliance with governance policies.</p><p>• Set up user groups, certificates, and IP whitelisting to maintain secure access and operations.</p>
We are looking for a skilled Data Engineer to join our team on a long-term contract basis in Columbus, Ohio. In this role, you will focus on optimizing database structures, queries, and workflows to ensure high performance and scalability in complex data environments. This position requires a proactive approach to improving system efficiency and collaborating across teams to meet strict performance standards.<br><br>Responsibilities:<br>• Analyze and refine database queries, stored procedures, and indexing strategies to enhance performance.<br>• Evaluate and optimize database schemas, tables, and relationships for improved efficiency.<br>• Collaborate with application and product teams to ensure performance benchmarks are consistently met.<br>• Implement monitoring and tuning strategies using cloud-native tools such as CloudWatch and Performance Insights.<br>• Establish best practices for coding, data modeling, and workflow optimization within database systems.<br>• Work closely with DevOps and cloud engineering teams to optimize cloud-hosted databases including Aurora and Redshift.<br>• Identify and troubleshoot performance bottlenecks, providing effective long-term solutions.<br>• Develop and maintain data pipelines to support seamless data integration and transformation processes.<br>• Provide technical recommendations to enhance system scalability and meet performance SLAs.
We are looking for a Senior Data Engineer to join our team in New York, New York. In this role, you will be instrumental in designing, developing, and optimizing data pipelines and systems to support business needs. This position requires a strong technical background, a passion for clean coding practices, and expertise in data governance and orchestration.<br><br>Responsibilities:<br>• Develop and maintain scalable data pipelines using Python and Apache Spark, ensuring adherence to clean coding principles.<br>• Implement data governance practices, including data quality management and lineage tracking.<br>• Manage diverse data ingestion patterns to facilitate seamless integration of data sources.<br>• Utilize tools such as Apache Airflow for data orchestration and Docker or Kubernetes for containerization.<br>• Build and maintain CI/CD pipelines to streamline deployment processes.<br>• Collaborate with stakeholders to ensure accountability and entrepreneurialism in daily tasks.<br>• Apply knowledge of cloud platforms like Microsoft Azure and Databricks for advanced data solutions.<br>• Leverage experience with commodity and energy trading data to deliver business-focused insights.
<p>We are on the lookout for a Data Engineer in Basking Ridge, New Jersey. (1-2 days a week on-site*) In this role, you will be required to develop and maintain business intelligence and analytics solutions, integrating complex data sources for decision support systems. You will also be expected to have a hands-on approach towards application development, particularly with the Microsoft Azure suite.</p><p><br></p><p>Responsibilities:</p><p><br></p><p>• Develop and maintain advanced analytics solutions using tools such as Apache Kafka, Apache Pig, Apache Spark, and AWS Technologies.</p><p>• Work extensively with Microsoft Azure suite for application development.</p><p>• Implement algorithms and develop APIs.</p><p>• Handle integration of complex data sources for decision support systems in the enterprise data warehouse.</p><p>• Utilize Cloud Technologies and Data Visualization tools to enhance business intelligence.</p><p>• Work with various types of data including Clinical Trials Data, Genomics and Bio Marker Data, Real World Data, and Discovery Data.</p><p>• Maintain familiarity with key industry best practices in a regulated “GXP” environment.</p><p>• Work with commercial pharmaceutical/business information, Supply Chain, Finance, and HR data.</p><p>• Leverage Apache Hadoop for handling large datasets.</p>
<p>We’re seeking a dynamic Video Analytics Engineer to join a forward-thinking Smart City initiative as a contractor. This role focuses on developing advanced computer vision solutions that enhance urban living through intelligent video analytics and automation.</p><p>Key Responsibilities</p><ul><li>Support setup, configuration, and management of video camera systems, including instrumentation and security.</li><li>Evaluate third-party computer vision tools and libraries to drive innovation.</li><li>Implement algorithms for object tracking, anomaly detection, and scene recognition.</li><li>Optimize performance and reliability of real-time video workflows.</li><li>Integrate machine learning models with enterprise video management systems and workflow engines.</li><li>Design and deploy scalable computer vision solutions for Smart City use cases.</li><li>Collaborate with cross-functional stakeholders across public safety, transportation, infrastructure, and operations to align technology with community needs.</li></ul><p><br></p><p><br></p>
We are looking for a skilled Business Intelligence (BI) Engineer to join our team in Jenkintown, Pennsylvania. As a key contributor, you will design, develop, and implement advanced data solutions to support business analytics and decision-making processes. This role requires a strong technical background and a collaborative mindset to drive innovation and optimize data platforms.<br><br>Responsibilities:<br>• Design and implement dimensional and semantic data models to enhance business analytics and reporting capabilities.<br>• Develop and optimize data pipelines using modern orchestration tools like Apache Airflow or Azure Data Factory.<br>• Create and manage interactive dashboards and visualizations using Power BI, ensuring accuracy and usability.<br>• Leverage Microsoft Fabric architecture to integrate centralized semantic models and external data platforms.<br>• Administer cloud-based data warehouses like Azure Synapse or Snowflake, ensuring performance and scalability.<br>• Collaborate with cross-functional teams to address data governance, quality frameworks, and compliance standards.<br>• Utilize Python and data science libraries to process, analyze, and develop machine learning workflows.<br>• Implement DataOps methodologies, including CI/CD practices and version control for data solutions.<br>• Conduct advanced statistical analysis and predictive modeling to inform business strategies.<br>• Partner with stakeholders to translate technical concepts into actionable insights for diverse audiences.
We are looking for an experienced Data Scientist / AI Engineer to join our team in Raleigh, North Carolina. This Contract-to-permanent position offers an exciting opportunity to work on cutting-edge projects involving machine learning, artificial intelligence, and advanced data solutions. The ideal candidate will bring a strong technical background, innovative problem-solving skills, and a passion for leveraging data to drive impactful results.<br><br>Responsibilities:<br>• Design and implement complex Python applications, ensuring scalability and reliability.<br>• Develop and optimize machine learning models, including those based on large language models (LLMs).<br>• Utilize serverless architectures and Amazon Web Services (AWS) to build and deploy data solutions.<br>• Work with chains and vector databases to solve multifaceted problems and enhance data processing capabilities.<br>• Perform ETL processes to transform and load data efficiently while maintaining data integrity.<br>• Collaborate with cross-functional teams to identify opportunities for AI-driven solutions.<br>• Analyze large datasets to extract actionable insights and build predictive models.<br>• Maintain and improve existing data pipelines and frameworks to ensure optimal performance.<br>• Stay updated on emerging trends in AI and machine learning to incorporate best practices into projects.
We are looking for an experienced SCADA Engineer III to join our team in Allentown, Pennsylvania. This is a long-term contract position that offers the opportunity to contribute to critical substation projects while working in a hybrid or fully remote environment, depending on candidate qualifications. The ideal candidate will leverage their expertise in computer engineering, communication protocols, and SCADA systems to support the design and troubleshooting of substation communication networks.<br><br>Responsibilities:<br>• Develop configuration settings for substation projects, adhering to established organizational standards.<br>• Create and refine point assignment sheets in collaboration with substation design engineers.<br>• Apply communication standards set by network engineers to ensure seamless integration into substation designs.<br>• Diagnose and resolve data and communication issues in coordination with system operators and Relay Test personnel.<br>• Utilize programming languages, including PLC ladder logic and Lua, to optimize control systems.<br>• Analyze and interpret network topology to support substation communication frameworks.<br>• Ensure effective data exchange between devices by implementing communication protocols.<br>• Support supervisory control and data acquisition (SCADA) systems through design, review, and troubleshooting efforts.<br>• Utilize spreadsheets and related tools to organize and manage project data.<br>• Maintain compliance with organizational and industry standards in all engineering tasks.
We are looking for a skilled AI Engineer to join our dynamic team in New York, New York. This role requires expertise in advanced data platforms and analytics tools, with a focus on leveraging machine learning and data science techniques to drive impactful solutions. If you're passionate about data innovation and have a strong consulting background, we encourage you to apply.<br><br>Responsibilities:<br>• Design, implement, and optimize solutions using Azure Databricks, Data Lake, Synapse, and Spark.<br>• Develop and enforce data governance strategies, including the use of Unity Catalog.<br>• Utilize Python to build, test, and deploy machine learning models and data science workflows.<br>• Collaborate with stakeholders to identify opportunities for AI-driven improvements in business processes.<br>• Maintain and enhance data pipelines to ensure scalability and reliability across projects.<br>• Provide technical leadership and mentorship to team members, ensuring best practices in data engineering and analytics.<br>• Translate complex data problems into actionable insights and solutions for clients.<br>• Stay updated on industry trends and emerging technologies to continuously improve solutions.<br>• Support clients in adopting AI and machine learning tools to address business challenges.
We are looking for a skilled Data Warehouse Engineer to join our team in Malvern, Pennsylvania. This Contract-to-Permanent position offers the opportunity to work with cutting-edge data technologies and contribute to the optimization of data processes. The ideal candidate will have a strong background in Azure and Snowflake, along with experience in data integration and production support.<br><br>Responsibilities:<br>• Develop, configure, and optimize Snowflake-based data solutions to meet business needs.<br>• Utilize Azure Data Factory to design and implement efficient ETL processes.<br>• Provide production support by monitoring and managing data workflows and tasks.<br>• Extract and analyze existing code from Talend to facilitate system migrations.<br>• Stand up and configure data repository processes to ensure seamless performance.<br>• Collaborate on the migration from Talend to Azure Data Factory, providing expertise on best practices.<br>• Leverage Python scripting to enhance data processing and automation capabilities.<br>• Apply critical thinking to solve complex data challenges and support transformation initiatives.<br>• Maintain and improve Azure Fabric-based solutions for data warehousing.<br>• Work within the context of financial services, ensuring compliance with industry standards.
<p>We are looking for an experienced Director of Data Operations to join our team. This is a long-term contract position offering the opportunity to work on challenging projects that span multiple technical domains. The ideal candidate will bring a depth of expertise in software development, programming, and technical problem-solving, contributing to impactful solutions and innovations.</p><p><br></p><p><strong>Responsibilities:</strong></p><ul><li>Work closely with business stakeholders to understand ongoing and future data needs.</li><li>Build relationships at every level of the organization and develop a deep understanding of the business context.</li><li>Work with the primary client to prioritize and manage day to day activities, task prioritization, team utilization, and data deliveries, utilizing Agile project and task management.</li><li>Lead data analytics team members to drive full data lifecycle activities, including new platform, data engineering, and reporting capabilities.</li><li>Lead day to day operations to ensure all stakeholder commitments are met.</li><li>Lead migration from legacy data warehouse and business intelligence platform to the new cloud native Azure platform.</li><li>Be hands on with the technologies, such as constructing SQL queries, in order to facilitate effective data analysis, subject domain understanding, troubleshooting, task scoping, data quality checks, and demonstrations.</li><li>Establish, maintain, and coordinate Agile development and operations, including Program Increment Planning.</li><li>Manage design, development, deployment, and operation of Microsoft Azure based data solutions.</li><li>Coordinate support from infrastructure and network support teams, vendors, and government service providers.</li><li>Maintain continuous learning to propose new approaches and solutions to meet client needs.</li><li>Incorporate and drive best practices in cloud based data infrastructure, data architecture, data quality assurance, data security, data standards, data management, data governance, data engineering, performance optimization, reporting, and analytics.</li><li>Collaborate with the client to maintain technical roadmap, ensuring incorporation of latest tools and technologies such as Generative AI.</li><li>Ensure appropriate project knowledge management and documentation.</li></ul>
<p>We are seeking a highly skilled Data Architect for a long-term contract opportunity in Southern California. This person will help to lead the implementation and optimization of cloud-based data catalog and quality tools. This role will support enterprise data governance initiatives, compliance reporting, and dashboard development for various business personas.</p><p><br></p><p>Key Responsibilities:</p><ul><li>Implement cloud-based data catalog tools for metadata ingestion and data quality profiling</li><li>Build domain/sub-domain level data governance dashboards and quality reports</li><li>Develop and demonstrate dashboards for roles such as Data Owner, Data Steward, Data Engineer, and Data Privacy Officer</li><li>Collaborate with cross-functional teams including business data owners, stewards, architects, platform owners, and project managers</li><li>Set up and run Informatica CDGC scanners, profiling, and sampling</li><li>Support PII classification using AI-based tools (e.g., CLAIRE Gen AI)</li><li>Produce audit reports for compliance with SOX, CCPA, and other legal requirements</li><li>Define and operationalize governance councils using catalog tools and processes</li></ul>
We are looking for an experienced Lead Software Engineer to join our team in Philadelphia, Pennsylvania. In this role, you will play a pivotal part in maintaining and enhancing our software systems, ensuring their reliability and scalability. This position offers an opportunity to lead a small development team, collaborate with stakeholders, and contribute to the technical evolution of our applications.<br><br>Responsibilities:<br>• Analyze and document existing software packages, reports, applications, and stored procedures to ensure a thorough understanding of critical business operations.<br>• Monitor and maintain the stability of legacy codebases and data processes, promptly identifying and resolving production issues.<br>• Lead responses to stakeholder requests for new features, workflows, or data services while ensuring compatibility with current system architecture.<br>• Design and implement system enhancements or new components, integrating them seamlessly into existing infrastructure.<br>• Mentor and guide team members by conducting code reviews, sharing best practices, and promoting maintainable development standards.<br>• Collaborate with the Data Team to optimize query performance, stored procedures, and data integrations for maximum system efficiency.<br>• Work closely with business users to translate technical requirements into functional solutions that align with operational goals.<br>• Identify and manage technical debt, developing plans that balance business priorities and system risks.<br>• Contribute to the improvement of documentation standards, deployment procedures, and system monitoring processes to ensure sustainable application management.
<p>Position Summary</p><p>We’re looking for a Sr. SQL developer, responsible for solving complex data and workflow challenges and building tools that directly impact production, fulfillment, and reporting systems. You’ll work closely with cross-functional teams to design, develop, and support custom software solutions that drive operational efficiency and scalability.</p><p><br></p><p>Top Responsibilities</p><ol><li>Design & Develop Custom Solutions: Build and maintain Windows-based and web applications that integrate with production systems, inventory tools, and third-party platforms.</li><li>Data Engineering & Automation: Create data workflows using SQL Server, SSIS, and other tools to automate print production, reporting, and invoicing processes.</li><li>System Integration: Develop interfaces with devices and systems such as barcode scanners, shipping tools, and ERP platforms.</li><li>Project Collaboration: Partner with technical leads and account managers to define project goals, test solutions, and ensure successful deployment.</li><li>Mentorship & Support: Guide junior developers and provide ongoing support for deployed applications, including training and documentation.</li></ol>
<p>Our client is seeking a Platform Software Engineer III – ATI to support the underlying infrastructure, platforms, and services that power applications and ELT/ETL processes. The ideal candidate will have a strong foundation in infrastructure automation, cloud technologies (AWS/Azure), monitoring, CI/CD frameworks, and platform reliability. Hands-on experience with programming and scripting languages such as Python, Spark, Jupyter Notebooks, Java, SQL, and Shell scripting is essential. </p><p> This role focuses on building and maintaining scalable, reliable, and secure platform components that enable development teams to deliver features efficiently. Responsibilities include designing and implementing system improvements, managing infrastructure as code, optimizing CI/CD workflows, supporting production environments, collaborating across teams, and participating in all phases of the development life cycle. </p><p> What You’ll Do: Design and develop complex platform components for core infrastructure, CI/CD pipelines, and cloud-native services. </p><p> Implement and maintain components across the platform stack, including IaC, middleware, orchestration, and monitoring. </p><p> Conduct peer code reviews, provide guidance, and mentor entry level engineers. </p><p> Refactor and improve existing platform code for scalability, maintainability, and performance. </p><p> Deliver robust, testable modules using TDD methodologies. </p><p> Lead the design of infrastructure modules, CI/CD integrations, and system automation with high availability and fault tolerance. </p><p> Produce and maintain technical design documentation. </p><p> Define and review integration and unit testing strategies. </p><p> Debug and resolve infrastructure-level issues, deployment failures, and automation bugs. </p><p> Provide Level III support for critical production issues. </p><p> Collaborate across teams to improve platform services and reliability.</p>
<p>We are seeking an Enterprise Architect – AI/Data to design and implement enterprise-level AI and data architectures that align with business objectives and enhance operational capabilities. This role requires deep expertise in AI/ML technologies, big data platforms, and enterprise architecture frameworks.</p><p>Key Responsibilities:</p><ul><li>Design and develop end-to-end AI and data architectures ensuring scalability, security, and performance</li><li>Create architectural blueprints and roadmaps for AI and data integration across the organization</li><li>Lead the development of data platforms and AI-driven systems for advanced analytics and automation</li><li>Define AI strategy and oversee its implementation across business units</li><li>Deploy and integrate AI models and tools into production systems</li><li>Design scalable cloud and on-premises data architectures using Azure, AWS, or Google Cloud</li><li>Work with big data technologies (Hadoop, Spark) and data lake architectures</li><li>Manage integration of AI models into big data platforms for structured and unstructured data</li><li>Collaborate with business and technical teams to translate needs into AI/data solutions</li><li>Stay current with AI, ML, and data technology advancements and best practices</li><li>Assess and mitigate risks related to AI and data architectures, ensuring compliance and security</li></ul><p><br></p>
<p><strong>Robert Half is seeking a Backend Engineer</strong> to support a <strong>Financial Services/Insurance</strong> organization based in <strong>Bellevue, WA (Remote U.S. candidates considered)</strong>. This role involves building a new greenfield <strong>Document Management System (DMS) on AWS</strong> to support enterprise insurance operations. The position is <strong>Remote</strong>, offered as a <strong>6-month contract with potential to extend/convert</strong>. Apply today!</p><p><br></p><p>Job Details</p><ul><li><strong>Schedule:</strong> Standard business hours (Pacific preferred, flexible across U.S. time zones)</li><li><strong>Duration:</strong> 6 months (contract, with potential to extend/convert)</li><li><strong>Location:</strong> Remote – U.S. only</li></ul><p>Job Responsibilities</p><ul><li>Design, develop, and deliver a robust document management system (DMS) deployed on AWS.</li><li>Build and optimize document processing pipelines, storage systems (search indexing, object key strategy, metadata schemas, lifecycle/retention, batch processing).</li><li>Contribute to backend architecture decisions: microservices, API contracts, event handling, datastore design, security, and access control.</li><li>Partner with product owners, architects, and analysts to translate business requirements into scalable backend solutions.</li><li>Manage the full software lifecycle—development, CI/CD, infrastructure, and deployment.</li><li>Ensure scalability, performance optimization, and proactive management of technical debt.</li><li>Support compliance in regulated environments (HIPAA, PCI, insurance/financial data).</li></ul>
We are looking for an experienced Applications Architect to join our team in Chicago, Illinois. This long-term contract position focuses on designing and implementing scalable, secure, and cloud-native solutions using cutting-edge technologies. The ideal candidate will possess expertise in data engineering, application development, and cloud architecture to drive innovation and support dynamic, metadata-driven environments.<br><br>Responsibilities:<br>• Develop and maintain data pipelines and workflows using Azure Data Factory and SQL Server to ensure seamless data integration.<br>• Design flexible data models and implement governance protocols to optimize performance and ensure data security.<br>• Architect cloud-native solutions on Microsoft Azure, leveraging AI/ML integration for enhanced analytics and automation.<br>• Build enterprise-grade applications using .NET Blazor and low-code tools, ensuring scalability and user-centered design.<br>• Automate workflows and processes using Azure Logic Apps, Function Apps, and Power Automate.<br>• Manage hosting and deployment through Azure Service Plans and Web Services, ensuring reliability and efficiency.<br>• Collaborate with cross-functional teams to align data engineering and application development efforts with organizational goals.<br>• Lead integration of Azure Data Platform with Power BI to deliver actionable business intelligence solutions.<br>• Mentor team members on best practices in AI, cloud technologies, and application design.<br>• Optimize database and data lake performance across structured and unstructured data environments.
We are in search of a Sr. Software Engineer to join our team based in Woodbridge, New Jersey. In this role, you will be tasked with the development of high-quality software solutions using modern scripting languages and AWS, ensuring scalability, performance, and maintainability. This position will also require you to collaborate closely with various teams to translate business requirements into technical specifications.<br><br>Responsibilities:<br>• Develop robust software solutions using scripting languages such as TypeScript, JavaScript, and Python, ensuring cross-platform compatibility and performance optimization.<br>• Design, develop, and deploy software solutions using React and AWS, adhering to architecture decisions.<br>• Manage APIs to support seamless integration across systems, ensuring adherence to performance, scalability, and security standards.<br>• Optimize the performance and cost-efficiency of applications hosted in AWS, leveraging services such as database management, serverless computing, and cloud storage solutions.<br>• Design, develop, and optimize SQL queries and database structures to support application requirements, ensuring data integrity, performance, and security in SQL-based systems.<br>• Work closely with cross-functional teams, including Business Solution Engineers and QA, to ensure successful project delivery.<br>• Communicate technical concepts clearly to non-technical audiences and promote best practices in coding, testing, and deployment across the team.<br>• Foster a culture of collaboration, innovation, and technical excellence by sharing your experience and knowledge with a team of engineers.<br>• Ensure adherence to security best practices in cloud-based development.
<p>We are offering an exciting opportunity in New Jersey for an Application Support Engineer. This role is integral to our operations, with a focus on end-user application architecture, deployment, and support. The successful candidate will work closely with data management and core systems to ensure productivity and operational excellence.</p><p><br></p><p>Responsibilities:</p><p><br></p><p>• Administer user access, maintaining control over all applications and policies for entitlement allocation. </p><p>• Manage an inventory of application-related content and provide guidance on data hygiene to end-users.</p><p>• Offer first-tier end-user application support and coordinate related application dependency support.</p><p>• Maintain, update, and own the application inventory, including key information about application use and support.</p><p>• Analyze documents and recommend modifications to systems based on user or system design specifications, regulations, industry best practices, and auditor/examiner recommendations.</p><p>• Utilize established project management methodologies to plan, lead, and participate in projects.</p><p>• Manage vendor relationships, defining needs, requirements, and parameters. This includes selection, onboarding, and ongoing management of vendors.</p><p>• Ensure secure computing practices across all areas, actively designing and improving security within core subject areas and across the institution.</p><p>• Prioritize and resolve bridge resolutions and interrupt-driven tasks.</p>
<p>Job Summary:</p><p><br></p><p>We are seeking a skilled and motivated System Engineer to join our team in New York City. As a System Engineer, you will be responsible for maintaining and optimizing our IT infrastructure, ensuring the smooth operation of our systems, and providing technical support to end-users. The ideal candidate should have 5+ years of experience in system administration, with a strong focus on Azure, Windows, Active Directory, VMware, Barracuda Backups Appliance, SAN Nimble storage, and MFT/SFTP technologies.</p><p><br></p><p><br></p><p><br></p><p>Responsibilities:</p><p><br></p><p>Manage and maintain the company's IT infrastructure, including servers, storage systems, network devices, and related components.</p><p>Monitor and troubleshoot system performance, ensuring high availability and reliability of all systems.</p><p>Configure and administer Azure cloud services, including virtual machines, storage, networking, and security.</p><p>Oversee the Windows server environment, including installation, configuration, and maintenance of servers and services.</p><p>Manage Active Directory, including user accounts, group policies, security permissions, and domain services.</p><p>Perform virtualization tasks using VMware, including server provisioning, virtual machine management, and troubleshooting.</p><p>Administer and monitor Barracuda Backups Appliance for data backup and recovery operations.</p><p>Maintain and support SAN Nimble storage systems, ensuring optimal performance and availability.</p><p>Collaborate with cross-functional teams to implement and maintain secure file transfer protocols (MFT) and secure file transfer protocol (SFTP) solutions.</p><p>Perform system upgrades, patches, and security updates in accordance with industry best practices.</p><p>Provide technical support to end-users, resolving issues related to hardware, software, and network connectivity.</p><p>Create and maintain documentation related to system configurations, procedures, and troubleshooting guides.</p><p><br></p><p><br></p><p><br></p>