We are looking for a skilled Software Developer to lead the implementation of a new ServiceNow instance from the ground up. This long-term contract role is based in Cincinnati, Ohio, and requires close collaboration with business analysts and stakeholders to deliver customized solutions that align with organizational goals. The ideal candidate will bring expertise in ServiceNow workspaces, UX, and asset management to create scalable and efficient systems.<br><br>Responsibilities:<br>• Design and implement a new ServiceNow instance tailored to organizational needs and industry best practices.<br>• Partner with business analysts and stakeholders to gather requirements and propose effective solutions.<br>• Develop and customize workflows, forms, integrations, reports, dashboards, and process automation.<br>• Utilize ServiceNow Workflow, Flow Designer, JavaScript, and other tools to enhance system functionality.<br>• Configure and optimize ServiceNow workspaces and user interfaces for seamless usability.<br>• Focus on asset management solutions, ensuring scalability and alignment with organizational objectives.<br>• Implement and manage security roles, access controls, and permissions to maintain compliance standards.<br>• Provide ongoing support and enhancements to ensure the ServiceNow system remains effective and up-to-date.<br>• Troubleshoot and resolve technical issues, ensuring system reliability.<br>• Collaborate with cross-functional teams to deliver enterprise-wide solutions.
We are looking for an experienced Data Engineer to join our team in Cincinnati, Ohio. This long-term contract position offers the opportunity to work on cutting-edge data engineering projects while collaborating with multidisciplinary teams to deliver high-quality solutions. The ideal candidate will have a strong background in Databricks and big data technologies, along with a passion for optimizing data processes and systems.<br><br>Responsibilities:<br>• Design, build, and enhance data pipelines using Databricks Runtime, Delta Lake, Autoloader, and Structured Streaming.<br>• Implement secure and governed data access protocols utilizing Unity Catalog, workspace controls, and audit configurations.<br>• Manage and integrate structured and unstructured data from diverse sources, including APIs and cloud storage.<br>• Develop and maintain notebook-based workflows and manage jobs using Databricks Workflows and Jobs.<br>• Apply best practices for performance tuning, scalability, and cost optimization in Databricks environments.<br>• Collaborate with data scientists, analysts, and business stakeholders to deliver clean and reliable datasets.<br>• Support continuous integration and deployment processes for Databricks jobs and system configurations.<br>• Ensure high standards of data quality and security across all engineering tasks.<br>• Troubleshoot and resolve issues to maintain operational efficiency in data pipelines.
We are looking for a skilled Data Engineer to join our team on a long-term contract basis. This position offers the opportunity to work remotely while contributing to critical data management and integration efforts. The ideal candidate will have hands-on experience with customer master data in ECC6, and the ability to create, maintain, and manage data effectively.<br><br>Responsibilities:<br>• Develop and maintain customer master data within ECC6, ensuring data accuracy and consistency.<br>• Create new customer profiles and manage existing ones, maintaining high standards of data integrity.<br>• Support the integration process by working with custom tables related to customer data.<br>• Collaborate with cross-functional teams to ensure seamless data flow and effective data management.<br>• Utilize tools such as Apache Spark, Python, and ETL processes to extract, transform, and load data efficiently.<br>• Leverage Apache Hadoop for scalable data storage and processing solutions.<br>• Implement Apache Kafka to enable real-time data streaming and integration.<br>• Troubleshoot and resolve data-related issues, ensuring system reliability.<br>• Provide documentation and training to stakeholders on data management processes.<br>• Stay updated on industry best practices and emerging technologies to enhance data engineering workflows.