We are looking for a skilled Software Developer to lead the implementation of a new ServiceNow instance from the ground up. This long-term contract role is based in Cincinnati, Ohio, and requires close collaboration with business analysts and stakeholders to deliver customized solutions that align with organizational goals. The ideal candidate will bring expertise in ServiceNow workspaces, UX, and asset management to create scalable and efficient systems.<br><br>Responsibilities:<br>• Design and implement a new ServiceNow instance tailored to organizational needs and industry best practices.<br>• Partner with business analysts and stakeholders to gather requirements and propose effective solutions.<br>• Develop and customize workflows, forms, integrations, reports, dashboards, and process automation.<br>• Utilize ServiceNow Workflow, Flow Designer, JavaScript, and other tools to enhance system functionality.<br>• Configure and optimize ServiceNow workspaces and user interfaces for seamless usability.<br>• Focus on asset management solutions, ensuring scalability and alignment with organizational objectives.<br>• Implement and manage security roles, access controls, and permissions to maintain compliance standards.<br>• Provide ongoing support and enhancements to ensure the ServiceNow system remains effective and up-to-date.<br>• Troubleshoot and resolve technical issues, ensuring system reliability.<br>• Collaborate with cross-functional teams to deliver enterprise-wide solutions.
We are looking for an experienced Data Engineer to join our team in Cincinnati, Ohio. This long-term contract position offers the opportunity to work on cutting-edge data engineering projects while collaborating with multidisciplinary teams to deliver high-quality solutions. The ideal candidate will have a strong background in Databricks and big data technologies, along with a passion for optimizing data processes and systems.<br><br>Responsibilities:<br>• Design, build, and enhance data pipelines using Databricks Runtime, Delta Lake, Autoloader, and Structured Streaming.<br>• Implement secure and governed data access protocols utilizing Unity Catalog, workspace controls, and audit configurations.<br>• Manage and integrate structured and unstructured data from diverse sources, including APIs and cloud storage.<br>• Develop and maintain notebook-based workflows and manage jobs using Databricks Workflows and Jobs.<br>• Apply best practices for performance tuning, scalability, and cost optimization in Databricks environments.<br>• Collaborate with data scientists, analysts, and business stakeholders to deliver clean and reliable datasets.<br>• Support continuous integration and deployment processes for Databricks jobs and system configurations.<br>• Ensure high standards of data quality and security across all engineering tasks.<br>• Troubleshoot and resolve issues to maintain operational efficiency in data pipelines.