We’re seeking a hands-on Data Quality Analyst to lead data profiling, cleansing, migration validation, and ongoing governance during a multi-phase ERP/CRM transition. You’ll partner with Business Systems, Finance, Operations, and RevOps to define data standards, remediate quality issues at the source, and ensure that data moved into NetSuite is complete, accurate, consistent, and reconciled back to SAP B1 and Salesforce.<br><br>What You’ll Do<br>Data Discovery & Standards<br><br>Profile legacy data sets (SAP B1, Salesforce, downstream extracts) to quantify quality issues (duplicates, nulls, invalid values, orphaned records, referential breaks).<br>Define and socialize data quality rules and validation thresholds for critical objects (customers, vendors, items, chart of accounts, price books, opportunities, orders, invoices, inventory, BOMs).<br>Establish data dictionaries and mapping specs (source → staging → NetSuite) including transformations, reference data, and business rules.<br><br>Cleansing, Mapping & Readiness<br><br>Design and execute cleansing plans (standardization, deduplication, survivorship rules, address/phone/email normalization, code set alignment).<br>Build transformation logic for fields that change across systems (e.g., COA structure, unit of measure, tax handling, multi‑subsidiary/entity mappings).<br>Coordinate test loads with the migration team; track and close defects in an issue log.<br><br>Migration Testing & Reconciliation<br><br>Create test cases and acceptance criteria for mock loads, CRP/SIT, UAT, and cutover; verify row counts, referential integrity, and business-calculated balances.<br>Reconcile financial and operational data between legacy systems and NetSuite (e.g., AR/AP aging, inventory on hand/valuation, open orders, deferred revenue).<br>Produce “go/no‑go” quality dashboards before each migration wave; document sign‑offs.<br><br>Data Governance & Post‑Go‑Live<br><br>Implement data controls (validation rules, picklists, reference tables) to prevent regression in NetSuite post‑go‑live.<br>Define stewardship model and RACI; set up SLAs and monitoring for ongoing data quality KPIs.<br>Train business users on data standards; build quick reference guides and SOPs.<br><br>Key Outcomes (First 90–180 Days)<br><br>Baseline data quality assessment with quantified risk and remediation plan.<br>Approved source‑to‑target mappings and transformation logic for priority objects.<br>Successful mock loads with ≥99% record acceptance for in‑scope entities.<br>Financial and operational reconciliation within agreed tolerances (e.g., ±0.5%).<br>Post‑go‑live DQ dashboard live with automated monitoring and ownership defined.<br><br>Required Experience & Skills<br><br>4–7+ years in data quality / data migration / master data management roles.<br>Hands-on with ERP/CRM data—preferably SAP Business One, Salesforce, and NetSuite object models (customers, items, vendors, transactions, COA).<br>Strong SQL and data profiling skills; comfortable with large datasets and joins.<br>Proven experience designing and executing data validation and reconciliation for ERP cutovers (financials, order-to-cash, procure-to-pay, inventory).<br>Solid understanding of accounting data structures (GL, subledgers, multi-entity) and operational data (price lists, BOM, inventory locations).<br>Excellent documentation (mappings, dictionaries, rules) and stakeholder communication.