DBT Data Engineer (DBT & Snowflake)
<h3 data-start="198" data-end="272"><strong data-end="242" data-start="202">DBT Data Engineer (Snowflake, Azure)</strong> | Insurance | London (Hybrid)</h3><p data-start="274" data-end="474">Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an experienced <strong data-start="404" data-end="425">DBT Data Engineer</strong> to join a major <strong data-end="473" data-start="442">insurance client engagement</strong>.</p><p data-end="665" data-start="476">The role focuses on scaling a <strong data-start="506" data-end="534">Snowflake Data Warehouse</strong> and expanding its <strong data-end="566" data-start="553">DBT Cloud</strong> modelling capabilities to support new analytics and regulatory data use cases across the business.</p><h3 data-end="698" data-start="672"><strong data-end="698" data-start="676">Assignment Details</strong></h3><ul><li>Initial Duration: 6-12 months (until March 2026, likely extension given scope)</li><li>Location: Hybrid - minimum of 2 days per week in London</li><li>Day Rate: £450 PAYE (plus 12.07% holiday pay - PAYE with employer's NI & Tax deducted at source, unlike umbrella companies and no umbrella company admin fees)</li></ul><h3 data-start="1003" data-end="1024"><strong data-end="1024" data-start="1007">Role Overview</strong></h3><p data-end="1393" data-start="1025">You'll be part of a growing data engineering function focused on <strong data-start="1090" data-end="1115">DBT model development</strong>, <strong data-start="1117" data-end="1143">Snowflake optimisation</strong>, and <strong data-start="1149" data-end="1168">data governance</strong> across multiple data domains.<br data-end="1201" data-start="1198" /> This role suits a technically strong engineer with proven <strong data-start="1259" data-end="1272">DBT Cloud</strong> experience who can take ownership of data pipelines and drive best practices in transformation, testing, and automation.</p><h3 data-start="1400" data-end="1431"><strong data-end="1431" data-start="1404">Key Skills & Experience</strong></h3><ul><li>Deep <strong data-start="1439" data-end="1452">DBT Cloud</strong> expertise, including models, macros, tests, documentation, and CI/CD integration.</li><li>Hands-on experience developing and optimising in <strong data-start="1588" data-end="1622">Snowflake Cloud Data Warehouse</strong> (schemas, warehouses, security, time travel, performance tuning).</li><li>Familiarity with <strong data-end="1764" data-start="1710">Snowflake cost monitoring, governance, replication</strong>, and environment management.</li><li>Strong understanding of <strong data-start="1822" data-end="1840">data modelling</strong> (star/snowflake schemas, SCDs, lineage).</li><li>Proven <strong data-start="1893" data-end="1902">Azure</strong> experience (Data Factory, Synapse, Databricks) for orchestration and integration.</li><li>Proficient in <strong data-start="2003" data-end="2010">SQL</strong> for complex analytical transformations and optimisations.</li><li>Comfortable working in agile teams and using <strong data-end="2134" data-start="2118">Azure DevOps</strong> for CI/CD workflows.</li></ul><h3 data-end="2182" data-start="2162"><strong data-end="2182" data-start="2166">Nice to Have</strong></h3><ul><li><strong data-end="2195" data-start="2185">Python</strong> or <strong data-start="2199" data-end="2210">PySpark</strong> for automation and data quality testing.</li><li>Knowledge of <strong data-end="2312" data-start="2269">data governance and security frameworks</strong> (RBAC, masking, encryption).</li><li>Prior experience in <strong data-end="2401" data-start="2366">financial services or insurance</strong> environments.</li></ul><p data-end="2536" data-start="2424">All candidates must complete standard screening (Right to Work, DBS, credit/sanctions, employment verification).</p><p data-start="2543" data-end="2761">This is an exciting opportunity for a <strong data-start="2581" data-end="2610">DBT-focused Data Engineer</strong> to join a high-performing consulting team and help build, scale, and optimise modern Snowflake data solutions within a leading insurance organisation.</p><p>Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data: roberthalf.com/gb/en/privacy-notice.</p><img src="https://counter.adcourier.com/Tmljb2xhcy5HZW9yZ2lvdS4wODQxOC4xMDkyM0ByaGkuYXBsaXRyYWsuY29t.gif">
azure snowflake dbt engineer data
- City of London, London
- remote
- Contract
-
450 - 500 GBP / Daily
- <h3 data-start="198" data-end="272"><strong data-end="242" data-start="202">DBT Data Engineer (Snowflake, Azure)</strong> | Insurance | London (Hybrid)</h3><p data-start="274" data-end="474">Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an experienced <strong data-start="404" data-end="425">DBT Data Engineer</strong> to join a major <strong data-end="473" data-start="442">insurance client engagement</strong>.</p><p data-end="665" data-start="476">The role focuses on scaling a <strong data-start="506" data-end="534">Snowflake Data Warehouse</strong> and expanding its <strong data-end="566" data-start="553">DBT Cloud</strong> modelling capabilities to support new analytics and regulatory data use cases across the business.</p><h3 data-end="698" data-start="672"><strong data-end="698" data-start="676">Assignment Details</strong></h3><ul><li>Initial Duration: 6-12 months (until March 2026, likely extension given scope)</li><li>Location: Hybrid - minimum of 2 days per week in London</li><li>Day Rate: £450 PAYE (plus 12.07% holiday pay - PAYE with employer's NI & Tax deducted at source, unlike umbrella companies and no umbrella company admin fees)</li></ul><h3 data-start="1003" data-end="1024"><strong data-end="1024" data-start="1007">Role Overview</strong></h3><p data-end="1393" data-start="1025">You'll be part of a growing data engineering function focused on <strong data-start="1090" data-end="1115">DBT model development</strong>, <strong data-start="1117" data-end="1143">Snowflake optimisation</strong>, and <strong data-start="1149" data-end="1168">data governance</strong> across multiple data domains.<br data-end="1201" data-start="1198" /> This role suits a technically strong engineer with proven <strong data-start="1259" data-end="1272">DBT Cloud</strong> experience who can take ownership of data pipelines and drive best practices in transformation, testing, and automation.</p><h3 data-start="1400" data-end="1431"><strong data-end="1431" data-start="1404">Key Skills & Experience</strong></h3><ul><li>Deep <strong data-start="1439" data-end="1452">DBT Cloud</strong> expertise, including models, macros, tests, documentation, and CI/CD integration.</li><li>Hands-on experience developing and optimising in <strong data-start="1588" data-end="1622">Snowflake Cloud Data Warehouse</strong> (schemas, warehouses, security, time travel, performance tuning).</li><li>Familiarity with <strong data-end="1764" data-start="1710">Snowflake cost monitoring, governance, replication</strong>, and environment management.</li><li>Strong understanding of <strong data-start="1822" data-end="1840">data modelling</strong> (star/snowflake schemas, SCDs, lineage).</li><li>Proven <strong data-start="1893" data-end="1902">Azure</strong> experience (Data Factory, Synapse, Databricks) for orchestration and integration.</li><li>Proficient in <strong data-start="2003" data-end="2010">SQL</strong> for complex analytical transformations and optimisations.</li><li>Comfortable working in agile teams and using <strong data-end="2134" data-start="2118">Azure DevOps</strong> for CI/CD workflows.</li></ul><h3 data-end="2182" data-start="2162"><strong data-end="2182" data-start="2166">Nice to Have</strong></h3><ul><li><strong data-end="2195" data-start="2185">Python</strong> or <strong data-start="2199" data-end="2210">PySpark</strong> for automation and data quality testing.</li><li>Knowledge of <strong data-end="2312" data-start="2269">data governance and security frameworks</strong> (RBAC, masking, encryption).</li><li>Prior experience in <strong data-end="2401" data-start="2366">financial services or insurance</strong> environments.</li></ul><p data-end="2536" data-start="2424">All candidates must complete standard screening (Right to Work, DBS, credit/sanctions, employment verification).</p><p data-start="2543" data-end="2761">This is an exciting opportunity for a <strong data-start="2581" data-end="2610">DBT-focused Data Engineer</strong> to join a high-performing consulting team and help build, scale, and optimise modern Snowflake data solutions within a leading insurance organisation.</p><p>Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data: roberthalf.com/gb/en/privacy-notice.</p><img src="https://counter.adcourier.com/Tmljb2xhcy5HZW9yZ2lvdS4wODQxOC4xMDkyM0ByaGkuYXBsaXRyYWsuY29t.gif">
- 2025-11-10T11:19:52Z