<br><p><strong>DBT Cloud Data Engineer (Snowflake, Azure)</strong> | Insurance | London (Hybrid)</p><p>Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an experienced <strong>DBT Data Engineer</strong> to join a major <strong>insurance client engagement</strong>. The role focuses on scaling a <strong>Snowflake Data Warehouse</strong> and expanding its <strong>DBT Cloud</strong> modelling capabilities to support new analytics and regulatory data use cases across the business.</p><p><strong>Assignment Details</strong></p><ul><li>Initial Duration: 6 months (likely extension given scope)</li><li>Location: Hybrid - minimum of 2 days per week in London</li><li>Day Rate: £450 p/day via PAYE (plus 12.07% holiday pay - PAYE with employer's NI & Tax deducted at source, unlike umbrella companies and no umbrella company admin fees)</li></ul><p>You'll be part of a growing data engineering function focused on <strong>DBT model development</strong>, <strong>Snowflake optimisation</strong>, and <strong>data governance</strong> across multiple data domains. This role suits a technically strong engineer with proven <strong>DBT Cloud</strong> experience who can take ownership of data pipelines and drive best practices in transformation, testing, and automation.</p><p><strong>Key Skills & Experience</strong></p><ul><li><strong>Deep DBT Cloud expertise, including models, macros, tests, documentation, and CI/CD integration.</strong></li><li><strong>Hands-on experience developing and optimising in Snowflake Cloud Data Warehouse (schemas, warehouses, security, time travel, performance tuning).</strong></li><li>Familiarity with <strong>Snowflake cost monitoring, governance, replication</strong>, and environment management.</li><li>Strong understanding of <strong>data modelling</strong> (star/snowflake schemas, SCDs, lineage).</li><li>Proven <strong>Azure</strong> experience (Data Factory, Synapse, Databricks) for orchestration and integration.</li><li>Proficient in <strong>SQL</strong> for complex analytical transformations and optimisations.</li><li>Comfortable working in agile teams and using <strong>Azure DevOps</strong> for CI/CD workflows.</li><li>Prior experience in <strong>financial services or insurance</strong> environments would be desirable</li></ul><p><strong>All candidates must complete standard screening (Right to Work, DBS, credit/sanctions, employment verification).</strong></p><p>This is an exciting opportunity for a DBT-focused Data Engineer to join a high-performing consulting team and help build, scale, and optimise modern Snowflake data solutions within a leading insurance organisation.</p><br><br><p>Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data: roberthalf.com/gb/en/privacy-notice.</p><img src="https://counter.adcourier.com/Tmljb2xhcy5HZW9yZ2lvdS45MTg2MS4xMDkyM0ByaGkuYXBsaXRyYWsuY29t.gif">
<p>Robert Half Technology are assisting a market leading AI-driven data platform organisation to recruit a Data Platform Engineer paying c£125 - £150k per annum. Remote working (UK based).</p><p><strong>Role</strong></p><ul><li>The Data Platform Engineer will be an expert level engineer within Snowflake for data engineering and security/governance features</li><li>Build & maintain GenAI Platform solutions focused on security and governance for engineering delivery</li><li>Build & maintain python & SQL based platform automation process</li><li>Build & maintain DataOps process for SDLC delivery</li><li>Build & maintain data quality metrics & observability to help drive data quality standards</li><li>Build & maintain administration systems and applications for monitoring, alerting, data observability, access management, platform metrics, and end user transparency</li><li>Assist Data Engineering with building python & SQL data replication & data pipelines on large & often complex data sets</li><li>Design data models for both short term and long term use cases to support data warehouse scalability</li><li>Identify opportunities for improvements & optimisation for greater scalability & delivery velocity</li><li>Collaborate with engineers within GenAI Engineering & Analytics Engineering to build scalable solutions within the Data Platform</li><li>Perform root cause analysis on often complex errors to help ensure data pipeline availability</li><li>Help drive technical & architectural decisions on the data platform including decisions on data architecture, data engineering processes, data quality frameworks, data access security & governance frameworks, DataOps processes & data consumption models.</li><li>Help test new features and partner tools to both provide feedback internally as well as determine value towards internal analytics & data platform integration</li><li>Work closely with key stakeholders across the organisation including Infra, embedded analytics teams, Product and Engineering to help foster both technical implementations & requirements gathering</li><li>Proactively drive innovation internally with dedicated innovation time & projects that aim to be transformational for either the platform, team or company as a whole.</li></ul><p><strong>Profile</strong></p><ul><li>The Data Platform Engineer will have experience & a strong understanding of GenAI development, security & governance concepts a must</li><li>5+ years of relevant experience in Data Engineering / Data Platform Engineering /Data Architecture</li><li>Expert level SQL & Python is a must.</li><li>Experience in DSS is a strong plus.</li><li>Prior experience with Snowflake required</li><li>Prior experience with DevOps technologies such as Github Actions, Azure DevOps or Jenkins</li><li>Strong understanding of data architecture & data modelling concepts</li><li>Prior experience building and maintaining replication & data pipelines in a cloud data warehouse or data lake environment confidential</li><li>Excellent analytical and creative problem-solving skills - exhibit confidence to ask questions to bring clarity, share ideas and challenge the norm.</li><li></li></ul><p><strong>Company</strong></p><ul><li>Market leading AI-driven data platform</li><li>Remote working - UK based </li><li>Offices in London</li><li>£125k - £150k per annum</li></ul><p><strong>Salary & Benefits</strong></p><p>The salary range/rates of pay is dependent upon your experience, qualifications or training.</p><p>Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data: roberthalf.com/gb/en/privacy-notice.</p><img src="https://counter.adcourier.com/S2F6aW0uSGFzc2FuLjc5MjA0LjEwOTIzQHJoaS5hcGxpdHJhay5jb20.gif">
<h1 data-end="161" data-start="89"><strong data-start="91" data-end="159">Data Engineer (Python) - Leading Asset Manager | London (Hybrid)</strong></h1><p data-end="208" data-start="162"><strong data-start="162" data-end="208">Brand-new team. High autonomy. Big impact and high bonus! </strong></p><p data-end="489" data-start="210">A top Asset Manager is building a <strong data-end="278" data-start="244">new data function from scratch</strong>, and they need a sharp Data Engineer to help define and deliver their future data strategy. You'll work directly with senior stakeholders, own your pipelines, and operate in a fast-paced commercial environment. You will be joining a small but growing team! </p><p data-start="491" data-end="562"><strong data-start="491" data-end="502">Hybrid:</strong> 3 days in London office, 2 days WFH (flexible when needed).</p><h3 data-end="585" data-start="569"><strong data-end="585" data-start="573">The Role</strong></h3><ul data-end="916" data-start="586"><li data-start="586" data-end="652"><p data-end="652" data-start="588">Build and optimise scalable data pipelines for a new Data Lake</p></li><li data-end="709" data-start="653"><p data-start="655" data-end="709">Work with <strong data-end="707" data-start="665">Python, SQL, Spark, AWS, Docker, CI/CD</strong></p></li><li data-start="710" data-end="774"><p data-start="712" data-end="774">Orchestrate workflows using <strong data-end="772" data-start="740">Airflow, Prefect, or Dagster</strong></p></li><li data-end="842" data-start="775"><p data-start="777" data-end="842">Support ETL, API integrations, and high-quality data validation</p></li><li data-start="843" data-end="916"><p data-end="916" data-start="845">Translate business needs into technical solutions with full ownership</p></li></ul><h3 data-end="944" data-start="923"><strong data-start="927" data-end="944">What You Need</strong></h3><ul data-end="1260" data-start="945"><li data-start="945" data-end="1005"><p data-start="947" data-end="1005">STEM degree (Computer Science, Engineering, Maths, etc.)</p></li><li data-end="1061" data-start="1006"><p data-end="1061" data-start="1008">2+ years in data engineering / analytics / software</p></li><li data-end="1085" data-start="1062"><p data-start="1064" data-end="1085">Strong Python + SQL</p></li><li data-end="1128" data-start="1086"><p data-start="1088" data-end="1128">2+ years ETL, APIs, CI/CD, Docker, AWS</p></li><li data-end="1180" data-start="1129"><p data-start="1131" data-end="1180">Ideally but not essential experience with <strong data-start="1147" data-end="1178">Airflow / Prefect / Dagster</strong></p></li><li data-end="1260" data-start="1181"><p data-end="1260" data-start="1183">Background in <strong data-start="1197" data-end="1258">Financial Services, FinTech, Insurance, PE/VC, or Banking</strong></p></li></ul><h3 data-start="1267" data-end="1293"><strong data-start="1271" data-end="1293">Package & Benefits</strong></h3><ul data-end="1439" data-start="1294"><li data-start="1294" data-end="1324"><p data-start="1296" data-end="1324"><strong data-end="1322" data-start="1296">High performance bonus</strong></p></li><li data-end="1348" data-start="1325"><p data-start="1327" data-end="1348"><strong data-end="1346" data-start="1327">25 days holiday</strong></p></li><li data-end="1385" data-start="1349"><p data-start="1351" data-end="1385"><strong data-start="1351" data-end="1383">Pension & private healthcare</strong></p></li><li data-start="1386" data-end="1439"><p data-start="1388" data-end="1439">Join at inception - shape the team and tech stack</p></li></ul><p>Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data: roberthalf.com/gb/en/privacy-notice.</p><img src="https://counter.adcourier.com/dG9ueS5rb3lyYXR0eS42ODM2MS4xMDkyM0ByaGkuYXBsaXRyYWsuY29t.gif">