Job description
We are looking for a highly skilled Senior Data Engineer to join our growing data team and play a key role in building scalable, secure, and high‑performance data solutions for the insurance business. You will work closely with data scientists, actuaries, product teams, and engineering to design and deliver data pipelines, analytics platforms, and cloud‑native architectures that power underwriting, claims, pricing, and customer insights.
This role is ideal for someone who enjoys working at the intersection of data engineering, machine learning, and platform architecture, and wants to drive real impact in a modern, AI‑driven insurance environment.
Key Responsibilities
This role is ideal for someone who enjoys working at the intersection of data engineering, machine learning, and platform architecture, and wants to drive real impact in a modern, AI‑driven insurance environment.
Key Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL processes for structured and unstructured insurance data.
- Databricks certified or equivalent work experience, you will have experience in various data management architectures such as Data Warehouse, Data Lake, Data Fabrics and Data Hub.
- Demonstrated experience in handling unstructured data and structured data with large, complex, and multiple data sets from various sources.
- Build and maintain production‑ready ML pipelines, including feature engineering, model training, deployment, and monitoring.
- Collaborate with actuaries, underwriters, and business analysts to translate insurance operations requirements into technical solutions.
- Optimize data warehouse and lakehouse architectures to support real-time analytics and machine learning models.
- Partner with cross-functional teams on digital transformation projects, including customer insights, claims automation, and risk modeling.
- Architect and maintain scalable ETL/ELT pipelines with scheduling, caching, partitioning, modelling, schema evolution, and lineage tracking to support both batch and real-time streaming.
- Partner with analytics and product teams to operationalize AI-driven data solutions across the insurance business.
- Bachelor’s or Master’s degree in Computer Science, Computer Engineering, or a related discipline.
- A minimum of 5 years of experience in data engineering, analytics engineering, or database management.
- Strong proficiency in SQL and Python for data manipulation and automation.
- Familiarity with cloud platforms (AWS or Azure) and modern data warehousing concepts.
- Hands-on with CI/CD (GitLab/Jenkins), Terraform, automation scripting.
- Proven experience in data pipeline development and maintenance.
- Strong expertise in distributed data processing and streaming architectures.
- Proficiency in Python programming for automation and data engineering tasks.
- Solid knowledge of SQL, data modeling, and ETL/ELT processes.