Data Engineer

Who We Are
Since its inception, Aventum Group has sought to take a different approach to insurance. We are on a mission to be ‘the most inspiring specialty (re)insurance group in the world’.
At the heart of Aventum are our people. Our employees collaborate in dynamic, service-focused teams. Together, we strive daily to achieve our goals and objectives with a shared dedication to revolutionising the insurance industry for the better.
The Company offers a competitive benefits package via a flexible benefits platform. In addition to core benefits, employees can tailor their benefits according to their individual needs.
Employee development is key to the ongoing development of Aventum on the whole. We invest in our people, empowering them to grow their careers and advance within the Group. Our dynamic culture is rooted in the continuous desire of our people to learn and challenge themselves.
Role Summary
You will work in various settings to build systems that collect, manage, and convert raw data into usable information for data scientists and business analysts to interpret. Your ultimate goal is to make data accessible so that Aventum stakeholders can use it to evaluate and optimise their performance.
Role Accountabilities
Strategy
Support to design, develop, implement, manage and support enterprise level ELT/ELT processes and environment.
Technical and business processes usage to combine data from multiple sources to provide a unified, single view of the data. Be an architect making strategic decisions.
Partly responsible for accessing, validating, and querying data from various repositories using available tools. Build and maintain data integration processes using SQL Services and other ETL / ELT processes and scripting tools as well as ongoing requests and projects related to the data warehouse, MI, or fast-moving financial data.
Designing the infrastructure/architecture of big data platform.
Evaluating, comparing and improving the different approaches including design patterns innovation, data lifecycle design, data ontology alignment, annotated datasets, and elastic search approaches.
Developing, creating and maintaining a reliable data pipeline and schemas that feed other data processes; includes both technical processes and business logic to transform data from disparate sources into cohesive meaningful and valuable data with quality, governance and compliance considerations.
Customising and managing integration tools, databases, warehouses, and analytical systems.
Identifying and eliminating all non-value-adding activities through automation or outsourcing.
Operations
Design and implement the management, monitoring, security, and privacy of data using the full stack of Azure data services to satisfy business needs.
Ensuring non-functional system characteristics such as such as security, maintainability, quality, performance, and reliability are captured, prioritized, and incorporated into products.
Leverage Agile, CI / CD and DevOps methodologies to deliver high quality product on-time.
Architecting, building, testing, and maintaining data platform as a whole. Develop and support a wide range of data transformations and migrations for the whole business.
Construct custom ETL processes: Design and implementation data pipelines, data marts and schemas, access versatile data sources and apply data quality measures.
Monitoring the complete process and apply necessary infrastructure changes to speed up the query execution and analyse data more efficiently; this includes Database optimisation techniques (data partitioning, database indexing and denormalisation) & efficient data ingestion (data mining techniques and different data ingestion APIs).
Responding to errors and alerts to correct and re-process data. Investigate data mismatches and apply solutions. Data scrubbing and analysis to troubleshoot data integration issues and determine root cause.
Any additional duties as assigned.
Role Requirements
Bachelor’s degree or equivalent in an engineering/numerate subject (e.g. Engineering, Statistics, Maths, Computer Sciences)
Experience in full-stack development and applying it to build science products (E.g. could include some or all Python/R, Linux scripting, SQL, Docker coupled with front ends such as Javascript)
Some experience as a developer building data pipelines and schemas, with data WH implementation, with SQL database development
Hands-on experience using Synapse or related tool with cloud-based resources (e.g. Stored Procedures, ADF, NoSQL Databases, JSON/XML data formats)
Hands-on experience with Azure Functions, Azure service bus, Azure Data Factory data integration techniques
Knowledge of Data Modelling concepts, monitoring, designs and techniques
Knowledge of Data Warehouse project lifecycle, tools, technologies, best practices
Experiene using Cloud Computing platforms (ADLS Gen2), Microsoft Stack (Synapse, DataBricks, Fabric, Profisee), Snowflake Data Integration, Azure Service Bus, Deltalake, BigQuery, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App service, Apache Airflow, Apache Iceberg, Apache Spark, Apache Hudi, Apache Kafka, Power BI, BigQuery, Azure ML is a plus
Experience with Azure SQL Database, Cosmos DB, NoSQL, MongoDB
Experience with Agile, DevOps methodologies
Awareness and knowledge of ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (public Key Infrastructure) and Integration testing
Skills and Abilities
Knowledge of Python, SQL, SSIS, Spark languages.
Demonstrative ability to develop complex SQL queries and Stored Procedures
Relationship-building and stakeholder management
Management Duties
No
We are an equal opportunity employer, and we are proud to share that 93% of our employees say they can be themselves at work. We aim to hire our industry's finest people because the best people drive the best outcomes. And we forever challenge the status quo because we know there are always ways to improve things. Because together, we're limitless.
We value applicants from all backgrounds and foster a culture of inclusivity. We understand the need for flexibility, so work in a hybrid model. Please let us know if you require any reasonable adjustments during the recruitment process.
FCA Conduct Rules
Under the Senior Managers and Certification Regime the FCA and Aventum expects that:
You must act with integrity
You must act with due skill, care and diligence
You must be open and cooperative with the FCA, the PRA and other regulators.
You must pay due regard to the interests of customers and treat them fairly.
You must observe proper standards of market conduct
You must act to deliver good outcomes for retail customers
Apply now
By submitting your information, Aventum may contact you further if we have a legitimate interest to do so. You will be able to unsubscribe from all email communications at any time.