Excellent opportunity for a suitably experienced Senior Data Engineer with strong Azure services skill set, Data Factory, Logic Apps, Azure Functions, Azure Synapse, and/or Data Lake.
A great role to lead the design and implementation of scalable data solutions, including the development of real time and ETL/ELT data pipelines across various source systems.
You will be responsible for maintaining and optimising the data warehouse, building advanced Power BI reports, and contributing to the development of AI-driven solutions leveraging OpenAI's APIs.
You will be required have 7+ years experience as a Data Engineer.
This Senior Data Engineer role is based on-site in London Monday to Friday, there is no remote working option in this role.
Senior Data Engineer
Location: Central London W1
Salary: Negotiable Dependant upon experience
Key Responsibilities:
* Design and implement scalable, robust real time data pipelines integrating data from diverse source systems (e.g., APIs, databases, flat files).
* Build and maintain a high-performance data warehouse to support reporting, analytics, and AI workloads.
* Develop interactive and insightful Power BI dashboards and reports for business stakeholders.
* Design and implement data integration and transformation logic using Azure Data Factory, Azure Functions, and Data Logic Apps.
* Utilize Python and SQL for data wrangling, transformation, and analysis across large-scale datasets.
* Leverage OpenAI APIs to develop AI/ML-based solutions for natural language understanding, data enrichment, and automation.
* Collaborate with cross-functional teams including product managers, analysts, and data scientists to align data strategy with business goals.
* Monitor and optimize pipeline performance, ensuring data quality, consistency, and reliability.
* Implement best practices for data governance, security, and compliance.
Required Qualifications:
* 7-8 years of hands-on experience as a Data Engineer, Data Architect, or similar role.
* Proven experience in designing and maintaining ETL/ELT pipelines across complex data ecosystems.
* Proficiency in Python for scripting, data manipulation, and automation.
* Strong SQL skills for querying and managing large datasets in relational and cloud-based environments.
* Hands-on experience with Azure services: Data Factory, Logic Apps, Azure Functions, Azure Synapse, and/or Data Lake.
* Expertise in building data models and visualizations using Power BI.
* Experience with OpenAI APIs or similar AI/ML services for building intelligent data products.
* Solid understanding of data warehousing principles and architectures.
* Excellent analytical, problem-solving, and communication skills.
Preferred Qualifications:
* Knowledge of DevOps practices, CI/CD pipelines
* Familiarity with cloud-native data platforms (Databricks, Fabric.
* Understanding of data privacy regulations (e.g., GDPR) and best practices for secure data handling.