Azure Data Engineer
Fulfil your potential in hospitals that make history:
Charing Cross, Hammersmith, St Mary’s, Queen Charlotte’s & Chelsea and Western Eye.
With five world-renowned hospitals, Imperial College Healthcare NHS Trust is full of opportunity if you are looking to develop your healthcare career.
We are an NHS Trust of approximately 14,000 people, providing care for over a million and a half patients from north west London and beyond every year.
We have a rich heritage and an ambitious vision for the future of our patients and local communities.
With our partners, Imperial College London, and The Royal Marsden NHS Foundation Trust, we form Imperial College Academic Health Science Centre, one of 6 academic health science centres in the UK, working to ensure the rapid translation of research for better patient care and excellence in education.
We are proud of our heritage in innovation and we are early adopters of new insights in technologies, techniques and treatments for improving health.
Job overview
We are looking for an experienced Azure Database Engineer to lead our cloud migration projects and support with the API projects, preferably within a healthcare data environment. The ideal candidate will have a deep understanding of Microsoft Azure and a proven track record in designing and implementing database solutions.This role involves working closely with stakeholders to ensure a smooth transition from existing On-premises platforms to Azure. You will be responsible for designing and implementing efficient, scalable, cost-effective and secure cloud solutions using Microsoft Azure.
The candidate will possess a deep understanding of Azure services including but not limited to Azure Blob, Azure App Services, Azure Functions, Azure Synapse Database, Azure Storage, and Azure Networking. Understanding of Snowflake, Power Automate, Pandas, NLP and Power BI would be desirable.
Your expertise will be crucial in guiding our cancer projects to success, ensuring alignment with requirements, best practices, and security standards. You will collaborate with cross-functional teams to understand data requirements and translate them into robust architectural solutions while orchestrating and ingesting data from multiple sources (EPR, Cancer database, Imaging, Histopathology, Endoscopy, Chemotherapy, Radiotherapy etc).
Main duties of the job- Lead the migration from Onprem to Cloud based solution.
- Manage the existing and new APIs.
- Create and maintain data pipelines, data storage solutions, and data processing systems on the Azure platform.
- Develop and implement processes to extract, transform, and load (ETL) data from various sources into data warehouses or data lakes.
- Work with Azure SQL Database, Azure Data Lake Storage, Azure Cosmos DB, and other storage solutions to design scalable and efficient data storage systems.
- Utilise big data technologies like Azure Databricks and Apache Spark to handle and analyze large volumes of data.
- Design, implement, and optimize data models to support business requirements and performance needs.
- Work with data scientists, analysts, and other stakeholders to ensure data solutions meet business needs.
- Identify and implement internal process improvements, such as automating manual processes and optimising data delivery.
- Maintain data quality, integrity, and security across all data solution.
Working for our organisation
At Imperial College Healthcare you can achieve extraordinary things with extraordinary people, working with leading clinicians pushing boundaries in patient care.
Become part of a vibrant team living our values - expert, kind, collaborative and aspirational. You’ll get an experience like no other and will fast forward your career.
Benefits include career development, flexible working and wellbeing, staff recognition scheme. Make use of optional benefits including Cycle to Work, car lease schemes, season ticket loan or membership options for onsite leisure facilities.
We are committed to equal opportunities and improving the working lives of our staff and will consider applications to work flexibly, part time or job share. Please talk to us at interview.
Detailed job description and main responsibilities
The full job description provides an overview of the key tasks and responsibilities of the role and the person specification outlines the qualifications, skills, experience and knowledge required.
For both overviews please view the Job Description attachment with the job advert.
Person specification
Education/ Qualifications
Essential criteria- Professional qualification at master's level in an IT related discipline
- Evidence of continued professional development as an Azure Solutions Architect Associate or Azure DevOps Engineer Associate Certification or higher.
- Azure IoT Developer Specialty Certification
- Degree in mathematics or health related discipline.
Experience
Essential criteria- Proven experience as an Azure Engineer or Architect, with a strong portfolio of successful cloud solutions and projects.
- Experience with cloud migration and modernisation projects.
- Extensive proven experience of clinical or administrative and/or operational workflows and processes working with multiple systems.
- Experience of communicating complex issues and data concepts to large stakeholders.
- Experience in data integration role with the ability to apply logical thinking using a wide range of data processing tools and techniques to build and integrate solutions in a highly complex environment.
- Experience of working with multiple data types both structured and unstructured.
- Experience of metadata documentation and development.
- Significant experience working in Data Warehousing including DBA roles.
- Good understanding of software development lifecycle, methodologies/platforms such as Agile and GitHub.
- Experience of NHS cancer systems and knowledge of cancer patient pathways.
- Understanding of Snowflake environment.
- Experience of PowerBI.
Skills/Knowledge/Abilities
Essential criteria- Expertise in Azure Databricks and Azure Synapse.
- Strong understanding of data modelling, ETL / ELT processes, and data warehousing concepts.
- Demonstrable knowledge and skills to identify and elaborate user and business needs to enable effective design, development and testing of services and business change.
- Advanced modelling skills.
- Advanced knowledge of Python and SQL (e.g. NumPy, Pandas, Scikit-Learn, NLP) and familiarity with a deep-learning framework.
- Highly advanced SQL server, SSIS DBT or Snowflake.
- Knowledge of APIs and XML/JSON.
- Although your programming language of choice (e.g. R, MATLAB or C) is not important, we do require the ability to become a fluent Python programmer in a short timeframe.
- Proficiency with automation tools and scripting languages such as PowerShell, CLI, or Python, XML, and JSON.