Summary
Overview
Work History
Education
Skills
Timeline
Generic

HARSHA MAKENENI

London

Summary

Senior Data Engineer and ETL Informatica Consultant with extensive experience in delivering large-scale data solutions across multiple industries including Insurance, Telecommunications, and Banking. Expertise in designing and developing data platforms using modern tools such as Informatica, Azure Data Factory, and Snowflake. Proven ability to build scalable data warehouses and integration pipelines, applying best practices in data modeling and the Kimball methodology. Strong programming skills in Python and PySpark, with a track record of successful project delivery and collaboration with cross-functional teams.

Overview

11
11
years of professional experience

Work History

Lead Data Engineer

CIPD
10.2024 - 03.2025
  • Company Overview: The Chartered Institute of Personnel and Development (CIPD) is a professional association for human resource management professionals, founded in 1913.
  • Informatica Environment Implementation: Designed and deployed a new Informatica environment to support the organisation’s digital transformation strategy, ensuring scalability and seamless integration with Salesforce, NetSuite, and Esmarch for optimized data flow across systems.
  • Data Integration & Migration: Spearheaded migration from Integra CRM by re-engineering Informatica integrations and optimizing ETL workflows, enhancing performance and minimizing downtime during critical data transfers.
  • Process Documentation & Review: Conducted a comprehensive review of existing Informatica processes, identifying inefficiencies and updating documentation to standardise workflows, improve consistency, and reduce manual errors.
  • Proof of Concept (PoC) Execution: Developed and executed a PoC to evaluate Informatica’s integration capabilities, defined future-state data models, and ensured alignment with enterprise platforms like Salesforce and NetSuite.
  • Scalability & Performance Optimization: Led a scalability review of platform infrastructure, pinpointed performance bottlenecks, and implemented optimization strategies to support growing data volumes and ensure high availability.
  • Strategy & Rollout Recommendations: Authored a structured rollout plan based on PoC results, including technical considerations, roles and responsibilities, and clear milestones to secure stakeholder alignment and drive full-scale implementation.
  • Automated Complex Data Pipelines: Streamlined operations by automating complex data pipelines using Databricks Jobs and scheduled workflows, reducing manual intervention by 25% and accelerating time-to-insight.
  • Managed Databricks Infrastructure: Administered and optimized Azure-based Databricks infrastructure for cost-effective scalability and high availability, ensuring efficient processing of large-scale data workloads.
  • SQL to Spark SQL Conversion: Engineered a dynamic solution to convert legacy SQL to Spark SQL using stored procedures and Databricks utilities, improving system compatibility and enhancing data transformation capabilities.
  • Data Migration & Post-Migration Support: Designed and implemented a robust framework for migrating data from SQL Server to Databricks Delta Lake, built Azure Data Factory pipelines for daily loads, and provided post-migration support, documentation, and performance tuning.
  • The Chartered Institute of Personnel and Development (CIPD) is a professional association for human resource management professionals, founded in 1913.
  • Technical Environment: Azure Databricks, Azure Data Factory, SQL Server, T-SQL, IICS (Informatica Intelligent Cloud Services), Salesforce, NetSuite, Semarchy, SQL, Data Modeling, Performance Tuning, Process Standardization, Knowledge Management, PoC Development, Stakeholder Collaboration, Digital Transformation

Senior Data Engineer

NHBC
01.2024 - 09.2024
  • Company Overview: The National House Building Council (NHBC) is a key organization in the UK, with its primary purpose focused on raising construction standards for new homes. Established in 1936, NHBC is the largest provider of new home warranties in the UK, offering consumer protection through its 10-year Buildmark warranty.
  • As an ETL Data Migration Specialist, I oversaw the end-to-end data migration process, ensuring a smooth and successful data transition from source systems to the target environment, while maintaining data integrity throughout.
  • Extracted data from various sources, including relational databases, flat files, and cloud storage, and automated data extraction processes using scripting to enhance efficiency and accuracy.
  • Led the data migration from legacy systems to cloud-based platforms (primarily Microsoft Azure), ensuring compliance with security standards, and maintaining data consistency throughout the migration lifecycle.
  • Designed and implemented scalable data migration frameworks using Azure Data Factory, Azure Databricks, and Azure Synapse Analytics for efficient ETL/ELT processes, handling both structured and unstructured data types.
  • Collaborated closely with business analysts, data owners, and stakeholders to gather data requirements, define migration strategies, and validate the data’s accuracy post-migration, ensuring business needs were met.
  • Built and optimized Azure Data Factory pipelines, automating complex data flows across on-premises and cloud systems while ensuring minimal disruption to ongoing business operations during migration.
  • Leveraged Azure Databricks for advanced data transformation, cleansing, and enrichment, improving data quality for downstream analytics and reporting by applying business rules and data quality standards.
  • Implemented robust data validation and reconciliation processes, ensuring accurate data mapping between source and target systems and preventing discrepancies during the migration.
  • Developed and maintained monitoring and logging solutions using Azure Monitor, Log Analytics, and custom logging frameworks to track migration performance, handle exceptions, and ensure data quality, allowing for timely detection and resolution of issues.
  • Provided detailed technical documentation on migration procedures, including data architecture, mappings, troubleshooting steps, and cutover planning, while coordinating with IT and business teams to ensure minimal downtime and a smooth system transition.
  • The National House Building Council (NHBC) is a key organization in the UK, with its primary purpose focused on raising construction standards for new homes. Established in 1936, NHBC is the largest provider of new home warranties in the UK, offering consumer protection through its 10-year Buildmark warranty.
  • Technical Environment: Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure SQL Database, Azure Blob Storage, Delta Lake, SQL Server, Python, PySpark, Spark SQL, T-SQL, Git, Azure DevOps, Informatica Intelligent Cloud Services (CDI, CAI & CDQ), Snowflake, DBT, Agile Methodology, Confluence

Senior Data Engineer

Department for Education
04.2023 - 12.2023
  • Developed end-to-end data pipelines using Informatica DEI, sourcing data from various systems and loading it into SQL Server across Landing, Staging, and Target layers.
  • Built scalable data ingestion frameworks in Azure Data Factory and Databricks, integrating third-party APIs and on-prem SQL Server data into Delta Lake.
  • Performed complex data cleansing and transformations using Python and Informatica, aligning with defined data quality rules and business logic.
  • Created a consolidated learner attendance view using Azure Synapse Analytics, enabling unified reporting and analytics across the organization.
  • Optimized data pipelines and SQL queries to handle 100M–600M records, improving performance, and throughput, and reducing processing times.
  • Built and managed Databricks meta store hierarchy, including catalogs and schemas, while implementing Unity Catalog for secure, role-based access to Delta Live Tables.
  • Designed and implemented a PySpark-based synthetic test data framework, enabling consistent and reusable test data generation across environments.
  • Delivered production-grade pipelines in agile sprints, meeting “definition of done” criteria and receiving approvals from Delivery Managers, Product Owners, and Data Engineering Leads.
  • Led defect resolution and testing efforts, including root cause analysis, automated test design, execution, and obtaining stakeholder signoffs for releases.
  • Collaborated with Business Analysts and Data Modelers to finalize data models, including required tables, views, and columns, ensuring alignment with business requirements.
  • Technical Environment: Informatica Data Engineering Integration (DEI), ADF2 on Azure, Azure Databricks, Databricks Delta Lake, Azure DevOps, T-SQL (SSMS), Azure Databricks Dashboard, Agile Methodologies as Sprintsran

Senior Informatica Cloud Services Developer

Vistra
11.2021 - 03.2023
  • Company Overview: Vistra is all about empowering businesses that are creating positive change in the world. As a leading fund administrator and corporate service provider, they partner with organizations that are driving innovation, whether in technology, investment strategies, or business models.
  • Delivered all relevant Informatica and IICS development items within Vistra’s enterprise data integration program, ensuring smooth integration and effective data flow across critical business systems.
  • Worked on VForce integration to Enterprise via VIP, ensuring seamless data flow between systems.
  • Developed ETL logic on Informatica Cloud (IICS), orchestrating data migration from various source systems, and using REST APIs to integrate multiple systems in real-time. This included synchronizing data between SAP (ERP and billing), Viewpoint (entity management software), and Salesforce (CRM and CPQ).
  • Led the integration development using Informatica Cloud Data Integration (IICS – CDI Service) for SAP S4/HANA, aligning data models, security policies, and migration timelines with client-side stakeholders.
  • Designed and implemented ETL workflows to extract, transform, and load data from legacy systems to Snowflake and Azure-based data platforms.
  • Extracted data from relevant SAP modules (e.g., financial data, HR records) and created connections for SAP and Azure SQL Database within the Informatica Cloud interface.
  • Managed data migration and transformation, optimizing complex datasets into Snowflake with best practices in partitioning, clustering, and query optimization.
  • Utilized Informatica Cloud’s pre-built connectors to establish a connection with SAP, while performing data profiling and working with cross-functional teams to improve data quality and accuracy.
  • Led the development and implementation of data quality rules using Informatica Cloud Data Quality (CDQ), including email validation, parsing, cleansing, and address doctor validation, reducing errors and improving data accuracy.
  • Conducted code review sessions with peer developers to ensure quality, performed troubleshooting and bug fixes, and ensured all development was delivered in line with industry standards and within the program's fixed milestones and endpoints.
  • Vistra is all about empowering businesses that are creating positive change in the world. As a leading fund administrator and corporate service provider, they partner with organizations that are driving innovation, whether in technology, investment strategies, or business models.
  • Technical Environment: Informatica Intelligent Cloud Services (CDI, CAI & CDQ), Salesforce (CRM and CPQ), Data Profiling & Cleansing, T-SQL (SSMS), Azure Data Factory, Snowflake, Azure Databricks, Azure Data Lake, Azure DevOps, AWS, OIC, Postman API, SAP, S/4HANA, SoapUI, Visual Studio 2013, Agile Methodology, Jira

Senior Data Engineer

ReAssure
08.2021 - 10.2021
  • Company Overview: The Phoenix Group has completed its acquisition of ReAssure, positioning itself as the UK's largest long-term savings and retirement business, with approximately 14 million policies.
  • Use Business Objects to develop universes
  • Generate reports
  • The Phoenix Group has completed its acquisition of ReAssure, positioning itself as the UK's largest long-term savings and retirement business, with approximately 14 million policies.
  • Technical Environment: Business Objects, SAP BO, Crystal Reports, Dashboards, SQL, ETL, Informatica

Senior Data Engineer

CIPD
05.2021 - 08.2021
  • Company Overview: The Chartered Institute of Personnel and Development (CIPD) is a major professional body for HR and people development, founded in 1913. It's one of the leading organizations in the UK and internationally, offering resources, qualifications, and support for HR professionals.
  • Supported the integration and deployment of multiple web methods, including analyzing and replacing DDM web methods within the HIP.
  • Developed and implemented contingency plans for operational delays, ensuring smooth knowledge transfer throughout the process.
  • Led release planning efforts to minimize the impact on application teams and reduce the time required for business regression testing, while ensuring other applications continued using existing DDM Platform Migration methods.
  • Coordinated the full application switchover, ensuring other applications successfully transitioned to the HIP methods in a subsequent migration phase.
  • Delivered the web methods and managed the orchestration of methods by the Supplier as part of the deployment process.
  • Prepared "As-Is" analysis documents for each orchestrated method, detailing their current functionality within DDM, and collaborated with CIPD's DDM development team for review and sign-off.
  • Created comprehensive functional description documents for each HIP web method, outlining service access, inputs/outputs, and logical data processing, with IICS implementation details available from the platform GUI.
  • Led the deployment of REST APIs in AWS, interacted with all parties supporting the applications in scope, assisted with integration testing, provided fixes and support during testing and production deployment, and drove technical discussions for the design of a replacement solution for Data Dissemination modules while engineering complex C# code.
  • The Chartered Institute of Personnel and Development (CIPD) is a major professional body for HR and people development, founded in 1913. It's one of the leading organizations in the UK and internationally, offering resources, qualifications, and support for HR professionals.
  • Technical Environment: Informatica Intelligent Cloud Services (CDI & CAI), Azure Data Factory, Azure Databricks, Azure Data Lake, Azure DevOps, T-SQL (SSMS), AWS, SoapUI, Integra, Visual Studio 2013, Agile Methodology, Jira

Data Engineer

MTVH
01.2021 - 04.2021
  • Company Overview: Metropolitan Thames Valley Housing (MTVH) manages approximately 57,000 homes, providing affordable housing solutions across London, the Southeast, East Midlands, and the East of England. The organization is committed to supporting communities by offering safe, sustainable, and inclusive housing for people in need.
  • Designed and developed integration solutions using Informatica Intelligent Cloud Services (iPaaS), enabling seamless data flows across MTVH systems and API development, including CSV, Web API, and XML formats, within multiple IICS projects.
  • Led data integration, quality management, and administration through IICS (CDI), ensuring smooth operation and efficient data management across various applications and systems.
  • Utilized Snowflake ETL processes to transform and load data into Snowflake Data Warehouse, managing role-based access control (privileges for databases, tables, etc.) and applying change data capture (CDC) methodologies for efficient data transformation.
  • Developed and maintained ETL pipelines using Azure ADF to load data into Azure SQL DB, optimizing data processes and ensuring timely data availability for business users.
  • Worked with large datasets, identifying and resolving inconsistencies as part of the Data Quality Team, leveraging SQL and UNIX scripting skills to maintain accurate and high-quality data across systems.
  • Built and deployed Business Services using JSON files for web service interactions, integrated REST APIs via IICS REST V2 Connection, and ensured the development and validation of data integration mappings, specifications, and deployments.
  • Contributed to strategic data initiatives by designing integration strategies and schedules, supporting AWS and Azure-based applications, and maintaining data repositories for seamless access to manageable datasets. Engaged with senior stakeholders to deliver technical solutions and ensure alignment with business objectives.
  • Metropolitan Thames Valley Housing (MTVH) manages approximately 57,000 homes, providing affordable housing solutions across London, the Southeast, East Midlands, and the East of England. The organization is committed to supporting communities by offering safe, sustainable, and inclusive housing for people in need.
  • Technical Environment: Informatica Intelligent Cloud Services (iPaaS-CDI, CAI), Azure Data Factory, Azure SQL Database, AWS, Snowflake Data Lake, T-SQL (SSMS), PL/SQL (Oracle), Oracle SQL Developer, FileZilla, Postman, Batch Scripts, Semarchy

Business Intelligence Consultant

Billigence
11.2020 - 12.2020
  • Company Overview: Client for Billigence - Vodafone is the largest mobile and fixed network operator in Europe and the world’s leading provider of IoT connectivity.
  • Designed and developed new workflows for Robotic Process Automation (RPA) to automate report generation, significantly reducing processing time and enhancing business insight delivery.
  • Collaborated with clients to gather requirements and translate them into high-level BI architecture, including the design of ETL processes and analytical workflows; documented all development activities and solutions.
  • Client for Billigence - Vodafone is the largest mobile and fixed network operator in Europe and the world’s leading provider of IoT connectivity.
  • Technical Environment: Alteryx Designer, MS Excel, Python

Data Engineer

Thomson Reuters
06.2019 - 01.2020
  • Company Overview: Thomson Reuters is a global leader in delivering news and information for professional markets. The company provides trusted intelligence, cutting-edge technology, and expert human insight to help clients make informed decisions across legal, financial, tax, and media sectors.
  • SAP Integration & ODBC Setup: Installed the SAP HANA ODBC driver. Created ODBC data sources and set up ODBC connections in Informatica PowerCenter for seamless data transfer and migration into SAP.
  • Databricks Migration & Automation: Built a dynamic framework to migrate data from SQL Server to Databricks Delta Lake. Converted incompatible SQL code into Spark SQL using stored procedures, SQL functions, and Databricks.
  • Azure Data Pipelines: Developed and automated Azure Pipelines for daily data loads between SQL Server and Databricks, enabling smooth bi-directional data flow.
  • Code & Object Migration: Automated the migration of legacy database objects (e.g., tables, views) into Delta Lake to support modern analytics and storage requirements.
  • Data Quality & Testing: Performed data migration including quality assurance, cleansing, validation, and profiling. Integrated data into the target warehouse and conducted comprehensive testing post-migration.
  • Collaboration & Compliance: Collaborated with cross-functional teams to define migration goals, timelines, and success criteria. Ensured adherence to industry regulations and best practices using cloud connectors like Azure Synapse, ADLS, Azure Blob, and SQL Server.
  • Thomson Reuters is a global leader in delivering news and information for professional markets. The company provides trusted intelligence, cutting-edge technology, and expert human insight to help clients make informed decisions across legal, financial, tax, and media sectors.
  • Technical Environment: Informatica, Azure Data Factory, SSIS, Oracle 19c, SQL Server 2016, Unix, T-SQL, Autosys

Senior BI Developer

Moody’s Investors Service
01.2017 - 05.2019
  • Company Overview: Moody’s Investors Service is a leading provider of international financial research and credit ratings on debt instruments issued by commercial and government entities. Alongside Standard & Poor’s and Fitch Group, Moody’s is recognized as one of the “Big Three” credit rating agencies globally, playing a key role in global capital markets by assessing credit risk and informing investment decisions.
  • Delivered end-to-end Data Warehouse solutions by coordinating with business users, translating needs into functional specifications, and conducting feasibility checks.
  • Delivered end-to-end Data Warehouse solutions by translating business needs into functional specs and leading feasibility checks.
  • Led data modelling, ETL, and data migration using Informatica, SQL Server, Oracle, Azure (Data Lake, Blob, SQL DW), and IDQ.
  • Optimized ETL and DB performance, authored standards, and built complex SQL for validation, transformation, and reporting.
  • Implemented exception handling and human Tasks via IDQ, and ensured data quality through profiling, cleansing, and validation.
  • Acted as Technical Lead for global teams (3-4 members), driving SDLC/Agile projects, deployment, and error recovery strategies.
  • Coordinated migration cycles and developed integration components on cloud platforms, consistently meeting deadlines in fast-paced and evolving environments while ensuring successful implementation.
  • Delivered complex solutions with an eye on scalability, performance optimization, and collaboration with stakeholders across all phases of the project lifecycle, ensuring business needs were met with precision and timeliness.
  • Moody’s Investors Service is a leading provider of international financial research and credit ratings on debt instruments issued by commercial and government entities. Alongside Standard & Poor’s and Fitch Group, Moody’s is recognized as one of the “Big Three” credit rating agencies globally, playing a key role in global capital markets by assessing credit risk and informing investment decisions.
  • Technical Environment: Informatica PowerCenter, Informatica Data Quality (IDQ), SQL Server, Oracle, Azure Data Lake Gen1, Azure Blob Storage, Azure SQL Data Warehouse, Azure Data Factory, T-SQL, PL/SQL, Complex SQL, Python (for scripting/automation), Agile/Scrum, SDLC, Git, JIRA, Control-M, UNIX, WinSCP, Agile/Sprint

ETL Developer

Rogers Communications
04.2015 - 12.2016

BI Developer

X.L. Global Services Inc
08.2014 - 04.2015

Education

MSc - Computer Networking

University of Bedfordshire
01.2011

Bachelor of Engineering - Electronics and Communication Engineering

Anna University
01.2008

Skills

  • Oracle 9i
  • Oracle 10g
  • Oracle 11g
  • Sybase 125
  • Sybase 157
  • MS SQL Server 2012
  • MS SQL Server 2014
  • MS SQL Server 2017
  • DB2
  • Azure Databricks
  • Azure Data Factory
  • Data Lake
  • Delta Lake
  • Informatica Intelligent Cloud Services
  • SSIS
  • Informatica PowerCenter
  • IDQ 9x
  • IDQ 10x
  • DEI
  • Alteryx
  • Unix
  • Windows 2K
  • Windows NT
  • Windows XP
  • Windows 7
  • Oracle SQL developer
  • Rapid SQL
  • T-SQL
  • SSRS
  • Power BI
  • PL/SQL
  • RDBMS SQL
  • Stored Procedures
  • Python
  • SAP
  • Control M
  • AutoSys
  • SQL Server Agent
  • Alteryx Connect
  • Tortoise SVN
  • JIRA
  • WinSCP
  • FileZilla
  • Postman
  • Soap UI
  • GitHub

Timeline

Lead Data Engineer

CIPD
10.2024 - 03.2025

Senior Data Engineer

NHBC
01.2024 - 09.2024

Senior Data Engineer

Department for Education
04.2023 - 12.2023

Senior Informatica Cloud Services Developer

Vistra
11.2021 - 03.2023

Senior Data Engineer

ReAssure
08.2021 - 10.2021

Senior Data Engineer

CIPD
05.2021 - 08.2021

Data Engineer

MTVH
01.2021 - 04.2021

Business Intelligence Consultant

Billigence
11.2020 - 12.2020

Data Engineer

Thomson Reuters
06.2019 - 01.2020

Senior BI Developer

Moody’s Investors Service
01.2017 - 05.2019

ETL Developer

Rogers Communications
04.2015 - 12.2016

BI Developer

X.L. Global Services Inc
08.2014 - 04.2015

Bachelor of Engineering - Electronics and Communication Engineering

Anna University

MSc - Computer Networking

University of Bedfordshire
HARSHA MAKENENI