You are on page 1of 107

GUIDE

ABU DHABI DEMAND SIDE MANAGEMENT &


ENERGY RATIONALISATION STRATEGY 2030

EVALUATION, MEASUREMENT & VERIFICATION


(EM&V) PROTOCOL

EFFECTIVE DATE: 02/01/2022

P.O. Box 32800, Abu Dhabi, U.A.E | T+ 971 2 2070777 www.doe.gov.ae


EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Contents
1 Introduction ................................................................................... 5
1.1 Purpose ............................................................................................. 5
1.2 Scope ................................................................................................ 5
1.3 References ........................................................................................ 6
1.4 Distribution ........................................................................................ 7

2 Definitions & Abbreviations ........................................................... 8


3 Background ................................................................................ 11
3.1 Abu Dhabi DSM Strategy ................................................................ 11
3.2 Evaluation, Measurement & Verification ......................................... 11
3.3 Abu Dhabi EM&V Protocol .............................................................. 13

4 Key EM&V Concepts .................................................................. 14


4.1 Gross and Net Savings ................................................................... 14
4.1.1 Calculating Net Savings ................................................................ 19
4.2 Program Overlaps ........................................................................... 20
4.3 Estimated vs. Evaluated Savings .................................................... 21
4.4 Calculating Demand Savings .......................................................... 22

5 EM&V Governance Framework .................................................. 25


5.1 International Best Practice .............................................................. 25
5.1.1 Who Develops the EM&V Protocol? ............................................. 25
5.1.2 Role of the Regulator in Program Evaluation ................................ 26
5.1.3 Third-Party Evaluations ................................................................ 26
5.1.4 Evaluation Levels ......................................................................... 30
5.2 Governance Structure: Roles & Responsibilities ............................. 33
5.2.1 DSM Strategy Owner: Department of Energy ............................... 33
5.2.2 DSM Program Owners .................................................................. 34
5.2.3 DSM Program Stakeholders ......................................................... 35
5.2.4 Department of Finance ................................................................. 35
5.2.5 Supporting EM&V Players ............................................................ 36
5.3 Governance Structure: Overall Data Governance ........................... 36
5.4 EM&V Process ................................................................................ 37
5.4.1 EM&V Process Option A ............................................................... 37
5.4.2 EM&V Process Option B ............................................................... 38
5.4.3 EM&V Process Option C .............................................................. 40
5.4.4 EM&V Process Option D .............................................................. 41

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 2 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

5.4.5 EM&V Process Recommended Option ......................................... 42


5.4.6 EM&V Process Roles and Responsibilities ................................... 43
5.4.7 EM&V Process Timeline ............................................................... 45
5.5 EM&V Implementation .................................................................... 46
5.5.1 Third-Party Evaluation Frequency................................................. 46
5.5.2 EM&V Funding ............................................................................. 46
5.5.3 Technical Reference Manuals ...................................................... 48

6 DSM Programs EM&V Methodologies ........................................ 49


6.1 Program 2: Building Regulations ..................................................... 49
6.1.1 International Best Practice for Evaluation ..................................... 49
6.1.2 Recommended Savings Estimation Approach .............................. 50
6.1.3 Gross vs. Net Savings .................................................................. 52
6.1.4 Data Governance ......................................................................... 52
6.1.5 Overlaps ....................................................................................... 54
6.2 Program 3: Street & Public Realm Lighting ..................................... 55
6.2.1 International Best Practice for Evaluation ..................................... 55
6.2.2 Recommended Savings Estimation Approach .............................. 56
6.2.3 Gross vs. Net savings ................................................................... 57
6.2.4 Data Governance ......................................................................... 58
6.2.5 Overlaps ....................................................................................... 59
6.3 Program 4: Efficient Cooling............................................................ 60
6.3.1 District Cooling ............................................................................. 60
6.3.2 Improved Cooling System Management ....................................... 67
6.3.3 Data Governance ......................................................................... 68
6.3.4 Overlaps ....................................................................................... 69
6.4 Program 5: Energy Storage............................................................. 70
6.4.1 International Best Practice for Evaluation ..................................... 70
6.4.2 Recommended Savings Estimation Approach .............................. 70
6.4.3 Gross vs. Net Savings .................................................................. 71
6.4.4 Data Governance ......................................................................... 71
6.4.5 Overlaps ....................................................................................... 72
6.5 Program 6: Standards & Labels ...................................................... 73
6.5.1 International Best Practice for Evaluation ..................................... 73
6.5.2 Recommended Savings Estimation Approach .............................. 74
6.5.3 Gross vs. Net Savings .................................................................. 78
6.5.4 Data Governance ......................................................................... 79
6.5.5 Overlaps ....................................................................................... 81
6.6 Program 7: Building Retrofits .......................................................... 82

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 3 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.6.1 International Best Practice for Evaluation ..................................... 82


6.6.2 Abu Dhabi Measurement & Verification Protocol Guidance
Document ................................................................................................ 84
6.6.3 Recommended Savings Estimation Approach .............................. 84
6.6.4 Gross vs. Net Savings .................................................................. 85
6.6.5 Data Governance ......................................................................... 85
6.6.6 Overlaps ....................................................................................... 86
6.7 Program 8: Efficient Water Use & Re-Use ...................................... 88
6.7.1 International Best Practice for Evaluation ..................................... 88
6.7.2 Recommended Savings Estimation Approach .............................. 89
6.7.3 Gross vs. Net Savings .................................................................. 91
6.7.4 Data Governance ......................................................................... 92
6.7.5 Overlaps ....................................................................................... 93
6.8 Program 9: Demand Response ....................................................... 94
6.8.1 International Best Practice for Evaluation ..................................... 94
6.8.2 Recommended Savings Estimation Approach .............................. 95
6.8.3 Gross vs. Net Savings .................................................................. 97
6.8.4 Data Governance ......................................................................... 97
6.8.5 Overlaps ....................................................................................... 98
6.9 Program 10: Rebates & Behavioral Change ................................... 99
6.9.1 International Best Practice for Evaluation ..................................... 99
6.9.2 Recommended Savings Estimation Approach ............................ 100
6.9.3 Gross vs. Net Savings ................................................................ 104
6.9.4 Data Governance ....................................................................... 104
6.9.5 Overlaps ..................................................................................... 105

7 EM&V Model............................................................................. 106

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 4 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

1 Introduction

1.1 Purpose
The Abu Dhabi Evaluation, Measurement & Verification (EM&V) Protocol
(“Protocol”) is underpinned by the need for a comprehensive monitoring
framework and tool to evaluate, measure and verify the energy and water
savings achieved by all DSM Programs under the Abu Dhabi Demand Side
Management and Energy Rationalization Strategy 2030 (“DSM Strategy”).
Consequently, this Protocol is developed to serve as a guiding framework that
supports the Program Owners (as more particularly identified in Figure 1 at
Section 3 in this Protocol) in their EM&V activities.

1.2 Scope
This Protocol discusses general fundamental EM&V concepts that are critical
to reliably establish the energy and water savings from a DSM Program such
as: Net Savings versus Gross Savings calculations, differences between
Estimated Savings and Evaluated Savings, peak demand savings and how to
deal with Overlaps. The Protocol also frameworks the governance of the EM&V
ecosystem including the roles and responsibilities, annual process timelines
and reporting requirements.

For each of the DSM Programs (as more particularly identified in Figure 1 at
Section 3 in this Protocol), the methodology to evaluate the impact of the DSM
activities is outlined in this Protocol. This includes international best practice,
local context considerations and recommendations, data flows within the EM&V
ecosystem, Net Savings calculation guidelines and relevant Overlaps with other
DSM Programs.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 5 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

1.3 References
The following references have been used to prepare this Protocol:
Energy Efficiency Program Impact Evaluation Guide, EM&V Working Group,
SEEAction (2012).
Evaluation, Measurement & Verification - ACEEE (2020).
National Renewable Energy Laboratory (NREL). 2014. “Estimating Net Savings:
Common Practices”.
NREL, 2018, Chapter 10: Peak Demand and Time-Differentiated Energy Savings
Cross-Cutting Protocol.
Evaluation Framework for Pennsylvania Act 129 Phase III Energy Efficiency and
Conservation Programs (2016).
National Survey of State Policies and Practices for Energy Efficiency Program
Evaluation - ACEEE (2020).
Avista Utilities Evaluation Framework (2016).
Efficiency Manitoba Evaluation Framework and Plan – Appendix A (2019).
EM&V Plan 2018-2020 Version 9 – California Public Utility Commission.
Evaluation Measurement and Verification (EM&V) Guidance for Demand-Side Energy
Efficiency (EE) – US EPA (2015).
California eTRM: https://www.caetrm.com/
Allcott, Hunt (2011). Rethinking real-time electricity pricing. Resource and Energy
Economics. Vol 33, Issue 4, pg. 820-842.
ENERGY STAR. Multifamily New Construction Certification Process.
BC Hydro, 2018. New Construction Program’s Energy modelling guideline.
Efficiency Nova Scotia. New Construction flow chart of requirements.
US DoE (PNNL), 2015, “Energy Modeling Building Codes Assistance Project”.
US DoE. Energy Plus.
UNFCCC, 2013, Demand-side activities for efficient outdoor and street lighting
technologies”.
AEEE, 2015, “Preparation of Monitoring & Verification Protocols for Street Lighting”.
ICP, 2018, “Streetlighting Protocol”.
All CMUA TRM-relevant documents: https://www.cmua.org/energy-efficiency-
technical-reference-manual
All Texas PUC TRMs here: http://www.texasefficiency.com/index.php/emv
NY Joint Utilities TRM: https://www3.dps.ny.gov
Arkansas PUC TRM: http://www.apscservices.info/EEInfo/TRMV8.1.pdf
NREL Sample Design Cross-Cutting Protocol.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 6 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

DTE / Navigant, 2017, “The Reliability of Behavioral Demand Response”.


EnerNOC, 2011, “The Demand Response Baseline”.
CAISO, 2009, “Baselines for Retail Demand Response Programs”.
AEIC, 2009, “Demand Response Measurement & Verification”.
Clean Energy Ministerial. 2014. “Measurement & Verification Process for Calculating
and Reporting on Energy and Demand Performance – General Guidance

1.4 Distribution
This Protocol has been distributed to the following entities:

Al Ain Distribution Company (AADC)

Abu Dhabi Distribution Company (ADDC)

Abu Dhabi Energy Services (ADES)

Emirates Water & Electricity Company (EWEC)

Department of Municipalities and Transport (DMT)

Abu Dhabi Municipality (ADM)

Al Ain Municipality (AAM)

Al Dhafrah Region Municipality (DRM)

Abu Dhabi Quality & Conformity Council (QCC)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 7 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

2 Definitions & Abbreviations

The following terms have been used throughout this Protocol:

Actual Savings - Savings that are realized in practice due to the implementation of
DSM activities.

Claimed Savings - Estimated savings based on actual DSM activities. Incorporates


some degree of QA and verification activities by the DSM Program Owner.

Cooling Degree Days - The cumulative number of degrees in a month or year by


which the mean temperature is above a set temperature.

Compliance Rate(s) - EM&V adjustment factor determined via a statistically


significant analysis that attempts to capture the fraction of initiatives or projects that
adhere to the set standards.

Deemed Savings - An estimate of energy, water or demand savings for a single


unit of an installed energy or water efficiency measure that (1) has been developed
from data sources and analytical methods that are widely considered acceptable for
the measure and purpose, and (2) is applicable to the situation being evaluated.

DSM Evaluation - Process of conducting a wide range of assessment studies and


other activities aimed at determining the impacts of energy and water efficiency and
DSM Programs (and portfolios) and identifying opportunities for improvement.

Estimated Savings - Estimated savings are based on actual DSM activities that
are calculated via commonly-accepted savings methodologies and/or Deemed
Savings from prior evaluations and technical resource manuals, but are improved
via a number of EM&V adjustment factors.

EM&V - The term EM&V is commonly used as an overarching term for determining
impacts at DSM Strategy, DSM Program and DSM Initiative level.

Evaluated Savings - Verified savings reported by an independent, third-party


evaluator after DSM activities have concluded. Evaluator develops an EM&V plan
and conducts an impact evaluation.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 8 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Free Riders - Participants that would have participated in energy and water
conservation behavior even if the DSM program did not exist.

Gross Savings - Changes in energy and water consumption related to DSM


activities.

International Performance Measurement and Verification Protocol (IPMVP) -


Guidance document with a framework and definitions describing the four M&V
approaches; a product of the Efficiency Valuation Organization.

Measurement & Verification - Combination of data collection, monitoring and


analysis activities (which could be a standalone activity, or part of the
program/portfolio impact evaluation) that aim at determining the energy, water
and/or demand savings at individual sites or projects using one or more options
defined in the IPMVP.

Net Savings - Changes in energy and water consumption directly attributable to


DSM activities, excluding natural changes that would have happened even in the
absence of the DSM Strategy.

Net-to-Gross Ratio - A net-to-gross (NTG) ratio is used to calculate Net Savings


from Gross Savings to adjust for Free Riders and Spill-over (e.g., unintended energy
and water conservation actions due to the program).

Overlaps - Savings that can be attributed to more than one DSM Program are
classified as overlaps.

Realisation Rate - EM&V adjustment factor determined via a statistically significant


analysis of Actual Savings vs. Reported Savings.

Reported Savings - Savings that are reported by the DSM activity implementing
body e.g., ESCOs to the program owner and eventually the strategy owner.

Spill-over - Out-of-program energy and water savings (e.g. spill-over activities) that
were initiated as a result of the DSM program but that are out of the direct scope of
the DSM program.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 9 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

The following abbreviations have been used throughout this Protocol:

Abbreviation Meaning Abbreviation Meaning


AADC Al Ain Distribution Company EFLH Equivalent Full Load Hours
AAM Al Ain City Municipality EM&V Evaluation, Measurement and
Verification
A/C Air Conditioning ESCO Energy Services Company
ACEEE American Council for an Energy- ESMA Emirates Authority for
Efficient Economy Standardization and Metrology
ADAFSA Abu Dhabi Agriculture and Food EUI Energy Use Intensity
Safety Authority
ADEO Abu Dhabi Executive Office EWEC Emirates Water and Electricity
Company
ADES Abu Dhabi Energy Services Company HEMS Home Energy Management System
ADHA Abu Dhabi Housing Authority GFA Gross Floor Area
ADM Abu Dhabi City Municipality IPMVP International Performance
Measurement & Verification
Protocol
ADDC Abu Dhabi Distribution Company LED Lighting Emitting Diode
AEC Annual Electricity Consumption M&V Measurement and Verification
B2B Business-to-business MEPS Minimum Energy Performance
Standards
BESS Battery Energy Storage Systems NTG Net-to-Gross Ratio
CAP Capacity Ps Plate Settings
C&I Commercial & Industrial PoS Point-of-sale
CHP Combined Heat and Power QCC Abu Dhabi Quality and Conformity
Council
CMVP Certified Measurement and REF Refrigerators & Freezers
Verification Professional
DC District Cooling RCT Randomized Controlled Trial
DCD Department of Community TRANSCO Abu Dhabi Transmission &
Development Despatch Company
DMT Department of Municipalities and TRM Technical Reference Manual
Transport
DoE Abu Dhabi Department of Energy TSD Technical Support Document
DoF Department of Finance TSE Treated Sewage Effluent
DR Demand Response SAEC Standard Annual Energy
Consumption
DRM Al Dhafra Region Municipality Wc Water Consumption per Cycle
DSM Demand Side Management Wt Water Consumption per Cycle
Under Test Conditions
DW Dishwashers WF Water Fixtures
EAD Environment Agency Abu Dhabi WH Water Heaters
EEI Energy Efficiency Index WMD Washing Machines & Dryers
EER Energy Efficiency Ratio WUI Water Use Intensity

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 10 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

3 Background

3.1 Abu Dhabi DSM Strategy


The Abu Dhabi Demand Side Management and Energy Rationalization Strategy
2030 (herein referred to as “DSM Strategy”) aims to provide economic, system
reliability and environmental benefits. Unveiled in 2019, the DSM Strategy
addresses supply and demand issues through a ten-program multi-stakeholder
approach (as shown in Figure 1) (“DSM Program(s)”). The DSM Strategy is
targeting to reduce electricity consumption by 22% and water consumption by
32% by 2030 from the 2013 baseline. These targets do not consider the impact
of the Tariff Reform program.

Figure 1: Abu Dhabi DSM Strategy, Programs and Key Stakeholders

3.2 Evaluation, Measurement & Verification


Evaluation (E) is the process of conducting a wide range of assessment studies
and other activities aimed at determining the impacts of energy and water
efficiency and demand side management programs (and strategies) and
identifying opportunities for improvement. In contrast, Measurement &
Verification (M&V) is a combination of data collection, monitoring and analysis
activities (which could be a standalone activity, or part of the program/portfolio
impact evaluation) that aim at determining the energy, water and/or demand
savings at individual sites or projects using one or more options defined in the

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 11 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

International Performance Measurement & Verification Protocol (IPMVP). The


term EM&V is commonly used as an umbrella term for determining impacts at
both a program and a project level1.

In the context of DSM Programs, DSM Evaluation activities should not be


considered as the end-objective; rather they should be part of a continuous
process of program planning, implementation and evaluation. The results of the
DSM Evaluation should feed back to the start of the DSM lifecycle and used as
an input into planning and for continuous improvement of existing and future
programs.

Figure 2: Typical demand side management programs’ lifecycle


The evaluation of DSM activities has the following main objectives2:

1. Accountability of the impacts: Estimate the DSM Program impacts and


determine whether the programs (and the overall DSM Strategy) have met their
targets;

2. Continuous improvement of DSM Programs: Provide the necessary


information and analyses to identify ways to improve existing and future DSM
programs; and

3. Risk management to support energy and water resource planning and


demand forecasting: Provide an understanding of the historical and future
effects of energy and water efficiency, compared to other supply and demand
side resources.

1 Energy Efficiency Program Impact Evaluation Guide, EM&V Working Group, SEEAction (2012)
2 Evaluation, Measurement & Verification - ACEEE (2020)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 12 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

To help facilitate the overall evaluation process, regulators, utilities and/or third-party
program administrators develop EM&V frameworks or protocols. These documents
outline general EM&V principles, metrics, approaches, Net Savings versus Gross
Savings calculations, reporting requirements and overall governance. The said
document tends to be fixed given its overarching nature but can be updated
periodically as the DSM Programs progress and mature.

3.3 Abu Dhabi EM&V Protocol


Based on the above, this Protocol is underpinned by the need for a comprehensive
monitoring framework and tool to evaluate, measure and verify the energy and water
savings achieved by all DSM Programs under the DSM Strategy. Consequently, this
Protocol is developed to serve as a guiding framework that supports the Program
Owners in their EM&V activities.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 13 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

4 Key EM&V Concepts


4.1 Gross and Net Savings
In the context of DSM Evaluation, the estimation of Net Savings is fundamental. Net
Savings represent savings that can be attributed directly to DSM activities. The
objective of estimating Net Savings is to appropriately attribute energy, water and
monetary savings to assess the effectiveness of DSM programs and their cost-
effectiveness.3

In order to accurately estimate Net Savings, a baseline must be defined. The baseline
represents the counterfactual scenario of the DSM intervention, in other words, what
would have happened in the absence of the DSM intervention. By establishing a
baseline that correctly adjusts for natural changes in energy or water consumption
occurring over time (e.g., an “adjusted baseline”), Net Savings can accurately be
isolated and quantified. If these natural changes in consumption are not excluded from
the baseline, then these savings are defined as Gross Savings. These concepts are
illustrated by the graph below.

Figure 3: Baseline definition, Net Savings and Gross Savings in the context of DSM EM&V

3National Renewable Energy Laboratory (NREL). 2014. “Estimating Net Savings: Common Practices”. Available
here: https://www.energy.gov/sites/prod/files/2015/01/f19/UMPChapter17-Estimating-Net-Savings.pdf

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 14 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

As shown above, Gross Savings are generally greater than Net Savings because
these natural changes in consumption reflect organic savings in consumption, rather
organic increases. For example, from improvements in end-use technology, adoption
of higher efficiency equipment, or more efficient use of energy and water.

As part of EM&V activities, utilities calculate both Gross Savings and Net Savings.
Generally, Gross Savings are calculated first, followed by a calculation of Net Savings
when incorporating an adjusted baseline to account for natural changes. Nevertheless,
in the context of EM&V and for the reporting and communication of DSM savings, best
practice involves reporting of Net Savings. Gross Savings may also be reported, but
Net Savings should feature more prominently as those are the savings that can be
directly attributed to a portfolio of DSM programs.

A more complete picture of the intricacies of demand changes that may occur from
one year to the next is shown below. In the Abu Dhabi context, it is important to
consider deterioration/depreciation of air-conditioning equipment and systems given
the dominant cooling load.

Figure 4: Demand changes in the context of DSM

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 15 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

This example shows how annual consumption changes from one year to the next
(2020 to 2021). The left-hand side of the graph illustrates non-DSM related changes
in consumption, while the right-hand side shows the impact of DSM.

• Beginning from the left, the 2020 baseline consumption may experience some
demand increase from new customers and some demand reduction due to
leaving customers.

• There is also an increase in consumption due to natural growth in energy use


(e.g., higher penetration of electronic devices) and some natural conservation
in energy (e.g., higher efficiency appliances become more cost-effective).

• Finally, to arrive at the 2021 consumption, savings attributed directly to the DSM
Strategy are accounted for.

In this context, the baseline for Net Savings is adjusted to reflect natural changes in
consumption (natural growth and natural conservation), whereas Gross Savings do
not reflect these natural changes.

The following graphs provide two examples to illustrate the difference between Gross
Savings and Net Savings for two DSM Programs: P6 Standards & Labels and P4
Efficient Cooling. For each example, two scenarios are shown:

• The first scenario assumes some level of natural changes in consumption,


which results in an adjusted baseline, and in-turn different Gross Savings and
Net Savings; and

• The second scenario assumes no natural changes in consumption, which


results in a fixed baseline, and in-turn equivalent Gross Savings and Net
Savings.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 16 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Example A: How do we attribute savings to P6 Standards & Labels?

Assumption: Natural changes (adjusted baseline)

Assumption: No natural changes (fixed baseline)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 17 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Example B: How do we attribute savings to P4 Efficient Cooling?

Assumption: Natural changes (adjusted baseline)

Assumption: No natural changes (fixed baseline)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 18 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

4.1.1 Calculating Net Savings


Since the nature and context of each DSM Program is different, the approaches used
to estimate Net Savings are also different. In general, there are three approaches:

1. Using a Net-to-Gross Ratio | A net-to-gross ratio (%) can be applied to Gross


Savings to estimate Net Savings. A net-to-gross ratio captures the impact of
Free Riders and Spill-over:

a. Free Riders are participants that would have participated in energy and
water conservation behavior even if the DSM Program did not exist;
whereas

b. Spill-over captures out-of-program energy and water savings (e.g., spill-


over activities) that were initiated as a result of the DSM Program but that
are out of the direct scope of the DSM Program.

2. Based on an adjusted baseline | An adjusted baseline is applied in the context


of DSM Programs where the baseline measure has historically shown changes
over time. For example, if there is evidence that the energy performance of A/C
units has experienced a gradual increase in efficiency, before the DSM Strategy
even existed, then the baseline should be adjusted in future years to continue
historical trends in efficiencies. If there is no evidence of natural improvements,
then the baseline should be fixed.

3. Savings are inherently net | For some DSM Programs, Gross Savings are the
same as Net Savings because no natural change in consumption would have
taken place in absence of the DSM Program.

The approach followed to calculate Net Savings for each DSM Program is outlined
within its corresponding part in Section 4.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 19 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

4.2 Program Overlaps


Overlaps, large and small, exist across many of the DSM Program initiatives. For
example, savings from enforcing A/C standards under the Standards & Labels
program overlap with savings generated from retrofitting A/C units under the Building
Retrofits program. To avoid double counting of savings, it is crucial that the DSM
Evaluation strategy either procedurally draws clear boundaries between the different
program initiatives through data collection and/or estimates the size of the Overlap
and allocates it among the overlapped programs. As for the latter approach, it is
generally recommended that Overlaps are dealt with pragmatically by making either a
full allocation of savings to one program, or a 50/50 allocation to each program.

Generally, most Overlaps within the ten DSM Programs can be managed procedurally
by defining clear boundaries between the data collection and reporting of savings in
different programs initiatives. Overlaps associated with P6 are the only areas that
require an estimation of the overlap and an allocation of savings between P6 and the
other DSM Programs.

The proposed approach for dealing with Overlaps for each of the ten Abu Dhabi DSM
Programs is described above in this section. An overall summary of the approaches
is presented in Figure 5.

Note: P4-P7 Overlaps: Savings from DC installations in new construction buildings (in other words non-retrofit DC projects)
should be allocated to P4; while Savings from DC retrofit projects (reported directly by ADES) should be allocated to P7.

Figure 5: Summary of the proposed overlap allocation approach per program

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 20 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

4.3 Estimated vs. Evaluated Savings


The effective communication and reporting of DSM savings is as important as the
evaluation of savings themselves. Using the right terminology to distinguish savings
as ‘net’ or ‘gross’ can have significant impact on the way savings are understood by
stakeholders. Some stakeholders may be concerned with the magnitude of net
savings for the purposes of understanding the cost-effectiveness of DSM Programs,
while other stakeholders may be more concerned with the magnitude of gross energy
and water savings, without concerns of whether savings are attributed to DSM
Programs or natural changes in consumption.

The use of different savings terminology in the context of EM&V maturity is just as
important. For example, Net Savings could refer to savings determined based on
engineering assumptions and Deemed Savings figures, just as they could refer to
savings determined based on metered data and complemented by insights from an
extensive survey of participants and interviews. The level of EM&V maturity and
certainty of the latter (measured data) is materially greater than the former
(engineering assumptions).

In the context of EM&V maturity, savings are generally reported as “projected”,


“claimed”, or “evaluated” savings.

• Projected Savings refers to savings that have been estimated for the purpose
of program design and DSM forecasting (e.g., high-level engineering
assumptions).

• Claimed Savings refers to savings that have been estimated based on actual
DSM activity data (e.g., data collected by Program Owners).

• Evaluated Savings refers to savings that have been evaluated by an


independent 3rd party evaluator.

The diagram below provides additional detail and context for each of these terms. It is
also important to consider at what point in the DSM implementation/evaluation life
cycle these terms are generally used (e.g., before implementation, after
implementation, after DSM Evaluation).

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 21 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Based on the Abu Dhabi landscape of DSM implementation and EM&V, the use of
“projected” and “claimed” savings may not be the most accurate. Instead of “projected”
or “claimed” savings, referring to savings as “estimated” savings may be most
appropriate for several reasons:

• Inclusion of EM&V adjustment factors | The EM&V model, and the


underlying DSM savings methodologies, incorporate a large selection of EM&V
adjustment factors that improve the calculation of DSM savings. These factors
include Compliance Rates, Realisation Rates and Net-to-Gross Ratios. Initially
most of these EM&V factors may only be placeholder values.

• Continuous Improvement of EM&V | Once DSM Evaluation is complete,


those results can be incorporated into the model to improve the quality of future
results. Over time, the level of certainty and maturity of Estimated Savings will
increase.

4.4 Calculating Demand Savings


In the evaluation of DSM activities, the calculation of demands savings can be
performed using a variety of methods. As with energy savings, methods that use
measurements or hourly metering data are more accurate than estimation methods
using engineering assumptions. In calculating demand savings, it is especially
important to consider how the type of end-use targeted by each DSM activity (e.g.,
lighting, space cooling, etc.) contributes to the system’s peak demand.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 22 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

For example, DSM initiatives targeting the replacement of luminaries will likely have a
lower impact on peak demand than DSM activities targeting space cooling equipment.
This is because space cooling demand is highly coincident with the electricity system
peak. In contrast, lighting demands are less coincident with system peak because
luminaries are commonly operated in the evening, when electricity demand is not at
its highest.

The US National Renewable Energy Laboratory (NREL) describes a variety of different


methods that can be employed to calculate demand savings from DSM activities.4 This
includes:

• Engineering algorithms;
• Calibrated hourly building simulation modelling;
• Billing data analysis;
• Interval metered data analysis (may also be end-use disaggregated); and
• Survey data on operating hours of end-use equipment.
The approach recommended for the Abu Dhabi DSM Strategy is to follow a consistent
engineering algorithm for all DSM Programs, while accounting for the different end-
uses targeted. The engineering algorithm described by NREL is appropriate for this
purpose. NREL employs the use of a diversity factor (%) and coincidence factor (%)
to convert energy savings (kWh) into demand savings (kW). A variation of this is to
consolidate both parameters into a “peak demand coincidence factor”. A peak demand
coincidence factor is a measure of how coincident energy consumption associated
with a particular end-use is to the system peak. This is illustrated by the graph below.
The left axis shows the system-wide energy demand, and the right axis shows space
cooling energy demand. As shown, the peaks of the system demand and space
cooling demand shapes are very coincident.

4NREL, 2018, Chapter 10: Peak Demand and Time-Differentiated Energy Savings Cross-Cutting Protocol.
Available here: https://www.nrel.gov/docs/fy17osti/68566.pd

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 23 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Figure 6: Illustration of the concept of peak demand coincidence factor

The peak coincidence demand factor for space cooling would be calculated as shown
by the equation below. Space cooling energy consumption during the system peak is
divided by the total annual space cooling consumption. In other words, the coincidence
peak demand factor is the share (%) of annual space cooling consumption occurring
during peak periods.

Each end-use (e.g., space cooling, lighting, etc.) should have a unique “peak demand
coincidence factor”. The equation below shows how the “peak coincidence demand
factor” is used to calculate demand savings.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 24 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

5 EM&V Governance Framework


This section covers the governance structure of Abu Dhabi’s EM&V activities. The
section starts with providing some insight into EM&V governance international best
practices. Then, a deeper dive into the EM&V ecosystem in Abu Dhabi is presented,
highlighting the roles and responsibilities of each of the players within the ecosystem.
Next, the data governance and data flows for each of the DSM Programs are detailed
and visualized. The annual EM&V process with the roles and responsibilities and
timelines is then presented. Finally, the section covers some miscellaneous EM&V
topics, including EM&V funding and technical resource manuals.

5.1 International Best Practice


5.1.1 Who Develops the EM&V Protocol?
Typically, in North America, EM&V protocols and frameworks are developed by
regulatory bodies to guide activities in their jurisdiction. For example, the evaluation
framework for the energy and water efficiency and conservation programs in the U.S.
State of Pennsylvania was published by the Pennsylvania Public Utility Commission,
which is responsible for regulating the public utility industries in the state. The EM&V
framework document is used to guide the efforts associated with conducting
evaluations of the energy efficiency programs and portfolios of the seven utilities in
Pennsylvania5. Given that the DoE, as the regulator, is leading the efforts to develop
this Protocol for Abu Dhabi, this element of the governance structure aligns with
international best practices.

Nevertheless, other governance structures do exist in North America, particularly in


jurisdictions where evaluations are not regarded as a core focus of the public utility
commission. A study conducted by ACEEE in 2020 highlighted that 21% of the states
had no state-wide documented evaluation protocols or guidelines6. In such cases,
there are some utilities who develop and adopt their own EM&V protocols and
frameworks, such as Avista Utilities in Washington & Idaho7. The utility developed its

5 Evaluation Framework for Pennsylvania Act 129 Phase III Energy Efficiency and Conservation Programs (2016)
6 National Survey of State Policies and Practices for Energy Efficiency Program Evaluation - ACEEE (2020)
7 Avista Utilities Evaluation Framework (2016)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 25 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

EM&V Framework in 2016 to define the methods used to perform EM&V activities for
their DSM programs.

Similarly, some government agencies and third-party program administrators have


developed their own EM&V protocols. Efficiency Manitoba, a standalone corporation
dedicated to energy efficiency and conservation in the Province of Manitoba, Canada,
adopts its own EM&V protocol8. The document provides guidance for Efficiency
Manitoba’s Energy Efficiency evaluation planning and implementation.

5.1.2 Role of the Regulator in Program Evaluation


The role of the regulatory commission in the evaluation of the programs differs from
one state to another, but generally has diminished over time. In 2012, 75% of the
regulators had a substantial role within the evaluation ecosystem, either by directly
administering the evaluation, or at least by formally approving evaluation plans and
reports. In 2020, the results indicate that only 59% of the state regulators have a strong
role in the overall evaluation process6.

Figure 7: Role of the U.S. state regulatory commissions in EM&V activities in 2012 (Left) and 2020
(Right)7
5.1.3 Third-Party Evaluations
Although the “Estimated Savings” are calculated using commonly accepted savings
methodologies and are coupled with a large selection of EM&V adjustment factors that
improve the accuracy of the savings, an independent third-party evaluation is
necessary to provide an un-biased evaluation and verification of the Estimated

8 Efficiency Manitoba Evaluation Framework and Plan – Appendix A (2019)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 26 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Savings to obtain “Evaluated Savings”. This is done via conducting impact evaluation
through a variety of research methods that are tailored to the different DSM programs
and activities within a portfolio. Third-Party evaluations can be conducted by
independent IPMVP certified personnel and/or independent entities with experience in
carrying-out such evaluations. A larger number of programs within the portfolio and
more complex programs require a proportionally bigger team to conduct the
evaluation.

In terms of conducting the evaluations in practice, a program owner typically generates


their own Estimated Savings using its own staff and/or consultants. For example, this
is done for the purpose of confirming incentive payments to participants or contractors,
for internal documentation or external reporting. A third-party evaluator will then
conduct some level of evaluation on the program. This could either be a verification
exercise only, or verification coupled with its own data collection and analysis. As part
of this process, the third-party evaluator calculates EM&V adjustment factors, such as
in-service rates, Realisation Rates, Net-to-Gross Ratios, Compliance Rates, etc., and
reports the overall Evaluated Savings results alongside the calculated factor in an
“Impact Evaluation Report” (see below for further information on evaluation reporting).

The role of the third-party evaluator(s) is not limited to only conducting the evaluation
activities. Rather, it is recommended that the evaluator(s) also develops the EM&V
planning documents alongside the Program Owners, given the evaluator’s main role
in adopting and implementing the plan (see below for further information on evaluation
planning). Consequently, it is highly recommended that the Program Owners engage
the third-party evaluator at an early stage to assist with the planning, provide general
feedback, endorse M&V plans if required, setup the data collection process and
streamline the overall evaluation activity. Nonetheless, the DoE will take this
responsibility for first next three years of this Protocol implementation to assist
Program Owners achieve the required level of maturity. Further details are outlined in
Section 5.4.3.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 27 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Who Hires the Third-Party Evaluator(s)?


There is a great diversity in North America in terms of who is responsible for hiring the
third-party evaluator(s) within the EM&V ecosystem. As Figure 8 illustrates, utilities
tend to have a predominant role in administering the evaluation activities in the U.S.
Furthermore, the role of the regulatory commission in administering evaluations has
diminished over time, with a greater share being captured by the utilities in 2020
compared to 20126.

Figure 8: Evaluation administration responsibility amongst U.S. States in 2012 (Left) and 2020 (Right) 7

For example, Commonwealth Edison (commonly known as ComEd) directly hires and
administers a third-party evaluator to carry out the evaluation activities for its energy
efficiency program in the State of Illinois9. On the other hand, in the State of California,
the California Public Utility Commission (CPUC) hires and administers the third-party
evaluation activities on behalf of the investor-owned utilities in its jurisdiction10, with
multiple firms hired to address one or more programs11.

Different EM&V process options are presented in Section 5.4, including the
recommended option for Abu Dhabi.

9 Section 8- 103B(fg(6) of the Illinois Public Utilities Act (PUA)


10 EM&V Plan 2018-2020 Version 9 – California Public Utility Commission
11 Generally for larger portfolios, there could be multiple firms hired separately under separate contracts with one

or more Program Owners, hence the use of the plural term “Third-Party Evaluator(s)” in this document.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 28 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

What is the Scope and the Frequency of the Third-Party Evaluations?


The scope of energy efficiency evaluation activities in North America depends on the
programmatic and regulatory context in each jurisdiction:

• A state (or province) with limited goals for energy efficiency and no mandated
(via codes and standards) energy efficiency activity → Limited level of EM&V.

• A state (or province) with aggressive long-term energy efficiency targets in


legislation and mandated energy efficiency activity has performance incentives
for Program Owners and requires solid data for resource planning purposes →
Rigorous EM&V.

The choice of evaluation scope and the level of certainty associated for Abu Dhabi
should strive to achieve a balance between the desired level of certainty, cost to plan
and conduct EM&V activities, and most importantly, the value of the information
generated by the evaluation efforts and how it is going to be used. Note that achieving
the desired level of certainty for some programs might be limited initially due to data
availability issues, nevertheless, this can be improved with time.

Figure 9: Illustrative chart demonstrating the impact of scope and frequency on EM&V budgets
As for the frequency of third-party evaluation programs, typically, less frequent
evaluation is required for programs with relatively smaller savings, have little variability
in savings, and/or less uncertainty to measure from one year to the other. Whereas, a
more frequent evaluation is usually required for larger and/or more complex programs
with greater variability and uncertainty to measure from one year to the other12.

12Evaluation Measurement and Verification (EM&V) Guidance for Demand-Side Energy Efficiency (EE) – US
EPA (2015)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 29 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

It should be noted that having the flexibility and the ability to adjust the EM&V plan and
implementation as the programs evolve in Abu Dhabi is critical. Based on the current
maturity and data availability, as a general guideline, a more limited third-party
evaluation scope at a higher frequency may be more appropriate for Abu Dhabi. As
implementation progresses, the evaluation scope could become more rigorous at a
lower frequency. The recommended frequencies of third-party evaluations for Abu
Dhabi’s DSM programs are summarized in Section 6.

5.1.4 Evaluation Levels


EM&V activities typically take place at three levels with varying levels of detail, which
are illustrated in the figure below.

Note: The Strategy Cycle EM&V Plan is not necessarily a direct input into the strategy level evaluation. In
essence, it is a forward-looking overview (three/four years) of evaluation activities followed by the third-
party evaluator. Alternatively, the strategy level EM&V plan could be a compilation of the impact evaluation
plans for all the DSM programs.
Figure 10: EM&V levels and the associated inputs and outputs

All evaluation planning documents should clearly present the evaluation efforts, the
level of detail required, and the timeline to be followed. While the planning documents
are mainly prepared by the third-party evaluator and/or the program owner(s), the
regulator can set out the minimum requirements and what should be included in these
documents through the regulatory framework, such as through a standard template
that is adopted by the evaluator and the program owner.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 30 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Table 1: Typical evaluation planning documents 13

Level Initiative Level Program Level Strategy Level

Plan Impact Evaluation


M&V Plan(s) Strategy Cycle EM&V Plan
Plan

In case the third-party evaluator is hired by


the strategy owner, this document is a
forward-looking long-term planning
document that indicates when major
Individual M&V Plans are required Program-specific evaluation activities will be conducted
for initiatives that are part of a impact evaluation during the evaluation cycle (one, two, or
program and are selected for plans that detail the three years). The document describes the
analysis and inspection. The plan evaluation approach at a high level, budgets, which
Description describes the specific activities that approach, timeline, programs would have impact evaluations
would be conducted at a single budgetary each year, and which programs only
site, the timeline, the relevant requirements, data require verification.
IPMVP option and verification collected, results
method. reported. If there are multiple third-party evaluators
contracted by the different program owners,
this document could be a compilation of all
the impact evaluation plans for all
programs.

• Single Third-Party Evaluator at a


The M&V Plans shall be prepared Strategy Level: Third-Party Evaluator
Third-Party with Program Owners.
Prepared by by an in-house or external CMVP
Evaluator with
and reviewed by Third-Party
Program Owner • Multiple Third-Party Evaluators:
Evaluator (occasionally). Compilation of Impact Evaluation Plans
by Regulator.

No, except for projects or initiatives


Requires DoE that contribute to significant (e.g.
Review and >5%) program savings. In such
Yes Yes
Approval? cases, the M&V Plans shall be
endorsed by the third-party
evaluator.

As Required (e.g.,
As Required, per the Impact
Frequency Annual), per the Annual/Two-Years/Three-Years depending
Evaluation Plan agreed with the
strategy Cycle on the EM&V activities level of maturity
program owners.
EM&V Plan

Feed into Impact Evaluation


M&V Report(s) DSM Annual Report
Report

13 Energy Efficiency Program Impact Evaluation Guide, EM&V Working Group, SEEAction (2012)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 31 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

DSM Evaluation results are typically reported through structured M&V reports at a
project or initiative level, which feed into an Impact Evaluation Report on a program
level. Once the entire program portfolio is evaluated, the overall results are typically
reported in the annual DSM Strategy report. Furthermore, an overall DSM Strategy
level evaluation shall be performed every two to three years to analyse successful
programs, identify opportunities for improvement and plan for future programs.

Table 2: Typical evaluation reporting documents 14

Level Initiative Level Program Level Strategy Level

Impact Evaluation DSM Strategy Strategy Evaluation


Report M&V Report
Report Report Report

Documents the results


Documents the
of an overall evaluation
methodology and Provides an
at a strategy level that
result of the impact overview of the DSM
evaluates the results to
Documents the evaluation Programs and
date, cost-
methodology used and the conducted by the documents the
Description effectiveness of the
impacts associated for a third-party impact evaluation
strategy and identifies
specific initiative or site. evaluator, and results and the cost-
opportunities for
possibly the cost- effectiveness for the
improvement and/or
effectiveness of the strategy.
discontinuations of
program.
programs or projects.

The M&V Reports shall be


prepared by an in-house or
Third-Party
Reported by external CMVP and Strategy Owner Third-Party Evaluator
Evaluator
reviewed by Third-Party
Evaluator (occasionally).

Program Owner
Program Owner and
Reported to and/or Strategy Stakeholders/ Public Strategy Owner
Strategy Owner
Owner

Public vs
Limited Public Public Public
Limited

As needed at an initiative
Upon Third-Party Two-Years/Three-
Frequency level, according to the Annual
Evaluation Years
EM&V Plan

14 Energy Efficiency Program Impact Evaluation Guide, EM&V Working Group, SEEAction (2012)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 32 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

5.2 Governance Structure: Roles & Responsibilities


There are multiple players in the Abu Dhabi EM&V ecosystem, each playing a part in
the success of the DSM Evaluation activities.

Figure 11: Abu Dhabi EM&V Ecosystem


The roles and responsibilities for each of the EM&V players are outlined as follows:

5.2.1 DSM Strategy Owner: Department of Energy


As the strategy owner, the DoE oversees the DSM Programs’ implementation, and
facilitates and administers the overall EM&V process. The roles and responsibilities
for the DoE are the following:

1. Oversee the implementation of Abu Dhabi DSM Programs aiming at the


rationalization of energy and water consumption.

2. Establish and enforce a robust EM&V ecosystem that facilitates the smooth and
successful implementation of the annual EM&V process.

3. Enforce the regulatory requirements that the Program Owners and third-party
evaluators need to fulfil in terms of EM&V planning and reporting, as well as the
eligibility criteria for performing the third-party evaluations.

4. Hire an independent, third-party evaluation contractor through a competitive


bidding process to conduct the evaluation on behalf of the Program Owners 15.

5. Review and approve DSM Program EM&V Plans.

15Recommended option for the transition period (3 years). Accordingly, this Protocol covers the scope of activity
during the 3-year transition period.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 33 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6. Send out data request sheets to Program Owners to collect relevant data from
the ten DSM Programs.

7. Manage the EM&V model through importing relevant data from Program
Owners and performing quality checks of the data and the generated DSM
Evaluation results.

8. Study the possibility of introducing an incentive/penalty scheme for relevant


Program Owners to realize the DSM targets set for their respective program(s).

9. Lead the EM&V Working Group.

10. Submit periodic reports to ADEO in relation to the DSM Strategy.

5.2.2 DSM Program Owners


The Program Owners carry out the DSM activities in accordance with the DSM
Strategy, including developing and implementing a tailored EM&V plan with the
assistance of an independent, third-party evaluator(s). The roles and responsibilities
for the Program Owners are the following:

1. Carry out DSM activities in accordance with the DSM Strategy.

2. Develop tailored EM&V plans at the program level that are aligned with the
strategy’s EM&V timeline with the contracted third-party evaluator(s) in
accordance with the regulatory requirements enforced by the DoE, and submit
plans for review and approval by the DoE.

3. Develop tailored M&V plans at the initiative level that are aligned with strategy’s
EM&V timeline.

4. Collaborate with the Program Stakeholders on collecting the relevant M&V data
following the M&V plan, and report back to the DoE through the data collection
sheets provided as part of this Protocol.

5. Subject to the regulatory requirements enforced by the DoE, collaborate and


coordinate with the independent, third-party evaluation contractor(s) to conduct
the DSM Evaluation.

6. Following the 3-year transition period outlined above, and subject to agreement
of the action plan after the transition period the most efficient arrangement for
Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 34 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

continuing with independent, third-party evaluation shall be determined in


consultation with the strategy owner.

7. Perform quality checks of the third-party evaluation outputs.

8. Report the collected EM&V data through the data collection sheet back to the
DoE in a timely manner.

9. Communicate any process improvements around the DSM Programs and


EM&V process to the DoE.

10. Participate in the EM&V Working Group

5.2.3 DSM Program Stakeholders

The Program Stakeholders also carry out the DSM activities in accordance with the
DSM Strategy, as well as assist the Program Owners and the third-party evaluators
with the EM&V activities. The roles and responsibilities for the program stakeholders
are the following:

1. Carry out DSM activities in accordance with the DSM Strategy.

2. Collaborate with the Program Owners to collect the relevant data in accordance
with the developed savings methodologies and data collection sheets in this
Protocol.

3. Collaborate with the third-party evaluators in providing the necessary data to


conduct the DSM Evaluation.

4. Communicate any process improvements around the DSM Programs and


EM&V process to the Program Owners.

5.2.4 Department of Finance

The main responsibility of the Department of Finance is to allocate an annual budget


for DSM & EM&V activities including budgeting for independent 3 rd party evaluators.
During the 3rd year of the transition period, the proposed options for allocation of
resources for 3rd party evaluators will be evaluated. Subject to agreement of DOF,
various options to ensure the most efficient allocation process will be studied and the
preferred option for post-transition period allocation included in the action plan.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 35 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

5.2.5 Supporting EM&V Players

There are other players in the EM&V ecosystem with critical roles and responsibilities:

• Consultants: As required, advise Strategy Owner and/or Program Owners on


DSM Strategy implementation and EM&V process.

• Third-Party Evaluators: Conduct third-party impact evaluation of DSM


Program activities to verify the savings reported by the Program Owners.

• ESCOs: Report relevant data required to estimate the savings from ESCO-
related activities back to the Program Owners in a timely manner.

5.3 Governance Structure: Overall Data Governance

The EM&V model collects data from all Program Owners and uses agreed savings
methodologies to calculate savings from the DSM Strategy. Section 4 of this Protocol
provides a per program view of the data governance that includes the data
requirements and collection frequency, data reporting frequency by the Program
Owners to the DoE and the data flow.

Figure 12: Overall Abu Dhabi EM&V data flow

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 36 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

5.4 EM&V Process


Based on international best practice, there are multiple viable options for Abu Dhabi’s
annual EM&V process. Figure 13 illustrates those options.

Figure 13: EM&V process flow options summary

5.4.1 EM&V Process Option A


Option A assumes that the Program Owners will contract the third-party evaluator(s)
and report Evaluated Savings annually, post evaluation.

•Program Owners contract third-party evaluator(s)


1
•Third-party evaluator(s) develop Impact Evaluation Plans with Program Owners for
2 review and approval by the DoE
•Program Owners develop M&V Plans with occasional review from Third-Party
3 Evaluator(s) and start data collection
•Third-party evaluator(s) verifies savings data, calculates EM&V adjustment factors
4 and generates evaluation report

•DoE sends data collection sheet to the Program Owners


5
•Program Owners fill in data collection sheet in collaboration with Program
6 Stakeholders
•Program Owners report “Evaluated Savings” data and Impact Evaluation Report
7 back to the DoE (Annually)

•DoE integrates evaluation results within EM&V Model to attain “Evaluated Savings”
8

•DoE reviews Impact Evaluation Report and “Evaluated Savings”


9

•DoE produces Annual DSM Report and reports results to the Executive Council
10

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 37 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Table 3: EM&V process Option A pros and cons

Option A Pros Cons

▪ Simple, streamlined process with minimal


touchpoints.
▪ No visibility into the savings accrued
▪ Updates to the EM&V Model are only throughout the year.
DoE required once a year with the final
Evaluated Savings. ▪ Potentially multiple different third-party
evaluators requiring greater resources for
▪ Management, oversight and coordination oversight and reviews.
with third-party contractor is the Program
Owner’s responsibility.

▪ Simple, streamlined process with minimal


Program touchpoints. ▪ Management, oversight and coordination
Owner(s) with third-party contractor is the Program
▪ Relatively lower effort as data is reported Owner’s responsibility.
once a year.

5.4.2 EM&V Process Option B


Option B also assumes that the Program Owners will contract the third-party
evaluator(s); however, Estimated Savings data are reported as per the assigned
frequency per program, and Evaluated Savings are reported annually, post DSM
Evaluation.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 38 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

•Program Owners contract third-party evaluator(s)


1
•Third-party evaluator(s) develop Impact Evaluation Plans with Program Owners for
2 review and approval by the DoE
•Program Owners develop M&V Plans with occasional review from Third-Party
3 Evaluator(s)

•DoE sends data collection sheet to the Program Owners


4
•Program Owners collect data and fill in data collection sheet in collaboration with
5 Program Stakeholders
•Program Owners report “Estimated Savings” data back to the DoE as per the
6 assigned reporting frequency per program

•DoE integrates “Estimated Savings” data in EM&V model


7
•Third-party evaluator(s) verifies savings data, calculates EM&V adjustment factors
8 and generates Impact Evaluation Report
•Program Owners report “Evaluated Savings” data and Impact Evaluation Report
9 back to the DoE (Annually)
•DoE integrates evaluation results within EM&V Model to attain “Evaluated
10 Savings”

•DoE reviews Impact Evaluation Report and “Evaluated Savings”


11

•DoE produces Annual DSM Report and reports results to the Executive Council
12

Table 4: EM&V process Option B pros and cons

Option B Pros Cons

▪ Visibility into the savings accrued


throughout the year, allowing for
adjustments and improvements. ▪ Updates to the EM&V model occur multiple times a
▪ Management and coordination with year with “Estimated Savings” data, and once a year
DoE third-party contractor(s) are the with “Evaluated Savings” data.
Program Owner’s responsibility. ▪ Potentially multiple different third-party evaluators
▪ DoE could retain a third-party requiring greater resources for oversight and reviews.
evaluator to carry out additional
evaluations for all programs.

▪ More complex process with multiple data collection


Program ▪ Continuous data tracking and
reporting requirements.
Owner(s) reporting, allowing for adjustments
and improvements. ▪ Management, oversight and coordination with third-
party contractor is the Program Owner’s responsibility.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 39 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

5.4.3 EM&V Process Option C


Option C assumes that the DoE will contract the third-party evaluator(s) and
Program Owners only report Estimated Savings data annually.

•DoE contract third-party evaluator(s)


1
•Third-party evaluator(s) develops strategy EM&V Plan and Impact Evaluation
2 Plans with Program Owners for review and approval by the DoE
•Program Owners develop M&V Plans with occasional review from Third-Party
3 Evaluator(s)
•DoE sends data collection sheet to the Program Owners
4
•Program Owners collect data and fill in data collection sheet in collaboration with
5 Program Stakeholders
•Program Owners report “Estimated Savings” data back to the DoE (Annually)
6
•Third-party evaluator(s) verifies savings data, calculates EM&V adjustment
7 factors and generates Impact Evaluation Report
•DoE integrates evaluation results within EM&V Model to attain “Evaluated
8 Savings”
•DoE and Program Owners reviews Impact Evaluation Report and “Evaluated
9 Savings”
•DoE produces Annual DSM Report and reports results to the Executive Council
10

Table 5: EM&V process Option C pros and cons

Option C Pros Cons

▪ Simple, streamlined process with minimal touchpoints.


▪ Updates to the EM&V Model are only required once a
year with the final Evaluated Savings. ▪ No visibility into the savings
accrued throughout the year.
▪ Direct control over evaluation process, scope and
DoE research methods, particularly for non-utility Program ▪ Management, oversight and
Owner(s). coordination with third-party
contractor is the DoE’s
▪ Complete independence from Program Delivery. responsibility.
▪ Potentially a single third-party evaluator for all
programs.

▪ Relatively lower effort as data is reported once a year.


Program ▪ Coordination with third-party
▪ Simple, streamlined process with minimal touchpoints.
Owner(s) contractor is the Program
▪ Management, oversight and coordination with third- Owner’s responsibility
party contractor is the DoE’s responsibility.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 40 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

5.4.4 EM&V Process Option D


Option D assumes that the DoE will contract the third-party evaluator(s), with
Estimated Savings data reported by the Program Owners as per the assigned
frequency per program, and verified savings are reported annually, post DSM
Evaluation.

•DoE contracts third-party evaluator(s)


1
•Third-party evaluator(s) develops strategy EM&V Plan and Impact Evaluation Plans
2 with Program Owners for review and approval by the DoE
•Program Owners develop M&V Plans with occasional review from Third-Party
3 Evaluator(s)

•DoE sends data collection sheet to the Program Owners


4
•Program Owners collect data and fill in data collection sheet in collaboration with
5 Program Stakeholders
•Program Owners report “Estimated Savings” data back to the DoE as per the
6 assigned reporting frequency per program

•DoE integrates “Estimated Savings” data in EM&V model


7
•Third-party evaluator(s) verifies savings data, calculates EM&V adjustment factors
8 and generates Impact Evaluation Report

•DoE integrates evaluation results within EM&V Model to attain “Evaluated Savings”
9
•DoE and Program Owners reviews Impact Evaluation Report and “Evaluated
10 Savings”

•DoE produces Annual DSM Report and reports results to the Executive Council
11

Table 6: EM&V process Option D pros and cons

Option D Pros Cons

▪ Visibility into the savings accrued throughout the year, ▪ DoE updates EM&V model with
allowing for adjustments and improvements. “Estimated Savings” data
DoE ▪ Direct control over evaluation process, scope and ▪ Management, oversight and
research methods, particularly for non-utility Program coordination with third-party
Owner(s) contractor is the DoE’s
▪ Complete independence from Program Delivery responsibility.

▪ Management, oversight and coordination with third-party


contractor is the DoE’s responsibility ▪ More complex process with
Program
Owner(s) ▪ Continuous data tracking and reporting, allowing for multiple data collection reporting
adjustments and improvements. requirements
▪ Potentially a single third-party evaluator for all programs.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 41 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

5.4.5 EM&V Process Recommended Option


Ideally for Abu Dhabi, an EM&V process where the Program Owners contract the third-
party evaluator(s) with frequent reporting would be adopted i.e., Option B, in order to:

1. Create ownership and accountability of the results for the Program Owners;
2. Enhance Program Owners’ knowledge of their programs;
3. Provide the DoE with the necessary oversight of the programs; and
4. Ultimately create better programs.

However, given the infancy of EM&V in the Emirate, the recommendation is for Abu
Dhabi to go through a transition period of 3 years, where the DoE guides the EM&V
efforts and cements these within the DSM Strategy by hiring the third-party
evaluator(s) on behalf of the Program Owners with frequent reporting of the data by
the latter i.e., Option D. This recommendation also stems from the fact that there are
non-utility Program Owners who would potentially require additional energy expertise
to oversee the EM&V activities. Beyond this transition period, the Program Owners
would contract the third-party evaluator with frequent reporting i.e., Option B.

In order to facilitate the implementation of the EM&V process beyond the transition
period, it is recommended that in the final year of the transition period (i.e. year 3), the
Strategy owner in consultation with the Program Owners, the appointed transition
period third-party evaluator(s) and other key stakeholders, develop and align on an
action plan for EM&V activities beyond the transition period. This will allow the Abu
Dhabi EM&V stakeholder ecosystem to adopt lessons learnt from the transition period,
assess the readiness of Program Owners to take third-party contracting
responsibilities and ensure the robustness of EM&V activities going forward.

The action plan will also be accompanied by a Terms of Reference (TOR) outlining
the responsibilities and resourcing implications (if applicable) for activities beyond the
transition period. This action plan and TOR will require sign off from all participating
Program Owners in order to demonstrate and acknowledge awareness of the various
responsibilities of the Program Owners beyond the transition period.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 42 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Additionally, it is also recommended that the DoE carries out strategy level evaluations
for the entire DSM Strategy at an appropriate frequency (every 3/4/5 years). These
evaluations look at the overall cost-effectiveness of the programs, analyses successful
programs, identifies opportunities for improvement and provides recommendations on
which programs or initiatives should be continued or discontinued. During the
transition period, the DoE will fund the EM&V activities for some or all Program
Owners.

5.4.6 EM&V Process Roles and Responsibilities


The EM&V process roles and responsibilities during the transition period are outlined
in Table 7.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 43 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Table 7 EM&V process roles and responsibilities for the Transition Period

Strategy
Program Program Third-Party
Step EM&V Process Owner –
Owners Stakeholders Evaluator
DoE

1 DoE contracts third-party evaluator(s). ✔ - - ✔

Third-party evaluator(s) develops


strategy EM&V Plan and Impact
2 ✔ ✔ - ✔
Evaluation Plans with Program Owners
for review and approval by the DoE.

Program Owners develop M&V Plans


3 with occasional review from Third-Party - ✔ ✔ ✔
Evaluator(s).

DoE sends data collection sheet to the


4 ✔ - - -
Program Owners.

Program Owners collect data and fill in


5 data collection sheet in collaboration with - ✔ ✔ -
Program Stakeholders.

Program Owners report “Estimated


Savings” data back to the DoE as per the
6 ✔ ✔ - -
assigned reporting frequency per
program.

DoE integrates “Estimated Savings” data


7 ✔ - - -
in EM&V model.

Third-party evaluator(s) verifies savings


data, calculates EM&V adjustment
8 - - - ✔
factors and generates Impact Evaluation
Report.

DoE integrates evaluation results within


9 EM&V Model to attain “Evaluated ✔ - - -
Savings”.

DoE and Program Owners reviews


10 Evaluation Report and “Evaluated ✔ ✔ - -
Savings”.

DoE produces Annual DSM Report and


11 ✔ - - -
reports results to the Executive Council.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 44 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

5.4.7 EM&V Process Timeline


Given that EM&V is in its early stages in Abu Dhabi, the recommendation is to only
focus on generating “Estimated Savings” in the first year of Protocol implementation
(Table 8). Beyond that, the EM&V Process timeline is outlined in Table 9.

Table 8: EM&V Protocol timeline – First year of implementation


2021 2022
Step EM&V Process
Q1 Q2 Q3 Q4 Q1 Q2

1 DoE sends data collection sheet to the Program Owners.

Program Owners collect data and fill in data collection sheet in collaboration
2
with Program Stakeholders.

Program Owners report “Estimated Savings” data back to the DoE as per the
3
assigned reporting frequency per program.

4 DoE integrates “Estimated Savings” data in EM&V model.

5 DoE produces Annual DSM Report and reports results to ADEO.

Table 9: Protocol timeline – Subsequent years of implementation


Year n Year n+1
Step EM&V Process
Q1 Q2 Q3 Q4 Q1 Q2

DoE contracts third-party evaluator


1
(procurement process starts in Q3 Year n-1).

Third-party evaluator(s) develop Strategy Level EM&V Plans (as per


2 EM&V Cycle) and Impact Evaluation with Program Owners for review and
approval by the DoE.

Program Owners develop M&V Plans with occasional review from Third-
3
Party Evaluator(s).

4 DoE sends data collection sheets to the Program Owners.

Program Owners collect data and fill in data collection sheets in


5
collaboration with Program Stakeholders.

Program Owners report “Estimated Savings” data back to the DoE as per
6
the assigned reporting frequency per program.

7 DoE integrates “Estimated Savings” data in EM&V model.

Third-party evaluator(s) verifies savings data, calculates EM&V


8 adjustment factors, and generates Impact Evaluation Report which is
reviewed and reported back to DoE.

DoE integrates evaluation results within EM&V Model to attain “Evaluated


9
Savings”.

DoE and Program Owners review Impact Evaluation Report and


10
“Evaluated Savings”.

11 DoE produces Annual DSM Report and reports results to ADEO.

Standard steps Semi-annual data as per reporting frequency Full year data for all programs

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 45 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

5.5 EM&V Implementation


5.5.1 Third-Party Evaluation Frequency
The table below outlines the recommended frequencies of conducting third-party
evaluations for each of the DSM Programs.

Table 10: Summary of third-party evaluation frequency per program

DSM Program Rationale Frequency

Post building codes


P2 Buildings Compliance Rates generally don’t vary year on year, so third-
update, then as
Regulations party evaluation is required post building codes update.
needed

P3 Street & Public P3 has a low variability and contributes to relatively low
Five Years
Realm Lighting savings.

P4 Efficient EM&V primarily required for non-DC cooling activities and


Two Years
Cooling these activities are yet to be formalized into a program

P5 Energy Storage EM&V not required given meter data is directly provided Not Required

P6 Standards & P6 contributes to significant share of savings hence justifies


Annual
Labels an annual EM&V exercise

P7 Building P7 contributes to relatively large share of savings hence


Two Years
Retrofits justifies EM&V exercise every two years.

P8 Efficient Water P8 contributes to significant share of water savings hence


Annual
Use & Re-Use justifies an annual EM&V exercise.

Demand response activities are yet to be formalized. An


P9 Demand
EM&V plan should be conducted in the first year. Beyond Two Years
Response
that, EM&V activities every two years may be sufficient

Behavioral change is an area that requires more effort to


P10 Rebates & build credibility and verify savings. Moreover, given each
Annual
Behavioral Change program area is inherently different, an annual EM&V is
justified.

5.5.2 EM&V Funding


EM&V activities are generally considered a part of the DSM Programs’ implementation
and administration costs. Therefore, for utility-administered programs, the program
funding (including EM&V costs) flows through the existing price control mechanism
that is currently adopted in Abu Dhabi. Although the EM&V budgets are part of the
program’s costs, there should be a clear separation of duties between the program
Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 46 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

implementers/administrators and the program evaluators to unbiasedly account for the


savings, hence emphasizing the need for and independent third-party evaluator.

As mentioned previously, the third-party evaluator could be hired either by the


regulator or the Program Owners, depending on the regulatory context and
requirements. In both cases, the third-party evaluation activities are typically funded
by the Program Owners through the allocated budget. If the regulator funds the EM&V
activities, it is considered as a form of subsidy.

A reasonable and acceptable M&V budget can be defined as follows16:

• The cost of M&V should (preferably) be between 2-15% of the implementation


cost. This rule applies to the initial set-up cost of M&V in year 1 (acquisition of
meters, development of M&V Plan, link-up to M&V center for data-warehousing,
etc.).

• In addition, the subsequent annual M&V costs should ideally not exceed
roughly 10% of the expected annual saving.

The overall reasonable cost for performing M&V is usually deemed to be 5% or less
than the total building retrofit project costs.

The performance incentive for DSM savings does not have to be based on a kWh or
m3 savings goals in Abu Dhabi, at least during the initial stages of the DSM Programs’
implementation. Instead, the performance incentives could be driven by operational
targets (e.g., number of ACs sold under rebates for P10 rather than savings generated
by the A/C systems.) Once data is more readily available and evaluation activities are
well established, the targets could shift to be based on kWh or m 3.

At present, the DoE is developing the RC2 requirements in consultation with the
Distribution Companies, that include financial incentives linked to successful
implementation of DSM initiatives. Once completed, the agreed funding mechanisms
and cost recovery mechanisms will be included in a future version of this Protocol.

Measurement & Verification Process for Calculating and Reporting on Energy and Demand Performance –
16

General Guidance. Clean Energy Ministerial. August 2014.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 47 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

5.5.3 Technical Reference Manuals

A Technical Reference Manual (TRM) is a document that outlines energy and water
efficiency measures, their expected savings as well as the associated savings
baselines and assumptions (either through Deemed Savings or engineering
algorithms). The TRMs are used by program administrators and implementers to
reduce EM&V costs and uncertainty when determining the “Estimated Savings”.
Evaluators will also use the information provided in TRMs to determine “Evaluated
Savings” values.

Many of the U.S. states have their own tailored TRMs in a PDF format. Some states
(e.g., California) have moved to an electronic TRM (eTRM) which offer better
documentation, is easier to manage and update, and is more user friendly 17.

The recommendation is that Abu Dhabi develops an eTRM, which will provide a
rigorous and credible estimate of the impact of a variety of energy and water
conservation measures, to support the EM&V efforts and cost-effectiveness estimates
going forward.

17 California eTRM: https://www.caetrm.com/

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 48 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6 DSM Programs EM&V Methodologies


6.1 Program 2: Building Regulations
The scope of the Building Regulations program is the enforcement of Estidama’s Pearl
Rating System (PRS) by integrating it within the building permit process. The program
has been implemented since 2010 and an update to the PRS is planned in future.

6.1.1 International Best Practice for Evaluation


There are two main approaches for evaluating building regulation programs, whether
new construction and/or whole-building retrofit programs:

• Calibrated Building Simulation Modelling (IPMVP Option D). Most new


construction regulation-based utility-funded programs in North America require
building energy models to be submitted for each new project. An evaluator then
verifies the savings for a sample of these projects to estimate the program-level
Realisation Rate. For example:

o In the US, the process for new construction projects to be granted


ENERGY STAR certification requires energy building simulation
modelling to determine energy savings.18

o In Canada, BC Hydro’s Commercial New Construction program has


established guidelines for project developers to submit energy building
simulation models in order to receive partial funding. 19 Similarly,
Efficiency Nova Scotia, the program administrator of energy efficiency
programs in Nova Scotia, has also created strict guidelines for building
simulation modelling.20

18 ENERGY STAR. Multifamily New Construction Certification Process. Available here:


https://www.energystar.gov/partner_resources/residential_new/program_reqs/mfnc_cert_process
19 BC Hydro, 2018. New Construction Program’s Energy modelling guideline. Available here:

https://www.bchydro.com/content/dam/BCHydro/customer-portal/documents/power-smart/builders-
developers/energy-modeling-guidelines.pdf
20 Efficiency Nova Scotia. New Construction flow chart of requirements. Available here: https://efficiencyns.ca/wp-

content/uploads/2016/09/New-Construction-Flow-Chart-VF.pdf

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 49 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

• Simulation-Based Savings Calculation (i.e., Prototype Buildings). For


macro analysis of newly developed building codes, energy and water savings
are determined by comparing two cases, one for the baseline (previous code)
and one for the comparison case (new code) using prototype buildings. 21 For
example, the US Department of Energy evaluates the impact of newly
developed building codes using prototype building models (developed with the
use of the EnergyPlus™ simulation model) and national weighting factors –
used to develop a breakdown of building types for each state.22

6.1.2 Recommended Savings Estimation Approach

The recommended approach is to use simulation-based savings calculations for


prototype buildings to estimate savings from the Building Regulations program. This
recommendation is based on the difficulty of collecting measured energy and water
consumption for all new construction buildings, in addition to the challenge of requiring
project developers to submit project-specific building energy models.

The use of simulated consumption and savings for prototype buildings is an interim
solution that can be complemented in the future when, and if, measured consumption
data linked to individual buildings becomes available. Incorporating measure
consumption data into the analysis would require some level of calibration to be
performed on the simulated-energy consumption data to better match measured
consumption.

While the recommendation is to use simulation-based savings, the Abu Dhabi


Benchmark Study collected energy and water consumption data for several hundred
buildings in Abu Dhabi. The result of this project will be a set of energy and water
intensity figures (kWh/m2/year and L/m2/year) by building type. The intention is to use
these measured consumption intensities in lieu of simulation-based consumption.

The recommended savings equation is presented below. The first part of the equation
is used to estimate energy and water savings from all new construction projects, and
the second part incorporates several EM&V adjustment factors. The energy and water

21 US DoE (PNNL), 2015, “Energy Modeling Building Codes Assistance Project”. Available here:
https://www.pnnl.gov/main/publications/external/technical_reports/PNNL-24269.pdf
22 US DoE. Energy Plus. Available here: https://www.energy.gov/eere/buildings/downloads/energyplus-0

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 50 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

intensities being collected are part of the first part and will be used to set the baseline
consumption and Estidama savings. Each parameter in the equation is described
below23.

• Baseline consumption (kWh/m2 /year or L/m2 /year): Baseline electricity


and water use intensity, by building type.

• Estidama Savings (%): Electricity and water savings from the Estidama PRS
regulations vs. the baseline, by building type.

• New GFA (m2): Total new gross-floor area (GFA) from new construction
projects, by building type.

• Occupancy Impact Factor (%): This parameter is used to account for the
impact building occupancy can have on energy and water use. For example, a
building at 50% occupancy will not have 50% of the energy and water
consumption of a building at 100% occupancy, rather this may be closer to 70-
80%. Occupancy is made up of two parameters; macro-occupancy and
occupancy ramp up. The combined impact of these two parameters is used as
the basis for determining the impact on savings:

o Macro Occupancy (%): Overall building occupancy in Abu Dhabi as a


whole (e.g., on average, buildings in Abu Dhabi are 90% occupied).

o Occupancy Ramp up (%): Ramp up assumptions by building type (e.g.,


in month 1 (M1) after construction completion, high-rise apartments are
5% occupied, in M2 this increases to 10%, in M3 15%, etc.). Ramp up
varies by building types. For example, villas may be fully occupied in a
matter of months, whereas high-rise apartment buildings may take up to

23Note: A depreciation factor/savings cut-off is not included in the calculation given the long lifetime of buildings
(50+ years) relative to the time horizon of the evaluation activity.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 51 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

2-years. Occupancy ramp up assumptions reach 100% for all building


types. These ramp-up schedules are multiplied by the overall “macro-
occupancy” factor.

• Weather Scaling Impact Factor (%): Parameter used to adjust energy savings
to account for the impact weather (temperature) can have on energy
consumption.

• Compliance Rate (%): Parameter to use reflect adherence of project


developers to Estidama PRS regulations.

6.1.3 Gross vs. Net Savings

The calculation of Net Savings will initially be equivalent to Gross Savings. Gross
Savings and Net Savings will both have baseline of pre-Estidama (“business as
usual”). In the future, when a new set of new construction building regulations are
adopted (“Estidama v2”), the current set of Estidama PRS regulations (“Estidama v1”)
will become the baseline for Net Savings whereas Gross Savings will continue to be
calculated based on a “business as usual” baseline.

This changing baseline is illustrated in the table below. As show, the baseline for Gross
Savings will always be “business as usual”, whereas the baseline for Net Savings will
be adjusted over time.

Table 11: Building regulations net and gross baseline evolution with Estidama v2

Today Baseline Future Baseline


Savings Type
(with Estidama v1) (with Estidama v2)

Gross Savings BAU BAU

Net Savings BAU Estidama v1

6.1.4 Data Governance


The data requirements and collection frequency are shown in the table below. Data
reporting frequency to DoE is defined as semi-annual to provide increased oversight
over this program, which contributes a significant share of total DSM Strategy
savings.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 52 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Table 12: Building Regulations Program - Data requirements and collection frequency

Data Collection
Category Data
Source Frequency

Buildings Database
(Completion date, typology, GFA per DMT Bi-Annual
DSM Activity building)
Data
Baseline EUI & WUI and Estidama Savings
DoE As Needed
(kWh/m2, L/m2, %)

Macro Occupancy Factor Annual

EM&V
Occupancy Ramp Up DoE Five Years
Adjustments
Cooling Degree Days Annual

EM&V Post buildings code


Third-Party
Compliance Rates update, then as
Outputs Evaluation
needed

Data Flow

Figure 14: Building Regulations Program - Data Flowchart

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 53 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.1.5 Overlaps
The largest Overlaps occur with P6 Standards & Labels where, as a result of enforcing
the Estidama PRS, savings from installing an appliance/product (e.g., air-conditioning
units) under the labelling scheme would also contribute to P2 savings. P2 also has
Overlaps with P8 Efficient Water Use and Reuse and P4 Efficient Cooling in terms of
mandating TSE use and district cooling connections in new buildings respectively.

Table 13: Description of Overlaps in P2 Building Regulations

P8 Efficient Water
P4 Efficient Cooling P6 Standards & Labels
Use & Re-Use

Savings from New building


New building regulations
appliance/product standards regulations may
Overlaps may mandate connection to
(tracked under P6) may mandate the use of
a district cooling system
contribute to P2 savings TSE for landscaping
Full Allocation: All district
cooling savings will be 50/50 Allocation: Appliances Full Allocation: All
counted in P4. In the future, installed in new buildings will TSE use in
Approach
EUIs may need to be be allocated 50/50 across P2 landscaping will be
adjusted to discount DC and P6 counted in P8
consumption.
Sizing Procedurally Manual allocation (50/50) Procedurally

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 54 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.2 Program 3: Street & Public Realm Lighting

The scope of the Street & Public Realm Lighting program is the installation of new and
retrofitting of all public lighting in roadways, parks and other spaces across Abu Dhabi
with high efficiency LED bulbs.

6.2.1 International Best Practice for Evaluation

In North America, the evaluation scrutiny placed on street lighting projects is much
lower than with other energy efficiency programs. Street lighting projects are less
notorious and high-profile than customer-focused energy efficiency programs. This is
also in part because the magnitude of savings from street lighting projects is
significantly lower than customer-focused programs. Nevertheless, general EM&V
guidelines defined by North American regulators and electric utilities remain valid and
appropriate for street lighting projects.

Several international organisations have developed very specific guidelines and


recommendations for street lighting evaluations. For example, the UNFCCC defined
evaluation guidelines for street lighting projects used as part of Clean Development
Mechanism (CDM) projects.24 Similarly, the Alliance for an Energy Efficient Economy
(AEEE) and the Investor Confidence Project (ICP) – two organisations promoting
energy efficiency projects – have also created evaluation guidelines. 25, 26

In general, best practice for the evaluation of street lighting projects is dependent on
the availability of existing metering infrastructure for street lighting luminaires.

• IPMVP Option C (Whole Facility) is recommended when entire street lighting


cluster loads are connected to a single meter (with limited non-street lighting
loads). In these cases, measured metering data can be retrieved and analysed
easily.

24 UNFCCC, 2013, Demand-side activities for efficient outdoor and street lighting technologies” . Available here:
https://cdm.unfccc.int/methodologies/DB/JXH8OI21V4PIQTL2WJLG6KJP5BTY3H
25 AEEE, 2015, “Preparation of Monitoring & Verification Protocols for Street Lighting”. Available here:

https://aeee.in/wp-content/uploads/2020/07/2015-Preparation-of-Monitoring-Verification-Protocols-for-Street-
Lighting.pdf
26 ICP, 2018, “Streetlighting Protocol”. Available here:

https://europe.eeperformance.org/uploads/8/6/5/0/8650231/icp_street_lighting_v1.0.pdf

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 55 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

• Retrofit Isolation (Option 1/2) involves the use of power meters to take
measurements of individual street lighting luminaries, or the installation of data-
loggers to measure consumption over an extended period. This is
recommended in situations where utilities do not have metering in place and
the billing of street lighting loads relies on estimate, rather than metered data.

6.2.2 Recommended Savings Estimation Approach

The recommended savings equation is presented below. The first part of the equation
is used to estimate energy savings based on the baseline and efficient wattage of the
luminaire, the number of luminaires installed and their annual operating hours. The
second part of the equation incorporates two EM&V adjustment factors; a Realisation
Rate and a Net-to-Gross Ratio.

The calculation of a Realisation Rate would involve the evaluation of a sample of street
lighting projects, beginning with desktop reviews of the database of projects, followed
by a review of engineering assumptions and interviews with project implementers. For
a small sample of projects, this would be complemented by on-site verification and
analyses of metering data, based on the availability of existing metering infrastructure.

• Baseline wattage (W): Wattage of the luminaire being replaced, or if new


installation, then baseline wattage assumption.

• Efficient wattage (W): Wattage of the luminaire being installed.

• Number of lamps installed (units): Number of luminaires being


replaced/installed.

• Operating Hours (hours): Number of operating hours in the year.

• Project Lifetime (years): Lifetime of the luminaire being installed, not


explicitly used in the equation, but collected for the purpose of capping
savings beyond lifetime.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 56 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.2.3 Gross vs. Net savings


To calculate Net Savings, a program-level Net-To-Gross Ratio may be applied.
Alternatively, a project-level adjustment to the baseline wattage may be applied. For
example, an 80W incandescent bulb may be replaced with a 20W LED bulb. To
calculate Gross Savings, the 80W incandescent is taken as the fixed baseline to
calculate annual savings. To calculate Net Savings, every few years, the baseline is
adjusted up, from an 80W incandescent to a 50W CFL and finally to a 20W LED. As
a result, Net Savings decrease over time.

Table 14: Street lighting adjusted baseline general principle

2020 2021 2022 2023 2024 2025 2026+

Gross Baseline Incandescent (80W)


Savings Savings Yes (80W – 20W = 60W)

Net Baseline Incandescent (80W) CFL (50W) LED (20W)


Savings Savings Yes (80 – 20 = 60W) Yes (50 – 20 = 30W) No 20 - 20 (=0W)

At a program-level the baseline may be adjusted much more gradually. Since Net
Savings are intended to capture natural increases in efficiency (absent a DSM
intervention), the magnitude of the adjusted baseline factors below should capture, in
reality, the efficiency increase observed year-on-year as low-efficiency luminaires
reach the end of their life and are replaced, whether with a higher-efficiency luminaire
or with the old luminaire.

Table 15: Program level street lighting net and gross adjusted baseline

2020 2021 2022 2023 2024 2025 2026


Gross Baseline 100% 100% 100% 100% 100% 100% 100%
Net Baseline 100% 95% 90% 85% 80% 75% 70%

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 57 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.2.4 Data Governance


The data requirements and collection frequency are shown in the table below. Data
reporting frequency to DoE is defined as annual.

Table 16: Street & Public Realm Lighting Program - Data requirements and collection frequency

Data Collection
Category Data
Source Frequency

Street Lighting Database


DMT:
(Completion date, reporting
- ADM
DSM Activity Data organisation, baseline wattage, Annual
- AAM
efficient wattage, units installed,
- DRM
operating hours, operating lifetime)

Realisation Rate Third-Party


EM&V Outputs Five Years
Net-To-Gross Ratio Evaluation

Data Flow

Figure 15: Street & Public Realm Lighting Program - Data Flowchart

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 58 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.2.5 Overlaps
In rare occasions, there might be Overlaps between P3 and P7 Building Retrofits in
case retrofit activities extend to cover street lighting upgrades or installations.

Table 17: Description of Overlaps in Street & Public Realm Lighting Program

P7 Building Retrofits

Overlaps While rare, retrofits projects may include street lighting installations
Approach Full Allocation: Street lighting retrofits will be allocated 100% to P3 and not to P7
Sizing Procedurally

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 59 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.3 Program 4: Efficient Cooling


The scope of the Efficient Cooling program is to encourage the adoption of efficient
cooling technologies and practices in Abu Dhabi. The scope of the program was
initially focused exclusively on the adoption of District Cooling (DC) through the
implementation of a regulatory framework promoting DC infrastructure. More recently,
the program has expanded to reflect a more holistic view on all cooling system and
incorporates other efficient cooling initiatives, such as:

• Cooling appliances (also captured as part of P6 Standards & Labels);

• Cooling retrofit projects (also capture as part of P7 Building Retrofits); and

• Improved cooling system management.

This section focuses exclusively on two initiatives, District Cooling and Improved
Cooling System Management. The other two initiatives, cooling appliances and
cooling retrofit projects, are covered in their respective sections of this report (Program
6 and Program 7 sections, respectively).

6.3.1 District Cooling


International Best Practice for Evaluation

In the context of DSM, district energy projects – and in the GCC context, DC projects
– stand out compared to more traditional DSM energy conservation measures like
efficient appliances or lighting retrofit projects. The nature of DC projects as large
infrastructure projects with relatively long lead timelines for development means that
project developers justify them based on their own right as net-positive commercial
business cases. As such, there is no involvement from energy utilities, no need for
utility-sponsored DSM incentives to justify the projects, and in turn no requirement for
EM&V activities and the evaluation of energy savings.

Nevertheless, the core concepts of EM&V practices are still valid and should be
applied in the context of DC projects in order to estimate electricity savings. For
example, the “baseline” measure can be defined as the mix-market average of
distributed A/C systems, while the “efficient” measure should be defined as the DC
system.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 60 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Incentive-Based Programs

The closest comparable DSM projects to DC in North America are combined heat and
power (CHP) systems. While different in nature and scale, CHP systems are also
facility-level measures. A key difference, however, is that CHP programs in North
America are often utility-sponsored, incentive-based programs. CHP developers
receive a financial incentive (e.g., $/kWh, or $/kW) in return for electricity and gas
saving resulting from the installation of CHP systems. As a result, there is a direct
relationship between the financial incentive offered by the program and a CHP system
installed by the business.

As illustrated by the figure below, the EM&V approach for CHP systems is at the
facility-level. Savings are reported to the program owner by individual
facilities/buildings. A third-party evaluator will then conduct an evaluation consisting of
multiple levels of review, beginning with an engineering desk review, interviews and
surveys, assessment of pre- and post-installation metering and on-site M&V audits.
This multi-layer evaluation results in an estimated Realisation Rate (%) and a Net-to-
Gross Ratio (%).

Market-Transformation Programs
Abu Dhabi’s DC initiative is not an incentive-based program. Rather, the DC initiative
is designed to influence overall market adoption of DC systems (e.g., the share of
technically feasible use cases that adopt DC systems) and the average efficiency of
DC systems. This type of DSM initiative is characterised as a market transformation
program and as a result the estimation of savings is not performed at the facility-level,
rather at the cooling market-level using a top-down savings approach. This is
illustrated by the equation below with two sources of savings; savings from growth
(e.g., growth in the cooling energy demand met by DC systems) and savings from
efficiency (e.g., increases in the efficiency of DC systems).
Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 61 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Recommended Savings Estimation Approach


Electricity Savings Estimation Approach
The recommended methodology is to attribute savings to the DC initiative from two
components; growth and efficiency. These two components are illustrated graphically
by the figure below, over two periods; before the launch of the initiative (denoted as
“business as usual”) and after the launch (denoted “DSM Strategy”).

• Savings from Growth: Since the initiative was launched, has DC cooling
output increased above historical baseline levels?

• Savings from Efficiency: Since the initiative was launched, has the
efficiency of DC cooling systems improved compared to historical baseline
efficiencies?

Each of these savings components require their own definition of baselines. Figure 16
provides an example of how the growth baseline of the DC initiative is defined; an 11%
market share for DC cooling output compared to the overall cooling output from all
cooling systems in Abu Dhabi. If the market share increases above 11%, this increase
would be attributed as savings to the DC initiative. If the market share does not rise
above 11%, no savings would be attributed.

An equivalent graph could be drawn to illustrate the efficiency baseline. If the overall
efficiency of all DC systems (both from the baseline output and above the baseline)
increased above the baseline, this increase would be attributed as savings to the DC
initiative.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 62 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Figure 16: Illustration of the district cooling’s DC output growth baseline


The detailed savings equation for growth and efficiency savings is presented below,
along with descriptions of each variable27. Each variable is categorised as either
‘actual data’ and ‘baselines’. Actual data is data that would be collected annually from
DC operators, while baselines are calculated based on historical data.

• Actual DC Output (TRh): Actual DC cooling output from all DC operators


(yearly data).

• Baseline DC Output (TRh): Projection of baseline DC output calculated based


on the historical DC market share baseline (e.g., 11%) and the actual total
cooling output from DC and non-DC cooling systems (TRh). As Abu Dhabi
continues to grow, actual total cooling output (from DC and non-DC systems)
grows as well. This in-turn means the baseline DC output projections will also
grow over time.

27Note: A depreciation factor/savings cut-off is not included in the calculation given the long lifetime of DC projects (equivalent to
building lifetimes of 50+ years) relative to the time horizon of the evaluation activity.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 63 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

• Baseline DC Efficiency (kWh/TRh): Projection of baseline DC efficiency


calculated by dividing historical electricity usage from DC operators (kWh) by
historical DC cooling output (TRh) supplied to customers. If historical data
shows DC efficiency improving over time, then baseline projections will also
show an improving efficiency.

• Baseline Non-DC Efficiency (kWh/TRh): Projection of baseline non-DC


efficiency determined based on the mix-market average efficiency of A/C
systems in Abu Dhabi. If historical data shows A/C systems efficiency improving
over time, then baseline projections will also show an improving baseline. The
baseline non-DC efficiency should be consistent with the baseline defined for
A/C units as part of Program 6: Standards & Labels.

• Actual DC Output (TRh): same as above.

• Actual DC Efficiency (kWh/TRh): Determined by dividing actual electricity


usage from DC operators (kWh) by actual DC cooling output (TRh) supplied
to customers.

• Baseline DC Efficiency (kWh/TRh): same as above.

District Cooling Water Demand Change Estimation Approach


Alongside the electricity savings attributed to DC initiatives, whether growth or
efficiency related, there is an associated change in water demand. For example, as
the DC cooling output increases above historical baseline levels i.e., growth, an
increase in water demand will also be observed given the increased water-cooled heat
rejection requirements. On the other hand, there could also be a reduction in the water
demand in existing DC plants as a result of more efficient water use e.g., less make-
up water. The recommended methodology to quantify this water demand change that
is attributed to the DC program follows the same principles and is linked with the
electricity savings estimation approach described above.
Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 64 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Figure 17 illustrates the link between the DC Cooling Output in TRh and the associated
water demand in m3. This link is established through a baseline Water Demand Factor
(m3/TRh) using historical TRh and m3 data reported by DC operators. As previously
mentioned, if there is an overall increase in the DC market share above the baseline
(11%) i.e., savings can be attributed to the DC initiatives, this baseline Water Demand
Factor can then be used to calculate the annual change in water demand that can be
directly attributed to the program. If the market share does not rise above 11%, no
water demand change would be attributed.

Figure 17: Illustration of the link between district cooling’s DC output growth and the DC water demand
The detailed equations to estimate the change in water demand from DC growth and
water use efficiency is presented below with descriptions of each variable28. Each
variable is categorised as either ‘actual data’ or ‘baselines’. Actual data is data that
would be collected annually from DC operators, while baselines are calculated based
on historical data.

28Note: A depreciation factor/savings cut-off is not included in the calculation given the long lifetime of DC projects (equivalent to
building lifetimes of 50+ years) relative to the time horizon of the evaluation activity.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 65 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

• Actual DC Output (TRh): Actual DC cooling output from all DC operators


(yearly data).

• Baseline DC Output (TRh): Projection of baseline DC output calculated based


on the historical DC market share baseline (e.g., 11%) and the actual total
cooling output from DC and non-DC cooling systems (TRh). As Abu Dhabi
continues to grow, actual total cooling output (from DC and non-DC systems)
grows as well. This in-turn means the baseline DC output projections will also
grow over time.

• Baseline Water Demand Factor (m3/TRh): Projection of baseline water


demand factor calculated by dividing historical water consumption from DC
operators (m3) by historical DC cooling output (TRh) supplied to customers.

• Actual DC Output (TRh): same as above.

• Actual Water Demand Factor (m3/TRh): Determined by dividing actual water


consumption from DC operators (m3) by actual DC cooling output (TRh)
supplied to customers.

• Baseline Water Demand Factor (m3/TRh): same as above.

Gross vs. Net Savings

The description of the savings methodology and baseline variables above is framed in
the context of calculating Net Savings. For example, the baseline DC efficiency and
the baseline non-DC efficiency may both be moving baselines that improve over time.
Table 18 describes the differences in the definition of the baseline variables in the
context of Gross Savings and Net Savings. In general, Gross Savings are determined
when baselines are fixed, while Net Savings reflect moving or adjusted baselines.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 66 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Table 18: Definition of district cooling baseline variables in the context of Net Savings and Gross
Savings
Gross Savings Net Savings
Calculated by multiplying DC market share (%) by actual total cooling output (TRh)

Baseline DC Based on a fixed DC market Based on an adjusted DC market share. For example,
Output (TRh) share (e.g., fixed at 11%) if the historical data shows a DC market share
increasing by 1% per year (e.g., 11% in 2013, 12% in
2014, etc.)
Baseline DC Based on a fixed baseline of DC Based on a moving baseline of DC efficiency.
Efficiency efficiency. Assuming historical data reflects a changing baseline
(kWh/TRh) – e.g., a historical increase in DC efficiency over time)
Based on a fixed baseline of Based on a moving baseline of water demand factor.
Baseline Water
water demand factor. Assuming historical data reflects a changing baseline
Demand Factor
– e.g., a historical increase in water demand factor
(m3/TRh)
over time due to a growth in DC market share)
Based on a fixed baseline of Based on a moving baseline of non-DC efficiency.
Baseline Non-DC
non-DC efficiency. Assuming historical data reflects a changing baseline
Efficiency
– e.g., a historical increase in the non-DC efficiency
(kWh/TRh)
over time.

6.3.2 Improved Cooling System Management


Recommended Savings Estimation Approach
The scope and structure of the improved cooling system management initiative of the
Efficient Cooling program is yet to be designed. For the purposes of defining a savings
methodology, a simplified structure is recommended, as shown below. This includes
two forms of savings calculation:

• Estimated Savings: Savings estimated based on a Deemed Savings value


(kWh/m2). Savings would be estimated based on floor area with improved
cooling system management (m2) and expected savings (kWh/m2).

• Reported Savings: Alternatively, savings may be reported directly by


commercial and/or industrial facilities with improved cooling system
management. A Realisation Rate (%) and Net-to-Gross Ratio (%) would have
to be determined for these set of projects.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 67 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.3.3 Data Governance


The data requirements and collection frequency are shown in the table below. Data
reporting frequency to DoE is defined as annual.
Table 19: Efficient Cooling Program - Data requirements and collection frequency

Collection
Category Data Data Source
Frequency

Baseline DC Data
DC
(Baseline DC & Non-DC Efficiency, baseline DC Developers
market share, baseline DC water demand factor)

Actual DC Data
DC
(Actual DC Efficiency, actual DC market share, actual Annual
Developers
DSM Activity cooling demand, actual DC water demand factor)
Data
Efficient Cooling Floor space
Facility
(m2 with Efficient Cooling Practices, or Reported Management
Savings)

Efficient Cooling Electricity Savings (kWh/m2) DoE As Needed

Efficient Cooling Net-to-Gross Ratio


EM&V Third-Party
Two Years
Outputs Evaluation
Efficient Cooling Realisation Rate

Data Flow

Figure 18: Efficient Cooling Program - Data flowchart

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 68 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.3.4 Overlaps
P4 has Overlaps with four other DSM Programs given the cooling-based focus of the
program. First, savings from efficient A/C units under P6 Standards & Labels also
contribute to P4 savings. These savings also Overlaps with P10 Rebates & Behavioral
Change if the upgrades were purchased under the A/C rebate scheme. New
construction or retrofit projects may also include connections to a DC system, thus
creating Overlaps with P2 Building Regulations and P7 Building Retrofits, respectively.

In the Overlaps between P4 and P7, savings from DC installations in new construction
buildings (in other words non-retrofit DC projects) should be allocated to P4 while
savings from DC retrofit projects (reported directly by ADES) should be allocated to
P7.

Table 20: Description of Overlaps in the Efficient Cooling Program

P2 Building P6 Standards & P7 Building P10 Rebates &


Regulations Labels Retrofits Behavioral
Change

New building Savings from A/C Savings from A/C


regulations may standards (tracked Retrofit projects rebates (tracked
Overlaps mandate connection under P6) may be district under P10)
to a district cooling contribute to P4 cooling projects. contribute to P4
system savings savings
Full Allocation:
All DC savings
Full Allocation: All
from retrofit will be
district cooling Full Allocation:
Full Allocation: allocated to and
savings will be All savings from
All savings from counted in P7.
counted in P4. In the A/C rebates will
Approach A/C standards will For clarity, DC
future, EUIs from P2 be allocated to
be allocated to projects from new
may need to be and counted in
and counted in P6 construction
adjusted to discount P10
buildings are not
DC consumption.
allocated to P7 but
to P4.
Sizing Procedurally Manual Allocation Procedurally Procedurally

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 69 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.4 Program 5: Energy Storage

The scope of the Energy Storage program includes developing energy storage
infrastructure to support the reliability and the sustainability of the network. In 2019,
the DoE inaugurated the world’s largest virtual power plant comprised of 12x 4MW
and 3x 20MW battery energy storage systems, adding up to a total of 108 MW.

6.4.1 International Best Practice for Evaluation

Generally, utility-sited or customer-sited storage initiatives do not undergo the same


level of evaluation scrutiny as traditional DSM programs. The evaluation of demand
and energy impacts (due to roundtrip efficiency losses) of storage initiatives requires
an analysis of hourly meter data to assess the hourly charger / discharge cycle of the
storage systems. This is the main reason why storage programs may not require any
third-party evaluation or verification of savings; because demand and energy impacts
are acquired directly by analyzing meter data. Additionally, unlike many other DSM
programs, determining demand energy impacts of a storage system does not require
any estimation, nor the use of engineering assumptions, since the actual meter data
is readily available. The main steps in determining demand energy impacts are the
following:

1. Collect hourly/sub-hourly meter data from each battery energy storage system
(BESS) facility;

2. Analyse hourly dispatch energy flows (charge and discharge cycles) to


determine demand savings;

3. Cumulative energy discharge and charge cycles will determine energy losses.

6.4.2 Recommended Savings Estimation Approach

The energy storage program in Abu Dhabi has the following characteristics:

1. A limited number of batteries make up the virtual power plant (15 in total).

2. The batteries are all localized and are connected to the distribution network.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 70 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

The charging and discharging operations follow a pre-defined charge/discharge


weekday schedule with small variations between summer and winter with the objective
of shifting demand between day and night.

Based on the above, the recommended approach is to calculate demand savings by


performing an hourly dispatch analysis. This can be performed by following these
general guidelines:

1. Collect and aggregate hourly metering data for each storage facility;

2. Identify system peak hours based on current-year system demand profile.


Ensure consistency with AADC / ADDC on the definition of system peak periods
(e.g., top 5, 10, 20 demand-hours of the year, pre-defined period during
summer weekdays, etc.);

3. Determine demand savings (kW) by calculating the average of the storage


capacity dispatched during peak periods; and

4. Calculate energy losses (kWh) by subtracting the total annual electricity fed
back into the grid from the electricity drawn from the grid.

6.4.3 Gross vs. Net Savings

The savings from the energy storage program are inherently net. This is because
absent the energy storage program, there would have been no organic storage
adoption.

6.4.4 Data Governance

No EM&V is required for the Energy Storage Program given that the storage data from
the 10 to 15 sites is the measured metered data, and no activities are taking place
behind-the-meter. Data reporting frequency to DoE is defined as annual.

Table 21: Energy Storage Program - Data requirements and collection frequency

Collection
Category Data Data Source
Frequency

Hourly System Load (MW)


DSM Activity Data EWEC Annual
Hourly Storage Dispatch (MW)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 71 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Data Flow

Figure 19: Energy Storage Program - Data flowchart


6.4.5 Overlaps
No Overlaps exist between the Energy Storage Program and the nine other Abu Dhabi
DSM Programs.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 72 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.5 Program 6: Standards & Labels

The scope of the Standards & Labels program is the implementation of ESMA’s
Minimum Energy Performance Standards (MEPS) for a number of appliances and
products including ACs, refrigerators and freezers, washing machines and dryers,
water heaters, dishwashers and water fixtures.

6.5.1 International Best Practice for Evaluation

Similar international standards and labels programs exist in many jurisdictions, with
varying scopes in terms of the customer sectors targeted (e.g., residential, commercial
and industrial equipment) and the number of appliances / products covered. For
example, the ENERGY STAR program in the US covers over 70 different product
types across all customer segments, while the European Commission and India’s
Bureau of Energy Efficiency standards programs cover a narrower set of appliances /
products; with approximately 25 different product types each.

In the US, Department of Energy (US DOE) prepares Technical Support Documents
(TSDs) supporting each appliance-specific standards ruling. TSDs are stand-alone
reports providing technical analyses and results for each appliance, including market
and technology assessments, screening and efficiency analyses, and detailed energy
consumption methodologies.

North American jurisdictions are the most mature in terms of savings evaluation
activities associated with these programs. Many US states and Canadian provinces
have established TRMs to create a consistent set of methodologies and assumptions
employed in the estimation of savings from related energy and water efficiency
programs. TRMs provide guidance on electricity and water savings calculations, they
assist in planning, reporting, cost effectiveness analysis, and perhaps most importantly
they help create consistency across energy regulators, implementers, utilities, and
evaluators in the characterisation of appliance/product savings.

In general, savings methodologies are largely consistent from one TRM to another,
with only assumptions changing across jurisdictions. For example, the savings
equation for A/C units are consistent across TRMs; however, the assumption of how
many hours an A/C system operates throughout the year changes.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 73 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.5.2 Recommended Savings Estimation Approach

The recommended savings methodology for appliance products is made up of two


parts. The first part is based on the standard methodology in TRMs and the second
part is made up of four parameters, two of which are EM&V adjustment parameters
used to more accurately estimate savings (e.g., lag from label issuance to installation
of an appliance, and Compliance Rate).

The following sub-sections describe in detail these two parts of the savings
methodologies.

TRM Savings Methodologies


The savings methodologies were determined based on a review of several North
American TRMs:

• California Municipal Utilities Association (CMUA), 2017, Savings Estimation


TRM29;

• Texas Public Utilities Commission (PUC), 2021, TRM v530;

• New York State Joint Utilities, 2020, Standard Approach for Estimating Energy
Savings v731; and

• Arkansas Public Utilities Commissions (PUC), 2018, TRM v8.132.

29 All CMUA TRM-relevant documents available here: https://www.cmua.org/energy-efficiency-technical-


reference-manual
30 All Texas PUC TRMs are available here: http://www.texasefficiency.com/index.php/emv
31 NY Joint Utilities TRM available here:

https://www3.dps.ny.gov/W/PSCWeb.nsf/96f0fec0b45a3c6485257688006a701a/72c23decff52920a85257f11006
71bdd/$FILE/TRM%20Version%207%20-%20April%202019.pdf
32 Arkansas PUC TRM available here: http://www.apscservices.info/EEInfo/TRMV8.1.pdf

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 74 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

The electricity savings equations for each appliance product alongside their assumed
lifetime are presented in Table 22. The variables used in each savings equation are
also described below. The assumed lifetime for each appliance is used in the savings
estimations to cap the savings from the appliance beyond its lifetime.

Table 22: Savings equations for electricity appliances and equipment

Product Electricity Savings Equation Consumption Lifetime


Indicator (Years)
ACs (Split / Elec Savings = (Cooling Capacity × EFLH) ÷ ∆EER EER 10
Window)
Washing Machines Elec Savings = Capacity × ∆EER × Cycles EER 11
/ Dryers
Refrigerators / Elec Savings = SAEC × ∆EEI EEI 14
Freezers
Dishwashers Elec Savings = Ps × ∆EEI × Cycles EEI 10
Water Heaters Elec Savings = (AEC/CapBaseline x Cap) – AEC AEC 15

A/C (Split / Window)

• Cooling Capacity (Btu/hour): The cooling capacity of the efficient A/C unit.

• EFLH (hours): The equivalent full load hours (EFLH) represents the number of
hours of operation of an A/C unit in a year if the unit was operating at full output.

• ∆EER (Btu-h/W): The Energy Efficiency Ratio (EER) represents the efficiency
level of A/C units. This parameter is the difference between the baseline EER
(the mix-market average EER) and the EER of the efficient A/C unit.

Washing Machines / Dryers

• Capacity (kg): The capacity of the efficient washer / dryer unit in kilograms

• ∆EER (Wh/kg): The Energy Efficiency Ratio (EER) represents the efficiency
level of the washer/dryer unit. This parameter is the difference between the
baseline EER (the mix-market average EER) and the EER of the efficient
washer/dryer unit

• Cycles: Number of washing or drying cycles per year.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 75 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Refrigerators / Freezers

• SAEC: Standard Annual Energy Consumption (SAEC). A standard electricity


consumption amount assigned to each unique type of refrigerator or freezer.

• ∆EEI: Energy Efficiency Index (EEI). Ratio of electricity consumption of the


refrigerator/freezer unit to the SAEC. A measure of how efficient the
refrigerator/freezer unit is relative to the standardized SAEC amount.

Dishwashers

• Ps: Number of plate settings of the efficient dishwasher unit (e.g., dishwasher
capacity).

• ∆EEI: Energy Efficiency Index (EEI) relative to the standard electricity


consumption of a specific type of dishwasher.

• Cycles: Number of dishwashing cycles per year.

Water Heaters

• AEC/CapBaseline(kWh/l): Annual electricity consumption per litre of capacity


for the baseline water heater unit.

• Cap (L): Capacity of the efficient water heater in litres.

• AEC (kWh): Annual electricity consumption for efficient water heater unit.

The water savings equations for each appliance product alongside their assumed
lifetime are presented in Table 23. The variables used in each savings equation are
also described below. The assumed lifetime for each appliance is used in the savings
estimations to cap the savings from the appliance beyond its lifetime.

Table 23: Savings equation for water appliances and equipment

Product Water Savings Equation Consumption Lifetime (Years)


Indicator
Washing Machines = (WC – WCBaseline) x Cycles AWC 11
Dishwashers = (Wt – Wt/PsBaseline x Ps) × Cycles Wt 10
Water Fixtures = ∆Consumption × Usage Consumption • 10 (Faucets &
Showerheads)
• 30 (Toilets)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 76 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Washing Machines

• WCBaseline (l/Cycle): Water consumption per cycle for the baseline washing
machine. The WCBaseline is calculated differently for front- and top-loading
washing machines as follows:

• Front: WC (l/cycle) = 5 * Capacity + 35

• Top: WC (l/cycle) = 20 * Capacity + 35

• WC (l/cycle): Water consumption per washing cycle

• Capacity (kg): The capacity of the efficient washing machine

• Cycles: Number of washing cycles per year.

Dishwashers

• Wt (l/cycle): Water consumption (under test conditions) of the efficient


dishwasher unit.

• Wt/PsBaseline (l/cycle per plate setting): Water consumption (under test


conditions) per cycle per plate setting for the baseline dishwasher.

• Ps: Number of plate settings of the efficient dishwasher unit (e.g., dishwasher
capacity).

• Cycles: Number of dishwashing cycles per year.

Water Fixtures

• ∆Consumption (l/use): Difference in water consumption per use for the


baseline and efficient water fixture.

• Usage (uses): Number of uses per year.


EM&V Adjustments Parameters

The second part of the savings equations is made of four parameters, two of which
are EM&V adjustment factors used in order to more accurately simulate the actual
accrual of electricity and water savings.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 77 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

EM&V Adjustment Factors:

• Installation Lag (months): Installation lag is a 9-month delay applied to the


accrual of savings to account for a delay from the time when a label is first
issued (which is ‘timestamp’ of ESMA data) and the time when an
appliance/product is installed and begins operations.

• Compliance Rate (%): Fraction of appliances that meet standards. May


include impact of sub-standard equipment, mislabelling, among other factors.

Other Parameters:

• Labels Issued (labels): Number of labels issued for an appliance model.

• Abu Dhabi Population Share (%): Share of the UAE population living in Abu
Dhabi. Used to prorate ESMA data (which reflects labels for the UAE) to Abu
Dhabi.

Population data is used in lieu of appliance point-of-sale (PoS) data since it is currently
unavailable. However, going forward, PoS and business-to-business (B2B) data would
need to be collected in order to accurately estimate the share (%) applied across
appliance types.

6.5.3 Gross vs. Net Savings

The calculation of Net Savings can be made by applying an adjusted baseline to the
consumption indicator of each appliance type. If there is evidence that the average
efficiency of an appliance experienced a gradual increase over time (prior to the DSM
Program), then the baseline efficiency should be adjusted in future years. If there is
no evidence that appliance efficiencies increased in the absence of the DSM Program,
then the baseline should remain fixed. In this case, Gross Savings and Net Savings
would be equivalent.

Figure 20 below illustrates this with an example for an A/C unit. In this example, Gross
Savings are estimated based on fixed baseline using an EER=7.00. By comparison,
Net Savings are calculated by incorporating an adjustment to the fixed baseline to
reflect the continuation of a historical trend of an increasing efficiency observed before
the DSM Program launched.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 78 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Figure 20: Illustration of the adjusted baseline for electricity and water appliances
The table below provides a more detailed example focused on savings attributed over
a 3-year period. This example assumes a split A/C unit with an EER=8.0. Gross
Savings are calculated based on a fixed baseline of 7.0. In every year of the analysis
(2020 through 2022), savings would be calculated based on an EER difference of 1.0.

To calculate Net Savings, the fixed baseline is no longer fixed at 7.0. Rather the
baseline increases by 0.2 every year, to 7.2 in 2021 and 7.4 in 2022. As a result,
savings are calculated based on an EER difference of 1.0 in 2020 (just as in the gross
case), 0.8 in 2021 and 0.6 in 2022.

Example: Split A/C (EER=8.0)

6.5.4 Data Governance


The data requirements and collection frequency are shown in the table below. Data
reporting frequency to DoE is defined as semi-annual.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 79 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Table 24: Standards & Labels Program - Data requirements and collection frequency

Collection
Category Data Data Source
Frequency

Labels Data
ESMA &
(Model, Issue Date, No. of Labels, Stars, Product Semi-Annual
QCC
Category, Model Energy/Water Data)

Appliance Baseline Data


DSM Activity ESMA &
(per appliance baseline efficiency and consumption
Data QCC
data)
As Needed
Global Parameters
ESMA, QCC
(Label to Market Lag, AD Population %, Equipment
& DoE
Lifetime)

EM&V Third-Party
Compliance Rates (per Appliance) Annual
Outputs Evaluation

Data Flow

Figure 21: Standards & Labels Program - Data Flowchart

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 80 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.5.5 Overlaps
P6 has Overlaps with four other DSM Programs. The largest Overlaps occur with P2
Building Regulations where, as a result of enforcing the Estidama standards, savings
from installing an appliance/product (e.g., A/C units) under P6 would also contribute
to P2 savings. Additionally, Overlaps exist between P6 and P4 Efficient Cooling, where
savings from efficient A/C units would also contribute to P4 savings. Moreover, with
P7 Building Retrofits, efficient appliances from P6 might also be part of retrofit projects.
Finally, since A/C units are part of P6, Overlaps exist with the A/C rebate program in
P10 Rebates and Behavioral Change.

Table 25: Description of Overlaps in P6 Standards & Labels

P2 Building P4 Efficient P7 Building P10 Rebates &


Regulations Cooling Retrofits Behavioral
Change

Savings from
Retrofits may
appliance / product Savings from A/C
include appliances / Savings from A/C
standards (tracked standards (tracked
Overlaps products tracked rebates are also
under P6) may under P6) contribute
and reported under tracked under P6
contribute to P2 to P4 savings
P6
savings
50/50 Allocation:
Full Allocation: All Full Allocation:
Appliances installed 50/50 Allocation:
savings A/C Savings from A/C
Approach in new buildings will Allocate overlap
standards will be rebates will be
be allocated 50/50 savings 50/50
allocated to P6 allocated to P10
across P2 and P6
Sizing Manual Allocation Manual Allocation Manual Allocation Manual Allocation

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 81 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.6 Program 7: Building Retrofits

The objective of the Building Retrofits program is to retrofit government, commercial


and residential buildings to improve the energy efficiency of their cooling, lighting and
water systems.

This is achieved through an energy performance contracting model whereby an ESCO


funds the efficiency improvement and recovers the implementation costs from utility
bill savings.

6.6.1 International Best Practice for Evaluation

Best practice for the evaluation of building retrofit programs involves conducting a
multi-stage, nested-sampling assessment of a statistically significant sample of retrofit
projects. The objective of this assessment is to verify the accuracy of ESCO Reported
Savings.

Individual reporting of projects savings is generally performed by the ESCOs. Those


savings are commonly determined based on engineering calculations of installed
electricity and water conservation measures and defined in an M&V Plan. These
calculations may be calibrated / adjusted to account for the changes in weather,
occupancy, and other relevant variables, which may impact savings.

ESCO Reported Savings are adjusted with a Realisation Rate – to quantify the
difference between Actual Savings vs. Reported Savings – and a Net-to-Gross Ratio
– to calculate Net Savings from reported Gross Savings. This is shown by the equation
below.

Realisation Rates are typically determined through a multi-stage, nested sampling


assessment of ESCO projects. This assessment aims to increasing sampling
efficiency, without compromising site-level evaluation rigor. The methodology begins
with desktop reviews of the entire database of projects to perform project
documentation checks. This activity is followed by a more in-depth review of

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 82 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

engineering assumptions (where appropriate), and a set of interviews with project


implementers. This review is not generally performed for the entire population of
projects, but nevertheless, for a larger set of projects. Finally, a smaller sample of
projects would be screened and selected for on-site verification and analyses of
metering data.

In general, this sort of sampling approach aims to select a statistically significant


sample of projects targeting a relatively high level of confidence; either 90/10, 90/20
or 80/20 confidence. With 90/10, this means that the evaluation should aim to provide
90% confidence that the Evaluated Savings fall within 10% of the true savings. Striving
for higher levels of confidence, although could provide more credible results, comes
at a cost. This is because there is a correlation between the level of confidence and
the sample size, which then impacts the level and size of analysis required, and
eventually the cost. This cost may be justified for programs that are important to the
portfolio (typically related to the level of spending and/or savings associated,
associated risks of not achieving program targets or future program plans), but not for
all programs. The choice of the level of confidence should aim to balance the value of
the information gained, compared to the cost of research needed to meet the desired
confidence and precision levels. Tailoring the precision level to the programs and their
importance within the portfolio can substantially reduce overall evaluation costs.

As part of this assessment, a Net-to-Gross Ratio is also determined based on


feedback from facility managers collected during interviews and surveys to understand
whether the ESCO project led to additional indirect energy efficiency activities.

Table 26: Nested sampling approach to estimate Realisation Rates33

Method # of Projects Key Questions Answered Example Methods


Covered

Tracking Is the tracking database sufficient to Review database for


Population
System Review verify the savings? comprehensiveness

Desktop Are the high-level assumptions Review documentation, conduct


Large Sample
Reviews correct? Can outliers be identified? phone interviews

Onsite Measure actual equipment for


Small Sample Are the site-level savings correct?
Verification hours of use, power, etc.

33 NREL Sample Design Cross-Cutting Protocol (Link)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 83 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.6.2 Abu Dhabi Measurement & Verification Protocol Guidance Document


In line with international best practice, the DoE issued the Abu Dhabi Measurement &
Verification Protocol Guidance Document to support ESCOs in performing M&V for
EPC projects34. This guideline outlines the fundamental principles and methodologies
aligned with the IPMVP and provides templates to facilitate the preparation of M&V
Plans and Reports. All building retrofit projects shall be implemented in adherence
with this guidance document and report verified savings to the DoE.

6.6.3 Recommended Savings Estimation Approach


The recommended approach to estimate electricity and water savings from the
Building Retrofits program requires a methodology to verify savings from the project
implementers. This is done by following the nested sampling approach outlined above
to calculate a Realisation Rate that is applied to the entire portfolio of building retrofit
projects. Additionally, the proposed methodology incorporates a degradation factor
that considers the impact of natural decay in savings after the ESCO contract duration.
Finally, a Net-to-Gross Ratio is applied to adjust reported Gross Savings to Net
Savings. The proposed methodology and the equation variables are presented below.

• Reported Energy and Water Savings (kWh or m3): Annual reported


electricity and/or water savings for each retrofit project.
• Annual Degradation Factor (%): The estimated degradation in the baseline
consumption, as well as in the electricity and water savings due to natural decay
post ESCO contract duration.
• Realisation Rate (%): EM&V adjustment factor determined via a statistically
significant analysis of Actual Savings vs. Reported Savings.
• Net-to-Gross Ratio (%): EM&V adjustment factor used to calculate Net
Savings from Gross Savings.

34 The Abu Dhabi M&V Protocol Guidance Document can be downloaded from:
https://www.doe.gov.ae/en/Publications

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 84 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

An additional parameter incorporated into the calculation of savings is a truncation


period of 10 years. A truncation period is used to stop the accrual of savings from
retrofit projects after a defined number of years, otherwise, savings would be accrued
into perpetuity if the annual degradation factor only reduces savings by a small annual
decrease. Applying a truncation period of 10 years to energy and water savings means
that retrofit projects will only continue to accrue savings for 10 years after the
termination of their respective contract period.

6.6.4 Gross vs. Net Savings

For this program, a Net-to-Gross Ratio (NTG) is used to calculate Net Savings from
the Gross Savings reported by the ESCOs to take into account Spill-over effects
resulting from the retrofit.

6.6.5 Data Governance

The data requirements and collection frequency are shown in the table below. Data
reporting frequency to DoE is defined as annual.

Table 27: Building Retrofits Program - Data requirements and collection frequency

Data Collection
Category Data
Source Frequency

Retrofitted Building Project Data


DSM Activity
(Completion date, reporting ESCO, customer sector, GFA, ESCOs Annual
Data
contract duration, reported electricity/water savings)

Realisation Rate
Third-
EM&V
Degradation Factor Party Two Years
Outputs
Evaluation
Net-To-Gross Ratio

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 85 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Data Flow

Figure 22: Building Retrofits Program - Data Flowchart


6.6.6 Overlaps
P7 Overlaps with four other DSM Programs. First, appliances under P6 Standards and
Labels program might be included in retrofit projects. Furthermore, retrofit projects
may also involve connecting buildings to a DC system, creating Overlaps with P4
Efficient Cooling. In the Overlaps between P4 and P7, savings from DC installations
in new construction buildings (in other words non-retrofit DC projects) should be
allocated to P4 while savings from DC retrofit projects (reported directly by ADES)
should be allocated to P7.

Moreover, if the retrofit project also involves retrofitting a building’s irrigation system,
Overlaps occur with P8 Efficient Water Use & Reuse. Finally, in rare occasions, there
might be Overlaps between P3 Street & Public Realm Lighting and P7 in case retrofit
activities extend to street lighting upgrades or new installations.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 86 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Table 28: Description of Overlaps in the Building Retrofits Program

P3 Street Lighting P4 Efficient P6 Standards & P8 Efficient Water


Cooling Labels Use / Re-Use

While rare, retrofits Retrofits may While rare, retrofit


Retrofit projects
projects may include appliances / projects associated
Overlaps may be connected
include street products tracked with irrigation
to DC.
lighting installations under P6 applications
Full Allocation: All
DC savings from
retrofit projects will
Full Allocation:
be allocated to P7. Full Allocation:
Street lighting 50/50 Allocation:
For clarity, DC Irrigation retrofits
Approach retrofits will be Allocate overlap
projects from new will be allocated
allocated 100% to savings 50/50
construction 100% to P7
P3 and not to P7
buildings are not
allocated to P7 but
to P4.
Sizing Procedurally Procedurally Manual Allocation Procedurally

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 87 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.7 Program 8: Efficient Water Use & Re-Use

The objective of the Efficient Water Use & Re-Use program is to reduce desalinated
water use for outdoor irrigation systems and to reuse water more efficiently through
TSE use. These objectives are supported by several initiatives including the optimised
operation of irrigation systems, the upgrading / automation of systems and the
application of soil additives.

Today, DSM Program 8 focuses primarily on initiatives that create desalinated water
savings via:

• Efficient desalinated water use.

• TSE replacing desalinated water.

In the future, the scope of savings may be expanded to also capture savings from
initiatives focused on the efficient use of groundwater and TSE, as well as TSE
replacing groundwater.

6.7.1 International Best Practice for Evaluation

Globally, the water sector does not conduct as much EM&V as the energy sector. In
general, this reflects the more challenging nature of evaluating water efficiency
initiatives (e.g., less advanced metering infrastructure, more expensive to submeter or
spot measure equipment), subsidised and/or low water rates, lack of regulatory
requirement to evaluate such programs, and the perceived abundance of water
resources. Nevertheless, when EM&V is applied to evaluate water conservation
programs, those practices are largely consistent with EM&V practices in the energy
sector.

In the context of irrigation and landscape initiatives, best practice involves submetering
of indoor and/or outdoor water consumption, as appropriate. Submetering pre- and
post-project implementation enables for a comparison of water consumption before
and after implementation. These analyses are calibrated by adjusting for weather,
occupancy, changes in the landscape, and other relevant variables. Once a regression
model is built, it can also be employed to estimate future water savings. Since
submetering can become an expensive evaluation approach, a lower-cost alternative

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 88 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

to the installation of water meters is the use of acoustic meters which can be wrapped
around water pipes to log water flow data.

When submetering is technically unfeasible, a second-best method is regression


modelling on the master water meter. Master-meter regression modelling is an
appropriate evaluation approach when the vast majority of (or all) water use is
outdoors – e.g., higher than 80-90%. If indoor water use accounts for a significant
share of water consumption, master-meter modelling may not be appropriate. Several
variables relevant for regression modelling of outdoor water consumption include
weather, rainfall and evapotranspiration. One variable to control for changes in indoor
water use is to log average daily occupancy rates (people per day).

6.7.2 Recommended Savings Estimation Approach

The recommended approach for estimating savings requires a methodology to verify


savings reported by project implementers. This requires the calculation of a
Realisation Rate (%) that can be applied to the portfolio of DSM Program 8 projects.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 89 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

The savings estimation methodology for the two types of initiatives part of DSM
Program 8 – efficient water use and TSE replacing desalinated water or groundwater
is based on a water supply and demand balance. Project implementers report the
following:

• Supply | Baseline water demand and its source(s) (desal, TSE, groundwater
or a mix)

• Supply | Efficient water demand as a result of implementing water efficiency


measures and its source(s) (desal, TSE, groundwater or a mix)

• Demand | Water demand end-use mix (forest, fruit trees and landscape)

• Project lifetime

For some projects, a desalinated water-to-TSE equivalency factor may be required to


account for the higher use of TSE volume vs. desalinated water volume for the same
purpose. For most applications such as landscaping and irrigation, this factor may be
100%. Similarly, an evaluation of a sample of projects will be conducted to estimate a
Realisation Rate, and that Realisation Rate will be applied to the full portfolio of
projects.

Realisation Rates are typically determined through a multi-stage, nested sampling


assessment. This assessment aims to increase sampling efficiency, without
compromising site-level evaluation rigor. The methodology begins with desktop
reviews of the entire database of projects to perform project documentation checks.
This activity is followed by a more in-depth review of engineering assumptions (where
appropriate), and a set of interviews with project implementers. This review is not
generally performed for the entire population of projects, but nevertheless, for a larger
set of projects. Finally, a smaller sample of projects would be screened and selected
for on-site verification and analyses of metering data.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 90 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

In general, this sort of sampling approach aims to select a statistically significant


sample of projects targeting a relatively high level of confidence; either 90/10, 90/20
or 80/20 confidence. With 90/10, this means that the evaluation should aim to provide
90% confidence that the Evaluated Savings fall within 10% of the true savings. Striving
for higher levels of confidence, although could provide more credible results, comes
at a cost. This is because there is a correlation between the level of confidence and
the sample size, which then impacts the level and size of analysis required, and
eventually the cost. This cost may be justified for programs that are important to the
portfolio (typically related to the level of spending and/or savings associated,
associated risks of not achieving program targets or future program plans), but not for
all DSM initiatives. The choice of the level of confidence should aim to balance the
value of the information gained, compared to the cost of research needed to meet the
desired confidence and precision levels. Tailoring the precision level to the DSM
initiatives and their importance within the portfolio can substantially reduce overall
evaluation costs.

Table 29: Nested sampling approach to estimate Realisation Rates35

Method # of Projects Key Questions Answered Example Methods


Covered

Tracking Is the tracking database enough to Review database for


Population
System Review verify the savings? comprehensiveness

Are the high-level assumptions Review documentation, conduct


Desk Reviews Large Sample
correct? Can outliers be identified? phone interviews

Onsite Measure actual equipment for


Small Sample Are the site-level savings correct?
Verification hours of use, power, etc.

6.7.3 Gross vs. Net Savings

In the context of DSM Program 8, Gross Savings and Net Savings are equivalent.
DSM Program 8 is directly responsible for the initiatives within its scope, hence, in the
absence of DSM Program 8, these water savings would not have materialised. Unlike
with other DSM Programs, a Net-to-Gross Ratio (%) is not applied.

35 NREL Sample Design Cross-Cutting Protocol (Link)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 91 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.7.4 Data Governance

The data requirements and collection frequency are shown in the table below. Data
reporting frequency to DoE is defined as semi-annual, given that the program is
expected to contribute a significant share of total DSM strategy water savings.

Table 30: Efficient Water Use and Re-Use Program - Data requirements and collection frequency

Collection
Category Data Data Source
Frequency

Project Baseline Water Demand and Source


(TSE vs Desal vs Groundwater)

Project Efficient Water Demand and Source


DSM Activity (TSE vs Desal vs Groundwater)
ADDC/AADC Semi-annual
Data
Project Water Demand End-Use Mix (forest,
fruit trees and landscape)

Project Lifetime

Realisation Rate
Third-Party
EM&V Outputs Semi-annual
Net-to-Gross Ratio Evaluator

Data Flow

Figure 23 Efficient Water Use and Re-Use Program - Data Flowchart

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 92 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.7.5 Overlaps
P8 primarily Overlaps with P2 in the case where new building regulations mandate the
use of TSE for landscaping. On rare occasions, if a building retrofit project also
involves retrofitting a building’s irrigation system, Overlaps would occur with P7
Building Retrofits.

Table 31: Description of Overlaps in the Efficient Water Use and Reuse Program

P2 Building Regulations P7 Building Retrofits

New building regulations may mandate While rare, retrofit projects associated with
Overlaps
the use of TSE for landscaping irrigation applications
Full Allocation: All TSE use in Full Allocation: Irrigation retrofits will be
Approach
landscaping will be counted in P8 allocated 100% to P7
Sizing Procedurally Procedurally

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 93 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.8 Program 9: Demand Response


The objective of the Demand Response (DR) program is to induce customers to
temporarily reduce their energy use in response to signals from the utility. The DR
program is under development and is yet to be implemented. However, projections of
energy and demand savings have been developed based on the hypothetical rollout
of DR measures in industrial, large commercial and agricultural sectors.

6.8.1 International Best Practice for Evaluation

Best practice for the evaluation of DR program varies based on the type of customer
sector targeted. DR programs targeting residential and small commercial and
industrial (C&I) customers do not follow the same impact evaluation plans as DR
programs for large C&I customers.

• Residential & Small Commercial DR Programs: Impact evaluations generally


involves an empirical analysis to calculate energy and peak savings using
econometric regression methods using hourly AMI data. Commonly, a randomised
control trial (RCT) experimental design is used to compare a treatment group of
DR program participants with a control group of non-participants.36

• Large C&I DR Programs: Impact evaluations generally use a “day matching”


approach to establish the baseline of energy consumption for each individual DR
program participants. This approach is often also referred to as an X-in-Y
approach. For example, a 5-in-10 approach sets the baseline based on the
average demand of the 5 highest energy consumption days on the immediate past
10 similar days. The objective of this is to establish a baseline level of energy
consumption by analysing energy consumption during similar non-DR event days.
Energy and peak impacts are determined by comparing a participant’s event-day
energy consumption vs. the participant’s baseline.37, 38, 39

36 DTE / Navigant, 2017, “The Reliability of Behavioral Demand Response”. Available here:
http://www.oracle.com/us/industries/utilities/reliability-bdr-5225436.pdf
37 EnerNOC, 2011, “The Demand Response Baseline”. Available here:

https://library.cee1.org/sites/default/files/library/10774/CEE_EvalDRBaseline_2011.pdf
38 CAISO, 2009, “Baselines for Retail Demand Response Programs”. Available here:

https://www.caiso.com/Documents/Presentation-Baselines_RetailDemandResponsePrograms.pdf
39 AEIC, 2009, “Demand Response Measurement & Verification” available here:

https://www.smartgrid.gov/files/documents/demand_response.pdf

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 94 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

o Alternative X-in-Y configurations: Different jurisdictions employ different X-in-


Y configurations. A common set of configurations include a 3-, 5-, 7- or 10-in-
10.

o “Day of Adjustment”. Adjustments to the baseline can sometimes be required.


This is because weather conditions can often vary significantly on the day of a
DR event compared to conditions in prior days used to set the baseline. These
baseline adjustments are particularly important for participants with very
weather-sensitive demands.

Different approaches are followed for different DR programs based on the degree of
heterogeneity of participants (e.g., how different are DR participants) and the number
of participants in each program (e.g., how big is the sample of participants). For
example, residential and small C&I program can be made up of thousands of
participants and can have a significant degree of homogeneity – in other words, their
energy consumption and demand composition can be quite similar. These
characteristics lend themselves well to a regression analysis with a large number of
participants in the treatment group.

In comparison, DR programs for large C&I customers are much more limited in terms
of the number of participants and can also be largely heterogenous (very different
energy consumption patterns and demand composition). These characteristics make
it more appropriate to perform a participant-specific approach.

6.8.2 Recommended Savings Estimation Approach

Since the DR program has not been implemented and is still being scoped, the
recommended approach to estimate savings is to initially follow a simplified top-down
approach until an initial pilot is launched, completed and evaluated. Once a DR pilot
is evaluated, those evaluated results may be used to estimate savings in future years
as the pilot is scaled to a full rollout.

The recommended top-down estimation approach is presented below. The approach


is to estimate savings based on a set of scope and design parameters of a future DR
program. Equations variables are also described below.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 95 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

• Program participants (number): Number of DR participants expected to take


part in initial DR pilot or program.

• Coincident Peak Demand (kW/participant): Estimate of average participant


demand (kW) during a defined ‘system peak period’. For example, if the system
peak is defined as summer weekdays from 1pm to 4pm, the coincident peak
demand is the average participant demand during this period.

• Unit-Impact (%): The fraction of the coincident peak demand that a participant
can be expected to reduce if a DR event is called.

• Opt-Out Factor (%): The fraction of DR participants expected to not respond


to DR event and not reduce their demand.

• Event-day duration (hours): Expected duration of a DR event.

• Snapback Demand Increase (kW): Average demand increase prior to or after


a DR event. This demand increase may reflect pre- or post-cooling, or a ramp
up in production to return to normal commercial/industrial operations.

• Snapback Event-day duration (hours): Average duration of snapback.

The proposed process of initially estimating savings based on the top-down approach
described above is illustrated by the diagram below. During the first year of the
program (Y1), the methodology described above would be employed to estimate DR
savings. Once the DR pilot has concluded, an evaluation would be performed. Once
the Y1 evaluation is complete, the evaluated impacts would become available to use
in order to estimate DR savings for the following year of the program. Evaluated
impacts from the pilot would be used until a new evaluation is conducted.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 96 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.8.3 Gross vs. Net Savings

In the context of DSM Program 9, Gross Savings and Net Savings are equivalent.
Absent the DR program, there would be no natural adoption of DR measures.

6.8.4 Data Governance

Data requirements and data collection frequency are outlined in the table below

Table 32: Demand Response Program - Data requirements and collection frequency

Data Collection
Category Data
Source Frequency

Demand Response Event Data


Before first EM&V plan is conducted
(Participants, coincident peak demand, unit
DSM Activity Data EWEC Annual
impact, opt-out rate, event duration)
After EM&V is conducted
(Participants, metering data)

Snapback
Third-Party
EM&V Outputs Two Years
Evaluator
Demand Reduction (by participant class)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 97 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Data Flow

Figure 24: Demand Response Program - Data flowchart


6.8.5 Overlaps
No Overlaps exist between the Demand Response Program and the nine other Abu
Dhabi DSM Programs.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 98 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.9 Program 10: Rebates & Behavioral Change

The scope of this program is divided in two major sub-programs as follows:

• Rebates: Offer purchase-rebates to promote replacement of existing A/C units


at their end-of-life by high efficiency units.

• Behavioral Change: Launch several behavioral change initiatives to nudge


participants towards the efficient use of electricity and water. This includes
several DSM initiatives:

o The Base + Program Area A - Enabling Mechanisms40: Awareness,


Education and Engagement activities; which also initially includes two
types of pilot projects:

▪ Home Maintenance with HEMS pilot; and

▪ Irrigation Audit pilot.

o Program Area B - Efficient Equipment & Appliances: Efficiency kits


offered to consumers;

o Program Area C - Efficiency Advisory Services: Energy efficiency


advisors providing on-demand feedback and input to consumers; and

o Program Area D - Energy Feedback Programs: Home energy reports


provided to consumers regularly.

6.9.1 International Best Practice for Evaluation

The table below outlines the international best practices evaluated.

40 Awareness, Education and Engagement activities (The Base + Program Area A), apart from the pilots, do not
directly generate savings, rather they reinforce the other initiatives in the portfolio e.g. Efficiency Kits initiatives,
Informational Billing, Home Energy Reports. Consequently, qualitative evaluations are envisioned for such
activities.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 99 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Table 33: Description of evaluation best practice for rebates and behavioral change program

DSM Initiative Description of Best Practice

Rebates Best practice for evaluating rebate programs requires an analysis of point-of-sale data from a sample
of retailers, and verification through customer surveys / interviews. Retail data is used to estimate the
number of A/C unit purchased during a rebates campaign that can be attributed to the campaign
itself. This involves determining the number of sales before the campaign, during the campaign and
after the campaign.
Program Area A: Best-practice for the evaluation of HEMS programs is a regression analysis comparing a treatment
HEMS pilot and control groups through a Randomized Control Trial (RCT). Traditionally, utilities will conduct a
small pilot, conduct an evaluation to estimate energy savings and employ those evaluated impacts as
“Deemed Savings” to estimate savings from expanding the program.
Program Area A: While submetering pre- and post-intervention is best practice for water DSM, with irrigation projects,
Irrigation Audit pilot regression modelling on a master water meter is generally sufficient. These results may be used to
inform expected savings per action (or recommendation).
Metering may not be possible at farms using groundwater. In this case, water use may be estimated;
for example, based on comparable farms with water meters.
Program Area B: Participant surveys and interviews with a complex battery of questions to: (i) measure participants’
Efficient Equipment exposure to the initiative, (ii) identify savings actions taken and actions taken prior [baseline
& Appliances condition] and (iii) compare which actions taken on high recall vs. low recall.
Program Area C:
Efficiency Advisory
Services
Program Area D: Same as Program Area A: HEMS pilot.
Energy Feedback
Programs

6.9.2 Recommended Savings Estimation Approach


Rebates
The recommended savings equation for the Rebates program is shown below.
Savings are calculated based on the savings estimated for A/C units in DSM
Program 6 Standards & Labels and the application of an attribution factor and Net-
to-Gross Ratio.

The attribution factor represents the share of A/C units purchased as a result of the
Rebates initiative. The attribution factor would be calculated based on an analysis
of point-of-sale data obtained from retailers. A/C unit sales figures would be
analysed over time to develop a comparison of sales before, during and after the
Rebates campaign. The attribution factor would be calculated as illustrated by the
diagram below. Additionally, a net-to-gross factor would also be calculated to
account for any Free Riders and Spill-over effects.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 100 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

• Attribution Factor (%): Fraction of A/C units purchased as a result of the


rebates initiative, expressed as a percentage of all A/C units purchased in
Abu Dhabi through DSM Program 6 Standards & Labels.

• Net-to-Gross Ratio (%): To capture the impact of Free Riders and Spill-over
effects.

*The number of P6 units are determined as part of the P6 analysis based on data from ESMA/QCC
Figure 25: Illustration of the application of an attribution factor to quantify savings from the rebate
program
Program Area A: HEMS pilot
The recommended methodology for the HEMS pilot is consistent with best practice.
The HEMS pilot should be evaluated through an econometric analysis (e.g., RCT).
The results of this pilot should be used to establish “Deemed Savings” figures (e.g.,
participants achieve 1% savings in electricity consumption) and these should be
applied in the equation below in order to estimate future savings.

• Deemed Savings (%): Expected level of savings per participant, expressed


as a percent of energy consumption.

• Average Participant Energy use (kWh/participant): Average annual


energy consumption per participant.

• Number of Participants: Number of annual participants in DSM activity.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 101 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Program Area A: Irrigation Audit pilot


The recommended methodology for the Irrigation Audit pilot is consistent with best
practice. Regression modelling should be performed using master meter
consumption data. These results may be used to establish ‘Deemed Savings’ for
participants. In this situation, the same methodology as for the HEMS pilot may be
followed. Alternatively, “measure characterization” calculations based on an
estimate of water savings per action taken – as shown below – may be applied.

• Baseline & Efficient Measure: Qualitative description of the baseline and


efficient measure actions.
• Savings per Measure (m3/action): Estimated water savings per action

• Action Frequency (actions per year): Estimated number of actions per year

• Number of Participants: Number of annual participants in DSM initiative.

Program Area B: Efficient Equipment & Appliances


The recommended approach is to use “measure characterization” calculations
based on engineering assumptions associated with the efficiency kit and participant
responses to a survey of participants. A net-to-gross factor may also be appropriate
to calculate based on the participant survey to quantify potential Spill-over effects.

• Baseline & Efficient Measure: Estimate of baseline and efficient measure


consumption.
• In-Service Rate (%): Fraction of efficiency kits installed and in operation.

• Use per Year (min/year): Number of uses per year.

• Number of Participants: Number of annual participants in DSM initiative.

• Net-to-Gross Ratio (%): Net-to-Gross Ratio for efficiency kits

• Kit Equipment Lifetime (years): Lifetime of the equipment within the kit, for
the purpose of capping savings beyond the lifetime.

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 102 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Program Area C: Efficiency Advisory Services


The recommended approach for Efficiency Advisory Services is consistent with
Program Area B: Efficiency Equipment & Appliances.

• Baseline & Efficient Measure: Qualitative description of the baseline and


efficiency measure actions.

• Savings per Action (kWh/action): Estimated water savings per action.

• Action Frequency (actions per year): Number of actions per year.

• Number of Participants: Number of annual participants in DSM initiative.

• Net-to-Gross Ratio (%): Net-to-Gross Ratio for Efficiency Advisors.

• Persistence Factor: an EM&V adjustment factor determined via a


statistically significant sample to take into account participants’ persistence in
taking action since receiving savings advice.

Program Area D: Energy Feedback Programs


The recommended approach for Energy Feedback Programs is consistent with
Program Area A: HEMS pilot.

• Deemed Savings (%): Expected level of savings per participants, expressed


as a percent of energy or water consumption.

• Average Participant Energy or Water Use (kWh or L/participant):


Average annual energy or water consumption per participant.

• Number of Participants: Number of annual participants in DSM initiative.

• Net-to-Gross Ratio (%): Net-to-Gross Ratio for HER

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 103 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

6.9.3 Gross vs. Net Savings


For P10 Rebates & Program Areas B, C and D, a Net-to-Gross (NTG) Ratio is used
to calculate Net Savings from Gross Savings to adjust for Free Riders (e.g.,
participants that would have changed their behavior even without the program) and
Spill-over (e.g., unintended energy and water conservation actions due to the
program). For Program Area A, Gross Savings and Net Savings are equivalent given
that the initiatives are in the pilot phase. Beyond the pilot, a Net-to-Gross Ratio should
be used to adjust for Free Riders and Spill-over effects.

6.9.4 Data Governance


The data requirements and collection frequency are shown in the table below. Data
reporting frequency to DoE is defined as annual.

Table 34: Rebates & Behavioral Change Program - Data requirements and collection frequency

Program Data Collection


Category Data
Area Source Frequency

A/C sales data (number of A/C units purchased


Rebates under rebates initiative)

Participants, Tracking Data of Actions Taken +


A
Participant Energy and Water Use
DSM
ADDC/
Activity Annual
Participants, Tracking Data of Actions Taken, AADC
Data B
Equipment Lifetime

C Participants, Tracking Data of Actions Taken

D Participants, Participant Energy and Water Use

Rebates
Net-to-Gross Ratio
+ B, C, D

Rebates Attribution Factor

A Deemed Savings per participant (for each pilot)


Third-
EM&V
Party Annual
Outputs Deemed Savings per participant (from Engineering
B Estimate of savings) Evaluator

Deemed Savings per participant (from Engineering


C Estimate of savings), Persistence Factor

Deemed Savings per participant (from econometric


D analysis)

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 104 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Data Flow

Figure 26: Rebates & Behavioral Change Program - Data Flowchart


6.9.5 Overlaps
The Overlaps with P10 are mainly related to the purchase of efficient A/C units under
the A/C rebate initiative, which could occur under both P4 Efficient Cooling and P6
Standards & Labels.

Table 35: Description of Overlaps in the Rebates & Behavioral Change Program

P4 Efficient Cooling P6 Standards & Labels

Savings from A/C rebates (tracked under Savings from A/C rebates are also
Overlaps
P10) contribute to P4 savings tracked under P6
Full Allocation: All savings from A/C Full Allocation: Savings from A/C
Approach
rebates will be allocated to P10 rebates will be allocated to P10
Sizing Procedurally Manual Allocation

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 105 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

7 EM&V Model
This Protocol is accompanied by an EM&V Model that takes in the data from the
Program Owners and calculates the net Estimated Savings and/or Evaluated Savings.
A user guide for the model is also provided.

The EM&V model is divided into three (3) sections:

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 106 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0
EVALUATION, MEASUREMENT & VERIFICATION
PROTOCOL

Data Collection Workbooks will be prepared for each Program Owner, and those
workbooks will feed directly into the EM&V model:

Key model inputs and assumptions are assigned a ‘maturity’ level. This allows the
tracking of the model’s overall input maturity status over time:

Document no. DoE/QD/P04/003 Version no. 0 Effective Date: 01/02/2022 Page 107 of 107
____________________________________________________________________________________________________________________
This document is in copy right and contains valuable and proprietary information. No part of this document may be reproduced in any form or by
any means without the prior permission and authorization of the Department of Energy (DoE), Abu Dhabi.
DoE-QMS-F-4.6 Rev.0

You might also like