You are on page 1of 153

A concise manual on generation of flood forecasting

model in Chindwin River Basin, Myanmar


RIMES
Contents
Acquisition of Data ......................................................................................................................... 3
Hydrological and meteorological data .............................................................................. 3
Evapotranspiration data .................................................................................................... 3
Acquisition of spatial data: ............................................................................................... 5
Basin Delineation .......................................................................................................................... 10
Setting up a HEC GeoHMS in your system.................................................................... 10
Getting started ................................................................................................................. 10
Geo HMS preprocessing ................................................................................................. 14
HEC-GeoHMS Project Setup ......................................................................................... 15
Basin Processing ............................................................................................................. 19
Extracting Basin Characteristics ..................................................................................... 22
Hydrologic Parameters.................................................................................................... 26
HMS ................................................................................................................................ 28
Quality control of metro-hydro data ............................................................................................. 36
Generation of sub basin average meteorological data .................................................................. 39
Creating DSSVue files to Store data ............................................................................... 46
Sub basin average WRF extraction ............................................................................................... 51
Extracting WRF using R ................................................................................................. 51
Extracting WRF using GrADS ....................................................................................... 54
WRF data verification and bias correction ................................................................................... 56
Forecast Verification ....................................................................................................... 56
Bias correction ................................................................................................................ 65
Hydrological modeling ................................................................................................................. 69
Hydrologic Elements ...................................................................................................... 70
Basin Model .................................................................................................................... 70
Meteorologic Model........................................................................................................ 77
Control Specifications ..................................................................................................... 81
Executing HMS Model ................................................................................................... 82
Viewing HMS Results .................................................................................................... 86
Model Calibration ........................................................................................................... 89
ARIMA Error Correction .............................................................................................................. 97
Hydraulic modeling .................................................................................................................... 102
Geo-RAS PreProcessing ............................................................................................... 103
HEC-RAS processing ................................................................................................... 124
Geo-RAS post processing ............................................................................................. 133
Decision Support System ............................................................................................................ 137
1. Acquisition of Data

Hydrological and meteorological data


Rainfall stations lying near Chindwin river, with daily data from 2000 to 2014 are collected
initially. Available discharge data or time series of gage heights with rating equation are also
collected from the stations lying in the Chindwin river. All these stations are in Chindwin river
and its tributaries. The observed data is accessed from Department of Meteorology and Hydrology,
Myanmar.
(For this exercise;
Rainfall stations = 12 (Falam, Gangaw, Hakha, Hkamti, Homalin, Kalemyo, Kalewa, Mohnyin,
Pinlebu, Ramtholo, Tamu and Ye U)
Discharge stations = 6 (Hkamti, Homalin, Mawlaik, Kalewa, Monywa, Minkin)

Evapotranspiration data
Evapotranspiration accounts for the loss of water in the basin. You can use the evapotranspiration
data if there are gages installed in the basin. If not, then you can seek for other sources like FAO.
The monthly evapotranspiration computed by Penman Montieth method for available
location/stations within the basin can be downloaded from FAO-Water Development website
http://www.fao.org/nr/water/infores_databases_climwat.html. You need to download and Install
CLIMWAT 2.0 for this purpose.

 Open

 Specify latitude and longitude or


select Myanmar from country
option and click OK
After selecting the stations desired, Click on Export Selected Stations which will export two files
for each station with extensions (.cli and .pen) in a specified folder.
Select your destination folder and Click Ok.

Opening the file with .cli extension will show you the reference Potential Evapotranspiration for
that station computed from a long term data. Note that the unit is in mm/day.

Detail description of all other climatic data in these two files can be accessed from
http://www.fao.org/nr/water/infores_databases_climwat.html.

Acquisition of spatial data:


1. Digital Elevation Model (DEM):
One arc-second (30 m) and Three arc-second (approx. 90 m) resolution digital elevation models
are freely available. For this basin, because of its large size, 90 m resolution can be used to save
the computing resources. There are many sources of free digital elevation model, of which some
popular ones are listed below:
a. The 30 m Digital Elevation Model (DEM) from ASTER GDEM can be downloaded from
the website: http://gdem.ersdac.jspacesystems.or.jp/download.jsp . The latitude and
longitude covering the area (make sure you give enough buffer distance) can be
downloaded in the form of separate tiles of DEM and Mosaic later to form a single DEM.

(For this exercise, the area covering 20-25 N and 90-95 E should be downloaded.)

b. HydroSHEDS (Hydrological data and maps based on SHuttle Elevation Derivatives at


multiple Scales) is more accurate than ASTER GDEM to delineate basin and river network.
3 sec GRID: Void-filled DEM tiles covering the area are downloaded and unzipped from
http://hydrosheds.cr.usgs.gov/dataavail.php.

(For this exercise, void-filled DEM tile n25e095_dem_grid.zip, n25e090_dem_grid.zip,


n20e095_dem_grid.zip and n20e090_dem_grid.zip should be downloaded and unzipped.)

c. SRTM: Shuttle Radar Topographic Mission (SRTM) DEMs can be downloaded from
http://viewfinderpanoramas.org/Coverage%20map%20viewfinderpanoramas_org3.htm
which has a resolution of 3 arc seconds (90m).

2. Soil characteristics

3. Land use map


900-meter resolution land use and land cover map for whole world can be downloaded from FAO
site http://www.fao.org/geonetwork/srv/en/metadata.show?id=12749&currTab=distribution .
300 m resolution land cover map for 2006 and 2009 for whole globe also can be downloaded from
http://due.esrin.esa.int/page_globcover.php. Land use map can be used to infer the different
parameters of the model.
Getting DEM in ArcMap
Open ArcMap. Create a new empty map. Click on the Add icon to add the raster data.
Since there are many tiles, we need to combine those with mosaic operation.
Data Management Tools Raster Raster Dataset Mosaic to new Raster
select the different TIFF tiles from ASTER (after unzipping the zip files) or SRTM.
Do not specify the coordinate system here. If you specify the coordinate during this step, you will
end up with artefacts because you cannot specify resampling technique here. You will specify the
coordinate system later.
Pixel type: 16_BIT_UNSIGNED
Mosaic Method: BLEND
Mosaic Colormap Mode: MATCH and Click OK.

(IMPORTANT NOTE: You might notice that during this operation, the default “pixel type” is
set to “8_BIT_UNSIGNED”; this seems to be the cause of problems we have been having, such
as numerous “No Data” pixels in the topography, typically at a given elevation. If the assumption
can be made that the DEM has no values below sea level then the DEM should be mosaiced as
16_BIT_ UNSIGNED, otherwise 16_BIT_SIGNED, assuming the units of height are Meters, as
that will accommodate all values (something that 8 bit can’t). So, if you want to avoid problems,
use 16 bit signed.)
A new mosaic raster is added to the document, which will be seamless (without sign of
joining/patching several raster files). You can check the elevation distribution along the raster file,
just to get an idea of the basin we are going to delineate afterwards
Projecting DEM
Once done with mosaicking of raster DEMs, we can remove the original individual tiles from the
document. The most important and fundamental thing you must understand is all the files we are
going to work should be in a projected coordinate system (not WGS!) and should be consistent
within the document (every file in same projection system in ArcMap document). The raster files
and shape files that we are going to project should be done in meters. There are several ways of
projecting a file to coordinate system. We have discussed two options here:
Option 1: Right Click on data frame Layers, go to Properties, Coordinate System and see which
coordinate system the data frame is originally having. (Normally, data frame has the same
coordinate system as the file which was loaded first in the document). Select Projected
Coordinates Systems UTM WGS1984 Northern Hemisphere  WGS 1984
UTM Zone 46N. You can click Add to Favorites once you have selected this coordinate system
to reduce the time next time. Click Apply and Click OK.
Now, right Click the mosaic DEM file and click Data Export Data. Then, select Data frame
(current) in the option of Spatial Reference. You will notice that cell size will be shown as 90 or
something (depending on your resolution of DEM, in this case it means 90m cell size). You can
select Location for the projected file and the name and format. Save it.
Option 2: Go to Data Management Tools Projections and Transformations Raster
Project Raster.
Specify the input raster, specify output raster dataset as Chindwin_dem and then click on the button
next to the “output coordinate system” box. Select Projected Coordinates Systems
UTM WGS1984 Select the zone 46N
Specify the “Resampling Technique” as CUBIC click OK. Right Click on the map, Click
Data Frame Properties and Change Coordinate System to WGS_1984_UTM_Zone_46 N so that
the coordinates are displayed within the new coordinate system.
(Once the file is projected to UTM46N, remove the other file from data frame, so that consistency
is maintained in data frame in terms of coordinate system)
(Note: If you have other shapefiles of discharge or rainfall stations in other coordinate systems,
just add them to the ArcMap document and export them similarly by selecting Use Same
Coordinate System as the data frame.)
2. Basin Delineation
To delineate a basin, a software that can interpret the topography and hydrological sequence of
any area must be employed. ArcMap will be used for this purpose since it has its own inbuilt
Hydrology features and other extensions can also be added like ArcHydro Tools and HEC-
GeoHMS.
(Note: An outlet must be selected to delineate a basin accordingly. In this model, Monywa is
identified as the outlet of Chindwin river basin.)

Setting up a HEC GeoHMS in your system


HEC-GeoHMS is a GIS extension that provides the user with a set of procedures, tools, and utilities
for the preparation of GIS data for delineation of a basin and import into HEC-HMS and generation
of GIS data from HMS output. Depending on the installed version of GIS in your system, download
corresponding version of HEC GeoHMS. (e.g.: If you have ArcMap 10.1 installed in your system
download and install HEC GeoHMS 10.1). It can be downloaded from the official website of HEC,
http://www.hec.usace.army.mil/software/hec-geohms/downloads.aspx.
If Arc Hydro (another extension of GIS) has not been previously installed in your system, then
download and install Arc Hydro in your system. Corresponding to your installed version of Arc
GIS, download suitable version from http://downloads.esri.com/archydro/ArcHydro/Setup/ and
install it.
(Note: Try installing HEC-GeoHMS first. If your system does not have ArcHydro Tools, it will
automatically install correct version of ArcHydro Tools in your system if your system has internet
access.)

Getting started
Open ArcMap. Select a new empty map. Check the HEC-GeoHMS menu. You should now see
the HEC-GeoHMS toolbar added to ArcMap as shown below. You can leave it floating or you
may dock it in ArcMap.

Loading data in ArcMap: Click on the Add button to add the raster file Chin_dem_pro
containing the DEM for Chindwin river basin and a standard river shape file for Chindwin river
named Chindwin_river.shp.
(Note: You might have noticed that the DEM we loaded was having 4 tiles originally. Thus, a
large area needs to be processed. However, our basin might be smaller based on the outlet we
define. So, we will clip our DEM around our study area only, so that our computation time and
resources are saved.)
Clipping the DEM: Go to Catalog, Select the Default location of ArcGIS, Right Click and select
NewShapefile.

Select the feature type as Polygon, provide a name for the shapefile and define the projection for
the new shape file. It should be in same projection system as the DEM to be clipped. Click Ok and
OK again.
Now a new feature class with the name specified will be added to the data frame.
Right Click the recently added shape file, click Edit Features and Start Editing. Click on Create
Features and select the Polygon in Create Features window.

Now, add the river shape file in the document and start digitizing around the river shape file. After
the shape file is digitized like shown below; Go to Spatial Analyst Tools, Click Extraction and
Extract by Mask.

Select Chin_dem_pro.tif as raster, recently digitized shapefile as feature mask data and specify the
output raster. Click Ok.

This will add a clipped DEM to the ArcMap document which will be having the same areal extent
of recently digitized shapefile. Remove the shapefile as there is no more need of it. Now, remove
other raster files from the document and check if the data frame and all the files in it are in projected
coordinate system. Save the document as ChindwinHMS.mxd.
Terrain Preprocessing
Terrain preprocessing involves using the DEM to create a stream network and catchments. The
Terrain Preprocessing menu (shown below) in Arc Hydro Tools will be used to delineate the
catchments and streams of the added DEM.

All the steps in the Terrain Preprocessing menu should be performed in sequential order, from top
to bottom. Use Chindwin_river.shp (if available) for DEM Reconditioning. This helps to burn the
DEM accordingly if the river has changed its course later then DEM survey. If not available, then
proceed from Fill sinks. All preprocessing must be completed to delineate the watershed for the
HEC-HMS model. Towards the end, you will have the following datasets:
Raster Data
1. chin_dem_proj (Raw DEM)
2. AgreeDEM (DEM after reconditioning)
3. Fil (DEM after filling sinks)
4. Fdr (Flow Direction Grid)
5. Fac (Flow Accumulation Grid)
6. Str (Stream Grid)
7. StrLnk (Stream Link Grid)
8. Cat (Catchment Grid)
9. WshSlope (Slope)

Vector Data
1. Catchment (Catchment Polygons)
2. DrainageLine (Drainage Line Polygons)
3. AdjointCatchment (Adjoint Catchment Polygons)

Save the file after each step so that you do not lose anything. This concludes the terrain
preprocessing part. What you have produced is a hydrologic skeleton that can now be used to
delineate watersheds or sub-watersheds for any given point on delineated stream network. The
next part of this tutorial involves, delineating a watershed to create a HEC-HMS model using HMS
Project Setup. Save your map document.

Geo HMS preprocessing


Before you continue, please make sure you have the following datasets in the map document from
the previous part.
Rasters
1. Chin_dem_proj (Raw DEM)
2. AgreeDEM (DEM after reconditioning)
3. Fil (DEM after filling sinks)
4. Fdr (Flow Direction Grid)
5. Fac (Flow Accumulation Grid)
6. Str (Stream Grid)
7. StrLnk (Stream Link Grid)
8. Cat (Catchment Grid)
9. WshSlope (Slope)

Vectors
1. Catchment (Catchment Polygons)
2. DrainageLine (Drainage Line Polygons)
3. AdjointCatchment (Adjoint Catchment Polygons)

Save the map document.

HEC-GeoHMS Project Setup


The HEC-GeoHMS project setup menu has tools for defining the outlet for the watershed, and
delineating the watershed for the HEC-HMS project. As multiple HMS basin models, can be
developed by using the same spatial data, these models are managed by defining two feature
classes: ProjectPoint and ProjectArea. Management of models through ProjectPoint and
ProjectArea let users to see areas for which HMS basin models are already created, and allow users
to re-create models with different stream network threshold. It is also convenient to delete projects
and associated HMS files through ProjectPoint and ProjectArea option.

Dataset Setup
Select HMS Project Setup Data Management on the HEC-GeoHMS Main View toolbar.
Confirm/define the corresponding map layers in the Data Management window as shown below:
Click OK.
Creating a new HMS project:
Click on Project Setup Start New Project. Confirm ProjectArea for ProjectArea and ProjectPoint
for ProjectPoint, and click OK.

This will create ProjectPoint and ProjectArea feature classes in the data frame as shown below. Fill in
the information to Define a New Project. If you click on Extraction Method drop-down menu, you
will see another option “A new threshold” which will delineate streams based on this new threshold
for the new project. For now, accept the default original stream definition option. You can write
some metadata if you wish, and finally choose the outside MainView Geodatabase for Project Data
Location, and browse to your working directory where ChindwinHMS.mxd is stored. Click OK.

.
Click OK on the message regarding successful creation of the project. Add the shape file of
hydrological station network of Chindwin river (It should be same coordinate system). Zoom-in
to downstream section of the Chindwin River to define the watershed outlet at the downstream

hydrological station. Select the Add Project Points tool on the HEC-GeoHMS toolbar, and
click on the downstream outlet area of the Chindwin River to define the outlet point as shown
below as red dot. After clicking the outlet, it automatically displays a window as shown in figure
below.

Accept the default Point Name and Description and click OK. This will add a point for the
watershed outlet in the ProjectPoint feature class. Save the map document.
Next, select HMS Project Setup Generate Project. This will create a mesh (by delineating
watershed for the outlet in Project Point), and display a message box asking if you want to create
a project for this hatched area as shown below:

Click Yes on the message box. Next, confirm the layer names for the new project (leave default
names for Subbasin, Project Point and River), and click OK.
This will create a new folder inside your working folder with the name of the project
ChindwinHMS, and store all the relevant raster, vector and tabular data inside this folder. The
raster data are stored in a sub folder with the project name (ChindwinHMS) inside ChindwinHMS
folder. All vector and tabular data are stored in ChindwinHMS.gdb. You will also notice that a new
data frame (ChindwinHMS) is added in ArcMap containing data for Chindwin river basin. Save
the map document.

Basin Processing
The basin processing menu has features such as revising sub-basin delineations, dividing basins,
and merging streams. We will use hydrological stations and water control structures (reservoirs,
diversions) to revise sub-basin delineations. Add Hydro_chindwin_proj.shp (a shape file with
information on location of hydrological stations in same projected coordinate system) for this
purpose.
Merge Basins
You might notice that there are several sub basins in the resulted basin. Many of them might be
very small. (It is a good modeling approach when all the subbasins are almost similar in area).
Thus, it will be easier for us later to assign parameters of subbasin, if the number of subbasins is
lesser. For this purpose, we can use Merge Basins Option.
This process merges two or more adjacent basins into one. Select the two adjacent basins (shown

below) using the standard select tool . Click on Basin Processing Basin Merge. You will
get a message asking to confirm the merging of selected basins (with basins hatched in
background), click Yes.
After the merging, we now have two rivers inside one subbasin (One basin; One river!). If there
are two tributaries meeting at subbasin outlet, one needs to delete unnecessary tributary.

We will now delete unnecessary river segment and merge two river segments within the same
subbasin which are left due to Basin Merge process. Click Editor => Start Editing and select
Source having River layer, click OK. Open Attribute Table of River1 and select the river segment
to delete and press Delete. Click Editor => Save Edits => Stop Editing.
Merge rivers:
We need to merge river sometimes, when there are two reaches within same river inside a subbasin.

Select two river segments to merge using the standard select tool . Click on Basin Processing
River Merge. Click Yes to confirm.
We will now divide the subbasins at the gauging points using Subbasin Divide tool . Click the
Subbasin Divide tool and click at the Str or StrLnk cell nearest to gauging point to divide the
subbasin. Save the map document.

The River Profile tool allows displaying the profile of selected river reach(es). If the river
slope changes significantly over the reach length, it may be useful to split the river/watershed at
such a slope change. Select the River Profile tool, and click on any river segment that you are
interested in inspecting. Confirm the layers in the next window, and click OK. This will invoke a
dockable window in ArcGIS at the bottom that will display the profile of the selected reach.
Repeat the same procedures for other hydrological stations as well. You can do Merge Basins and
River Merge to merge the basins and equal number of rivers.

At the end, you can inspect the attribute table of subbasin and river layer. They should have equal
number of elements. The finalised basin along with their attributes and corresponding river
attributes are shown below:

Extracting Basin Characteristics


The basin characteristics menu in the HEC-GeoHMS Project View provide tools for extracting
physical characteristics of streams and sub-basins into attribute tables.

River Length
This tool computes the length of river segments and stores them in RivLen field. Select Basin
CharacteristicsRiver Length. Confirm the input River name, and click OK.

You can check the RivLen field in the input River1 feature class is populated. Save the map
document.
River Slope
This tool computes the slope of the river segments and stores them in Slp field. Select Basin
Characteristics River Slope. Confirm inputs for RawDEM and River, and click OK.

You can check the Slp field in the input River155 (or whatever name you have for your input river)
feature class is populated. Fields ElevUP and ElevDS are also populated during this process.
Slp = (ElevUP – ElevDS)/RiverLen.

Basin Slope
This tool computes average slope for sub-basins using the slope grid and sub-basin polygons. Add
the WshSlope raster file that you created in the terrain preprocessing part and select WshSlope in
the Slope grid option of data management.
Select Basin Characteristics Basin Slope. Confirm the inputs for Subbasin and Slope Grid,
and click OK.

After the computations are complete, the BasinSlope field in the input Subbasin feature class is
populated.

Longest Flow Path


This will create a feature class with polyline features that will store the longest flow path for each sub-
basin. Select Basin Characteristics Longest Flow Path. Confirm the inputs, and leave the default
output name Longest Flow Path unchanged. Click OK.
A new feature class storing longest flow path for each sub-basin is created and added to the data
frame. Open the attribute table of Longest Flow Path, and examine its attributes. Close the attribute
table, and save the map document.

Basin Centroid
This will create a Centroid point feature class to store the centroid of each sub-basin. Select Basin
Characteristics Basin Centroid. Choose the default Center of Gravity Method, input
Subbasin, leave the default name for Centroid. Click OK.
(Note: Center of Gravity Method computes the centroid as the center of gravity of the sub basin if
it is located within the sub basin. If the Center of Gravity is outside, it is snapped to the closest
boundary. Longest Flow Path Method computes the centroid as the center of the longest flow path
within the sub basin. The quality of the results by the two methods is a function of the shape of the
sub basin and should be evaluated after they are generated.)
A point feature class showing centroid for each sub-basin is added to the map document. As
centroid locations look reasonable, we will accept the center of gravity method results, and
proceed. Save the map document.

Centroid Elevation
This will compute the elevation for each centroid point using the underlying DEM. Select Basin
Characteristics Centroid Elevation. Confirm the input DEM and centroid feature class, and
click OK.

After the computations are complete, open the attribute table of Centroid to examine the Elevation
field. The centroid elevation update may be needed when none of the basin centroid methods
(center of gravity or longest flow path) provide satisfactory results, and it becomes necessary to
edit the Centroid feature class and move the centroids to a more reasonable location manually.

Centroidal Longest Flow Path


Select Basin Characteristics Centroidal Longest Flow Path. Confirm the inputs, and leave the
default name for output Centroidal Longest Flow Path, and Click OK.
This creates a new polyline feature class showing the flow path for each centroid point along
longest flow path. Save the map document.
The computation of longest flow path, centroid and centroidal longest flow path for Chindwin river
basin is presented graphically below:

Hydrologic Parameters
The hydrologic parameters menu in HEC-GeoHMS provides tools to estimate and assign several
watershed and stream parameters for use in HMS. These parameters include SCS curve number,
time of concentration, channel routing coefficients, etc.
Go to Parameters, Select Data Management and Check if every file is selected for right option.
Select HMS Processes
You can specify the methods that HMS should use for transform (rainfall to runoff) and routing
(channel routing) using this function. Of course, this can be modified and/or assigned inside HMS.
Select Hydrologic Parameters Select HMS Processes. Confirm input feature classes for
Subbasin and River, and click OK. Choose SCS for Loss Method (getting excess rainfall from total
rainfall), Clark Unit Hydrograph for Transform Method (for converting excess rainfall to direct
runoff), Linear Monthly Constant for Baseflow Type, and Muskingum Cunge for Route Method
(channel routing). Click OK.

You can open the attribute table of subbasin feature class to see that the subbasin methods are
added to LossMet, TransMet, and BaseMet fields, respectively. The Muskingum method is added
to RouteMet field in the River feature class. You can treat these methods as tentative which can be
changed in HMS model. Save the map document.

River Auto Name


This function assigns names to river segments. Select Hydrologic Parameters River Auto Name.
Confirm the input feature class for River, and click OK.

The Name field in the input River feature class is populated with names that have “R###” format,
where “R” stands for river/reach “###” is an integer.
The Name field in the input River feature class is populated with names that have “R###” format,
where “R” stands for river/reach “###” is an integer.

Basin Auto Name


This function assigns names to sub-basins. Select Hydrologic ParametersBasin Auto Name.
Confirm the input feature class for sub-basin, and click OK.

Like river names, the Name field in the input Subbasin feature class is populated with names that
have “W###” format, where “W” stands for watershed, and “###” is an integer. Save the map
document.

HMS
The HMS menu has tools for creating input files for HEC-HMS. Before proceeding, go to Data
Management and check if every proper file is under proper field.
Map to HMS Units
This tool is used to convert units. Click on HMS Map to HMS Units. Confirm the input files,
and click OK.

In the next window, choose SI units from the drop-down menu, and click OK.

After this process is complete, you will see new fields in both River and Subbasin feature classes
that will have fields ending with “_HMS” to indicate these fields store attributes in the specified
HMS units (SI in this case). All fields that store lengths and areas will have corresponding “_HMS”
fields because of this conversion.

Check Data
This tool will verify all the input datasets. Select HMS Check Data. Confirm the input
datasets to be checked, and click OK. You should get a message after the data check saying the check
data is completed successfully as shown below.
CHECKING SUMMARY
****************
Unique names - no problems.
River containment - no problems.
Center containment - no problems.
River connectivity - no problems.
VIP relevance - no problems.

If you get problems in any of the above four categories (names, containment, connectivity and
relevance), you can look at the log file to identify the problem, and fix them by yourself.

HMS Schematic
This tool creates a GIS representation of the hydrologic system using a schematic network with
basin elements (nodes/links or junctions/edges) and their connectivity. Select HMSHMS
Schematic. Confirm the inputs, and click OK.
Two new feature classes HMSLink1 and HMSNode1 will be added to the map document.
After the schematic is created, you can get a feel of how this model will look like in HEC-HMS
by toggling/switching between regular and HMS legend. Select HMSToggle HMS
LegendHMS Legend

You can keep whatever legend you like. Save the map document.
Add Coordinates
This tool attaches geographic coordinates to features in HMSLink1 and HMSNode1 feature classes.
This is useful for exporting the schematic to other models or programs without losing the
geospatial information. Select HMSAdd Coordinates. Confirm the input files, and click OK.
The geographic coordinates including the “z” coordinate for nodes are stored as attributes
(CanvasX, CanvasY, and Elevation) in HMSLink1 and HMSNode1 feature classes.

Prepare Data for Model Export


Select HMS Prepare Data for Model Export. Confirm the input Subbasin and River files, and
click OK.

This function allows preparing subbasin and river features for export.

Background Shape File


Select HMSBackground Shape File. This function captures the geographic information (x,y)
of the subbasin boundaries and stream alignments in a text file that can be read and displayed
within HMS. Two shapefiles: one for river and one for sub-basin are created in the project folder.
Click OK on the process completion message box.

Basin Model
Select HMSBasin Model File. This function will export the information on hydrologic
elements (nodes and links), their connectivity and related geographic information to a text file with
.basin extension. The output file ChindwinHMS.basin (project name with .basin extension) is
created in the project folder. Click OK on the process completion message box.
You can also open the .basin file using Notepad and examine its contents.

Meteorologic Model
We do not have any meteorologic data (temperature, rainfall etc.) at this point. We will only create
an empty file that we can populate inside HMS. Select HMSMet Model FileSpecified
Hyetograph. The output file ChindwinHMS.met (project name with .met extension) is created in
the project folder. Click OK on the process completion message box.

You can also open the .met file using Notepad and examine its contents.

HMS Project
This function copies all the project specific files that you have created (.basin, .map, and .met) to
a specified directory, and creates a .hms file that will contain information on other files for input
to HMS. Select HMSCreate HMS Project. Provide locations for all files (You can create a
new folder called Model in your working directory. Even if we did not create a gage file, there will
be a gage file created when the met model file is created. Give some name for the Run, and leave
the default information for time and time interval unchanged. This can be changed in HEC-HMS
based on the event you will simulate. Click OK.
If a .hms file already exist in that folder, you may get a message asking you to overwrite or not.
After you respond to that message a project file report will be displayed as shown below:
This set of files displayed in project report defines the HMS project that you can open and
manipulate in HMS directly without interacting with GIS. Typically, you should modify
meteorologic and basin files to reflect field conditions before running the HMS model. Close the
report. Save the map document. OK, this is all about HEC-GeoHMS.

You can carry on from HEC HMS directly now after this step. But before that, we have some
other steps to follow.
3. Quality control of metro-hydro data
The observed data of hydrology (discharge/water level/rating equation) and meteorology (rainfall,
temperature, etc.) are the primary requirements of any hydrological model. Based on the observed
meteorological conditions, the hydrological processes are simulated by a hydrological model and
output in the form of discharge or/and water level or/and soil moisture is generated. Thus, the
accuracy of these data themselves are most crucial requirements for setting up a robust model
capable of capturing the hydrological processes in that watershed.
Quality control of observed meteorological data and hydrological data enables us to input
correct/realistic input to the model and generate valid output and compare with gridded
meteorological and hydrological products of same time series.

Steps of Quality control:


1. Missing data Filling:
In HEC-HMS, we need to have meteorological input which does not have any missing data. But
in ground observation data, we might have some missing data for rainfall, temperatures and
discharges/water level. Missing data in discharge might not pose serious issues, so can be left as it
is for this part. But, for rainfall and temperatures, we need to fill in the missing data, for which
some techniques as discussed below can be used:
a. Single Best Estimator (SIB) method
Single best estimator is simple and uses the closest neighboring station’s data to fill the missing
data by estimation. Station with highest correlation and lying in neighborhood of the desired station
is used to fill the data.

b. Multiple Regression Analysis with Least Absolute Deviation Criteria:


It is a traditional approach of interpolation where missing data (Vo) is estimated as

Where ao, a1, a2….an are regression coefficients and Vi is the value of ith weather station.
c. UK traditional method (UK)
Traditionally used by UK Met Office to estimate the missing temperature data and sunshine data,
it compares the desired station with a single neighboring station from where a delta was calculated
indicating the average difference for the overlapping period. And using that delta, the missing data
of the desired station is calculated using the neighboring station’s data for same time series.
d. Simple Arithmetic Average (AA)
An arithmetic average of say five stations closest to the station is taken for the day when no data
is there in the station and the resulting mean is taken as the data that was missing.

e. Inverse distance method (ID)


Based upon the proximity of station, the contribution of the station data to fill the missing data is
identified in this process.

Vo is the missing data which is estimated by the surrounding station’s proximity. Vi is data for
same time of ith station among n number of station and di is the distance between desired station
and the ith station.
All these methods are discussed in one of the papers by Y. Xia, et. al in a journal of Agricultural
and Forest Meteorology in 1999.
You may choose other methods like Normal ratio method as well.
f. Normal- Ratio method:
Rainfall PA at station A is estimated as a function of normal monthly or annual rainfall of the
station whose data is missing and the neighboring stations with available data.

Where Pi is the rainfall at surrounding stations, NRA is the normal monthly or seasonal rainfall at
Station A, NRi is normal monthly or seasonal rainfall at Station i and n is the number of
neighboring stations.
Depending on your choice, you can use any of the methods (there are more also).

2. Outlier Verification
Most of the times, the field data is transferred into computer by manual operations. While entering
the data, it is possible that the data being input is wrong. For temperature data, the data lying
outside of Mean ± 3* Standard Deviation may be taken as the range. Any data exceeding/below
this range might be considered as outlier.
Verification of this outlier by comparing the same day temperature of neighboring station can be
done. If it is consistent with other stations, then it is likely that the temperature data is correct. If
this shows inconsistency, removal of outliers can be done and they can be treated as missing data
which can be filled with procedures as discussed above.

For rainfall, the range definition is not suitable. However, 3 or 4 times standard deviation method
can be used to define the outlier. The outlier then can be verified with available timeline of
flooding/discharge also. Other records of possible heavy rainfall from archives, newspaper can
also be taken as reference. Other nearby rainfall stations can also be assessed for same date with
outlier and if it is showing inconsistency, then it might need to be removed and treated as missing
data. Unlike temperature, rainfall might be distinctly different from one station to another, but the
date of heavy rainfall can be verified by checking if it was a rainy month or not also.

3. Inconsistency of data
The cases when minimum temperature is recorded with higher/same value than maximum
temperature, or consecutive five or more days with same data or a very small gap in between
maximum and minimum temperature data can be taken as inconsistent data. These data also need
to be rectified before proceeding to another step.

4. Homogeneity of data
Data of longer time series (30 years or so) require checking of homogeneity. The
rainfall/temperature/discharge measuring stations themselves might introduce error because of
station shifting/ method of measurement/ instrument wear, etc.

Double mass curve method to test the homogeneity of rainfall and discharge data can be used. It
also adjusts the possible deviation of rainfall/discharge for the past time series. In this flood project,
data from 2000 to 2013 has been used. So, stations are expected to be homogeneous for this period.
4. Generation of sub basin average meteorological data

1. Open new ArcMap document → Add layers using the “Subbasin155.shp” (Chindwin
river basin shape file) and “Rainfall_chindwin.shp” from the folder
“…/Training/Thiessen_Raw data”
2. Under Arc tool box , Select Analysis → Click Proximity → Double click Create Thiessen
Polygon

 Define input features “Rainfall_chindwin.shp”


 Define Output feature class “Thiessen” in your working folder
 Go to Environments…. → Processing Extent
 In the Extent, change Default into Same as layer Subbasin 155
 Click Ok and Ok again.

A layer “Thiessen” will be added into the map automatically after the process gets finished.

3. Under Arc tool box , Select Analysis → Click Extract → Double click Clip

 Define input features “Thiessen”


 Define Clip features “Subbasin155”
 Define Output feature class “Clip_thiessen” and Click OK.
A layer “Clip_thiessen” will be added into the map automatically after the process gets finished.

4. Open Attribute table of Clip_thiessen, right click and go to Joins and Relates option and select
Join.
 Choose the field in the layer that the join will be based on Input FID
 Choose the table to join this layer, or load the table from your folder: Rainfall_chindwin”
 Choose the field in the table to the base the join on: FID
 In Join options, Keep only matching records
 Click Ok

“Clip_thiessen.shp” will be updated with attributes joined from the defined file.

5. Select Analysis Tools → Overlay → Intersect and input the clipped thiessen and basin file to
intersect. Save it as “Int_Clip_thiessen”.

This will create an intersected file of basin shape file and thiessen polygon. Next step is to add
the area in table of intersected file and calculating the gage weights.
The images for calculation of thiessen, clipping the generated thiessen and intersection are
presented in figure below:

6. Right click the feature “Int_Clip_thiessen” and select the option Open Attribute Table. You
can see the recently added information from the station shape file and intersected polygons.

Click the button Option on the right bottom side of the window and select Add field. Define the
parameters as shown in figure below and click Ok. This step will enable user to create an attribute
column Area. Select the recently added Area column, right click to get option calculate
geometry. This will calculate area of each intersected segment.
Note that Calculation of area requires your coordinate system in a projected plane. If your
shapefiles are in Georeferenced Coordinated System (i.e GCS WGS1984) you can change your
data frame into a Projected one, just to calculate the area (but this is not a good practice).

In the attribute table of “Int_Clip_thiessen.shp”, there is another field with Area_HMS which is
area of each subbasin computed during our HEC-GeoHMS exercise. The recently added column
of Area_poly can also be seen which is populated. Inside each subbasin, different polygons are
populated with their corresponding area in square kilometers as we specified during the
computation of area.
7. Repeat the similar step for creating another attribute Weight in the attribute table as shown in
figure below (right)

Select the recently created column Weight and select option Calculate Geometry. To calculate
the percentage contribution of each station within each subbasins, divide the area of the polygon
generated by that station within the subbasin by whole area of subbasin,
i.e; Percentage weight = Column 2/ Column 1 *100 % .
8. Export the attribute table and save as a text file. Open it with excel for further calculations and
analysis. Then Save the document and exit.
Open the .csv file, stored from above process in the specified folder.

Note that for each subbasin (column 1), the contribution of mean areal rainfall by stations (column
2) in terms of percentage is given by their respective weights (column 3).
(e.g.: For subbasin W180, only two rainfall stations will contribute; Mohnyin (4%) and Hkamti
(96%). So, rainfall of W180= 0.04* Mohnyin + 0.96* Hkamti)
Compute mean subbasin rainfall for all subbasins using this technique and save it in an excel
format.

Creating DSSVue files to Store data


For HEC products like HEC-HMS, HEC-RAS, HEC-ResSIM, etc., DSS files can be created for
saving input and output files. They are very easy to create, handle and update also. Automatic linux
machines can also access and update these DSS files to update the inputs of our hydrologic models.
For this, we need to download and install HEC-DSSVue software from HEC official website,
http://www.hec.usace.army.mil/software/hec-dssvue/downloads.aspx . Install the HEC DSSVue
and open it by double clicking. The window based DSSVue looks like this:
Create a new DSS file with name Chindwin_timeseries.

All your time series data of rainfall, temperature, discharge, etc. should be in excel format.
(Remember; it should be an excel 97-2003 version. So, if you have your old file in excel as .xlsx,
convert it into .xls). The format in the excel file should be like:
Date W180 W230 W250 W330 W340 W380 W390 W430 W480 W530 W540
1/1/2000 0 0 0 0 0 0 0 0 0 0 0
1/2/2000 0 0 0 0 0 0 0 0 0 0 0
1/3/2000 0 0 0 0 0 0 0 0 0 0 0
1/4/2000 0 0 0 0 0 0 0 0 0 0 0
1/5/2000 0 0 0 0 0 0 0 0 0 0 0
1/6/2000 0 0 0 0 0 0 0 0 0 0 0

Go to Data Entry tab in you HECDSSvue file you just created and click Import, Excel.

This will open the subbasin mean rainfall file which we computed for all subbasins.

Select individual basin column with date column (Circle 1) and click Import (Circle 2).
This will open a window like shown below. Fill in the blank parts of A, B, C, F, Units and Type.
A refers to the Group name (it can be station name or subbasin name). B represents the location,
C represents the Parameter (for rainfall PRECIP-INC should be written, for discharge it would be
FLOW, for stage it is written STAGE, and so on). F is option for version and is optional. However,
we will write a brief description of the data like OBS for observed, WRF for WRF generated
rainfall, etc.
The Units are also to be written (capital MM or FEET for rainfall, CMS for cumecs of discharge,
FEET for stage, etc.). Finally Type option has instantaneous and cumulative values (for daily
rainfall data, PER-CUM is selected and for discharge INST-VAL is selected). After all this is
written, you can visualize the data if you want using Plot option. Save the file and create another
subbasin DSS similarly within the same DSS file but with different path name.
Similarly save all available discharge data, subbasin average rainfall data, stage data and others in
this DSS with name Chindwin_timeseries.DSS
After updating all our data into DSS, our finalized file will look something like this:

Similarly, add time series of discharge also. (A separate DSS can also be made for discharges)
5. Sub basin average WRF extraction
RIMES customized WRF (weather research and forecast) model is run at a grid resolution of 9 km
x 9 km for a domain covering 30°E to 160°E and 15°S to 45°N. For hydrological applications, it
is necessary to transform the numerical model data from gridded data to basin average or Mean
Areal Precipitation (MAP) outputs.
As the model data is stored in two different formats, the direct model output in netCDF format and
a post processed binary (GrADS readable) format. For this exercise, we will use “R” to extract
WRF data from the netCDF file. Both methods use the basin shape file in lat-long (WGS84)
projection for generating the areal (sub-basin) average.

Extracting WRF using R


First Step in the process is to install R into the system depending on the operating system platform.
If windows based or Linux or Mac Based systems. R is freely available tool (https://cran.r-
project.org). For this process, we will use R studio. To download R studio, you can go to
https://www.rstudio.com/products/rstudio/download/ and download the free version and install it.
Remember that after installing R you can download and install R Studio.
(For us, we already installed R in our system, so we just need the following dependent packages)
For the generation of the basin average we need several dependent packages to be installed in R
Studio, which include
 ncdf/ncdf4 – this enables R to be able to read netCDF file formats
 raster – raster function rasterizes the data from grids
 rgdal – interpolation, visualization and processing of spatial data

All these packages can be installed from the R Studio as shown below.
Open R studio. The first screen looks like the left figure above. In the right-hand side, there is a
gripper where you can click and drag which will reveal two other windows as can be seen in right
figure above. In the rightmost bottom window, there is a window Packages where there is an
option to Install new package.

Type the package ncdf4 and click Install. It will start installing the required package ncdf4 along
with dependencies required by the package. Similarly, install other required dependencies as
mentioned above.
Then open the script provided to you “Process_WRF.r” that is in the annex also here. Or you can
click which will give you options to create a new R script where you can type in code and
run afterwards.
Change the highlighted text in the code to locate the working directory, model file (.nc format)
and location of shape file in 3D projection (lat-long/WGS 1984) that will be used for processing.
This script will automatically extract data and then present WRF forecast output for our subbasins
to us in usable format.
Extracting WRF using GrADS
The GrADS based approach is used for binary data sets, which are basically post processed outputs
from the original WRF output files. This process is also effective considering most of the WRF
outputs were stored basically in Grads binary format as it consumes less storage space compared
to the wrfout nc files and GrADS is well known among Meteorological community. This datasets
has one descriptor file (.ctl) and a bindary data (.dat) file.

Steps:
 Installation of Grads binary package.

The Opengrads version is freely available, and can be accessed via


https://sourceforge.net/projects/opengrads/files/grads2/2.0.2.oga.2/Windows/grads-2.0.2.oga.2-
win32_superpack.exe . Installation of GrADS in windows is easier just by clicking the .exe file.
Once installed, grads can be opened by double clicking the file.

 Preparing the Shape file of Ayerawaddy in WGS 84 projection

GrADS does not support any other projected coordinate system. So, it is a MUST that the shape
file for which we want to extract the WRF output be in lat-long projection system. Then we will
extract the extent of the X bound and Y bound for each subbasin of Ayerawaddy basin shape file
which is as listed below. This can be extracted from the GrADS using following command in
the grads prompt.
 q dbf name_shapefile
 q shp name_shapefile
(Still to be updated)
6. WRF data verification and bias correction

Forecast Verification
Verification is the process of comparing the forecasts to relevant observations. It is a measure of
quality of the forecasts. A paper by Finley (1884) on Tornado forecasting is generally regarded as
the starting point for the science of forecast verification – Finley used the phrase ‘verification of
predictions’, which has then been adopted by later authors.
The followings are the importance of forecast verification
1. To improve model forecast
2. To improve decision making
3. To understand model biases
4. To make choice of a better model or better model configuration

Types of forecasts
a. Deterministic (Non-probabilistic):
1. Continuous variables (Forecast is a specific value of the variable)
2. Dichotomous: Yes/No (Binary) e.g. rain vs. no rain
3. Multi-Category: e.g. Precipitation type: light, moderate, and heavy
4. Visual
5. Spatial
b. Probabilistic: Forecast is the probability of occurrence of ranges of values of the variable

Verification Measures of Deterministic Forecasts


1. Continuous:
1. Mean Error, Bias (ME)
2. Mean Absolute Error (MAE)
3. Root Mean Squared Error (RMSE)
4. Skill Score (SS)

2. Dichotomous
1. Frequency Bias or Bias Score
2. Percent Correct (Accuracy)
3. Probability of Detection (POD) or Hit Rate (HR)
4. False Alarm Ratio (FAR)
5. Probability of False Detection (POFD) or False Alarm Rate
6. Threat Score (TS)
7. Equitable Threat Score (ETS)
8. Heidke Skill Score (HSS)

3. Multi Category
1. Histograms
2. Accuracy (Percent Correct)
3. Equitable Threat Score (ETS)
4. Hanssen-Kuipers Score
5. Gerrity Score (GS)
6. Heidke Skill Score(HSS)

4. Visual
1. Mapped forecasts and observations
2. Time series of forecasts and observations at selected sites
3. Scatter plots
4. Quantile-Quantile plots

5. Spatial
1. Scale decomposition methods
2. Neighborhood (fuzzy) methods

Some Basic Definitions of Verification Methods of Continuous Forecast Variables:


1. Mean Error, Bias (ME)
- ME = (1/n) Σ(fi – oi)
- Perfect score = 0
- With ME > 0, the system exhibits over-forecasting and ME < 0, the system exhibits
under-forecasting
- Does not provide magnitude of errors so it’s not an accuracy measure.
- Range: -∞ to + ∞
- It measures bias

2. Mean Absolute Error (MAE)


- MAE = (1/n) Σ|fi – oi|
- Range: 0 to ∞
- Perfect score – 0
- Small is better
- Gives average magnitude of errors in each set of forecasts.
- It measures accuracy

3. Root Mean Squared Error (RMSE)


- MSE = (1/n) Σ(fi – oi)2
- RMSE = square root of MSE
- Perfect Score = 0
- Range: 0 to ∞
- Smaller is better
- It measures accuracy.
- The comparison of MAE and RMSE gives error variance.

Some basic Definitions of Verification Methods of Categorical Forecasts:


Event Observed
Yes No Marginal Total
Event Forecast

Yes a b a+b

No c d c+d

Marginal a+c b+d a+b+c+d


where, a= Hits c= Misses
b= False Alarms d= Correct Negatives

1. Bias Score or Frequency Bias


- BIAS = Hits + False Alarms/Hits + Misses
- BIAS > 1, indicates over forecasting
- BIAS < 1, indicates under forecasting
- Perfect Score = 1
- Range: 0 to ∞

2. Accuracy (Proportion Correct)


- Accuracy = Hit + Correct Negatives/Total
- Overall what fraction of forecast were correct
- Perfect Score = 1
- Range: 0 to 1
- Strongly influenced by the common category

3. Probability of Detection (POD) or Hit Rate


- POD = Hits / (Hits + Misses)
- Gives fraction of predicted “yes” events that occurred
- Sensitive to misses
- Range: 0 to 1
- Perfect Score: 1

4. False Alarm Ratio


- FAR = False Alarms/Hits+ False Alarms
- Gives fraction of predicted yes events that did not occur.
- Sensitive to false alarms not misses
- Range: 0 to 1
- Perfect Score: 0

5. Probability of False Detection (False Alarm Rate)


- POFD=false alarms/ (correct negatives + false alarms)
- Gives fraction of predicted no events that were incorrectly forecast as yes.
- Range: 0 to 1
- Perfect Score: 0
6. Threat Score (TS) or Critical Success Index (CSI):
- TS = Hits / (Hits + Misses + False Alarms)
- It’s a measure of forecast performance after removing correct simple ’no’ forecasts
from consideration
- Range: 0 to 1
- Perfect Score: 1
- It includes hit due to random forecast.

7. Equitable Threat Score (ETS)


- ETS = hits – hits random/ (hits + misses + false alarms – hits random)
- Where hits random = (hits + misses) (hits +false alarms)/total
- Hits due to random forecasts.
- Range: 0 – 1
- Perfect Score: 1

8. Heidke Skill Score (HSS)


- HSS = 2(ad – bc)/[(a+c)(c+d)+(a+b)(b+d)]
- It measures skill.
- Perfect score = 1
- 0 means no skill, negative skills mean chance forecast is better or model has poor
skill and positive skill means better skill.
- Range: -∞ to 1

The verification of WRF forecast with respect to relevant observations here is done with R script,
attached in the annex with name VerifyNWP.R. A hind cast data of WRF for 2008-2014 is
compared with daily data for three days forecast skill. The data needs to prepared in the format
specified as:
Fcst Obs
5 0.00
5 0.00
4 0.60
3 4.75
5 5.35
It should be saved with basin name (e.g: W180.csv) in this case. Please note that the header should
have names as Fcst and Obs under which you must put the time series of forecast and observed
data for the corresponding date. Prepare data for all basins and put them inside separate folders as
shown below:
Open the Script VerifyNWP.R in R studio software. Change the working directory and filenames
as highlighted in the picture below:
Run the script by clicking on the source and you will notice that several plots and other files will
be generated in the folder you specified.

Upon inspecting the output performance recorded in perf_out.csv, you can see the statistics of the
performance as shown below:
The results show that the Mean Absolute Error, Mean Square Error, Mean Error, Skill Score and
PBIAS are 9.22 (model is over forecasting), 184.84, 5.46, -2.14 (chance forecast is better or model
has poor skill) and 92.29 (high positive bias).
For detailed output, open W180_out.txt and you can see the statistics of performance in detail. The
multicategory (4 categories) verification show that PC (Percentage Correct) shows 0.56 which
shows fair chance of detecting the weather conditions. HSS (Heidke Skill Score) shows 0.03 which
means the model is not good in simulating weather. The PSS (Pierce Skill Score) also shows low
accuracy (0.03) and GS (Gerrity Score) also shows low accuracy (0.03).
The multicategories can be selected accordingly in the R script depending on the information of
rainfall in area. In this case, (<0.25 mm rainfall (no rain), 0.1-15 mm (light), 15-50 mm (medium)
and 50mm< (heavy) categories are selected.
The bias, Threat Score (TS), Percentage Correct (PC), Hit Rate (POD), False Alarm Ratio (FAR)
are also accessible from the output files for four different categories.
The binary verification is also done which decides the performance of model in terms of binary
forecasts (rain/ no rain), Hit Rate, False Alarm Rate (FAR), Percent Correct (PC), Bias Score
(BIAS), Heidke Skill Score (HSS), etc.
Graphical comparison of observed and forecast data is also automatically generated in the same
folder for visual inspection.
The figures suggest that the forecast is predicting higher rainfall than the observed. Also, the
frequency has reduced for lower rainfalls. Also, when the observed rainfall says that there is no
rainfall, the forecast simulates rainfall.
Understanding of the forecast is a must before application of the forecast for further modeling.
Knowledge of performance of forecast of different lead times (1 day, 2 day and 3 day) for the
selected basin helps to increase the confidence in forecasts.
Bias correction
The WRF data we are getting are for a coarser resolution (generally 9*9 km or sometimes
downscaled 3*3 km). The coarser data needs some adjustment, so that it can simulate the weather
of our subbasins precisely. This process of adjusting the raw WRF outputs for our subbasins to
generate accurate weather forecast is called bias correction. This is done by comparing the raw
WRF output vs observed meteorological data for a given subbasin.
A hind cast of WRF is done for 2008-2014 and 3 day forecast during this period is extracted from
the procedure discussed in Chapter 4. Comparison of basin average rainfall data with this WRF
hind cast data gives a mechanism to correct the bias present in WRF data.
Basin average rainfall data calculated during Chapter 3 and WRF data extracted for each basin
from Chapter 4 are now arranged in a specific format as shown in Figure below:

For 3 day forecasts, you may make 3 folders as Day1, Day2 and Day3 inside which same set of
subbasins will be present. Each subbasin within Chindwin river basin must have a separate folder.

Inside each subbasin folder, create three .csv files with names starting from Basin name which is
also the folder name and add cal, val and future text in the names of files as shown in figure above
to indicate calibration file, validation file and future forecast file.
The calibration file consists of longer period of data compared to validation file. This consists of
hind cast data of WRF and observed data of subbasin. The format should be strictly like shown in
figure above. Based on the data given in calibration file, a suitable bias correction method among
seven methods will be automatically selected and it will be tested for an independent set of data
that is provided in Validation file. (Note: Do not use same data in calibration and validation file)
In the future file, put the raw forecast of WRF for coming 3 days. Then for all subbasins, 3 day
WRF forecast data is prepared similarly inside respective folders.
There are seven different methods of bias correction employed in the R script provided to you with
name Biascorrection_chindwin.R, which are:
 Scale fit
 Linear fit
 Power fit
 Exponential Asymptotic Fit
 Bernouli Gama distribution
 Tricub method Quantile Mapping
 Smooth Splines method

Change the name of working directory and the names of basins that you want to process for bias
correction, as specified in the image above. Selection of any method of bias correction is possible
by defining selection criteria within the script. As of now, the method which results in minimum
absolute percentage bias for any basin will be automatically selected. Then, Click on Source.
Graphs and statistical results are automatically generated for both calibration and validation files
and the forecast file is corrected and saved as W###_correctedforecast.csv in the same folder.
These two graphs are for comparison of quantiles of observed and bias corrected model values
during calibration period and comparison of quantile of independent observed and model values
corrected by best method. A file with name W180perf_cal will also be generated in the working
folder which shows the statistics of the bias correction. The method with best value defined in the
criteria in the script will be automatically selected for testing in independent data set defined in
validation file as shown below.
A separate file with name W180future_corrected.csv will be generated in the working folder at
the end of the process where bias corrected WRF forecast for the basin W180 will be stored. This
file must be used in our hydrological model as an input meteorological file.
7. Hydrological modeling
Open HEC-HMS, and select File Open. Browse to ChindwinHMS.hms file, and click Open.
You will see that two folders: Basin Models and Meteorologic Models will be added to the
Watershed Explorer (window on top-left) in HEC-HMS. Expand the Basin Models folder and
click on ChindwinHMS. This will display the Chindwin schematic. Click on View
Background Map, and then add the river and basin shape files to see the watershed as basin
along with its sub-basins, streams, links and junctions as shown below.

If you expand the ChindwinHMS basin in watershed explorer, you will see the list of junctions,
reaches and subbasins. You can click on any reach and see its associated methods. For example,
when you click on a Reach (R##), you will see that Muskingum-Cunge routing method is
associated with it. Similarly, if you click on a Watershed (W##), you will see SCS (for abstractions)
and Clark Unit Hydrograph (for runoff calculations) are associated with it. All this information,
which is now independent of GIS, is extracted from attributes that we created in HEC-GeoHMS.

Navigating the HMS Desktop


You can use the following three tools in the tool bar to navigate through the HMS desktop:
The arrow tool lets you select any hydrologic element in the basin. You can use the zoom-in tool
to zoom-in to a smaller area in the desktop, and right click to zoom out. The pan tool can be used
to move the display in the desktop.
Now let’s explore the basin information.

Hydrologic Elements
The ChindwinHMS basin contains different hydrologic elements. The following description gives
brief information on each symbol that is used to represent individual hydrologic element.

Subbasin – Used for rainfall-runoff computation on a watershed.

Reach – Used to convey (route) streamflow downstream in the basin model.

Reservoir – Used to model the detention and attenuation of a hydrograph caused by a


reservoir or detention pond.

Junction – Used to combine flows from upstream reaches and sub-basins.

Diversion – Used to model abstraction of flow from the main channel.

Source – Used to introduce flow into the basin model (from a stream crossing the boundary
of the modeled region). Source has no inflow.

Sink – Used to represent the outlet of the physical watershed. Sink has no outflow.
The model of Chindwin contains only 4 of these kinds of elements. There are 30 hydrologic
elements in the ChindwinHMS model, made up of 11 subbasins, 9 reaches, 9 junctions, and 1 outlet
at Monywa. Notice that when a stream flows through a watershed, the additional local runoff from
the drainage area around the stream is not accounted for until the downstream end of the reach
where its flow is combined at a junction with the flow coming from the upstream reach.

Basin Model
Make sure the ChindwinHMS basin is expanded in the watershed explorer to see all the hydrologic
elements in the basin. Select the Arrow tool from the tool bar, and click on one of the subbasins
(e.g., sub-basin W180) icon in the watershed explorer. After this sub-basin is highlighted,
information related to this sub-basin will appear in the component editor window as below.
Remember the sub-basin element is used to convert rainfall to runoff. So, the information on
methods used to compute loss rates, hydrograph transformation and baseflow is required for each
sub-basin element. A canopy component could also be included to represent interception and
evapotranspiration.
Similarly, a surface component could also be added to represent water caught in surface depression
storages. In this case, we are using Simple Canopy and Simple Surface methods.
The loss method allows you to choose the process which calculates the rainfall losses absorbed by
the ground. In this case, we are using the SCS method to compute losses and get excess rainfall
from the total rainfall. Click on the drop-down menu to see your choices. Some options are
Exponential, Green and Ampt, Initial and Constant, SCS Curve Number and Soil Moisture
Accounting.
The Transform method allows you to specify how to convert excess rainfall to direct runoff. Again,
click on the drop-down menu to view your options. This model employs the Clark Unit method,
which takes rainfall data, subtracts the losses as specified through the specified time of
concentration and lag time, and converts the excess rainfall to a runoff hydrograph.
There is monthly constant base flow specified for this model. The specified baseflow will be added
to the resulting direct run-off hydrograph to produce total streamflow hydrograph.
Once the loss, transform and baseflow methods are chosen for the sub-basin, the next step is to
specify the parameters for these methods.
Select the Canopy tab in the component editor to look at the parameters for the canopy method.
There are some parameters with * on their left indicating that they are mandatory to be provided
with initial values. Similarly, in Surface tab, fill the initial storage and maximum storage.
For SCS method of loss, each basin requires Curve Number and Percent Imperviousness. If the
percent impervious value differs from 0, that percentage of the land area is assumed to have no
losses and the loss method is applied only to the remainder of the drainage area. The value of
Curve Number depends on the land use type and soil type available in the area.
Similarly select the Transform tab to look at the parameters for the transform method. The time of
concentration Tc can be calculated using Kirpich equation which employs average basin slope and
length of principle waterway. Fill in the storage coefficient.

Similarly select the Baseflow and provide mean monthly base flow values. To provide suitable
mean monthly base flow values, you can look at the discharge data of the station downstream and
give a rough estimate of contribution of mean monthly base flows.
After the sub-basin element, let’s look at a reach element. Click on any reach (e.g., R350 in this
case), and look at its parameters in the component editor.

Since the reach element route flows, only one method (routing) is associated with it. Click on the
drop-down menu to look at choices available for routing flows. The Muskingum-Cunge method is
specified here, which is the routing technique used for the reaches in this model. The Muskingum-
Cunge routing method is based on conservation of mass and momentum equations. This method
is suitable to represent attenuation and translation of flood waves for river reaches with a small
slope. Select the Routing tab to look at the parameters for the routing method (Muskingum-Cunge).
The length and slope of the reach are estimated during preprocessing with GeoHMS. These values
are retrieved from the attribute table of River layer. Fill in Manning’s n, bottom width and side
slope.
Similarly, for other subbasins and reaches and remaining hydrological elements (if any), fill in
appropriate values for transform, losses, base flows and routing parameters. You can look at the
parameters for all hydrologic elements by selecting Parameters in the menu bar and selecting a
method. For example, by selecting ParametersCanopySimple Canopy gives a list of
parameters for all the sub-basins in the model as shown below:

Similarly, fill in all parameters for Surface, Transform, Loss, Baseflow and Routing methods as
below.
Meteorologic Model
Meteorologic models provide meteorologic boundary conditions for the subbasins such as
precipitation, evapotranspiration and snowmelt. A new meteorologic model can be created by
using the Meteorologic Model Manager of Components menu. In this exercise, we have already
created a Meteorologic model ChindwinHMS generated from HEC-GeoHMS as below.

Specify Precipitation method, evapotranspiration method and in the replace missing tab, select Set to
Default. In case you have any missing rainfall values, this one will run the model.
Precipitation Method
There are different precipitation methods such as gage weights, inverse distance, specified
hyetograph, gridded precipitation etc. Selection of the method depends upon the purpose of the
model. In this exercise, we will select Specified Hyetograph method. We will import subbasin
average precipitation data computed externally to the program. Let’s create Precipitation Gages
for each subbasin.
Click on Components Time Series Data Manager, and create new Precipitation Gages
called W180, W330 and so on. Close the Time-Series Data Manager.
Associate each gage to corresponding subbasin by clicking on Specified Hyetograph of
meteorologic model as below.

Next, in the Watershed Explorer, expand the Time Series Data folder to provide data for the rain
gauge. We will use observed subbasin rainfall data stored in DSS file to populate the information
for the rain gauge. This is the same DSS file we generated in Chapter 3.
Go to Time window and specify the time series.

You can visualize the rainfall data for this subbasin as you can proceed through Table and Graph
tabs where the subbasin data is tabulated and visualized respectively.
Similarly, provide data for all subbasin mean rainfall gages.
Evapotranspiration Method
Evapotranspiration is responsible for returning about 50-60% of precipitation back to the
atmosphere. Hence, it is an important component for continuous modeling. It is required to
compute the potential evapotranspiration over land surface. For event modeling, it may be omitted.
Monthly Average method is the simple method to represent evapotranspiration if pan evaporation
data are available. However, it can also be used with potential evapotranspiration computed from
other climate data. Pan coefficient for each month is required if pan evaporation data are used.
FAO derived monthly evapotranspiration data also can be used, as we extracted earlier during
Chapter 1. Enter monthly evapotranspiration data for each subbasin (e.g. W180 subbasin below).
Similarly, provide this data for other subbasins as well.
We will not consider snowmelt as there is no snowmelt in Chindwin basin. This completes the
meteorologic model. After the meteorologic data are provided, the model is ready for simulation.
One final step before executing the model is to specify the time step information and the duration
of the simulation. This is done by using the Control Specifications Manager.

Control Specifications
We will create two controls for calibration and validation run; Control 1 for calibration and Control
2 for validation.
Select ComponentsControl Specifications Manager. Delete the existing Run1_control. Select
New in the control specifications manager and type the following information:

Click Create. This will add Calibration control to Control Specifications folder in the watershed
explorer. Similarly, Select New in the control specifications manager and type the following
information:
Click Create. This will add Validation control to Control Specifications folder in the watershed
explorer. Close the control specifications manager.
To see the control specifications file, expand the folder, and select Calibration. This will prompt
the control specifications tab in the component editor. Specify the duration of the simulation in
date and time, and the time interval of the calculations as shown below.

Similarly, Select Validation and Specify the duration of the simulation in date and time, and the
time interval of the calculations as shown below.

Executing HMS Model


Before creating simulation run for calibration and validation, we must assign observed discharge,
water level and rating table data for the junctions having gaging site and at the outlet. Let’s create
Discharge Gages and Stage Gages for the junctions renamed as Hkamti, Homalin, Kalewa, Minkin,
Mawlaik and Monywa.
Click on ComponentsTime Series Data Manager, and create new Discharge Gages called
Hkamti, Homalin, Kalewa, Minkin, Mawlaik and Monywa. Close the Time-Series Data Manager.
Similarly, Click on ComponentsTime Series Data Manager, and create new Stage Gages
called Hkamti, Homalin, Kalewa, Minkin, Mawlaik and Monywa. Close the Time-Series Data
Manager.
In the watershed explorer, under Time-Series Data, you will see 6 discharge gages and 6 stage
gages added.

We will use observed discharge and stage data stored in DSS file to populate the information for
the gauging sites. The data in this case has been stored in the same file Chindwin_flows.DSS.
Click one of the gauging sites (e.g. Hkamti). Assign Data Source, DSS Filename and DSS
Pathname for both discharge and stage as follows:
Populate the information for Time Window as below.

You can check the data by clicking on Table and Graph.

Similarly, populate data for other gauging sites also.


Elevation-Discharge Functions (Rating Tables) are needed to convert simulated discharges into
simulated stages. Let’s create Elevation-Discharge functions for the gauging sites.
Click on ComponentsPaired Data Manager, and create new Elevation-Discharge
Function called Hkamti, Homalin, Kalewa, Minkin, Mawlaik and Monywa. Close the Paired
Data Manager.
We will use rating table data stored in DSS file to populate the information for the gauging sites.
Assign the Data source, DSS filename and pathname for this accordingly. You can check the data
by clicking on Table and Graph.

Click Monywa (Outlet/Junction name, which we renamed earlier) in the watershed explorer. Click
Options and assign Observed Flow, Observed Stage and Elev-Discharge data. Similarly, assign
data for other gauging sites also.
Finally, we have finished perusing the data involved in creating the Chindwin HMS model. The
last step is to run the model. Select ComputeCreate Simulation Run. Accept the default name
for the run (Run 1), click Next to complete all the steps selecting Calibration for calibration and
finally Click Finish to complete the run. Now to run the model, select ComputeSelect
RunRun 1, and then go to ComputeCompute Run [Run 1] to see the following window
(alternatively you can click the compute run tool in the tool bar):

Click Close. You will see a log in the message log as program executes the model. If there are
errors in the model, you will see them in red color. For this model, there are no errors. You can
create a new run (Run 2) for validation by going to ComputeCreate Simulation Run and
selecting the Validation control while creating Run 2.

Viewing HMS Results


The HMS allows you to view results in tabular or graphical form. To view a global results table,
select ResultsGlobal Summary Table (alternatively you can click the Global Summary tool
in the tool bar). You will get a window like the one shown below which summarizes the peak
discharge and time, the total volume of storm runoff and the drainage area from which it came.
In addition to viewing global results, you may also view results for each element within the model.
Again, there are a couple of options to do this, and each option provides output in different ways.
One option is to use the watershed explorer and component editor tab. To view results, you select
the Results tab in the watershed explorer, expand the Simulation Runs folder, and expand Run 1.
To see results for any element, expand that element as seen below:

To see the graph of outflow and observed flow from any element, you can select Graph and
see the hydrograph as shown above.
Similarly, you can look at other graphs in the component editor by selecting the variable in the
watershed explorer. You can select a reach element and see the attenuation in the inflow and
outflow hydrograph by selecting the combined inflow and outflow option in the watershed
explorer. Each element also has a summary option that gives the results from the global summary
table (a single row of the table) for that element.
If you click on a sub-basin just upstream of the outlet, you see the rainfall at the top and the runoff
at the bottom as shown below:
Unlike a single graph in the component editor, you get to see all graphs (input precipitation,
outflow hydrograph, baseflow, precipitation losses) in a single window using this option. You can
also see the results in tabular form by using the view time series table tool in the toolbar. These
functions are also accessed through the Results menu on the menu bar.
If you click Summary Table of any element which has observed discharge data, you will see the
summary of results and model efficiency in terms of Nash-Sutcliffe coefficient as follows:

Results can also be accessed from the basin model map. Move the mouse on top of the basin model
element and click the right mouse button. In the popup menu, select the View Results option and
choose Graph, Summary Table or Time-Series Table.
Model Calibration
The model simulated discharges and observed discharges show a significant amount of bias or
difference. This is because, different parameters of the watershed are not assessed correctly. We
need to find out the appropriate values of parameters that influence the hydrology of the watershed.
This is called model calibration. There are three ways of optimizing model parameters.
Manual Calibration with Trial and Error:
Assign parameter values, run the model, compare simulated and observed hydrographs, and check
performance indicators (NS, PBIAS etc.). If the performance is satisfactory, then accept otherwise
repeat the process.
Automatic Calibration with Optimization Trials:
The optimization process begins with initial parameter estimates and adjusts them so that the
simulated flows match the observed flows as closely as possible. Most parameters for methods
included in subbasin and reach elements can be estimated automatically using optimization trials.
Observed discharge must be available for at least one element before optimization can begin.
Parameters at any element upstream of the observed flow location can be estimated. Seven
different objective functions are available to estimate the goodness-of-fit between the computed
results and observed discharge. Two different search methods can be used to minimize the
objective function. Constraints can be imposed to restrict the parameter space of the search method.
This could be done by creating Optimization Trial in HEC-HMS as follows.
STEP 1: Click Compute  Create Optimization Trial
STEP 2: Provide Name or accept default, Click Next
STEP 3: Choose the correct Basin Model and Click Next
STEP 4: Choose the correct element (subbasin or reach) for which parameters need optimization
and Click Next
STEP 5: Choose the correct Meteorologic Model and Click Finish
STEP 6: Under the Compute tab, expand the Optimization Trials tree
STEP 7: Expand the Trial tree and Click on Correct_Hkamti, fill in Start Date, Start Time, End
Date, End Time and Select correct Time Interval as below. We will select calibration
period to optimize model parameters.

STEP 8: Click on Objective Function, Select Method (Nash Sutcliffe), Location (Hkamti in this
case), Missing Flow (10%) as below.

STEP 9: Right click on the Correct_Hkamti you just created and click Add Parameter (repeat this
step until you have as many parameters as needed)
STEP 10: Click on Parameter 1
STEP 11: Under the Parameter 1 tab, choose your basin under the element drop down menu.
STEP 12: Under the Parameter 1 tab, choose the parameter you want to optimize under the
Parameter drop down menu
STEP 13: Under the Parameter 1 tab, choose “No” under the Locked drop down menu, and choose
appropriate values for the Initial Value, Minimum, and Maximum
STEP 14: Repeat Steps 10-13 for all the parameters added
STEP 15: Right click on the trial Correct_Hkamti and click Compute. When the computation is
finished, Click Close.
STEP 16: Click the Results tab
STEP 17: Expand the Optimization Trials tree
STEP 18: Expand the Correct_Hkamti tree and select Hydrograph comparison

STEP 19: Click on Objective Function to view the improvement in the objective function (Nash
Sutcliffe in this case)’s value.
(Note the improvement in value of NSE for Hkamti. It has gone up from -3.1 to -2.65 within 20
iterations)
STEP 20: Click Optimized parameters to see the values set before and after optimization. You can
see the sensitivity of parameters as well.

(Note that storage coefficient is negatively sensitive to the objective function set)
STEP 20: Check Results for watershed and junction elements. Repeat the process to get best
parameter values.
Calibration Aids:
Elements with observed data in the basin model can be designated as computation points so that
manual calibration could be done using the slider bars to adjust parameter values upstream of the
computation points.
There are two ways to select an element as computation point. The first way is to click on the
element in the Basin Model Map window using the right mouse button and choose the Select
Computation Point command. An element selected as a computation point shows a small red
circle added to the icon in the Basin Model Map window and in the Watershed Explorer.

The second way to select an element as a computation point is by using the Computation Point
Manager window. Click on Parameters => Computation Point Manager command. The
Manager shows all the elements that have been selected as computation points. Press Select
Elements button, click on the row in the table and press the Select button.
Press the Close button when all elements are selected. Similarly, you can unselect an element
which is no longer a computation point. Select the computation point and press the Parameters
button. Click on an element in the list on the left side of the window, select parameters on the right
side of the window by holding the control key and clicking on several parameters. Press the Select
button. When you are finished selecting the parameters, press the Close button to return to the
Computation Point Manager window. Press the Parameter Settings button, specify the minimum
and maximum values, press Apply and Close.
Press the Results button to configure the results. Select the Element and Time-Series and press
Select button.

Right-click on a computation point in the map and select the Calibration Aids command. The
command can only be selected when there is a current simulation run. Selecting the command will
open the customizable editor and the customized result graphs.
Adjust a parameter value for an element using the slider bar for that parameter. The results for the
computation point and all elements upstream of it are recomputed immediately after changing a
parameter value with a slider bar. The result graphs will automatically update to reflect changes.
Press Apply button and Close.

Right-click on a computation point in the map and select the View Results command. Then select
Graph, Summary Table or Time-Series Table to view results.
8. ARIMA Error Correction
ARIMA error correction model
 Auto -Regressive Integrated Moving Average model
 “Box and Jenkins model”
 I in ARIMA stands for “Integrated” where a series required for subtraction is made
stationary; AR means “Auto Regressive” where the stationarized series are lagged and MA
meaning “Moving Average” where forecast errors are lagged.

Construction of an ARIMA model


 First stationarize the series by any means (differencing, logging, deflating, etc.)
 Study the patterns of autocorrelation and partial autocorrelation to determine if lags of
stationarized series and lags of forecast errors should be included in the forecast equation
 Fit the model that is suggested and check the residues (ACF and PACF plots) to see if all
coefficients are significant and all patterns are explained
 Need of additional Auto regression and Moving average might be there

Provided R script for ARIMA


 ARIMA is a statistical error correction model which is applied to the model simulated
discharge to improve the accuracy of our simulated and forecasted discharge.
 It is a R based script, ready to run.
 Input data in a predefined format and changing the name and location of input data in the
script will generate the correction and apply to the simulated discharge.
Steps:
1. Prepare the input files and folder location.
 Create a new folder by the name of station. (eg: Hkamti in this case)
 Inside the folder, create StationnameQfit.csv (eg: HkamtiQfit.csv) inside which arrange
model simulated discharge and observed discharge for the station in a time series, as shown
below:
Date Qout Qobs
1/1/2000 97 345
2/1/2000 85.9 349
3/1/2000 85 344.1
4/1/2000 85 339.2
2. Open the R script (FitModelCorrection.R). (Note: You may need to install libraries specified
in the script, if not previously installed).

3. Change the name of station (in this case, the name of the folder Hkamti) and the work directory
inside the script.

4. Run the script, CTRL+ALT+R can be used to run all the commands simultaneously.

5. This process will create two time steps for calibration of model and validation. The time series
of model and observed data are now compared. The Autocorrelation function (ACF) showing
correlation of series with itself at different lags and Partial Autocorrelation Function (PACF)
showing amount of correlation at any lag that is not explained by lower order correlations.
A positive autocorrelation has an “AR” signature and negative autocorrelation has a “MA”
signature. If there is already a zero or negative autocorrelation at lag 1, there is no need to
difference again. In this case, the script itself takes care of everything.

This generates an error model which works as a workspace for FitModelCorrection.R model
to fit the simulated and forecasted data as per the observed data series. (Note: errorMod.RData
is a workspace which is essential to correct the forecast data).
The inspection of effectiveness of the correction during calibration and validation period
(adjustable) can be done through goodness of fit. in this case, goodness of fit is calculated and
displayed in the same plot as stationnameFitGOF.png.

6. To have a time series of corrected forecasted discharge data, generate a new csv file with
StationnameQfor.csv (eg; HkamtiQfor.csv in this case) and arrange the data in similar fashion
as StationnameQfit.csv. Also, you need to provide a file for rating table as HkamtiRT.csv in
the same folder.
 Open the script CorrectModel.R and change the name of station and working directory as
same as for FitModelCorrection.R.
 Run the script and notice that a new file with name StationnameQHfcst2.csv is generated
inside which time series of corrected discharge is recorded.

7. For correction of forecasted discharge, follow the same step as step 6 and keep the time series
of forecasted discharge with values and keep NA in the Qobs. Column as below:
Date,Qout,Qobs
01/12/2013,52.57371191,495
02/12/2013,49.2749692,498
03/12/2013,46.18320643,473
04/12/2013,43.28543661,473
05/12/2013,40.56948765,462
06/12/2013,38.02395117,457
07/12/2013,35.63813463,445
08/12/2013,33.40201638,450
09/12/2013,31.30620358,453
10/12/2013,29.34189277,440
11/12/2013,27.50083283,428
12/12/2013,25.77529038,423
13/12/2013,24.15801726,420
14/12/2013,22.6422201,403
15/12/2013,21.22153178,387
16/12/2013,19.88998468,398
17/12/2013,18.64198565,392
18/12/2013,17.47229243,393
19/12/2013,16.37599173,402
20/12/2013,15.34847852,364
21/12/2013,14.38543673,387
22/12/2013,13.48282109,374
23/12/2013,12.63684016,388
24/12/2013,11.84394039,348
25/12/2013,11.10079119,349
26/12/2013,10.40427095,345
27/12/2013,9.751453954,344
28/12/2013,9.139598019,347
29/12/2013,8.566133046, NA
30/12/2013,8.028650188, NA
31/12/2013,7.524891745, NA

 The rating table file should be like:


H Q
1 0
1.1 15
1.2 25
1.3 35
1.4 45
1.5 55
 Then Run the same script CorrectModel.R with same procedure of step 6 and the corrected
forecast discharge can be obtained from the resulting file StationnameQHfcst2.csv.
9. Hydraulic modeling
The basic purpose of a hydraulic model is to have information of width and depth of inundation
for a provided magnitude of flow/flood in any given river. HEC-River Analysis System (HEC-
RAS) is also one freeware like HEC-HMS but does the inundation mapping for steady (flow
conditions like velocity, pressure, cross-section may change with space, but not with time) and
unsteady flows (the flow conditions change with time as well).
The objective of this part of modeling is to learn the use of HEC-GeoRAS (similar GIS platform
preprocessing tool as HEC-GeoHMS for HEC-HMS) to export the river/reach system to HEC-
RAS, run the steady/unsteady conditions of flow and import them again for inundation mapping.
(Note: For hydrological modeling, 90m or 30m resolution DEM is suitable enough to get the
hydrological analysis. However, for hydraulic modeling, we need high resolution DEM (5m or
10m resolution DEM)

Requirements:
Software:
Your computer must have a Windows Operating System and should be installed with Arc-GIS or
ArcMap (10.x) and same version of HEC-GeoRAS and HEC-RAS. You can download the latter
two from HEC official website http://www.hec.usace.army.mil/software/hec-ras/downloads.aspx.
For this tutorial, we have selected HEC-RAS 4.1 version. The HEC-GeoRAS should be compatible
with the GIS platform you have. If you have ArcMap 10.1, HEC-GeoRAS 10.1 must be installed.
You may download it from http://www.hec.usace.army.mil/software/hec-georas/downloads.aspx.
Install HEC-GeoRAS and HEC-RAS on your system first before proceeding with any other steps.

Data requirement:
The essential data to run HEC-GeoRAS is a DEM from which Triangulated Irregular Network
(TIN or terrain data) can be obtained. The other river components are defined from HEC-GeoRAS.
Land use information is also required which can be downloaded from FAO or other sources like
esa landcover map downloaded from http://due.esrin.esa.int/page_globcover.php.
If available, aerial photographs of the area of interest are useful in identifying the features. Also,
if available, cross-section survey data can also be used to correct or validate the cross-sections
derived from provided TIN data. Also, information regarding bridges and existing structures along
the river will help to setup model correctly.
For this example, we will use a 30m DEM acquired from USGS (ASTER DEM). (It is provided
to you along with the training materials.)
Geo-RAS PreProcessing
Open a new window of ArcMap and load the required DEM (new 30m resolution DEM), basin
shape file of Chindwin and Chindwin river (Subbasin155.shp and River155.shp) and Hydrological
stations (Hydro_chindwin_proj.shp).
Save the document as georas_chin.mxd. You need to activate menu bar for HEC-GeoRAS (just
like HEC-GeoHMS). For this, you can right click on the menu bar and click HEC-GeoRAS from
the list of options. The bar looks like this:

If some options in this windows are not active, the reason might be that you have not enabled the
required extensions. Go to Customize and then go to Extensions. Enable the 3D Analyst and
Spatial Analyst boxes by checking them. Close it afterwards.
Notice that HEC-GeoRAS has four menu options viz., RAS Geometry, RAS Mapping, ApUtilities
and Help. Seven buttons for Assign ReachCode, Assign FromStation/ToStation, Assign
LineType, Construct XS Cutlines, Plot Cross Section and Assign Levee Elevation are also there.
The RAS Geometry menu contains preprocessing part of HEC-RAS while RAS Mapping is done
afterwards or is a post processing step to generate an inundation map. The ApUtilities menu
contains options of Data Management. Help option provides help for each section.
Steps:
Before starting any HEC-GEORAS steps, we need to assure that the DEM we are using has no
voids. So, we can fill it first. Fill the DEM like we did in Chapter 1.
Now, notice that there are patches of DEM which are not required (unlike hydrological modeling,
we only need the DEM around river and its buffer zone). If the DEM could be made smaller, our
resources can be saved. For this, we need to mask the DEM. (We will be proceeding for river
analysis of reach beneath Minkin up to the confluence of Ayerawaddy and Chindwin river. So,
DEM beneath the confluence and unnecessary areas can be masked with a proper shape file.)

Masking DEM:
Before clipping the DEM, we need a shape file which will cover the river to be analysed along
with its potential flood plain. For that, we will create a new shape file and digitize it around our
area of interest.
To create a new shape file, go to Arc Catalog, Right Click any folder location and click New,
Shapefile. Specify that it is a polygon and its coordinate system. Click ok.
Right Click the shape file added to the document, Click Edit Features and Start Editing. Click
on Create Features on Editor tool bar. Start digitizing.
Click Editor, Save Edits and Stop Editing.
Go to Spatial Analyst Tools in ArcToolBox and click Extraction. Then, click Extract by Mask.

Select the DEM and shape file of Babai as the masking feature. Go to Environments and Click
on Processing Extent, Select As layer boundary. Save the output raster in your destination folder.
(Note: Remember, the smaller the area, the faster it would process and less chances of error as
well. For our case, we will be focusing Minkin as initial boundary condition, so we can eliminate
the DEM upper than Minkin and lower than the confluence. The DEM would accordingly be
smaller, Say ras_dem.tif)
1. Generating a TIN data
TIN (Triangulated Irregular Network) model represents any surface as a set of contiguous, non-
overlapping triangles. Within each triangle, the surface is represented by a plane. It has an ability
to describe surface at different level of resolution than any other grid data.
Raster To TIN:
Go to 3D Analyst Tools, Click on Conversion, From Raster and Raster To TIN.
Specify the Input Raster and the Output TIN location and name. Other fields can be left as it is.

Click Ok. The output will be added automatically in the windows of ArcMap. If the resulting map
do not have distinct elevation or it gives a notification that maximum number of points have been
reached, then you need to increase the maximum number of points.
For further distinction of elevation, you can either do the process again and increase the Maximum
Number of Points or change the classification band and color. Note that, the first one will
consider, much detailed analysis and will also take considerable time. So, we can try with second
option.
Go to Properties option of the layer which can be accessed by right clicking the TIN file. Go to
Symbology and Uncheck Edge Types.

You can change the number of Classes and the break interval through Classify.
Check the Elevation and Edge Types box back and Apply the classification and press OK.
(Note: it will take some time to appear, as this time it will be distinctly clear color for elevations.
If it takes a lot of time to display TIN or it is not showing properly, then we can off the layer for
TIN and digitize further using DEM itself. But a proper identification of river network and its
floodways is necessary)
2. Setting Up Analysis Environment
(Before proceeding, one thing you must take care is the consistency of projection system.)
Go to RAS Geometry, Click Layer Setup and Specify the Required Surfaces as Single and
select Terrain Type as TIN and Specify tin. Click OK.
3. Creating RAS Layers
The geometry files in HEC-RAS consist information on river features like reach, river banks,
cross-sections, bridges, levees, etc. All this information is created and populated during pre-
processing stage of HEC-GeoRAS. All this information will be stored as a feature file in HEC-
GeoRAS. So, we need to create some blank files for each river feature. There are two approaches
to create these feature files for river features.
Option 1: You may go to RAS Geometry, Create RAS Layers and Click ALL. This will open a
window Create All Layers. Accept the default names and Click OK.
Option 2: For convenience, we will add the features one at a time only. So, we will have only the
required feature files that we will be digitizing for the selected reach.
(For e.g., if we need river file, we will go to RAS Geometry, Create RAS Layers and Click Stream
Centerline) This way, only one file will be added to the data frame.
HEC-GeoRAS creates a geodatabase in the folder where the ArcMap document has been saved
and gives the same name as of map document to the database (georas_chin.mdb, in this case). All
these feature files will be saved in this geodatabase.
(Note: Till this phase, the feature class we added, do not have any information. So, we will edit
each feature and populate them with proper information)

4. Editing RAS Layers

4.1. River Centerline


A river centerline is used to represent a river reach network for HEC-RAS. So, we will start
digitizing river centerline along the middle of the river starting from Minkin (which will be our
upper boundary). It must be digitized from high to low or the direction of flow. Zoom in and out
for convenience.
Add the river centerline by clicking RAS Geometry, Create RAS layers and selecting Stream
Centerline. Accept the default names and note that a feature file named River will be added to
the document. To start editing, right click the River class and click Edit Features and Start
Editing. In the Editor toolbar, click on Create Features and Select Line from the Create Features
window.

Start digitizing by selecting straight segment tool on Editor and proceed in the direction of flow
along the centerline of river. Double click to terminate that reach.
(Note: If there is a junction, stop digitizing the reach at junction and start digitizing the tributary.
Snap the tributary’s end to this major reach’s endpoint. The starting point of lower reach will be
this snapped junction. To enable Snapping, open Editor, Click Snapping and it will open a
Snapping Toolbar. If your GIS version is 10.2 or higher, there is no need of doing this as the
straight segment itself has snapping automatically. Snapping is necessary for reaches having
junction/s.)

After all reaches are digitized in similar fashion, Click the Editor tool, Save Edits and Close
Editing. We need to name the reaches now. Each river in HEC-RAS must have unique river name
and each reach within a river also needs to have unique reach name. To assign names to reaches,
Click on Assign RiverCode/ReachCode button to activate it as shown below:

Assign the river name and reach name respectively and Click OK. Now, Right Click the River
feature and check the attribute table. You will notice that some of the information has been
populated.
Still some other information are not populated. Before we proceed,now lets save the document
and also populate the remaining attributes from this class. Click on RAS Geometry, Stream
Centerline Attributes and click All.

Confirm the terrain surface and stream centerline. The Stream Profiles would be generated on the
basis of these information. Click OK.
Click the attribute table and check the information which is populated.

Note that a new feature by the name of River3D is added to the map which is a 3D version of the
reach we just created.

4.2. River Banks


Bank lines are used to separate main channel from the floodplain. They are generally at a higher
altitude than river centerline and have different properties than river centerline (for e.g.: higher
values of Manning’s n). It is created in a similar fashion of river centerline.
Add a feature for Bank line like we added River feature.
Right Click on Banks feature class, Edit Features and Start Editing. Create new features and select
line segment and use a straight segment to delineate left banks first and right bank later. Some
general guidelines of delineating a river bank are;
 Start from upstream towards downstream, i.e.; direction of flow
 Digitize left bank first and then right bank
 Even though some continuous bank lines are possible, follow the guideline of river
centerline

4.3. Flowpaths (Floodplain)


Flow path layer contains three types of lines; centerline, left over bank and right over bank. The
flowpath lines are used to determine the downstream reach lengths between cross-sections in the
main channel and over bank areas. For this purpose, we can use our river centerline which was
earlier created as flow path centerline. Click on RAS Geometry, Create RAS layers and Flow
Path Centerlines.
Click Yes and choose Streamline as existing River and Flow Path Centerlines as Flowpaths. Click
Ok. To create the right and left flow paths, start digitizing in the same fashion. The left flow path
is digitized first for each reach. You have to save and stop editing after you complete left and right
flow paths.

Now, label the flow paths by using the Assign Line Type button. Click on the button and
click on left flowline. And select line type as left from the window.
Similarly, repeat this process for right flow path as well. You can confirm this by looking at the
attribute table of the Flowpaths. All the features should be populated.

4.4. Cross-sections
Cross-sections are most important inputs to HEC-RAS. It extracts the elevation data from the
provided TIN data to create a ground profile across the channel flow. The intersection of cross-
section cut lines with other RAS layers such as centerline and flow path lines are used to compute
the HEC-RAS attributes such as bank stations (which separate channel from floodplain),
downstream reach lengths (distance between c/s) and manning’s n.
Rules to create cross-sections:
 They should be digitized roughly perpendicular to the direction of flow
 One cross-section cannot crisscross the other
 It must span the entire floodplain extent
 They are always digitized from left to right (looking the direction of flow)
 A good practice is spanning the c/s across almost equal distance
 Provide cross-sections on the immediate u/s and d/s of existing hydraulic structure
 It cannot cross the same feature twice (e.g.; in meandering, be careful not to take cross-
section which would cross the flow path more than once)
To create cross-sections, use the same technique of digitizing, but with the rules defined above.
Right click the Cross-sections feature, Edit Features and Start Editing. Select the Line and straight
segment and start digitizing from the left side.
As discussed above, the cross-section should be wide enough to cover the flood plain. To visualize

if the constructed cross-section covers the entire floodplain or not, click on from the HEC-
GeoRAS toolbar and click on the desired cross-section.
Check if either side has enough distance. If this does not have sufficient length, you need to edit
the cross-sections again. Normally, it can be kept four or five times wider than channel banks.
The next step is to add the HEC-RAS attributes to these cut lines. We will add Reach/River name,
station number along the centerline, bank stations and downstream reach lengths. This all
information is extracted when the cross sections intersect with other layers. Click on RAS
Geometry, XS Cutline Attributes, All.
You might have noticed that a feature with XSCutlines3D feature class will be added to the
document. See that the feature class is now PolylineZ.
(Note: Check the attribute table of XSCutLines3D and if you find any feature unpopulated or
negative, you need to edit those cross-sections first. Chances are that you didn’t follow the rules
properly)

4.5. Bridges and Culverts


After creating cross-sections, the next step is to define bridges, culverts and other structure along
the river. Use of aerial photographs to define the bridges or hydraulic structures existing along the
river system helps to digitize them. The additional information regarding bridges like its length,
width, number and types of piers, etc. are also required. However, in this example, we have not
considered any bridges.
Ineffective areas and obstructions/blocked objects are also required in case there are any still water
areas or buildings on the flow way. In this case, no any such features are existing.

4.6. Assigning Manning’s N value to the cross-sections:


Before exporting the GIS data to HEC-RAS geometry file, we need to assign the Manning’s n
value to individual cross sections. For this purpose, we need to have a land use map and respective
n values for that land use.
If you have a land use map generated by a local agency for your area of interest, you may use that
one. But if you do not have any high-resolution land use maps, then you can use freely
downloadable global land use map for 2005 AD and 2009 AD from globcover website as discussed
in Chapter 1. The resolution of this map is 300m which is highest for any other freely available
gridded land use maps. You can also check the availability of land use or land cover map for your
area of interest from FAO website. Generally, they have freely available land use map in shape
file format for any area of interest.
(Note: If you have a gridded dataset for land use map, first clip the raster for your area of
interest only. Afterwards convert the raster into a shapefile (Reclassify if required). Add a field
in the shapefile and put manning’s n value there. The manning’s n values for different land use
types can be inferred from the following table:

Using this technique, we can generate the land use map with required manning’s n value for
each land use class.)
For this example, we will use the land use map provided by UNEP. It is provided to you in the
folder “Shapefiles”. Use the land use map of Myanmar in same projected coordinate system (UTM
46N). (Note: All your maps and layers in the dataframe must be in same projection coordinate
system.)

Right click the land use shape file and check if it has the manning’s n value in its attribute table or
not. If it does not have manning’s n value for each land use type, then you need to put manning’s
n value to the shape file. Based on a descriptive field identifying the land use type, join manning’s
n table which can be abstracted from the provided table above or from U.S Department of
Transportation Federal Highway Administration, Bridges and Structures website
https://www.fhwa.dot.gov/engineering/hydraulics/library_arc.cfm?pub_number=47&id=138 .
For this example, we have a land use shapefile with Manning’s n already assigned as shown in
figure.
Go to RAS Geometry, Click on Manning’s n Values and Extract N Values.

Select Land Use as landuse_myanmar (land use map of Myanmar). Select the Manning Values
in Landuse Layer as Manning_s. Leave the default XS Cutlines and XS Manning Table names.
It will take some time and the XS cutlines will be assigned with Manning’s n values for intersection
of cross section with land use polygons. A dialog box will appear as shown below:

Open the Manning table and see how the values are stored.

Close the table. We are almost done with GeoRAS preprocessing. The last step is to export the
GeoRAS file to the HEC-RAS so that geometry of the river is imported for analysis. Go to RAS
Geometry and click Layer Setup. Verify the layers for each tab. Make sure the Single Terrain
option is selected and proper tin file is saved.
Click OK. Go to RAS Geometry, Export RAS Data.

After the export is complete, close the window and notice that some files are created.

The two files GIS2RAS.xml and GIS2RAS.RASImport.sdf will be used to import the geometry in
HEC-RAS. The final layers in ArcMap will look like this:
Save the map document and close it.
HEC-RAS processing
Install and open HEC RAS 4.1.

Save Project As Chin_hecras.prj in your working folder. Click OK.

Let’s import the GIS data which we recently created in HEC-GeoRAS. Go to Geometric Data

editor by clicking tool. Now click on File, Import Geometry Data and click GIS Format.
Browse the file we created recently. Click Ok.

It will open new window a new window in which you can specify the units. You can select US
Customary Units or SI (metric) Units and click Next.
This will lead to a window where you can select reach to import into HEC-RAS. Select Next.

Check all the boxes for streamlines as we want to import all and click Next.
Confirm the cross-section data also and make sure all boxes are checked. Click OK. Leave defaults.
Since we do not have any other feature to import, it will open geometric data editor window. Save
the geometry file by clicking File, Save Geometry Data. It is always advisable to check the
information we import from HEC-GeoRAS before moving forward. For this, Graphical Cross-
Section Editor can be used. Go to Tools, Graphical Cross Section Edit…

You can adjust the bank stations also if they seem to need some adjustment.
You can use the editor tool as shown in the right bottom of the figure above to change the bank
stationing, manning’s n, delete ground points or structures.
Generally, when we are creating cross-sections, a lot of redundant points are there which can be
removed using cross-section filter. In the Geometric Data Editor, click on Tools, Cross Section
Points Filter.

In the Cross Section Point Filter, select the Multiple Locations tab. From the River drop, you can
select the river (in this case, only one) and reach (only one in this case) and click the arrow button
to select all cross sections. Click the tab of Minimize Area Change and limit the points in
cross-section. This tab doesn’t change the area of the cross section much even when the points are
being removed. Click OK. You will get a summary of points removed from the filtered cross-
sections. Close the summary result box.
Next, click on the geometry editor and select any cross section. You should be able to see the cross
section in the window along with bank stations, manning’s n for the cross-section and other
information.

Once you are Ok with this, you can save the geometry file. Go to File, Save Geometry Data. Fill
the information in window and save it by clicking OK.
(Note; Sometimes, even though you have selected SI as the units, it might still show in US
customary. So, you can always go to Options and Unit Systems to change it to SI.)
Flow data and Boundary Conditions:
Flows are defined at the upstream location for each reach/river and at junctions. Sometimes,
depending on situations, we must define additional flow points. In this example, we have such
condition where we will define additional flow change location. Minkin will be the upstream initial
condition and Monywa will be the flow change location.
Let us create some hypothetical profiles. Profile is just a flow value in HEC-RAS. We can choose
5, 10, 25,50 or 100-year return period flood depending on the availability of long term maximum
discharge data. Assuming, Minkin’s 20-year flood is 21000 cumecs and Monywa is 25000 cumecs.
(Just a guess. But in actual, you must compute it based on fitting best probability distribution
function). We will now run the model for steady simulation using this as the flow value.

Go to View/Edit Steady Flow Data and a window will open like this: You can enter as many
profiles you like. But, in this example, lets proceed with just one profile.

This flow condition is an upstream condition. Click on Add a Flow Change Location and select
the cross section next to the Monywa station.
To define the downstream boundary, click on Reach Boundary Conditions and Select the
Downstream and Click Normal Depth and assign the slope of downstream. You can provide Rating
curve or known water surface when you have real data and you know the upstream value of flow
and corresponding water surface for downstream.
For now, assign some value for downstream slope. (Note that it is an iterative process to get the
slope correct.)

Click OK and Save the flow data.

Close the Flow window.


Now, we will run the model. Go to Run, Steady Flow Analysis. It will open a window for Steady
Flow Analysis. Click Flow Regime as Subcritical. You can save the plan.
Click compute and you will see that a series of messages will appear and it will show completion
of running the steady analysis as shown below. Save it.
Close the window and now you need to map the results back in GeoRAS. Go to File, Export GIS
Data in main HECRAS window.

Select all profiles (only one in this case). Click Export Data and notice that it will create
Chin_hecras.RASexport.sdf in the folder defined.
Geo-RAS post processing

Open the ArcMap (if closed earlier) and click on Import RAS SDF file button which will
convert SDF file into an XML file. In the convert RAS Export SDF to XML, browse for the .SDF
file we just created. And convert the SDF file to readable XML format for post processing.

Now click on RAS Mapping, Layer Setup and open the post processing menu as shown below:

Give a name for new analysis and specify the tin file and output directory. Specify the .XML file
we just created. Click OK and Go to RAS Mapper and click Import RAS data. This will result
in a series of messages like when we were exporting the GIS to RAS during preprocessing.
This will create a bounding polygon which basically defines the analysis extent for flood mapping
which is defined by the end points of cross section lines.

Now that the extent of analysis is defined, we are ready to map the inundation extent. Click on the
RAS Mapping, Inundation Mapping and Water Surface Generation.
This will generate a TIN data for water surface elevation.

Click on RAS Mapping, Inundation Mapping, Floodplain Delineation using Rasters. Select
PF1 and click OK.
This will generate several messages during the process. During this process, the water surface tin
is converted into a GRID first and afterwards subtracts the GRID file of Chindwin. If you see any
negative values, then that means the area is not inundated (DEM is higher than the water surface).
If it is positive, that is the depth of inundation.

You can export this inundation map or use background base maps to get an idea which locations
are going to get flooded.
10. Decision Support System
Linux version of HEC-HMS
Download Link - http://www.hec.usace.army.mil/software/hec-hms/downloads.aspx
File - hec-hms41-linux.tar.gz
1. Installation
Extract File using command
command - tar xvzf hec-hms41-linux.tar.gz

Hec-Dssvue
Download Link - http://www.hec.usace.army.mil/software/hec-dssvue/downloads.aspx
File - hec-dssvue201-linux.bin.zip
1.Installation
Extract File using command
command – unzip hec-dssvue201-linux.bin.zip
then run hec-dssvue201.bin file
command - ./hec-dssvue201.bin
Automatized process
WRF data extraction and bias correction
Need to use scripting for automatized process, the scripts written in the language python. Need to
include java libraries for hec-hms and hec-dssvue to run and do scripting.
1. Get Subbasin wise WRF Forecast data for 3 days from link
http://www.rimes.int/services/10DAYSFCST/BASIN_DATA/MYANMAR/
For Ayeyarwady basin add AYE like /MYANMAR/AYE
For Sittaung basin add SIT
Forecast data inside folder, for example if folder Apr0117 then it contains data for 2nd April, 3rd
April, 4th April
The data is generated subbasin wise, data available with subbasin name such as W590, W700.

2. Bias Correction Rainfall


Bias correction is done using Rscript, the script using the library Qmap to find the best fit bias
corrected rainfall. The script uses calibration and validation of subbasin wise data and outputs the
best fit bias corrected result.
Feeding the corrected WRF data in the calibrated HEC-HMS model
1. Update the HEC-DSS database file
The input file for HEC-HMS model needs to be updated with precipitation data. This is done using
python scripts with HEC-DSSVUE library. The bias corrected rainfall data is added to the input
file of HEC-HMS model.
For Ayeyarwady model
Input File - Ayeyar_HMS_Timeseries.dss
Rainfall path - /AYEYARWADY/LOCATION/PRECIP-INC//1DAY/OBS/

For Sittaung model


Input File - SittoungTimeSeries.dss
Rainfall path - /SITTOUNG/LOCATION/PRECIP-INC//1DAY/OBS./
2. Run HEC-HMS model in bash mode
command – hec-hms - s $PATH OF MODEL FILE

Simulation of discharges at predefined outlets

1. Read output HEC-DSS output file


Read the output file and extract the discharge/flow data simulated by the model. This is done by
using python script. The HEC-DSSVUE script will export the simulated data model station wise
in a excel sheet and then a script in python will read the excel sheet and prepare the data for Arima
error correction, stage calculation with respect to rating table.
The output file of Ayeyarwady model
File - Run_6.dss
Flow-combine path - //STATION/FLOW-COMBINE/DATE/1DAY/RUN:RUN 6/
STATION – station name, DATE – HEC DATE for example 01SEP2015

Application of error correction model and generating corrected discharge


1. Arima Error Correction
The simulated discharge from the model is then corrected using Arima error correction script. The
script is written in Rscript using hydroGOF and forecast library.

Generation of forecasted water levels and corrected discharges for predefined stations
Rscript is used here to generate water level data from discharge with station rating table data. The
Rscript will generate output and then it is inserted or updated to the database.
Access of respective hazard map from the archive corresponding to the forecasted discharge
and water level
The hazard maps generated with 50 years return period or 100-year return period can be uploaded
and viewed in the web based decision support system.

Generation of advisory and advisory dissemination


The advisory is generated with station water level threshold for warning and danger level. If station
water level crosses the threshold set, then it generates some advisories and it can be send to group
or individual person. This is available in the web based decision support system.

Updating/Editing the Threshold Level


The Data panel in the web based decision support system user can set threshold level for stations
of the HEC-HMS model.
Data process showing in flow chart
Web-based Flood decision support system
It is a web based visualization tool.
Outline of system

It consists of overview page, observation page, advisory page,


data panel page, climate monitoring page, river monitoring page,
flood hazard map page and archive page.
The Data panel section consists of many sub sections and only
the administrative user will be able to access theses section, the
normal user will only be able to view the list of stations available
in the system.
The Advisory section only the administrative user will be able to
send advisories to groups and individual person. The normal user
will only be able to see the advisory.
The Archive section only the administrative user will be able to
download rainfall, discharge and water level data. The normal
user will not be able to download rainfall, discharge and water
level data.

a. Overview Page
This panel consists of hydrological forecast section. It consists of Chindwin, Ayeyarwady and
Sittaung river basin. It consists of 3 day forecast of discharge and water level data of hydrology
station available in the river basins. In the future 10 days ECMWF and Glofas forecast data will
be added to get a lead time of 10 days.
This panel consists of bias corrected rainfall forecast tab and a wrf rainfall forecast tab. The bias
corrected rainfall is 3days forecast data. The discharge and water level data is also 3 days
forecasted result from HEC-HMS model.
b. Observation page
This panel consists of past week data for Simulated, Observed and Corrected Discharge Q (m3/s)
and water level H (m). It also consists of next three days Simulated, Observed and Corrected
Discharge Q (m3/s) and water level H (m).
It also shows graph comparison of Simulated, Observed and Corrected Discharge Q (m3/s) and
water level H (m) and its basin and then station selection.
c. Advisory Page
This panel consists of warning level notification, such as whether station is below warning level,
at warning level or at danger level. In this page, we can send email to a certain group or any person.
d. Data Panel
This panel consists of adding, editing, updating and deleting data.
 Stations panel consists of a list of all stations
including rainfall and hydrological stations.
 Stations List panel consists of list of all stations
available but no editing option.
 Upload Rainfall Data panel you can upload
observed rainfall data.
 Upload Stage & Flow Data panel you can
upload observed discharge data.
 Upload Stage Data panel you can upload stage
data and it will generate flow automatically.
 Upload/Update Rating Table panel you can
upload or update rating table via csv file.
 Information Receivers panel you can add
receiver user to receive e-mail notification.
 Receiver Group panel you can add group for
receivers.
 Water Level panel you can add threshold level
for all stations.

e. Data Panel information


Stations panel – Stations information adding, editing, updating and deleting.
Upload Rainfall Data Panel
The observed rainfall data of rainfall station
will be uploaded or updated in this panel.
Only CSV format file is allowed in certain
field and format. The file should contain
datadate, dataRF field and the date format
should be yyyy-mm-dd, the filename should
be station name and ending with RF for
example for station Aunglan –
AunglanRF.csv

Upload Stage & Flow Data Panel


The observed stage and flow data station
will be uploaded or updated in this panel.
Only CSV format file is allowed in certain
field and format. The file should contain
datadate, dataQ and dataH field and the date
format should be yyyy-mm-dd, the filename
should be station name and ending with
QHobs for example for station Aunglan –
AunglanQHobs.csv
Upload Stage Data Panel
The stage data of station will be uploaded or
updated in this panel. In this section it will
automatically generate discharge with
respect to rating table of the station. Only
CSV format file is allowed and certain field
and format. The file should contain datadate,
dataH and dataQ field and the date format
should be yyyy-mm-dd. The dataQ field
should contain NA since it will generate
discharge value. The filename should be
station name and ending with QH for
example for station Aunglan –
AunglanQH.csv

Upload/Update Rating Table


The rating table of station will be uploaded or updated in this panel. Only CSV format file is
allowed and certain field and format. The file should contain H and Q field. The filename should
be station name and ending with RT for example for station Aunglan – AunglanRT.csv
f. Climate monitoring
In this panel, you can see station wise observation rainfall data.

g. River monitoring
In this panel, you can check HEC-HMS model performance by comparing simulated, observed
and corrected discharge and water level data.
h. Flood hazard maps
This panel consists of upload hazard maps, satellite images, images taken during flood. Users will
be able to upload images with metadata, upload hazard map with metadata and upload hazard map
with metadata.
Users will be able to see images and hazard map that were uploaded to the system.

i. Archive
This panel you can download, station wise discharge and water level data for simulated, observed
and corrected. The Users will be able to view simulated, observed and corrected data and
download/export data of the particular station.
Users can select river basin and then with respect to river basin the hydrological stations to
visualize the water level data and also to export the data to a csv file.
Users can select river basin and then with respect to river basin the hydrological stations to
visualize the discharge data and also to export the data to a csv file.

You might also like