Calculating the volume and surface of a reservoir using ArcGIS

Today we are going to see how to calculate the volume of a projected reservoir which will be created with the construction of the Villafria Dam in Northern Spain using ArcGIS and its extension 3D Analyst.

We start by activating the 3D Analyst Extension: Customize menu > Extensions…

The data required for this exercise is a contour map with lines placed at 5-meter intervals (contours.shp).

The first step is converting the contour map into a Triangulated Irregular Network (TIN):

Open ArcToolbox > 3D Analyst Tools > TIN Management > Create TIN

Select “contours.shp” as the Input Feature Class and name the Output TIN “terrain_tin”:

The process of creating a new TIN may take several minutes, depending on the accuracy of the output surface.

We now have to select the contour that marks the reservoir maximum level, which according to the project is set at 1115 meters above sea level. So, we select the 1115 contour using the Select by Attributes tool: “Selection menu > Select By Attributes…”. In the query builder enter data as shown in the next picture:

Please note that, in order to perform surface and volume calculations, the contour must be closed, so unselect all other contours that are not linked to the actual reservoir surface.

This contour gives us an idea of the maximum flood extent of the reservoir once filled at its full capacity.

The next step is converting the selected contour (Polyline) into a Polygon: ArcToolbox > Data Management Tools > Features > Feature To Polygon.

And finally, to calculate the total volume of the reservoir, we use the tool “Polygon Volume”: ArcToolbox > Terrain and TIN Surface > Polygon Volume.

Select “terrain_tin” as the Input Surface; “1115_contour_polygon” as the Input Feature Class; and Height Field as “CONTOUR”. In “Reference Plane”, make sure that BELOW is selected. The volume and surface area will be calculated below the reference plane height of the polygon. Leave the rest as default:

Click on OK. The calculation process may take several minutes.

As the final result, two new fields are added to the “1115_contour_polygon.shp” attribute table:  “Volume” and “SArea”. In this case, the result is in cubic meters, since we are working in International System of Units (S.I.): 11,834,796.1198 (equivalent to 11.835 cubic hectometers). We also got the total flooded surface: 1110140.5082 square meters (equivalent to 111.014 hectares).


Tuolumne Meadows floodplain modeling using HEC-GeoRAS (Part 2)

After a long break, today we will see how to import the preprocessing data generated in part 1 of this tutorial into the HEC-RAS hydraulic model.

HEC-RAS is a program that models the hydraulics of water flow through natural rivers and other channels. It is one-dimensional, meaning that there is no direct modeling of the hydraulic effect of cross section shape changes, bends, and other two- and three-dimensional aspects of flow.

Launch the HEC-RAS program (Start > Programs > HEC > HEC-RAS > HEC-RAS 4.1.0). The latest version of HEC-RAS can be downloaded for free from the Hydraulic Engineering Center of the U.S. Army Corps of Engineers website:

Save the new project. Go to File > Save Project As… Save it as “TuolumneMeadows.prj”

To import the GIS data into HEC-RAS, first go to the geometric data editor (Edit > Geometric Data…)

Next, click on File > Import Geometry Data > GIS Format. Browse to “GIS2RAS.RASImport.sdf” file previously created in HEC-GeoRAS, and click OK. An import wizard will appear. In the first window (Intro), confirm SI (metric) units.

In the second window (River Reach Stream Lines), make sure all import stream lines boxes are checked, and click Next. Finally, in the last window (Cross Sections and IB Nodes), make sure all Import Data boxes are checked for cross-sections and click OK.

Click on Finished-Import Data. Data will be imported into the HEC-RAS geometry editor.

Save the geometry file as “TuolumneGeometry” (File > Save Geometry Data). Note that the geometry data has a .g01 extension.

To check the quality of the imported data, we can use the graphical cross-section editor (Tools > Graphical Cross-section Edit) in the geometric data editor.

We can use the editor to move bank stations, change the distribution of Manning’s n, add or delete ground points, etc.

The next step is to edit the structures data (bridges). Click on “Bridge/Culvert” editing button. In our case, we have two bridges at river stations RS=5360.882 and RS=4123.291.

Pacific Crest trail bridge (RS=4123.291)

We see that, in both cases, the deck elevation is not accurate. So we need to edit this information. Click on Deck/Roadway editor, and delete all information. Enter the data for the deck hight and low chords height as shown in the next picture:

Click on the “Copy US to DS” button to copy the information from upstream to downstream, and click OK. The result should look something like this:

Now, we will enter the pier information. The first bridge (RS=5360.882) has three piers and the second (RS=4123.291) has only one. Click on the “Bridge Design” button and enter the information as shown below:

Close the bridge/culvert editor. Save the geometry data and close the geometric editor.

The next step is entering Flow Data and Boundary Conditions. In HEC RAS every flow to be simulated is called a profile. In our case, we are going to define one profile that represents the peak discharge at the most upstream location of the Tuolumne River for a 100 year return period.

100-year flows were assessed from a frequency analysis of the nearest gauging station data (1127479 Tuolumne River at Grand Canyon of Tuolumne), with the maximum discharge being 200 cubic meters per second (7060 cubic feet per second). Enter 1 for number of profiles, and click on “Apply Data”.

We now define flow conditions upstream for steady flow (Edit > Steady Flow Data). To define downstream conditions, click on “Reach Boundary Conditions”, select “Downstream” for Toulumne River, click on Normal Depth, and enter 0.001.

The basic computational procedure of HEC-RAS for steady flow is based on the solution of the one-dimensional energy equation. Energy losses are evaluated by friction and contraction – expansion coeficients.

To learn more about boundary conditions in HEC-RAS, check chapter 3 of the Hydraulic Reference Manual.

Click OK. Save the flow data as “TuolumneMeadows.f01”. Now we are ready to run the model.

In the HEC-RAS main window, click on Run > Steady Flow Analysis… Click on File > New Plan. Name it “TuolumneMeadows.p01” and “100year” for the short ID. Select the Subcritical Flow Regime, and click on “Compute”.

We have to take into account that the maximum number of station-elevation points is 500. The number of points can be reduced with the XS Points Filter Tool (geometric data window > Tools > Cross section point filter…).

Please note that this is a very simple example for educational purposes only. An actual hydraulic simulation requires the verification of results, identification of possible errors in the data and, if necessary, repetition of calculations. Assessment of the quality of the simulation comes with the knowledge of the study area and experience.

After the computation is finished, we will export the results to ArcGIS to delineate the inundation extent. To export the data to ArcGIS click on File > Export GIS Data… in the HEC-RAS main window. Leave the rest of the parameters as default.

Click on “Export Data” button. A “TuolumneMeadows.RASexport.sdf” file will be created in our working directory. Save the HEC-RAS project and exit. Now we will now return to HEC-GeoRAS to delineate the extent of the flooding.

Open ArcMap and load the “Tuolumne.mxd” file previously created. In the HEC-GeoRAS toolbar, click on “Import RAS SDF file” button to convert the SDF file into an XML file. Browse to “TuolumneMeadows.RASexport.sdf” and click OK.

Next, click on RAS Mapping > Layer Setup. In the Layer Setup window, select “New Analysis”  and name it as “Tuolumne Steady”. Browse to TuolumneMeadows.RASexport.xml for RAS GIS Export File. Select “Single” for Terrain Type, and browse to “tm_dem_flood” GRID. HEC-GeoRAS will create a geodatabase with the analysis name (Tuolumne Steady) in the output directory.

A new data frame named “Tuolumne Steady” will be added to ArcMap.

Click on RAS Mapping > Import RAS Data. This will create a bounding polygon by connecting the endpoints of the XS Cut Lines.

Click on RAS Mapping > Inundation Mapping > Water Surface Generation. Select PF1 as a profile and click OK. A TIN named “t PF 1” is created. This TIN define a zone that will connect the outer points of the bounding polygon.

And now the final step. Click on RAS Mapping > Inundation Mapping > Floodplain Delineation Using Rasters. Select PF1 (100-year flow), and click OK. At this stage the water surface TIN (t PF 1) is first converted to a GRID, and then the terrain surface (tm_dem_flood) is subtracted from the water surface grid. The area with positive results (meaning water surface is higher than the terrain) is flood area, and the area with negative results is dry.

The final result is a polygon (b PF1) which is the final flood inundation polygon, and a GRID (d PF1) that represents the final depth water map. We can see how the maximum water depth is 3.68 meters along the Tuolumne River main channel.

Ok, we are now done with delineating the 100-year flow floodplain in the Tuolumne Meadows area of Yosemite National Park.

Tuolumne Meadows floodplain modeling using HEC-GeoRAS (Part 1)

Tuolumne Meadows is a gentle, dome-studded sub-alpine meadowy section of the Tuolumne River, in the eastern section of Yosemite National Park (California). Yosemite National Park’s hydrologic resources include the headwaters of two magnificent rivers—the Merced and the Tuolumne. All of the creeks, streams, and lakes in Yosemite will eventually join with one of these two rivers.

Toulumne Meadows (Yosemite National Park)

The headwaters of the Tuolumne River arise on the slopes of Mount Dana near Tioga Pass and at the base of Lyell and Maclure glaciers, the two largest glaciers remaining on the west slope of the Sierra Nevada Mountains. The river and its tributaries drain the northern half of the park, passing through deep canyons like Matterhorn, Virginia, Lyell, and the Grand Canyon of the Tuolumne. The Tuolumne watershed is also home to two important California water sources: Lake Eleanor and Hetch Hetchy reservoirs.

We are going to use the HEC-GeoRAS extension of ArcGIS 10 for pre-processing of GIS data for flood inundation mapping. The only dataset required for HEC-GeoRAS is the terrain data (DEM). In our case, we are using a digital elevation model (DEM) of the Tuolumne Meadows area of Yosemite National Park derived from LIDAR that was flown in September, 2006. All above ground features such as vegetation, buildings, and constructed infrastructure has been removed through processing algorithms to leave a flat earth model of the ground surface.

This dataset can be obtained and used for non-commercial purposes from the U.S. National Park Service Integrated Resource Management Applications website:

Start ArcMap 10. Save the ArcMap document (File > Save As…) as “Tuolumne.mxd” in your working folder.

Enable ArcGIS’s Spatial Analyst and 3D Analyst extensions by clicking on Customize > Extensions…

Load the HEC-GeoRAS toolbar into ArcGIS (Customize > Toolbars > HEC-GeoRAS).

Create empty GIS layers using the RAS Geometry menu on the HEC-GeoRAS toolbar. Click on RAS Geometry > Create RAS Layers > All.

In the “Create All Layers” window, accept the default names, and click OK. HEC-GeoRAS creates a geodatabase in the same folder where the map document is saved with its same name. In this case, Toulumne.gdb (you may have to set pathnames in ApUtilities > Set Target Locations).

After creating RAS layers, these are added to the map document with a pre-assigned symbology.

Now we use the Editor toolbar (Editor > Start Editing) to digitize the main stream centerline in River layer (IMPORTANT: always digitize from upstream to downstream).

Digitize the main channel of the Tuolumne River.

Click on ID icon to assign River/Reach Codes and type in name the name of the river (in our case, we only have one river segment):

Click on RAS Geometry > Stream Centerline Attributes > Topology. This creates Topology, Lengths/Stations and assign Elevations to the Rivers/Reaches (creates “River3D” layer).

Again, we use the Editor toolbar (Editor > Start Editing) to digitize stream banks in Banks layer. Bank lines are used to distinguish the main channel from the overbank floodplain areas. Always start from the upstream end and, looking downstream, digitize the left bank first and then the right bank.

Click on RAS Geometry > Create RAS Layers > Flow Path Centerlines. Click “Yes” on the message box that asks if you want to use the stream centerline to create the flow path centerline.

Use the Editor toolbar (Editor > Start Editing) to digitize the left and right overbank center flowlines in Flowpaths layer. The left and right flowpaths must be digitized within the floodplain in the downstream direction.

Label the flowpaths by using the “Select Flowpaths and Assign LineType Attributes” button. click on one of the flow paths (left or right, looking downstream), and name each flow path (Left, Channel, Right).

Use the Editor toolbar (Editor > Start Editing) to digitize the cross sections locations for in XSCutlines layer (IMPORTANT: always digitize from left to right looking downstream).

Go to RAS Geometry > XS Cut Line Attributes to populate River/Reach Names, Stationing, Bank Stations, Downstream Reach Lengths and Elevations data for XSCutlines layer (this creates “XSCutlines3D” layer).

After creating cross-sections, the next step is to define bridges and other structures along the river.

Use the Editor toolbar (Editor > Start Editing) to digitize the bridges location for in Bridges layer. In our map, the Tuolumne River is crossed by the California State Route 120 (Tioga Pass Rd) and by the Pacific Crest Trail, as shown in the next picture:

Next, go to RAS Geometry > Bridge/Culverts to assign river/reach names and station numbers and elevations (this creates “Bridges3D” layer).

Ineffective flow area

Use the Editor toolbar (Editor > Start Editing) to digitize Ineffective flow areas. These are areas with water but no velocity within the floodplain. Areas behind bridge abutments can then be considered as ineffective flow areas.

Go to RAS Geometry > Ineffective Flow Areas > Position, to calculate the position of ineffective areas.

The final step before exporting the GIS data to HEC-RAS is assigning Manning’s n value to individual cross-sections.

We delete the LandUse layer (as we created it empty and has no data), and add the “Manning.shp” shapefile. This file store the Manning’s n values in its attribute table (N_VALUE field).

The data come from the Soil Resources Inventory (SRI) for Yosemite National Park area, which meets the geospatial requirements of the Soil Survey Geographic (SSURGO) database.

Click on RAS Geometry > Manning’s n Values > Extract n Values. Select “Manning” for Land Use, choose “N_Value” for Manning Field, “XSCutLines” for XS Cut Lines, name “ManningTable” for XS Manning Table, and click OK.

Finally, to create an import file into HEC-RAS, go to RAS Geometry > Layer Setup > Required Layers tab. The “Required Layers” tab should have River, XSCutLines and XSCutLines3D for Stream Centerline, XSCutmLines and XSCut Lines Profiles, respectively. In the “Optional Layers” and “Optional Tables” tab, make sure the layers and tables that are empty are set to Null, as show in the picture below:

Click on RAS Geometry > Export GIS Data. This process will create two files: “GIS2RAS.xml” and “GIS2RAS.RASImport.sdf”.

That’s it. We are done for now. In later posts I will explain how to import these data into the HEC-RAS hydraulic model.

Geostatistical modelling of Arsenic in groundwater

Arsenic in groundwater is largely the result of minerals dissolving from weathered rocks and soils. Arsenic is found naturally in rocks in the earth’s crust and it is recognized as a poison and a carcinogen substance. It occurs within organic compounds (combined with hydrogen and carbon), and within inorganic compounds (combined within sulphur, chlorine or oxygen). In water, arsenic has no smell or taste and can only be detected through a chemical analysis.

The concentration of arsenic in ground waters in the Duero river basin is relatively low, usually ranging from 0.01 to 0.02 milligrams per litre (mg/L). Sample data (arsenic_sample.shp) for this example comes from the central region of the Duero basin, where a recent investigation has shown increasing concentrations of Arsenic in various linked primarily to illegal agricultural practices.

By using ArcGIS Geostatistical Analyst, we are going to perform a prediction model for the data. In this case the geostatistical method that best suits the spatial variability of the data is the Kriging, which is an stochastic method.

A comparison of the different methods (deterministic and stochastic) can be found in the following document:

Open ArcMap. In the menu bar click on Customize > Extensions… and activate Geostatistical Analyst extension. Click close.

In the Geostatistical Analyst toolbar, click on “Launch Geostatistical Analyst Wizard” button. This will prompt the following window:

Methods: We must select the method by which you want to analyze the data, in this case is Kriging.
Source dataset: the shapefile on which we are going to carry out the modelling. In this case, we select “arsenic_sample”.
Data Field: The field you want to perform the analysis. In this case, the concentration levels of Arsenic in mg per litre (“AS_”).

Click next. If you get a message saying that two or more sample point exist at the same location, select the option “Use Mean”. The following window will appear:

In Kriging type, select “Ordinary Kriging”. Select “Prediction” as the output type.
In Transformation, select “Log”, since is necessary for the data to apply a logarithmic transformation.
In Order of trend removal, select “First”. To find out which order of trend removal is necessary for a given dataset, check this link:

Click Next.  The following window allows us to find out whether the data have directional anisotropy present or not. If the surface resemble a circle, there is no directional anisotropy. If there is anything like the figure, the data have directional anisotropy.

In the next window, we choose the following options:

Model: 1. Type = “Spherical”.
Anisotrophy = “True”.
Show Search Direction = “True”.

Now we must change the angle until the blue line that comes from the center of the Semivariogram Map match with the direction of the major axis of the ellipse, as shown in the figure:

Click Next. In the next window, we change the Maximum Neighbours to 12 and Minimum Neighbours to 6. As sampled locations get farther from the prediction location, their influence on the prediction location decreases. So, to speed up interpolation calculations, we can ignore those far-off sampled points.

Click Next. The Cross Validation window will appear. You can see a table showing a the model’s predicted data compared with the measured values, along with the Standard error.

On the lower-right side of the window you can see the prediction errors:

  • Root-Mean-Square: 7.356
  • Average Standard Error: 10,836.42
  • Mean Standardized: 0.002047
  • Root-Mean-Square Standardized: 0.0161

Click Finish. A report summary of the selected method will appear.

The final result is as shown below:

In the final result, we can see how there is a growing gradient in Arsenic concentration from East to West. Note the area colored in red, where the predicted concentration of Arsenic is greater than 15,3 mg/L, which in 300 times higher than the legal limit.

Anyway, we can try to refine the model by changing the model parameters. The lower the Root-Mean-Square (RMS), the better our prediction map is. To compare two models, we can right-click on the model in the table on contents and go to “Compare…”

Additionally, we can calculate the Standard Error Map. A Standard Error Map quantifies the uncertainty of our predictions. The larger the standard errors, the more uncertain are our predictions.

The IDE_DUERO Spatial Data Infrastructure

After a long process, it has been released the version 4.1.3 of the MIRAME IDE_DUERO Spatial Data Infrastructure (SDI). This is a tool designed as a support for decision-making in the development of the Duero River Basin Management Plan. Undoubtedly, this is a complex task that requires the management of diverse datasets from many sources, along with the application of different analytical methods to obtain objective results.

To facilitate this task, me and my colleagues at the Duero River Basin Authority have created this Spatial Data Infrastructure which allows to access to the basic information used in the development of the Basin Management Plan, both raw data and maps, through the viewer.


This is an instrument created with the support of the General Direction for Water of the Spanish Ministry of the Environment. It is in constant development, following the technical and methodological requirements established by the Directive 2/2007/CE, establishing an Infrastructure for Spatial Information in the European Community (INSPIRE).

The data has been set-up using the following technologies:

Geodatabase: Oracle Spatial versión 11g (Propietary License)

WMS and WFS Server: MapServer 5.2 (BSD License)

Catalog DB: GeoNetwork 2.2 (GNU GPL)

Glossary DB: Deegree 2.1 (GNU LGPL)


  • J2EE 1.4 – Apache Struts versión 1.3.8 (Apache 2.0 License)
  •  JDK 1.5 Compiler
  • Tomcat 6.0.18  (Apache 2.0 License)


  • GeoTools 2.4.4 (GNU LGPL)
  • Open Layers 2.7 (BSD License)
  • JQuery 1.2.8 (GNU GPL and MIT)

SDI Architecture

Additionally, it is possible to access the mapping server in ArcMap or to download shapes through the Web Feature Service (WFS). To add this service, open ArcMap 10. In the Standard toolbar, click on Add data… > GIS Servers > Add WMS Server. Type in the following string: Click on “Get Layers”. A list of the available layers and its capabilities will appear. Click OK.

Creating script tools with Python

As I previously said, we can export any ModelBuilder model to Python. Python is an open-source language that is used to automate computing tasks through programs called scripts. It is a high-level language, meaning you don’t have to understand the “nuts and bolts” of how computers work in order to use it. Python syntax (how the code statements are constructed) is relatively simple to read and understand. Finally, Python requires very little overhead to get a program up and running.

In the case of ArcGIS 10, it only works with the version 2.6 of Python (although there are higher versions of Python available). Python comes with a simple default editor called IDLE; however, I highly recommend you to use PythonWin Integrated Development Environment (IDE) to help you write code. PythonWin is free, has basic debugging capabilities, and is included with ArcGIS, but it’s not installed by default. Alternatively, you can download PythonWin (32 bits) in the following link:

Once we have exported our previous Watershed Delimitation model to Python, we can open it using PythonWin. It should look something like this:

# Import arcpy module
import arcpy

# Define paramaters
Output_surface_raster = arcpy.GetParameterAsText(0)
Output_flow_direction_raster = arcpy.GetParameterAsText(1)
Output_accumulation_raster = arcpy.GetParameterAsText(2)
Output_raster_1 = arcpy.GetParameterAsText(3)
Output_raster_2 = arcpy.GetParameterAsText(4)
Output_raster_3 = arcpy.GetParameterAsText(5)
Output_polyline_features = arcpy.GetParameterAsText(6)
StreamT_raster1_FeatureVerti = arcpy.GetParameterAsText(7)
Output_raster_4 = arcpy.GetParameterAsText(8)

# Local variables
dem = "C:\\Users\\Sergio\\Desktop\\BarrancoAras\\dem3"
Output_drop_raster = Output_surface_raster

# Process: Fill
tempEnvironment0 = gp.scratchWorkspace
arcpy.env.scratchWorkspace = "C:\\Users\\Sergio\\Desktop\\BarrancoAras\\modelbuilder.gdb"
tempEnvironment1 = gp.workspace
arcpy.env.workspace = "C:\\Users\\Sergio\\Desktop\\BarrancoAras\\modelbuilder.gdb", Output_surface_raster, "")
arcpy.env.scratchWorkspace = tempEnvironment0
arcpy.env.workspace = tempEnvironment1

# Process: Flow Direction, Output_flow_direction_raster, "NORMAL", Output_drop_raster)

# Process: Flow Accumulation, Output_accumulation_raster, "", "FLOAT")

# Process: Raster Calculator"Con(\"%Output accumulation raster%\" > 20000,1)", Output_raster_1)

# Process: Stream Link, Output_flow_direction_raster, Output_raster_3)

# Process: Stream to Feature, Output_flow_direction_raster, Output_polyline_features, "SIMPLIFY")

# Process: Feature Vertices To Points
arcpy.FeatureVerticesToPoints_management(Output_polyline_features, StreamT_raster1_FeatureVerti, "END")

# Process: Watershed, StreamT_raster1_FeatureVerti, Output_raster_4, "ORIG_FID")

# Process: Stream Order, Output_flow_direction_raster, Output_raster_2, "STRAHLER")

Take a look at line 2 of the code. Here we use the “import” command to import the ArcPy module into Python. ArcPy is a site package that is a successor to the arcgisscripting module of ArcGIS 9.2 and 9.3. This package provides a wide range of special scripting abilities included with ArcGIS 10.

From line 5 to line 13 we have to define the script parameters with the “arcpy.GetParameterAsText()” method. This allows the user to enter the path of the output datasets.

In line 16, we define the input dataset. In this case it has been hard-coded, meaning that it cannot be changed by the user.

From line 20 to 26 the script “environments” are provided, using “arcpy.env.scratchWorkspace” and “arcpy.env.workspace” tools.

Finally, from line 29 to 50 the geoprocessing tools are executed.  For more information on an specific tool syntax, you can open it through “System toolboxes” in ArcMap 10 Catalog window, and click on “Tool Help” in the lower-right side of the window.

Once you have checked the code for errors, save it on your local disk with a .py extension.

Now we are ready to implement the script code as a tool into ArcMap.

Open ArcMap 10. In the Catalog window, right-click on “My Toolboxes” > New > Toolbox.

Right-click on the newly created Toolbox > Add > Script… The Add Script wizard will appear:

Give the script a name (no spaces) a label and a description (optional) and click Next.

In the next window, browse to the .py file you previously saved and click Open.

The next window is the most important. Here we have to define what the nine script parameters are in the same order provided in the ” arcpy.GetParameterAsText()” code block. You see two columns: “Display Name” (the name that will be shown on the tool window) and “Data Type” (the type of data we have to provide as outputs). We leave “Parameter Properties” as default.

The window should look like this:

Click Finish.

Open the Script tool. It should look like this:


Watershed delineation using ArcGIS Model Builder

In previous posts we took an overview of geoprocessing in ArcGIS and we discussed about delimiting watersheds using the Hydrology tools of the ArcGIS Spatial Analyst toolset. Now we are going to automate this task by using Model Builder.

We will create an automated process to delimitate watersheds from a Digital Elevation Model. In this example, I will be using as an input the same DEM as the Watershed delineation using ArcGIS Hydrology tools.

1.- In the ArcMap 10 Catalog window, we browse to Toolboxes >  My Toolboxes. Right-click on MyToolboxes and select New > Toolbox, to add a new toolbox to which we give the name of ModelWatershed.

Right-click on ModelWatershed toolbox  > New > Model… This prompts the Model Builder window.

2.-  Right-click on the ModelBuilder window > Create Variable… In the list, we select “Raster Layer”. A blank oval is then added to the Model Builder window.

3.- In the Catalog window, we browse to System Toolboxes > Spatial Analyst Tools > Hydrology Drag and drop the “Fill”, “Flow Direction”, “Flow Accumulation”, “Stream Link” and “Watershed” tools to the Model Builder window.

Next, go to System Toolboxes > Data Management tools > Feature and drag “Feature Vertices To Points” tool to the Model Builder window.

Finally, go to System Toolboxes > Spatial Analyst Tools > Map Algebra and drag “Raster Calculator” tool (Map algebra).

4.- In the Model Builder window standard toolbar, select the “Connect” tool and link the previously added tools. The result will look like this:

5.- Right-click on each of the inputs and outputs (blank ovals), select the options “Model Parameter” and ” Add To Display”. The result is shown in the figure below:

As you can see, a “P” appears next to each input and output, indicating that they are “parameters”.

6.- Right-click on “Raster Layer” (the variable we created in step 2), select Open… A window prompts in which we have to select the DEM we want to use as an input. Once selected the input DEM you can see how some of the tools get activated by changing their color as shown below:

7.- Right click on “Raster Calculator”, select Open… A window will appear in which we must specify the minimum size of the cells covering the watershed. In this case we write the following expression: “Con (“%Output accumulation raster%”> 20000, 1)”. In this case, “Output accumulation raster” is the output of the “Flow Accumulation” tool and 20000 is the minimum size in pixels for a given watershed. In this example, the pixel size of the DEM is 5×5 m so the minimum surface of the resulting watersheds will be 100,000 m2 (this procedure avoid the creation of “microbasins”).

The Model Builder window will look like this:

8.- Right click on “Feature Vertices To Points”, select Open… In the field “Point Type (optional)” we specify “END”.

9.- We have to indicate the path in which the outputs will be stored.

Right-click on “Fill” tool, select Properties… In the “Environments” tab, activate “Workspace” and click on “Values…”. We must fill the “Current Workpace” and “Scratch Workspace” with the location of the folder where you want to save the outputs.

10.- Finally, we run our model by clicking on the “Run” tool of the standard toolbar. After a few minutes, this is the final result:

The main advantage of this model is that we can run it as many times as we want using different datasets.

Additionally, Model Builder allows you to export any model to Python code, where you can refine your model by adding advanced conditional and iterative logic. We will see in later posts how to do this.