Turn any ICM or SWMM model into a forecasting system in 60 minutes with ICMLive/SWMMLive (Includes scripts – for US users)

We are now in a time where models no longer sit on the shelf and become out-of-date in between master plans. Improvements in computing resources, software, and weather forecasts allow any model to turn into a real-time forecasting system.

ICMLive lets you connect your ICM model to live data streams (and SWMMLive for SWMM models), and then run it on an automated basis on a server to produce forecasts and alerts. Once you have a model, the rest is quite easy. If you’re an InfoSWMM user, you’ll need to export your desired scenario as a SWMM INP text file and import it into SWMMLive.

Prerequisites:
• InfoWorks ICM or InfoSWMM with a calibrated model
• Install ICMLive/SWMMLive, Workgroup Data Server, Live Server, and Data Loader (refer to our Getting Started Document)

The first requirement is a stream of forecast rainfall (though not all ICMLive users do forecasts). In the continental US, the National Weather Service provides free high resolution forecast data which can be downloaded regularly. In this example, we will use data from the High Resolution Rapid Refresh model (HRRR).

The following is a PowerShell script that can be modified to local coordinates to auto-pull the latest forecast data. Users can set ICMLive/SWMMLive to simply call this script whenever it pulls data on its set schedule.

#-----------------------------------------------------------
# Run this script ~20 min after each hour of the day.
# Edited Dec 5, 2017 by Nathan.Gerdts@Innovyze.com
#-----------------------------------------------------------
#-----------------------------------------------------------
# Specify Local Inputs -
# - time zone, Lat-Lon window, and Local file path
#-----------------------------------------------------------
$TimeZoneUTC = -7 # e.g. PST=-8, EST=-5 ...
$Lon_min = -124
$Lon_max = -123
$Lat_min = 37
$Lat_max = 38
$Local_Path = "D:\temp\"

#-----------------------------------------------------------
# Get current UTC hour in string format
#-----------------------------------------------------------
[int]$today = Get-Date -format yyyMMdd
[int]$Hour = Get-Date -format HH
$Abs_HR = $Hour - $TimeZoneUTC - 1 # minus 1 since they're usually posted about an hour late...
if ($Abs_HR -gt 24){ 
 $Abs_HR -= 24
 $today += 1
}
if ($Abs_HR -lt 10){
 $HR_char = "0" + $Abs_HR
} ELSE {
 [string]$HR_char = $Abs_HR
}
#-----------------------------------------------------------
# Prepare other inputs
#-----------------------------------------------------------
$str1 = "http://nomads.ncep.noaa.gov/cgi-bin/filter_hrrr_sub.pl?file=hrrr.t"
$str2 = "z.wrfsubhf"
$str3 = ".grib2&var_PRATE=on&subregion=&leftlon="+$Lon_min+"&rightlon="+$Lon_max+`
 "&toplat="+$Lat_max+"&bottomlat="+$Lat_min+"&dir=%2Fhrrr."+$today
$session = New-Object Microsoft.PowerShell.Commands.WebRequestSession
#-----------------------------------------------------------
# Loop for each forecast hour
#-----------------------------------------------------------
For ($i=0; $i -le 18; $i++) {
 if($i-lt10){$fhr="0"+$i}else{[string]$fhr=$i}
 $url = $str1+$HR_char+$str2+$fhr+$str3
 $output = $Local_Path + "HRRR_"+$today+"_"+$HR_char+"_"+$fhr+"_.grib"
 #write-output $url
 Invoke-WebRequest $url -WebSession $session -TimeoutSec 900 -OutFile $output
 }

We can also freely pull NEXRAD data for almost any large city in the US. ICMLive can combine numerous rainfall sources with varying timesteps and spatial resolutions, and apply them in ranked priority to the model.

NOAA provides an FTP site containing the latest NEXRAD data for each site. The easiest way that I have found for auto-retrieving data from an FTP site is to install the free software WinSCP. With this installed, the following batch file will auto-pull any files from the site that have been uploaded within the last two hours (or desired interval) for the specified NEXRAD station.

@echo off
rem ----------------------------------------------------------
rem - Script for pulling the latest radar data from an FTP site
rem - Requires local install of WinSCP
rem - https://winscp.net/eng/index.php
rem - NEXRAD site codes:
rem - https://www.roc.noaa.gov/wsr88d/Images/WSR-88DCONUSCoverage1000.jpg
rem ----------------------------------------------------------
rem - Define local paths and desired window for pulling files.
set "localpath=C:\Real_Time_Site\NEXRAD\"
set "sitecode=kftg" :: NEXRAD station site code
set "window=2h" :: files added in this time period will be imported
set "ftp_UN=anonymous" :: user name login for FTP – (if applicable)
set "ftp_PW=pw" :: password for FTP - (arbitrary for public sites)
rem - Assemble FTP site path
set "ftpsite=ftp://%ftp_UN%:%ftp_PW%@tgftp.nws.noaa.gov"
set "ftploc=/SL.us008001/DF.of/DC.radar/DS.176pr/SI.%sitecode%/"

rem - Call WinSCP to pull requested window of files from FTP to local folder
"C:\Program Files (x86)\WinSCP\WinSCP.com" log="%localpath%log\WinSCP.log" ^
 /ini=nul /command ^
 "open %ftpsite%%ftploc%" ^
 "get *>%window% %localpath%*" ^
 "exit"

Spatial time series database configuration example

Once we run each of these scripts to retrieve sample files, we can set up spatial TSDB’s in ICMLive/SWMMLive to auto-import them. Below is the configuration I use for my HRRR forecast data. In the screenshot you’ll see I use 18 Grib files per forecast, which corresponds with 18 hours ahead from each current time. I also highlight the Script box which is optional. Here I call my PowerShell script above, so that every time ICMLive/SWMMLive is told to upload data for model runs, the script will first be called to download the data. I don’t need to crop the HRRR data here since I’ve already done so using my script, however you’ll most likely want to crop the NEXRAD data in this configuration.

Once imported, you can verify that the rainfall grid overlays on your model. Simply drag the Spatial time series database onto your network and ensure that Rainfall is turned on in the Properties and Themes and that Show radar cell boundaries is checked on the Visual tab. The following screenshot shows the NEXRAD data cropped over my network.

 

Sample view of NEXRAD data over an ICM model

Simple Run configuration using the new rainfall data

Of course, we could also connect the model to live telemetry feeds for real time controls and model validation at this point, but we’ll move on since the title of this blog says we have 60 minutes.

We next create a run that uses the new radar and any other settings we desire to use for our ongoing live runs. The order that you drag the TSDB’s onto the run will determine the hierarchy for which rainfall to use as a default when both are available, so be sure your measured rainfall appears above the forecast rainfall.

 

 

Basic Manifest configuration for hourly runs

The final step is to create a Manifest – which is an object that defines how the model is deployed for regular runs on the server. Simply drag our previously created model run into the Default run box and set up a Run schedule. Since our HRRR forecast data comes in hourly around 20 minutes after the hour, I’ve set the schedule below to run hourly with an offset of 30 minutes so that it will catch the latest data. There are many more involved settings that can be used to set the model to just look at rainfall and runoff until it detects a certain amount of rainfall, then run full simulations frequently during those wet events. Similarly, if your model takes a long time to run, a simplified model could run on a regular basis, then trigger the full simulation during wet events.

 

 

Deploy Manifest

Once we simply drag this Manifest into a Manifest Deployment and click Deploy, the model will run on the server at the set schedule. One powerful feature to the live runs as configured above, is that each simulation will save the state just before the end of the period with measured data and the next run will simply pick up where the state left off. This means the runs are stitched together hydrologically, and the run time is cut in half.

Share this post!

    About Nathan Gerdts

    Nathan is a Client Service Manager and has helped utilities with training, implementation, and data review.
    This entry was posted in ICMLive, InfoSWMM, InfoWorks ICM, SWMMLive. Bookmark the permalink.