InfoNet 18.5 is now out in the wild and along with a wealth of new functionality, we’ve thrown in some that will hopefully make day to day usage somewhat more pleasant. One of these is the SQL dialog box which is now resizable. So for those of you who indulge in SQLs and specifically writing those long and complex SQLs then the following should bring you a smile.
We all know the classic SQL Dialog box…
Classic SQL Dialog Box
InfoWorks ICM version 8.0 was released with several new features. One new feature is the “Averaged Spatial Rainfall on TVD connectors” and this post is an example of the flexibility that it offers you.
What is it for?
Have you wonder what is the rainfall over an area? This can simply be answered with the “Averaged Spatial Rainfall on TVD connectors” feature.
The increasing resolution of spatial rainfall from radar data, with spatial resolutions as detailed as 50 metres, means the number of radar cells over an area can be significant and instead of analysing each cell you might want to analyse the average rainfall intensity over an area. This new feature enables you to calculate the Averaged Spatial Rainfall on a TVD connector polygon, which can then be used to plot rainfall averages and also to trigger alerts in a live system with InfoWorks ICMLive. Continue reading
This is the fourth instalment in a series of blogs I’ve written summarising run times for 2D simulations conducted on PC hardware and NVIDIA GPU cards. The first was in December 2012, I then followed up with a second in September 2016 and a third in June 2017. Each blog shows how InfoWorks ICM has been designed to leverage technological improvements as soon as new hardware platforms and new GPU cards come onto the market.
The tables below give simulation runtimes for two of our standard 2D testing models that were run on a high-end AMD based Workstation containing an NVidia GeForce GTX 1070Ti GPU card and a high-end Intel Xeon Server fitted with an NVidia Quadro GP100 GPU card. Continue reading
At the end of running a rehab flowchart, the rehab summary report is generated which shows the assigned rehab action for each pipe. However, many utilities and engineering consultants do not perform rehab actions on an individual pipe basis, but on many pipes grouped into individual projects. This blog addresses how InfoMaster users can bundle pipes into work orders and projects.
Many times, our clients will have custom inspection data for assets and will want to bring that information into InfoMaster. Because InfoMaster is extremely flexible with both GIS and tabular data, there are lots of ways of doing this, but the technique below is one suggested method. The following steps describe how custom inspection data can be brought into InfoMaster through the Work Manager tool.
1. First, convert your survey data to an attribute table and import this attribute table into your InfoMaster project database. Below is my example:
In InfoWorks ICM, it is possible to assign initial node levels or inflows as well as initial depths and flows along a river reach. If you have attempted this by creating an Initial Conditions 1D object and applying it to your Hydraulic Run only to find that the initial conditions specified were not applied, you might have been missing a few crucial steps. This only occurs if the initial conditions have been partially applied, ie, not all nodes/links have had initial conditions specified.
Simply applying your Initial Conditions 1D object to your run here may not ensure that the initial conditions will be applied if some of the adjacent nodes/sections do not have their initial conditions set. In which case there will be a message in the log file:
Warning 1124: Some nodes and links in the network are not in the state. SIM will fill in levels from known state nodes.
There will then be an initialization phase which propagates any defined levels from the 1D initial conditions object throughout the network.
One of the main benefits of InfoMaster is the ability to leverage all of the different applications already within ArcMap. Because InfoMaster is completely built within the geodatabase structure, all InfoMaster data is also ArcGIS data. This makes it very easy for users to generate data and results with InfoMaster and then transform that data into a raw attribute table, feature class, or shapefile, so that anyone with ArcMap can view the data.
Below is a quick example of how seamless and user friendly this process can be. Say, for example, that you want to export out your risk results from your gravity main facility type as a map display. First, you would generate a map display by right-clicking on the desired risk analysis and selecting Map Display…
In the Thematic Mapping dialog, select the display preferences and then click update. Users can adjust weights, colors, labels, etc. and save the theme so that it can be easily be recalled in the future. Also note in the screenshot below that InfoMaster’s Thematic Mapping window has be simplified for a more user friendly experience in InfoMaster Update 8.1. Continue reading
Ever wonder how the LOF and COF Contribution Pie Charts are generated in InfoMaster? If so, read on!
The percentages are calculated when InfoMaster sums the total COFs and LOFs in the risk report.
Note: This blog applies to all versions of InfoMaster before 8.5. In 8.5, the basic relationships still apply (Rehab Costs – > Rehab Methods, Rehab Methods -> Rehab Actions), but the interface has been completely changed. For information on this same topic in InfoMaster 8.5, go here! 8.5: Rehab Actions and Costs Interface
In InfoMaster, there are five special tables to help users go from defect codes to costs for completing rehab actions. These five tables can be found under the main InfoMaster dropdown:
Along with many more subtle updates to InfoMaster 8.0, what used to be called Reliability Analysis in InfoMaster has now been upgraded and renamed as Failure and Deterioration Modeling. The Failure/Deterioration tools in InfoMaster allow users to select from a multitude of industry-adopted deterioration models. These complex models will provide estimated failure probability for pipes as a function of time. Advanced knowledge of statistics is NOT required to run these models! Rather, the user must simply have an understanding of which pipes have already failed (and the criteria that defines such), and of course, the data to back it up.
Getting started with these models is as simple as working your way down through the four steps in the Failure/Deterioration Modeling module.