Yesterday evening I jumped into a Twitter thread started by an old acquaintance of mine – it was the first I’d really heard about the record-breaking floods in Nebraska and surrounding areas. I offered up my services providing river data for closer analysis of the situation.
Dredging USGS water data sources and whipping the results into a usable format is something I enjoy with the Little Colorado River. Why not apply that technique to current events, I figured. It seemed like a good challenge to get it all done in one evening between bottle-feeds.
I don’t know the area, so my first step was to query the USGS data service for a list of all water stations in Nebraska that:
- Weren’t knocked offline by floods
- Measured stream-flow or gauge height (don’t care about water temperature, clarity, etc)
- Offered “instantaneous values” (a lot of stations provide once-daily reports)
The USGS has an excellent water services application available. The web UI doesn’t look cutting-edge, but the results are reliable and the data sources are vast.
There were hundreds of stations in Nebraska that met the criteria I needed, so I had the list of station information encoded in Google’s KML format. That allowed me to take all the stations in Nebraska and plot them in Google Earth, which will make it MUCH easier to narrow down the right ones to investigate.
My first few attempts were a bust. I found stations that were online and in ideal locations, but as I combed through the data they were all knocked offline during the peak of the action. Not good.
My third or fourth try was a success – humble USGS station #06467500 along the Missouri River (between the border of South Dakota and Nebraska) and just downstream from a dam survived with all of its data intact!
I used a Chrome Extension called RESTlet to craft the initial query for that data station. This isn’t the actual data retrieval I would use, it just a tool that helps me craft it.
You can run the query yourself, right in the browser. Clicking this link will hit that small USGS water station on the Missouri River and pull the last 10 days of data (at 15 minute intervals). It works right in the browser! However, the results are not exactly human-readable:
I turned to Microsoft Flow to help automate and format the retrieval. Flow would perform the data pull I needed and get the results tucked nicely and organized into a custom SharePoint list.
But first, Flow needs a little human help to make proper use of the data. I followed some of the excellent advice from Microsoft MVP John Liu about whipping the JSON result schema into workable shape, and it wasn’t long before my Flow was off and running.
Once I had it dialed in, it took my Flow about 10 minutes to complete its tasks. With the data is in SharePoint, it’s off to the races with PowerBI – Microsoft’s enterprise-grade data analytics and reporting tool. In the right hands, PowerBI can work wonders. With a fussy baby to contend with, a shaded line graph was about the best I could muster last night!
The best part about work done in Flow and PowerBI is that it can be automated. What I did last night can be scheduled, and one could build a sophisticated dashboard of river data graphed nearly in real-time!
Even a simple line graph shows us cool things!
Because the sensor providing this data is immediately downstream from a dam, you can see every increase and decrease in water flow as the engineers worked to battle the tremendous flood. The river swelled over ten feet in 24-hours, and that’s with a staffed, controlled dam immediately upstream!
It must have been one hell of a night at the floodgates.