Sharing rivers of data

Contributed by: 
Peter Taylor,
 Research Engineer,
 Digital Productivity Flagship

Measuring freshwater resources in our rivers and catchments is tricky business. Even trying to get an idea of how much water is flowing past a point in a river raises many questions: how fast is the water moving, what is the shape of the river channel, are there weeds growing on the bank, how does the speed of the water vary across the river section? Field hydrologists spend their days measuring and updating all this information to provide accurate estimates of where water is moving across the landscape. These data inform many decision-making processes. 

Rating Table WaterML IE

Figure: A web-based visualization client was developed that made use of WaterML 2.0's open interfaces and information objects. As the services were validated, they were then made available for visualization in the web client. The client is available at This client allows visualization of ratings and gaugings across a selection of sites for each service implemented within the IE.

 Increasingly, other parties interested in freshwater resources are using data collected by hydrologists. Flood warning agencies use data to ensure flood predictions and warnings are as up to date and accurate as possible. Structures built close to rivers or water storages require engineers to analyze flood frequency and inundation levels. The uses only continue to grow as fresh water becomes more scarce and valuable to communities. 

Members of the joint OGC/WMO Hydrology Domain Working Group (HydroDWG) have recently concluded an Interoperability Experiment (IE) that investigated sharing data about rivers and water storages. The data types vary from direct observations (water velocity, levels, river cross-sections) to derived data that capture relationships between closely related properties (e.g. river level and flow). As part of this experiment the group has implemented prototype web services and visualization tools to refine the exchange information model and formats. The experiment tested existing OGC standards, while also investigating the use of RESTful web services and JSON encodings, which are on the radar for future OGC standards activities. 

A number of key findings included:

  1. The use of JSON encodings and RESTful services was desirable from all the participants in the IE given the consistency with existing and planned data services, resulting in lower development times. The group wanted to maximize testing of the core concepts of the standard while minimizing development time (given most work is done in-kind). 

  2. There was a noticeable lack of common JSON patterns that could be re-used across OGC in areas such as: metadata headers, service capability descriptions, structuring of collections, handling links, mapping of common GML types, mechanisms for handling large responses (e.g. paging). 

  3. The use of generated documentation provided up to date description of the services and an easy way for developers to interact and test the various services. Swagger ( was used ( to document and provide interactive examples to the reference implementation of a RESTful API. Having web-based specifications more closely linked to reference implementations may be a good way to lower barriers to developers using standards-based services. 

Some software vendors involved are already looking to use the services as a data publishing mechanism to their clients.

The full details of the experiment have been published in an OGC public engineering report, WaterML2.0 part 2 – IE Engineering Report, available here: