Maritime Forum

1st EMODnet Technical Working Group Meeting

Published on: Wed, 15/05/2019 - 13:33
Table of Contents
    The EMODnet Technical Working Group consists of the data portal managers and developers involved in EMODnet projects. The group meets at least twice per year to discuss common challenges and solutions to provide better services to the EMODnet users.

    Date: 23 October 2015 - 9h00 to 13h00

    Meeting Venue: InnovOcean site, Wandelaarkaai 7, Ostend, Belgium



    Nathalie LEIDINGER

    SHOM, France

    Coastal Mapping

    Christophe LAMPIN

    WORLDLINE, France

    Coastal Mapping

    Joseph COLIN

    WORLDLINE, France

    Coastal Mapping

    Matteo VINCI

    OGS, Italy


    Dick SCHAAP

    MARIS, Netherlands


    Francisco SOUZA DIAS

    VLIZ, Belgium

    Central Portal / Biology

    Simon CLAUS

    VLIZ, Belgium

    Central Portal / Biology

    Graeme DUNCAN

    JNCC, UK

    Seabed Habitats

    Francisco HERNANDEZ

    VLIZ, Belgium


    Jonathan LOWNDES

    BGS, UK




    Human Activities

    Antonio NOVELLINO

    ETT, Italy


    Thomas LOUBRIEU

    IFREMER, France

    Physics / Chemistry


    DELTARES, Netherlands

    Chemistry / Biology

    Alexander BARTH

    ULG, Belgium

    Chemistry / Biology

    Lennert TYBERGHEIN

    VLIZ, Belgium



    VLIZ, Belgium

    Central Portal

    Bruno VITORINO

    VLIZ, Belgium


    Filip WAUMANS

    VLIZ, Belgium

    Central Portal / Biology

    Oonagh McMEEL

    Seascape Consultants, UK

    EMODnet Secretariat


    Seascape Consultants, UK

    EMODnet Secretariat

    Jan-Bart CALEWAERT

    Seascape Consultants, UK

    EMODnet Secretariat


    Meeting Minutes


    Currently the Query Tool allows users to retrieve information from a limited selection of webservices and append it to coordinate points. Users may choose to use either a set of coordinate points already uploaded to the Query Tool (demofile) or upload a set of their own, provided that they register/log in. 2 outputs are generated: a table consisting of the original data uploaded to the Query Tool plus the values of each web service per coordinate point; and a webpage report with summarized information and a map showing the location of the coordinate points.

    The selection of webservices used in the query tool are not fully interoperable. This is due not only to incomplete compliancy to OGC standards but also because there are additional issues preventing the services’ interoperability within the context of the query tool. During the development of the query tool, developers at VLIZ wrote dedicated scripts for each web service so that its output could be fully interoperable. Every webservices’ issues were identified and a set of recommendations were discussed. Ideally, if all recommendations are followed when producing new webservices (or adapting existing ones) , then the services can be “plugged” into the query tool without further adaptation.


    Meeting Summary & Action Points

    The first Technical Working Group meeting took place in Oostende on the 23rd of October 2015. This meeting was integrated in a week of partner meetings within the first EMODnet Conference/Jamboree.

    Francisco Souza Dias chaired the meeting starting with a small presentation of the status of the query tool and its current functionality.

    It was stressed, during this presentation, that the Central Portal does not hold data and only provides access to data products. Similarly, the Central Portal does not produce data products, these are made by each thematic lot and then used within the central portal.

    Recommendations presented during the meeting:

    • OGC doesn’t define output formats for the WFS GetFeature and WMS GetFeatureInfo requests. Provide an output format suitable for dataexchange. GeoJSON recommended as output format. (output example added to annex)

    Filip Waumans (Central Portal) stresses the fact of using the same type of service (wfs) instead of different formats (REST).

    Dick Schaap (Bathymetry) confirms that the bathymetry WMS returns RGB colour code instead of depth and points out that bathymetry users prefer WMS to WFS.

    • Each web service should provide only the layers relevant for the Central Portal, or at least these should be identified.
    • Use WGS 84 as coordinate system (EPSG:4326) for the different layers. (It was pointed out during the meeting that INSPIRE recommends European Terrestrial Reference System 1989 – EPSG 4258 – further discussion is needed to ensure both INSPIRE/OGC/ Central Portal interoperability.)
    • Use the elevation and time parameter of the WMS service for time-dependent or elevation-dependent data (
    • If some parameters other than latitude, longitude, depth, and time, have to be included in the request, the WFS call has to be capable to include a customized filter parameter.
    • Don’t use different layers for different geographical regions. If different layers are used the portal has to send several requests to receive the data for one coordinate.

    At least for the purpose of the Query Tool, the layers must be structured in such way that one single request may be sufficient for retrieving a parameter value from any point in Europe. Francisco Hernandez asked: unify all products… don’t need to give all the information. Same values for the same location, not different value according to the calculation type (different dataset, sea basin, season/time).

    Graeme Duncan (seabed habitats) agreed to produce layers covering all sea basins for the purpose of the query tool. These will return classification system and its value for a given coordinate point.

    • For discrete data (ex. stations) the query tool searches for data from the nearest station. 

    Searching for the nearest station is not possible by WFS. In that case a REST service giving the nearest station and the distance between requested coordinate and station is desirable. 



    Select area around vessel/ aquaculture cage/ wind farm siting , and retrieve summarized information from each services:

    • DEPTH (Min, max, avg)
    • Seabed Habitats – all occurring habitats given in % - km²
    • Seabed Geology all occurring substrate type given in % - km²
    • Species
    • Nearest Physics station and its values (min max avg) for sea level, temperature, etc.
    • Chemical contaminants
    • Human activities (submarine cables, natura 2000, MPAs)


    Webservices’ output will be recalculated to give %, min, max, avg values for the area selected by the user.

    Dick Schaap commented that the summarized information needs to be paired with a Map Viewer. Current WMS layers can easily be integrated in the map viewer.

    Gerrit Hendriksen comments that WFS is not good for gridded data or time series. Some layers have up to 50 million records. Use WCS.  More info on possible ways to develop the use case presented above were given by Gerrit by email afterwards (in annex)

    Francisco Hernandez suggests that the recalculation of the services output value within the user-specified area may be done centrally at the Central Portal level rather than each lot providing gridded products.

    Usage of OPeNDAP suggested as a mean to transfer data across the web. (more in annex)


    Single Sign-On

    Thomas Loubrieu introduces the single sign-on – Marine-ID – used by IFREMER and SeaDataNet.

    Marine-ID makes use of pwm project implementation.

    The members agree on using marine id and Copernicus as extra access systems for the data portal.

    VLIZ will discuss implementation directly with Thomas Loubrieu.

    Antonio Novellino refers that not all user databases are linked yet and that the Central Portal should also allow other login platforms for the time being. The solutions/service developed in one field may not be easily adaptable by other systems/services (e.g. MarineID is a valuable initiative and service but it is not straight forward to adopt it in Physics, Biology etc.)

    More information available from T. Loubrieu presentation – Sextant Product Catalogue.


    Data Catalogue

    Presentation on the data catalog by Thomas Loubrieu IFREMER.

    Data catalogue for Products:

    Synthetic maps (geo-spatial analysis, raster, vectors)

    Compilation of observation with homogeneous content (observed properties) and processing level (e.g. quality control)

    Bathymetry, Chemistry, Physics, Biology, and MedSea checkpoint already included.

    Data Catalogue functions include:

     Browse/Administration

     Product description edition:

    Product manager oriented, tailor-made

    Controlled vocabularies and contact directories

    Online resource URLs

     Validation workflow

     INSPIRE, OGC compliance

     Data publication and citation (DOI)

     Business Intelligence

    Webservices monitoring via NAGIOS

    More information available from presentation Sextant Product Catalogue



    Secretariat gave an update on joint meeting EMODnet – INSPIRE last June at JRC.

    INSPIRE will develop a marine pilot

    A contact point should be appoint by each lot

    Secretariat suggested that the technical group would be used to disseminate information within the lots.

    Secretariat suggested to have a joint Technical WG – INSPIRE meeting in November but Simon Claus referred that more information is needed ahead of the meeting. A meeting will be set after INSPIRE documentation have circulated among the lots. 


    End of the meeting.

    Another Technical WG meeting will be set in June 2016.



    Annex 1 - Agenda


    09:00h   Welcome participants

    09:00h – 10:30h               EMODnet Central Portal Webservices – Recommendations

    10:30h – 11:30h               New Use Cases combining data products                                                           

    12:00h – 12:30h               Single Sign On - Discussion led by Antonio Novellino, ETT

    12:30h – 13:00h               Data Catalogue – Discussion led by Thomas Loubrieu, Ifremer

    Add new comment

    It is not necessary to be logged in to provide comments but in that case comments will need to go through a moderator.
    This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.