Intern:MOSAiC raw data

Page content (please note)


This page and the information provided here is addressed to PANGAEA curators and PANGAEA employees only.

MOSAiC project background
Multidisciplinary drifting Observatory for the Study of Arctic Climate (MOSAiC)



External links:
 * Project website
 * MOSAiC data services
 * MOSAiC data policy at O2A wiki
 * MOSAiC data publication guide at O2A wiki: one page summary
 * detailed MOSAiC data publication guide at O2A wiki

Authors
The Authorship of datasets sould follow principles of good scientific practice (Guidelines for Safeguarding Good Research Practice of the German Research Foundation ).

According to MOSAiC data policy, co-authorship (on any kind of publication) must generally be offered to those that have made a substantial contribution to a) the intellectual conception or design of research; b) the acquisition, analysis, or interpretation of the data (i.e., including the data provider or data PI), or c) the drafting or significant revision of the work.

For data set authorship, there are additional considerations:
 * Data authors can but don't have to be the same as paper authors
 * Acknowledging contributions of scientists, technicians, students, who generated the data, but did not contribute to the interpretation or manuscript writing
 * Authors of datasets: those who contributed to collection a processing of data
 * Follow general rules of good scientific practice

Standardized Title
To ensure the reusability (R) and Interoperability (I) of data sets, metadata of similar data sets mostly gathered during specific research missions from research vessels or other research platforms, should follow a common template and using common vocabulary. This is also including the title.

Therefore, the title of raw data which are entrusted to PANGAEA shall follow a certain template “construction kit”. Key elements of each data set title are incorporated, such as data type, data level, device, platform type, platform, cruise (& area)

PANGAEA will follow this template in order make the data as FAIR as possible.

Example: Multibeam bathymetry raw data (Kongsberg EM122 working area dataset) of RV METEOR during cruise M127, Fictional Bank canyon, North Atlantic Ocean

The “construction kit” is shown below:

Constrution kit:

Further information for the entries can be found here:

Standardized Abstract
The abstract should include all necessary information, as described here: https://wiki.pangaea.de/wiki/Abstract

Instructions for the user can be found here: Intern:MOSAiC_raw_data

Raw data (example)
''Swath sonar bathymetry data using Kongsberg EM 122 multibeam echosounder was recorded during RV METEOR cruise M127 (WHAT, HOW). The cruise took place between 25.05.2016 and 28.06.2016 in the North Atlanic Ocean (WHEN). The survey data contain MBES measurements of the Fictional Bank canyon. No bathymetric measurements of Fictional Bank canyon were found in international databases prior to the cruise (WHY).

''To enhance MBES data accuracy, two sound velocity profile casts were conducted in the vicinity of the working area prior to the survey using sound velocity profiler AML Oceanographic (WHAT, WHEN & WHY). After processing, these data were directly imported into the MBES acquisition software Kongsberg SIS Seafloor Information System.

(more information is provided as soon as possible.....)

Standardized usage of Reference
For data from german research vessels, the cruise report should be listed under References as Related to.

Datasets of processed data and raw data from one entire cruise should be linked as Related to to each other. Currently, all bathymetry datasets already archived in PANGAEA are reviewed and checked.

Standardized Coverage
Automatical entry in PANGAEA calulated from Events or PANGAEA Geocode (see below)

https://wiki.pangaea.de/wiki/Metaheader

Please note, that Minium Elevation und Maxium Elevation does not represent min / max depth values from MBES data contained in this dataset.

Note: Currently, the Coverage will be visable on PANGAEA website with points having an GoogleMaps image in the background (right upper corner). It does not show the actual coverage of the surveyed area. The points are representing the Events.

Underway event
Each German campaign (since ~2016) should include so called "Underway" Events that are created on board. The "Underway" Event is named like the following example: CruiseID_0_Underway-sequential number e.g. M127_0_Underway-1 (this schema was established during the MANIDA project). If this Event was not created on board of the research vessel, please check in the station list which is the next free number to create an "Underway" Event. Please do not use the track Event anymore!

This Event has to include the Device used to record the data and latitude/longitude and date/time for start and end of file recording needs to be updated in order to show the first and last data record of the MBES device acquired during the cruise. This information can be derived from the parameters table that is uploaded together with the data file links (see Intern:Bathymetry).

If available, an additional link to the Sensor description hosted on https://sensor.awi.de should be included.

SVP event(s)
If SVP casts (sound velocity profile measurements using an external device e.g. using an AML Oceanographic Plus X sound velocity probe) were made during the cruise, the data files which were acquired (and used for import to the acquisition software, e.g. the Kongsberg Seafloor Information, SIS) during the cruise will be stored within the same raw data set. The corresponding event of the Event list of the cruise (e.g. https://www.pangaea.de/expeditions/events/M160) will be linked to these files. This Event will contain similar information as discribed above, e.g. coordinates,methods/devices, etc.. If SVP's are used for data post-proesssing (sound veleocity correction), these data files can be also submitted to PANGAEA. If possible, these SVP can be also linked to the corresponding Event.

Standardized usage of Parameter
Please note, that a script must be executed on AWI-Server to get the parameters (the script is currently under development). Please contact Daniel Damaske.

Raw data curation
Standard parameters for bathymetry raw data include also PANGAEA’s so called Geocodes (https://wiki.pangaea.de/wiki/Geocode). These are standard important parameters, which are mandatory to keep the data georeferenced. These metadata for each file will be automatically created while data curation process.

(more information (and more additional parameters) is provided as soon as possible.....).

Standardized usage of Comment
Always include "These data should not be used for navigational purposes" as dataset comment for each bathymetry dataset.

For further information see Bathymetry.

Licence
PANGAEA and DAM are encouraging data submitters that bathymetry data are submitted with CC-BY 4.0 License terms, which allows share (copy and redistribute the material in any medium or format) and adapt (remix, transform, and build upon the material) the data (https://creativecommons.org/licenses/by-nc/4.0/) This allows also PANGAEA / DAM in future to make data coverage and the trackline visible at https://marine-data.de/ with a direct link to PANGAEA. Within the DAM efforts, coverage and trakline shall also be made available to the community via OGC Web Map Services (WMS). Avoid licence like CC-BY-ND-NC if possible.

Which CC license choose for data? Check also https://wiki.pangaea.de/wiki/License



Keywords/Project
If the data is direct DAM data (from a DAM member, cruise or from BSH) add "DAM" as project label. If the data is not directly linked to the project, but this standard workflow was used, please add the "DAMUnderwayResearchData" as technical keyword.

Export table file name
The filename for the export table (not the data itself) should follow a certain structure and all letters should be in lower case. This structure is: Campaign_mbes_datatype_device_datasettype. Example: msm88_1_mbes_raw_em122_ed.


 * datatype: raw or processed
 * device: em122 or em712 or...
 * datasettype: ed (entire dataset) or wad (working area dataset)

Raw data
https://doi.pangaea.de/10.1594/PANGAEA.918716 (Please note that the Abstract is currently still missing, dataset currently in review, therefore not fully accessible)

Processed data
https://doi.pangaea.de/10.1594/PANGAEA.912259 (Please note, this is a test dataset, dataset currently in review status, therefore not fully accessible)

(Standardized usage of Data model extensions) (for raw and processed data curation)
The usage of so-called data model extensions which are further attributes is currently developed. This is a new feature for PANGAEA in order to include more standardized metadata. This new feature shall insure further findabilty (F) and interoperability(I) of the data sets (more information is provided as soon as possible.....).

Standard operating procedure for uploading MOSAiC raw data
Here is a step-by-step workflow for upload of MOSAiC raw data.

Following steps are currently done (workflow status Nov. 2020):
 * When a data submission comes in via ticket, provide an upload link
 * When all data is uploaded, sub-task is created that is assigned to Daniel Damaske
 * Create a "dummy" dataset with the standardized title and authors and write the dataset ID into the subtask
 * Check if "Underway" event exists, if not create one, and write the event in the subtask
 * Set moratorium to 2023-01-01
 * xxx will now produce an import file with the standardized parameters.
 * As soon as the import file is provided, the dataset can be uploaded/overwritten with the real data. You might need to add event information, file content, latitude, longitude and date/time for svp or other additional files prior to the upload.
 * "MOSAIC" project
 * Add standardized export table name and standardized comment

Workflow for adding / changing / deleting Events
This workflow allows to add missing Events (device operation IDs ), which for some reason weren't / couldn't be registered during the expedition through the D-ship system. With this it is also possible to correct entries which are wrong or are not complete (e.g. end of operation missing, wrong comment, wrong position, etc.) or delete entries which were entered wrongly.

The current workflow has following steps:


 * 1) Submitting a request through PANGAEA, in the form of a Data table + a description (reason for change) by the researcher, ideally the device PI.
 * 2) Alternatively, it could be discovered during publishing a dataset, that changes / additions are needed.
 * 3) Check by the PANGAEA editor for plausibility -> either get back with additional questions, contact the PI or perform and document the change, if everything is plausible.
 * 4) Associate the request ticket with the expedition station list ticket.
 * 5) Prepare import file (incl. sensor links) and import the changes to Events table.
 * 6) When new Events need to be added: For PS legs use a label with "scientific activity" 99, PS122/LEG_99-#, where LEG = MOSAiC leg and # = a continuous number (check which # was assigned last in the 4D Event table). The continuity is independent on the leg. Example: the last added Label was for leg 1: PS122/1_99-18, so the next label for leg 4 should be PS122/4_99-19.
 * 7) Document the summary of the changes in the expedition station list ticket.
 * 8) Request to sensor.awi colleagues via o2a-support@awi.de to perform the same changes in sensor and to update the folder structure accordingly at MCS, if needed (e.g. new empty folders). This enables the scientist uploading newly generated data to the right location at the MCS with the support of ingest. Provide the sensor.awi colleagues with the upload list.
 * 9) Provide feedback to the researcher when changes finished in all systems.

For the user guideline see: https://spaces.awi.de/x/lIFrF