This chapter describes the workflow for a data publication from source to publication which is similar to the submission > review > editorial > publication flow established in scientific literature. The editorial process is coordinated by the editor-in-chief in close cooperation with the data librarian, the data curators of projects/institutes, the Pangaea editors and data scientists. Since 2011 the workflow and communication of each data submission is documented through a Ticket System.
The workflow is primarily an interaction beween the (corresponding) author and the curator/editor and consists of 6 steps:
- Data submission - the author submits datasets with description (metadata) through the Ticket System, following the submission guidelines and the policy of her/his project/institute.
- Editorial review - the submission is checked by the editor-in-chief for completeness of metadata and validity/format of the data. A request will be send to the author if mandatory information is missing. If the submission is complete, the editor-in-chief assings the ticket to the curator/editor in charge.
- Data import - as a technical review, metadata is checked via the editorial system (4D), new information is added by the curator/editor. Data are converted to the import format. With the editorial system the relations are set between metadata and data. After import the vailidity of the dataset(s) is checked in browser-view by the curator/editor. This includes a validity check of all external links, partly automated.
- Dataset proof - the curator sends the DOI-link to the author(s), requesting for proofread.
- Corrections - Through an iterative process between author and curator the dataset is edited until final approval by the author.
- Publication - the dataset status is set to published, the DOI will be valid 4 weeks after the final editing and is than part of the official dataset citation. On request of the author, a password may be set for a moratorium period or until publication of the related paper. For inclusion of the data in the peer-review of the paper, the reviewers have data access via the DOI-link and the authors password.
Author(s), projects or institutes have the choice to
- archive a supplement related to a publication or
- making results available to the scientific community, e.g. within a research project or
- publish a data report.
- the granularity of the data submission is defined,
- parameters are defined with unit; as close to international standards as possible,
- metadata are on hand,
- any information, necessary to understand the generation and content of the data set, is available,
Data supplements with status supplementary data can automatically be linked to the publications splash page in the journals/publishers catalog through an automatic web service (e.g. doi:10.1016/0025-3227(89)90086-8 .
The ressources required to establish this workflow need to become a mandatory part of each research project. The amount is estimated in the business model.