Talk:Workflow

Combined version - Discussion closed, was added to Data submission
The workflow for a data publication from source to publication is similar to the submission > review > editorial > publication flow established in scientific literature. The editorial process is coordinated by the editor-in-chief and the data editors. The workflow and communication of each data submission is documented through a Ticket System.

The workflow is primarily an interaction between the (corresponding) author and the editor and consists of 6 steps:
 * 1) Data submission - the author submits data sets with description (metadata) through the Ticket System, following the submission guidelines and the data policy of their project/institute.
 * 2) Editorial review - the submission is checked by the editor-in-chief for completeness of metadata and validity/format of the data. A request will be send to the author if mandatory information is missing. Once the submission is complete and the data set accepted for publication in PANGAEA, the author is informed. Once the publication process can be started, the editor-in-chief assigns the ticket to the editor in charge.
 * 3) Data import - during a technical review, existing metadata is checked and, if necessary, additional metadata is added by the editor. Data are reformatted to fit to the PANGAEA Data model]. During this step, if necessary, tables are transposed, combined or divided, columns with metadata added (e.g. official [[event labels), etc. Through the editorial system the relations are established between metadata and data. After import, the data set is checked in browser-view by the editor. This includes, among other things, a validity check of all external links.
 * 4) Data set proof - the editor sends the data set link to the author(s), requesting for proofread. The DOI is assigned, but not yet registered ("activated"). The data set status is in "in review" at this stage.
 * 5) Corrections - Through an iterative process between author and editor the data set is edited until the final approval by the author.
 * 6) Publication - the data set status is set to "published", the DOI will be activated 4 weeks after the final editing and is than part of the official data set citation. On request of the author, a password may be set for a moratorium period or until publication of the related paper. A temporary access link with an expiry date can be granted on request of the author. Such a link can be used to share the data with individuals or groups, for example for co-authors or anonymous reviewers.

From Workflow
This chapter describes the workflow for a data publication from source to publication which is similar to the submission > review > editorial > publication flow established in scientific literature. The editorial process is coordinated by the editor-in-chief and the data editors. The workflow and communication of each data submission is documented through a Ticket System.

The workflow is primarily an interaction between the (corresponding) author and the editor and consists of 6 steps:
 * 1) Data submission - the author submits datasets with description (metadata) through the Ticket System, following the submission guidelines and the data policy of their project/institute.
 * 2) Editorial review - the submission is checked by the editor-in-chief for completeness of metadata and validity/format of the data. A request will be send to the author if mandatory information is missing. Once the submission is complete and the dataset accepted for publication in PANGAEA, the author is informed. Once the publication process can be started, the editor-in-chief assigns the ticket to the editor in charge.
 * 3) Data import - during a technical review, metadata is checked via the editorial system and additional metadata is added by the editor. Data are converted to the import format. Within the editorial system the relations are established between metadata and data. After import, the validity of the dataset(s) is checked in browser-view by the editor. This includes a validity check of all external links, partly automated.
 * 4) Dataset proof - the editor sends the dataset link to the author(s), requesting for proofread. The DOI is assigned, but not yet registered ("activated"). The dataset status is in "in review" at this stage.
 * 5) Corrections - Through an iterative process between author and editor the dataset is edited until the final approval by the author.
 * 6) Publication - the dataset status is set to "published", the DOI will be activated 4 weeks after the final editing and is than part of the official dataset citation. On request of the author, a password may be set for a moratorium period or until publication of the related paper. A temporary access link with an expiry date can be granted on request of the author. Such a link can be used to share the data with individuals or groups, for example for co-authors or anonymous reviewers.

Data publication workflow - RESPONSIBILITIES
For documentation of the data publishing process, the workflow and related communication should go through the ticket system only.


 * 1) Submission of data + metadata via the Ticket System - AUTHOR
 * 2) Assignment to editor - EDITOR-in-CHIEF
 * 3) Quality control for consistency, completeness, standards - EDITOR
 * 4) Parameter definitions (on request of editor; if applicable) - EDITOR-IN-CHIEF
 * 5) Adjustment of formats, storage of files in related systems (ePIC, hs), import of tables with editorial system - EDITOR
 * 6) Password protection on request of the author - EDITOR
 * 7) Inform author about availability of data by sending the DOI and ask for proofread - EDITOR
 * 8) Proofread, reply with approval or corrections - AUTHOR
 * 9) Iteration of format/content of data publication until approval of author - EDITOR
 * 10) Add dataset DOI to publication - AUTHOR
 * 11) After final publication inform editor by sending the paper-DOI and ask for removal of protection - AUTHOR

Supervision of workflow - EDITOR-in-CHIEF