You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/sphinx-guides/source/user/appendix.rst
+14
Original file line number
Diff line number
Diff line change
@@ -8,13 +8,18 @@ Additional documentation complementary to the User Guide.
8
8
.. contents:: |toctitle|
9
9
:local:
10
10
11
+
.. _metadata-references:
12
+
11
13
Metadata References
12
14
======================
13
15
14
16
The Dataverse Project is committed to using standard-compliant metadata to ensure that a Dataverse installation's
15
17
metadata can be mapped easily to standard metadata schemas and be exported into JSON
16
18
format (XML for tabular file metadata) for preservation and interoperability.
17
19
20
+
Supported Metadata
21
+
~~~~~~~~~~~~~~~~~~
22
+
18
23
Detailed below are what metadata schemas we support for Citation and Domain Specific Metadata in the Dataverse Project:
19
24
20
25
- `Citation Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=0>`__: compliant with `DDI Lite <http://www.ddialliance.org/specification/ddi2.1/lite/index.html>`_, `DDI 2.5 Codebook <http://www.ddialliance.org/>`__, `DataCite 3.1 <http://schema.datacite.org/meta/kernel-3.1/doc/DataCite-MetadataKernel_v3.1.pdf>`__, and Dublin Core's `DCMI Metadata Terms <http://dublincore.org/documents/dcmi-terms/>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/citation.tsv>`__). Language field uses `ISO 639-1 <https://www.loc.gov/standards/iso639-2/php/English_list.php>`__ controlled vocabulary.
@@ -26,6 +31,15 @@ Detailed below are what metadata schemas we support for Citation and Domain Spec
26
31
`Virtual Observatory (VO) Discovery and Provenance Metadata <http://perma.cc/H5ZJ-4KKY>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/astrophysics.tsv>`__).
27
32
- `Life Sciences Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=2>`__: based on `ISA-Tab Specification <https://isa-specs.readthedocs.io/en/latest/isamodel.html>`__, along with controlled vocabulary from subsets of the `OBI Ontology <http://bioportal.bioontology.org/ontologies/OBI>`__ and the `NCBI Taxonomy for Organisms <http://www.ncbi.nlm.nih.gov/Taxonomy/taxonomyhome.html/>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/biomedical.tsv>`__).
28
33
- `Journal Metadata <https://docs.google.com/spreadsheets/d/13HP-jI_cwLDHBetn9UKTREPJ_F4iHdAvhjmlvmYdSSw/edit#gid=8>`__: based on the `Journal Archiving and Interchange Tag Set, version 1.2 <https://jats.nlm.nih.gov/archiving/tag-library/1.2/chapter/how-to-read.html>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/journals.tsv>`__).
34
+
35
+
Experimental Metadata
36
+
~~~~~~~~~~~~~~~~~~~~~
37
+
38
+
Unlike supported metadata, experimental metadata is not enabled by default in a new Dataverse installation. Feedback via any `channel <https://dataverse.org/contact>`_ is welcome!
39
+
29
40
- `Computational Workflow Metadata <https://docs.google.com/spreadsheets/d/13HP-jI_cwLDHBetn9UKTREPJ_F4iHdAvhjmlvmYdSSw/edit#gid=447508596>`__: adapted from `Bioschemas Computational Workflow Profile, version 1.0 <https://bioschemas.org/profiles/ComputationalWorkflow/1.0-RELEASE>`__ and `Codemeta <https://codemeta.github.io/terms/>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/computationalworkflow.tsv>`__).
30
41
42
+
See Also
43
+
~~~~~~~~
44
+
31
45
See also the `Dataverse Software 4.0 Metadata Crosswalk: DDI, DataCite, DC, DCTerms, VO, ISA-Tab <https://docs.google.com/spreadsheets/d/10Luzti7svVTVKTA-px27oq3RxCUM-QbiTkm8iMd5C54/edit?usp=sharing>`__ document and the :doc:`/admin/metadatacustomization` section of the Admin Guide.
Copy file name to clipboardExpand all lines: doc/sphinx-guides/source/user/dataset-management.rst
+18-13
Original file line number
Diff line number
Diff line change
@@ -162,13 +162,13 @@ BagIt Support
162
162
163
163
BagIt is a set of hierarchical file system conventions designed to support disk-based storage and network transfer of arbitrary digital content. It offers several benefits such as integration with digital libraries, easy implementation, and transfer validation. See `the Wikipedia article <https://en.wikipedia.org/wiki/BagIt>`__ for more information.
164
164
165
-
If the repository you are using has enabled BagIt file handling, when uploading BagIt files the repository will validate the checksum values listed in each BagIt’s manifest file against the uploaded files and generate errors about any mismatches. The repository will identify a certain number of errors, such as the first five errors in each BagIt file, before reporting the errors.
165
+
If the Dataverse installation you are using has enabled BagIt file handling, when uploading BagIt files the repository will validate the checksum values listed in each BagIt’s manifest file against the uploaded files and generate errors about any mismatches. The repository will identify a certain number of errors, such as the first five errors in each BagIt file, before reporting the errors.
166
166
167
167
|bagit-image1|
168
168
169
169
You can fix the errors and reupload the BagIt files.
170
170
171
-
For information on how to enable and configure the BagIt file handler see the :ref:`installation guide<BagIt File Handler>`
171
+
More information on how your admin can enable and configure the BagIt file handler can be found in the :ref:`Installation Guide<BagIt File Handler>`.
172
172
173
173
.. _file-handling:
174
174
@@ -238,10 +238,11 @@ Computational workflows precisely describe a multi-step process to coordinate mu
238
238
239
239
|cw-image1|
240
240
241
+
241
242
FAIR Computational Workflow
242
243
~~~~~~~~~~~~~~~~~~~~~~~~~~~
243
244
244
-
FAIR principles (Findable, Accessible, Interoperable, Reusable) also apply to computational workflows. The FAIR Principles (https://doi.org/10.1162/dint_a_00033) apply to workflows in two areas as FAIR data and FAIR criteria for workflows as digital objects. In the FAIR data area, "*properly designed workflows contribute to FAIR data principles since they provide the metadata and provenance necessary to describe their data products, and they describe the involved data in a formalized, completely traceable way*" (https://doi.org/10.1162/dint_a_00033). Regarding the FAIR criteria for workflows as digital objects, "*workflows are research products in their own right, encapsulating methodological know-how that is to be found and published, accessed and cited, exchanged and combined with others, and reused as well as adapted*" (https://doi.org/10.1162/dint_a_00033).
245
+
The FAIR Principles (Findable, Accessible, Interoperable, Reusable) apply to computational workflows(https://doi.org/10.1162/dint_a_00033) in two areas: as FAIR data and as FAIR criteria for workflows as digital objects. In the FAIR data area, "*properly designed workflows contribute to FAIR data principles since they provide the metadata and provenance necessary to describe their data products, and they describe the involved data in a formalized, completely traceable way*" (https://doi.org/10.1162/dint_a_00033). Regarding the FAIR criteria for workflows as digital objects, "*workflows are research products in their own right, encapsulating methodological know-how that is to be found and published, accessed and cited, exchanged and combined with others, and reused as well as adapted*" (https://doi.org/10.1162/dint_a_00033).
245
246
246
247
How to Create a Computational Workflow
247
248
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -257,31 +258,35 @@ You are encouraged to review these examples when creating a computational workfl
257
258
258
259
At https://workflows.community, the Workflows Community Initiative offers resources for computational workflows, such as a list of workflow systems (https://workflows.community/systems) and other workflow registries (https://workflows.community/registries). The initiative also helps organize working groups related to workflows research, development and application.
259
260
260
-
How to Upload your Computational Workflow
261
+
How to Upload Your Computational Workflow
261
262
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
262
263
263
-
When you :ref:`add a new dataset <adding-new-dataset>`, the Dataverse repository you are using may provide additional support for describing computational workflows, including Computational Workflow Metadata fields for describing your workflow and a "Workflow" tag you can apply to your workflow files.
264
+
After you :ref:`upload your files <dataset-file-upload>`, you can apply a "Workflow" tag to your workflow files, such as your Snakemake or R Notebooks files, so that you and others can find them more easily among your deposit’s other files.
265
+
266
+
|cw-image3|
267
+
268
+
|cw-image4|
269
+
270
+
How to Describe Your Computational Workflow
271
+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
272
+
273
+
The Dataverse installation you are using may have enabled Computational Workflow metadata fields for your use. If so, when :ref:`editing your dataset metadata <adding-new-dataset>`, you will see the fields described below.
264
274
265
275
|cw-image2|
266
276
267
-
The three fields are adapted from `Bioschemas Computational Workflow Profile, version 1.0 <https://bioschemas.org/profiles/ComputationalWorkflow/1.0-RELEASE>`__ and `Codemeta <https://codemeta.github.io/terms/>`__:
277
+
As described in the :ref:`metadata-references` section of the :doc:`/user/appendix`, the three fields are adapted from `Bioschemas Computational Workflow Profile, version 1.0 <https://bioschemas.org/profiles/ComputationalWorkflow/1.0-RELEASE>`__ and `Codemeta <https://codemeta.github.io/terms/>`__:
268
278
269
279
- **Workflow Type**: The kind of Computational Workflow, which is designed to compose and execute a series of computational or data manipulation steps in a scientific application
270
-
- **External Code Repository URL**: A link to another public repository where the un-compiled, human-readable code and related code is also located (e.g., SVN, GitHub, GitLab, CodePlex)
280
+
- **External Code Repository URL**: A link to another public repository where the un-compiled, human-readable code and related code is also located (e.g., GitHub, GitLab, SVN)
271
281
- **Documentation**: A link (URL) to the documentation or text describing the Computational Workflow and its use
272
282
273
-
After you :ref:`upload your files <dataset-file-upload>`, you can apply a "Workflow" tag to your workflow files, such as your Snakemake or R Notebooks files, so that you and others can find them more easily among your deposit’s other files.
274
-
275
-
|cw-image3|
276
-
277
-
|cw-image4|
278
283
279
284
How to Search for Computational Workflows
280
285
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
281
286
282
287
If the search page of the Dataverse repository you are using includes a "Dataset Feature" facet with a Computational Workflows link, you can follow that link to find only datasets that contain computational workflows.
283
288
284
-
You can also use the "Workflow Type" facet, if the Dataverse repository uses it, to find datasets that contain certain types of computational workflows, such as workflows written in Common Workflow Language files or Jupyter Notebooks.
289
+
You can also search on the "Workflow Type" facet, if the Dataverse installation has the field enabled, to find datasets that contain certain types of computational workflows, such as workflows written in Common Workflow Language files or Jupyter Notebooks.
#datasetField name title description watermark fieldType displayOrder displayFormat advancedSearchField allowControlledVocabulary allowmultiples facetable displayoncreate required parent metadatablock_id termURI
4
4
workflowTypeComputational Workflow TypeThe kind of Computational Workflow, which is designed to compose and execute a series of computational or data manipulation steps in a scientific applicationtext0TRUETRUETRUETRUETRUEFALSEcomputationalworkflow
5
-
workflowCodeRepositoryExternal Code Repository URLA link to the repository where the un-compiled, human readable code and related code is located (e.g. SVN, GitHub, CodePlex, institutional GitLab instance)https://...url1FALSEFALSETRUEFALSETRUEFALSEcomputationalworkflow
5
+
workflowCodeRepositoryExternal Code Repository URLA link to the repository where the un-compiled, human readable code and related code is located (e.g. GitHub, GitLab, SVN)https://...url1FALSEFALSETRUEFALSETRUEFALSEcomputationalworkflow
6
6
workflowDocumentationDocumentationA link (URL) to the documentation or text describing the Computational Workflow and its usetextbox2FALSEFALSETRUEFALSETRUEFALSEcomputationalworkflow
7
7
#controlledVocabulary DatasetField Value identifier displayOrder
8
8
workflowTypeCommon Workflow Language (CWL)workflowtype_cwl1
Copy file name to clipboardExpand all lines: src/main/java/propertyFiles/computationalworkflow.properties
+1-1
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ datasetfieldtype.workflowType.title=Workflow Type
5
5
datasetfieldtype.workflowType.description=The kind of Computational Workflow, which is designed to compose and execute a series of computational or data manipulation steps in a scientific application
datasetfieldtype.workflowCodeRepository.description=A link to another public repository where the un-compiled, human-readable code and related code is also located (e.g., SVN, GitHub, GitLab, CodePlex)
8
+
datasetfieldtype.workflowCodeRepository.description=A link to another public repository where the un-compiled, human-readable code and related code is also located (e.g., GitHub, GitLab, SVN)
0 commit comments