Skip to content

Commit 070b3c0

Browse files
committed
Merge branch 'develop' into 10554-avoid-solr-join-guest #10554
2 parents 6baef6a + da3dd95 commit 070b3c0

28 files changed

+482
-177
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
Bug fixed for the ``incomplete metadata`` label being shown for published dataset with incomplete metadata in certain scenarios. This label will now be shown for draft versions of such datasets and published datasets that the user can edit. This label can also be made invisible for published datasets (regardless of edit rights) with the new option ``dataverse.ui.show-validity-label-when-published`` set to `false`.
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
Changed ``api/dataverses/{id}/metadatablocks`` so that setting the query parameter ``onlyDisplayedOnCreate=true`` also returns metadata blocks with dataset field type input levels configured as required on the General Information page of the collection, in addition to the metadata blocks and their fields with the property ``displayOnCreate=true`` (which was the original behavior).
2+
3+
A new endpoint ``api/dataverses/{id}/inputLevels`` has been created for updating the dataset field type input levels of a collection via API.

doc/release-notes/6.2-release-notes.md

+8-4
Original file line numberDiff line numberDiff line change
@@ -417,12 +417,16 @@ In the following commands we assume that Payara 6 is installed in `/usr/local/pa
417417

418418
As noted above, deployment of the war file might take several minutes due a database migration script required for the new storage quotas feature.
419419

420-
6\. Restart Payara
420+
6\. For installations with internationalization:
421+
422+
- Please remember to update translations via [Dataverse language packs](https://github.com/GlobalDataverseCommunityConsortium/dataverse-language-packs).
423+
424+
7\. Restart Payara
421425

422426
- `service payara stop`
423427
- `service payara start`
424428

425-
7\. Update the following Metadata Blocks to reflect the incremental improvements made to the handling of core metadata fields:
429+
8\. Update the following Metadata Blocks to reflect the incremental improvements made to the handling of core metadata fields:
426430

427431
```
428432
wget https://github.com/IQSS/dataverse/releases/download/v6.2/geospatial.tsv
@@ -442,7 +446,7 @@ wget https://github.com/IQSS/dataverse/releases/download/v6.2/biomedical.tsv
442446
curl http://localhost:8080/api/admin/datasetfield/load -H "Content-type: text/tab-separated-values" -X POST --upload-file scripts/api/data/metadatablocks/biomedical.tsv
443447
```
444448

445-
8\. For installations with custom or experimental metadata blocks:
449+
9\. For installations with custom or experimental metadata blocks:
446450

447451
- Stop Solr instance (usually `service solr stop`, depending on Solr installation/OS, see the [Installation Guide](https://guides.dataverse.org/en/6.2/installation/prerequisites.html#solr-init-script))
448452

@@ -455,7 +459,7 @@ curl http://localhost:8080/api/admin/datasetfield/load -H "Content-type: text/ta
455459
456460
- Restart Solr instance (usually `service solr restart` depending on solr/OS)
457461

458-
9\. Reindex Solr:
462+
10\. Reindex Solr:
459463

460464
For details, see https://guides.dataverse.org/en/6.2/admin/solr-search-index.html but here is the reindex command:
461465

doc/sphinx-guides/source/api/native-api.rst

+40-1
Original file line numberDiff line numberDiff line change
@@ -898,7 +898,46 @@ The following attributes are supported:
898898
* ``filePIDsEnabled`` ("true" or "false") Restricted to use by superusers and only when the :ref:`:AllowEnablingFilePIDsPerCollection <:AllowEnablingFilePIDsPerCollection>` setting is true. Enables or disables registration of file-level PIDs in datasets within the collection (overriding the instance-wide setting).
899899

900900
.. _collection-storage-quotas:
901-
901+
902+
Update Collection Input Levels
903+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
904+
905+
Updates the dataset field type input levels in a collection.
906+
907+
Please note that this endpoint overwrites all the input levels of the collection page, so if you want to keep the existing ones, you will need to add them to the JSON request body.
908+
909+
If one of the input levels corresponds to a dataset field type belonging to a metadata block that does not exist in the collection, the metadata block will be added to the collection.
910+
911+
This endpoint expects a JSON with the following format::
912+
913+
[
914+
{
915+
"datasetFieldTypeName": "datasetFieldTypeName1",
916+
"required": true,
917+
"include": true
918+
},
919+
{
920+
"datasetFieldTypeName": "datasetFieldTypeName2",
921+
"required": true,
922+
"include": true
923+
}
924+
]
925+
926+
.. code-block:: bash
927+
928+
export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
929+
export SERVER_URL=https://demo.dataverse.org
930+
export ID=root
931+
export JSON='[{"datasetFieldTypeName":"geographicCoverage", "required":true, "include":true}, {"datasetFieldTypeName":"country", "required":true, "include":true}]'
932+
933+
curl -X PUT -H "X-Dataverse-key: $API_TOKEN" -H "Content-Type:application/json" "$SERVER_URL/api/dataverses/$ID/inputLevels" -d "$JSON"
934+
935+
The fully expanded example above (without environment variables) looks like this:
936+
937+
.. code-block:: bash
938+
939+
curl -X PUT -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -H "Content-Type:application/json" "https://demo.dataverse.org/api/dataverses/root/inputLevels" -d '[{"datasetFieldTypeName":"geographicCoverage", "required":true, "include":false}, {"datasetFieldTypeName":"country", "required":true, "include":false}]'
940+
902941
Collection Storage Quotas
903942
~~~~~~~~~~~~~~~~~~~~~~~~~
904943

doc/sphinx-guides/source/installation/config.rst

+20
Original file line numberDiff line numberDiff line change
@@ -2945,6 +2945,24 @@ Defaults to ``false``.
29452945
Can also be set via any `supported MicroProfile Config API source`_, e.g. the environment variable
29462946
``DATAVERSE_API_ALLOW_INCOMPLETE_METADATA``. Will accept ``[tT][rR][uU][eE]|1|[oO][nN]`` as "true" expressions.
29472947

2948+
.. _dataverse.ui.show-validity-label-when-published:
2949+
2950+
dataverse.ui.show-validity-label-when-published
2951+
+++++++++++++++++++++++++++++++++++++++++++++++
2952+
2953+
Even when you do not allow incomplete metadata to be saved in dataverse, some metadata may end up being incomplete, e.g., after making a metadata field mandatory. Datasets where that field is
2954+
not filled out, become incomplete, and therefore can be labeled with the ``incomplete metadata`` label. By default, this label is only shown for draft datasets and published datasets that the
2955+
user can edit. This option can be disabled by setting it to ``false`` where only draft datasets with incomplete metadata will have that label. When disabled, all published dataset will not have
2956+
that label. Note that you need to reindex the datasets after changing the metadata definitions. Reindexing will update the labels and other dataset information according to the new situation.
2957+
2958+
When enabled (by default), published datasets with incomplete metadata will have an ``incomplete metadata`` label attached to them, but only for the datasets that the user can edit.
2959+
You can list these datasets, for example, with the validity of metadata filter shown in "My Data" page that can be turned on by enabling the :ref:`dataverse.ui.show-validity-filter` option.
2960+
2961+
Defaults to ``true``.
2962+
2963+
Can also be set via any `supported MicroProfile Config API source`_, e.g. the environment variable
2964+
``DATAVERSE_API_SHOW_LABEL_FOR_INCOMPLETE_WHEN_PUBLISHED``. Will accept ``[tT][rR][uU][eE]|1|[oO][nN]`` as "true" expressions.
2965+
29482966
.. _dataverse.signposting.level1-author-limit:
29492967

29502968
dataverse.signposting.level1-author-limit
@@ -3142,6 +3160,8 @@ Defaults to ``false``.
31423160
Can also be set via any `supported MicroProfile Config API source`_, e.g. the environment variable
31433161
``DATAVERSE_UI_ALLOW_REVIEW_FOR_INCOMPLETE``. Will accept ``[tT][rR][uU][eE]|1|[oO][nN]`` as "true" expressions.
31443162

3163+
.. _dataverse.ui.show-validity-filter:
3164+
31453165
dataverse.ui.show-validity-filter
31463166
+++++++++++++++++++++++++++++++++
31473167

src/main/java/edu/harvard/iq/dataverse/DatasetPage.java

+3-5
Original file line numberDiff line numberDiff line change
@@ -2296,13 +2296,11 @@ private void displayPublishMessage(){
22962296

22972297
public boolean isValid() {
22982298
if (valid == null) {
2299-
DatasetVersion version = dataset.getLatestVersion();
2300-
if (!version.isDraft()) {
2299+
if (workingVersion.isDraft() || (canUpdateDataset() && JvmSettings.UI_SHOW_VALIDITY_LABEL_WHEN_PUBLISHED.lookupOptional(Boolean.class).orElse(true))) {
2300+
valid = workingVersion.isValid();
2301+
} else {
23012302
valid = true;
23022303
}
2303-
DatasetVersion newVersion = version.cloneDatasetVersion();
2304-
newVersion.setDatasetFields(newVersion.initDatasetFields());
2305-
valid = newVersion.isValid();
23062304
}
23072305
return valid;
23082306
}

src/main/java/edu/harvard/iq/dataverse/DatasetVersion.java

+30-1
Original file line numberDiff line numberDiff line change
@@ -1728,7 +1728,36 @@ public List<ConstraintViolation<DatasetField>> validateRequired() {
17281728
}
17291729

17301730
public boolean isValid() {
1731-
return validate().isEmpty();
1731+
// first clone to leave the original untouched
1732+
final DatasetVersion newVersion = this.cloneDatasetVersion();
1733+
// initDatasetFields
1734+
newVersion.setDatasetFields(newVersion.initDatasetFields());
1735+
// remove special "N/A" values and empty values
1736+
newVersion.removeEmptyValues();
1737+
// check validity of present fields and detect missing mandatory fields
1738+
return newVersion.validate().isEmpty();
1739+
}
1740+
1741+
private void removeEmptyValues() {
1742+
if (this.getDatasetFields() != null) {
1743+
for (DatasetField dsf : this.getDatasetFields()) {
1744+
removeEmptyValues(dsf);
1745+
}
1746+
}
1747+
}
1748+
1749+
private void removeEmptyValues(DatasetField dsf) {
1750+
if (dsf.getDatasetFieldType().isPrimitive()) { // primitive
1751+
final Iterator<DatasetFieldValue> i = dsf.getDatasetFieldValues().iterator();
1752+
while (i.hasNext()) {
1753+
final String v = i.next().getValue();
1754+
if (StringUtils.isBlank(v) || DatasetField.NA_VALUE.equals(v)) {
1755+
i.remove();
1756+
}
1757+
}
1758+
} else {
1759+
dsf.getDatasetFieldCompoundValues().forEach(cv -> cv.getChildDatasetFields().forEach(v -> removeEmptyValues(v)));
1760+
}
17321761
}
17331762

17341763
public Set<ConstraintViolation> validate() {

src/main/java/edu/harvard/iq/dataverse/Dataverse.java

+8
Original file line numberDiff line numberDiff line change
@@ -411,6 +411,14 @@ public List<DataverseFieldTypeInputLevel> getDataverseFieldTypeInputLevels() {
411411
return dataverseFieldTypeInputLevels;
412412
}
413413

414+
public boolean isDatasetFieldTypeRequiredAsInputLevel(Long datasetFieldTypeId) {
415+
for(DataverseFieldTypeInputLevel dataverseFieldTypeInputLevel : dataverseFieldTypeInputLevels) {
416+
if (dataverseFieldTypeInputLevel.getDatasetFieldType().getId().equals(datasetFieldTypeId) && dataverseFieldTypeInputLevel.isRequired()) {
417+
return true;
418+
}
419+
}
420+
return false;
421+
}
414422

415423
public Template getDefaultTemplate() {
416424
return defaultTemplate;

src/main/java/edu/harvard/iq/dataverse/FilePage.java

+11-5
Original file line numberDiff line numberDiff line change
@@ -34,6 +34,7 @@
3434
import edu.harvard.iq.dataverse.makedatacount.MakeDataCountLoggingServiceBean;
3535
import edu.harvard.iq.dataverse.makedatacount.MakeDataCountLoggingServiceBean.MakeDataCountEntry;
3636
import edu.harvard.iq.dataverse.privateurl.PrivateUrlServiceBean;
37+
import edu.harvard.iq.dataverse.settings.JvmSettings;
3738
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
3839
import edu.harvard.iq.dataverse.util.BundleUtil;
3940
import edu.harvard.iq.dataverse.util.FileUtil;
@@ -314,13 +315,18 @@ private void displayPublishMessage(){
314315
}
315316
}
316317

318+
Boolean valid = null;
319+
317320
public boolean isValid() {
318-
if (!fileMetadata.getDatasetVersion().isDraft()) {
319-
return true;
321+
if (valid == null) {
322+
final DatasetVersion workingVersion = fileMetadata.getDatasetVersion();
323+
if (workingVersion.isDraft() || (canUpdateDataset() && JvmSettings.UI_SHOW_VALIDITY_LABEL_WHEN_PUBLISHED.lookupOptional(Boolean.class).orElse(true))) {
324+
valid = workingVersion.isValid();
325+
} else {
326+
valid = true;
327+
}
320328
}
321-
DatasetVersion newVersion = fileMetadata.getDatasetVersion().cloneDatasetVersion();
322-
newVersion.setDatasetFields(newVersion.initDatasetFields());
323-
return newVersion.isValid();
329+
return valid;
324330
}
325331

326332
private boolean canViewUnpublishedDataset() {

src/main/java/edu/harvard/iq/dataverse/MetadataBlockServiceBean.java

+9-1
Original file line numberDiff line numberDiff line change
@@ -58,10 +58,18 @@ public List<MetadataBlock> listMetadataBlocksDisplayedOnCreate(Dataverse ownerDa
5858

5959
if (ownerDataverse != null) {
6060
Root<Dataverse> dataverseRoot = criteriaQuery.from(Dataverse.class);
61+
Join<Dataverse, DataverseFieldTypeInputLevel> datasetFieldTypeInputLevelJoin = dataverseRoot.join("dataverseFieldTypeInputLevels", JoinType.LEFT);
62+
63+
Predicate requiredPredicate = criteriaBuilder.and(
64+
datasetFieldTypeInputLevelJoin.get("datasetFieldType").in(metadataBlockRoot.get("datasetFieldTypes")),
65+
criteriaBuilder.isTrue(datasetFieldTypeInputLevelJoin.get("required")));
66+
67+
Predicate unionPredicate = criteriaBuilder.or(displayOnCreatePredicate, requiredPredicate);
68+
6169
criteriaQuery.where(criteriaBuilder.and(
6270
criteriaBuilder.equal(dataverseRoot.get("id"), ownerDataverse.getId()),
6371
metadataBlockRoot.in(dataverseRoot.get("metadataBlocks")),
64-
displayOnCreatePredicate
72+
unionPredicate
6573
));
6674
} else {
6775
criteriaQuery.where(displayOnCreatePredicate);

src/main/java/edu/harvard/iq/dataverse/Shib.java

+3
Original file line numberDiff line numberDiff line change
@@ -59,6 +59,8 @@ public class Shib implements java.io.Serializable {
5959
SettingsServiceBean settingsService;
6060
@EJB
6161
SystemConfig systemConfig;
62+
@EJB
63+
UserServiceBean userService;
6264

6365
HttpServletRequest request;
6466

@@ -259,6 +261,7 @@ else if (ShibAffiliationOrder.equals("firstAffiliation")) {
259261
state = State.REGULAR_LOGIN_INTO_EXISTING_SHIB_ACCOUNT;
260262
logger.fine("Found user based on " + userPersistentId + ". Logging in.");
261263
logger.fine("Updating display info for " + au.getName());
264+
userService.updateLastLogin(au);
262265
authSvc.updateAuthenticatedUser(au, displayInfo);
263266
logInUserAndSetShibAttributes(au);
264267
String prettyFacesHomePageString = getPrettyFacesHomePageString(false);

0 commit comments

Comments
 (0)