Skip to content

Commit 30f560e

Browse files
committed
Merge branch 'develop' into 10341-croissant #10341
2 parents c7a7057 + 23a4d9b commit 30f560e

File tree

11 files changed

+60
-54
lines changed

11 files changed

+60
-54
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
A bug that prevented the Ingest option in the File page Edit File menu from working has been fixed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
Dataverse will use the Dataset thumbnail, if one is defined, rather than the generic Dataverse logo in the Open Graph metadata header. This means the image will be seen when, for example, the dataset is referenced in Facebook.

doc/sphinx-guides/source/admin/harvestclients.rst

+5
Original file line numberDiff line numberDiff line change
@@ -47,3 +47,8 @@ What if a Run Fails?
4747
Each harvesting client run logs a separate file per run to the app server's default logging directory (``/usr/local/payara6/glassfish/domains/domain1/logs/`` unless you've changed it). Look for filenames in the format ``harvest_TARGET_YYYY_MM_DD_timestamp.log`` to get a better idea of what's going wrong.
4848

4949
Note that you'll want to run a minimum of Dataverse Software 4.6, optimally 4.18 or beyond, for the best OAI-PMH interoperability.
50+
51+
Harvesting Non-OAI-PMH
52+
~~~~~~~~~~~~~~~~~~~~~~
53+
54+
`DOI2PMH <https://github.com/IQSS/doi2pmh-server>`__ is a community-driven project intended to allow OAI-PMH harvesting from non-OAI-PMH sources.

doc/sphinx-guides/source/api/apps.rst

+7
Original file line numberDiff line numberDiff line change
@@ -133,6 +133,13 @@ https://github.com/libis/rdm-integration
133133
PHP
134134
---
135135

136+
DOI2PMH
137+
~~~~~~~
138+
139+
The DOI2PMH server allow Dataverse instances to harvest DOI through OAI-PMH from otherwise unharvestable sources.
140+
141+
https://github.com/IQSS/doi2pmh-server
142+
136143
OJS
137144
~~~
138145

doc/sphinx-guides/source/api/native-api.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -1179,7 +1179,7 @@ See also :ref:`batch-exports-through-the-api` and the note below:
11791179
export PERSISTENT_IDENTIFIER=doi:10.5072/FK2/J8SJZB
11801180
export METADATA_FORMAT=ddi
11811181
1182-
curl "$SERVER_URL/api/datasets/export?exporter=$METADATA_FORMAT&persistentId=PERSISTENT_IDENTIFIER"
1182+
curl "$SERVER_URL/api/datasets/export?exporter=$METADATA_FORMAT&persistentId=$PERSISTENT_IDENTIFIER"
11831183
11841184
The fully expanded example above (without environment variables) looks like this:
11851185

doc/sphinx-guides/source/developers/deployment.rst

+3-9
Original file line numberDiff line numberDiff line change
@@ -91,17 +91,11 @@ Download `ec2-create-instance.sh`_ and put it somewhere reasonable. For the purp
9191

9292
.. _ec2-create-instance.sh: https://raw.githubusercontent.com/GlobalDataverseCommunityConsortium/dataverse-ansible/master/ec2/ec2-create-instance.sh
9393

94-
To run it with default values you just need the script, but you may also want a current copy of the ansible `group vars <https://raw.githubusercontent.com/GlobalDataverseCommunityConsortium/dataverse-ansible/master/defaults/main.yml>`_ file.
94+
To run the script, you can make it executable (``chmod 755 ec2-create-instance.sh``) or run it with bash, like this with ``-h`` as an argument to print the help:
9595

96-
ec2-create-instance accepts a number of command-line switches, including:
96+
``bash ~/Downloads/ec2-create-instance.sh -h``
9797

98-
* -r: GitHub Repository URL (defaults to https://github.com/IQSS/dataverse.git)
99-
* -b: branch to build (defaults to develop)
100-
* -p: pemfile directory (defaults to $HOME)
101-
* -g: Ansible GroupVars file (if you wish to override role defaults)
102-
* -h: help (displays usage for each available option)
103-
104-
``bash ~/Downloads/ec2-create-instance.sh -b develop -r https://github.com/scholarsportal/dataverse.git -g main.yml``
98+
If you run the script without any arguments, it should spin up the latest version of Dataverse.
10599

106100
You will need to wait for 15 minutes or so until the deployment is finished, longer if you've enabled sample data and/or the API test suite. Eventually, the output should tell you how to access the Dataverse installation in a web browser or via SSH. It will also provide instructions on how to delete the instance when you are finished with it. Please be aware that AWS charges per minute for a running instance. You may also delete your instance from https://console.aws.amazon.com/console/home?region=us-east-1 .
107101

src/main/java/edu/harvard/iq/dataverse/DatasetServiceBean.java

+27-26
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,6 @@
1919
import edu.harvard.iq.dataverse.export.ExportService;
2020
import edu.harvard.iq.dataverse.globus.GlobusServiceBean;
2121
import edu.harvard.iq.dataverse.harvest.server.OAIRecordServiceBean;
22-
import edu.harvard.iq.dataverse.pidproviders.PidProvider;
23-
import edu.harvard.iq.dataverse.pidproviders.PidUtil;
2422
import edu.harvard.iq.dataverse.search.IndexServiceBean;
2523
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
2624
import edu.harvard.iq.dataverse.util.BundleUtil;
@@ -41,11 +39,10 @@
4139
import jakarta.ejb.TransactionAttributeType;
4240
import jakarta.inject.Named;
4341
import jakarta.persistence.EntityManager;
44-
import jakarta.persistence.LockModeType;
4542
import jakarta.persistence.NoResultException;
43+
import jakarta.persistence.NonUniqueResultException;
4644
import jakarta.persistence.PersistenceContext;
4745
import jakarta.persistence.Query;
48-
import jakarta.persistence.StoredProcedureQuery;
4946
import jakarta.persistence.TypedQuery;
5047
import org.apache.commons.lang3.StringUtils;
5148

@@ -115,28 +112,32 @@ public Dataset find(Object pk) {
115112
* @return a dataset with pre-fetched file objects
116113
*/
117114
public Dataset findDeep(Object pk) {
118-
return (Dataset) em.createNamedQuery("Dataset.findById")
119-
.setParameter("id", pk)
120-
// Optimization hints: retrieve all data in one query; this prevents point queries when iterating over the files
121-
.setHint("eclipselink.left-join-fetch", "o.files.ingestRequest")
122-
.setHint("eclipselink.left-join-fetch", "o.files.thumbnailForDataset")
123-
.setHint("eclipselink.left-join-fetch", "o.files.dataTables")
124-
.setHint("eclipselink.left-join-fetch", "o.files.auxiliaryFiles")
125-
.setHint("eclipselink.left-join-fetch", "o.files.ingestReports")
126-
.setHint("eclipselink.left-join-fetch", "o.files.dataFileTags")
127-
.setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas")
128-
.setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas.fileCategories")
129-
.setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas.varGroups")
130-
//.setHint("eclipselink.left-join-fetch", "o.files.guestbookResponses
131-
.setHint("eclipselink.left-join-fetch", "o.files.embargo")
132-
.setHint("eclipselink.left-join-fetch", "o.files.retention")
133-
.setHint("eclipselink.left-join-fetch", "o.files.fileAccessRequests")
134-
.setHint("eclipselink.left-join-fetch", "o.files.owner")
135-
.setHint("eclipselink.left-join-fetch", "o.files.releaseUser")
136-
.setHint("eclipselink.left-join-fetch", "o.files.creator")
137-
.setHint("eclipselink.left-join-fetch", "o.files.alternativePersistentIndentifiers")
138-
.setHint("eclipselink.left-join-fetch", "o.files.roleAssignments")
139-
.getSingleResult();
115+
try {
116+
return (Dataset) em.createNamedQuery("Dataset.findById")
117+
.setParameter("id", pk)
118+
// Optimization hints: retrieve all data in one query; this prevents point queries when iterating over the files
119+
.setHint("eclipselink.left-join-fetch", "o.files.ingestRequest")
120+
.setHint("eclipselink.left-join-fetch", "o.files.thumbnailForDataset")
121+
.setHint("eclipselink.left-join-fetch", "o.files.dataTables")
122+
.setHint("eclipselink.left-join-fetch", "o.files.auxiliaryFiles")
123+
.setHint("eclipselink.left-join-fetch", "o.files.ingestReports")
124+
.setHint("eclipselink.left-join-fetch", "o.files.dataFileTags")
125+
.setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas")
126+
.setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas.fileCategories")
127+
.setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas.varGroups")
128+
//.setHint("eclipselink.left-join-fetch", "o.files.guestbookResponses
129+
.setHint("eclipselink.left-join-fetch", "o.files.embargo")
130+
.setHint("eclipselink.left-join-fetch", "o.files.retention")
131+
.setHint("eclipselink.left-join-fetch", "o.files.fileAccessRequests")
132+
.setHint("eclipselink.left-join-fetch", "o.files.owner")
133+
.setHint("eclipselink.left-join-fetch", "o.files.releaseUser")
134+
.setHint("eclipselink.left-join-fetch", "o.files.creator")
135+
.setHint("eclipselink.left-join-fetch", "o.files.alternativePersistentIndentifiers")
136+
.setHint("eclipselink.left-join-fetch", "o.files.roleAssignments")
137+
.getSingleResult();
138+
} catch (NoResultException | NonUniqueResultException ex) {
139+
return null;
140+
}
140141
}
141142

142143
public List<Dataset> findByOwnerId(Long ownerId) {

src/main/java/edu/harvard/iq/dataverse/FilePage.java

+10-11
Original file line numberDiff line numberDiff line change
@@ -522,10 +522,9 @@ public String ingestFile() throws CommandException{
522522
return null;
523523
}
524524

525-
DataFile dataFile = fileMetadata.getDataFile();
526-
editDataset = dataFile.getOwner();
525+
editDataset = file.getOwner();
527526

528-
if (dataFile.isTabularData()) {
527+
if (file.isTabularData()) {
529528
JH.addMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("file.ingest.alreadyIngestedWarning"));
530529
return null;
531530
}
@@ -537,25 +536,25 @@ public String ingestFile() throws CommandException{
537536
return null;
538537
}
539538

540-
if (!FileUtil.canIngestAsTabular(dataFile)) {
539+
if (!FileUtil.canIngestAsTabular(file)) {
541540
JH.addMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("file.ingest.cantIngestFileWarning"));
542541
return null;
543542

544543
}
545544

546-
dataFile.SetIngestScheduled();
545+
file.SetIngestScheduled();
547546

548-
if (dataFile.getIngestRequest() == null) {
549-
dataFile.setIngestRequest(new IngestRequest(dataFile));
547+
if (file.getIngestRequest() == null) {
548+
file.setIngestRequest(new IngestRequest(file));
550549
}
551550

552-
dataFile.getIngestRequest().setForceTypeCheck(true);
551+
file.getIngestRequest().setForceTypeCheck(true);
553552

554553
// update the datafile, to save the newIngest request in the database:
555554
datafileService.save(file);
556555

557556
// queue the data ingest job for asynchronous execution:
558-
String status = ingestService.startIngestJobs(editDataset.getId(), new ArrayList<>(Arrays.asList(dataFile)), (AuthenticatedUser) session.getUser());
557+
String status = ingestService.startIngestJobs(editDataset.getId(), new ArrayList<>(Arrays.asList(file)), (AuthenticatedUser) session.getUser());
559558

560559
if (!StringUtil.isEmpty(status)) {
561560
// This most likely indicates some sort of a problem (for example,
@@ -565,9 +564,9 @@ public String ingestFile() throws CommandException{
565564
// successfully gone through the process of trying to schedule the
566565
// ingest job...
567566

568-
logger.warning("Ingest Status for file: " + dataFile.getId() + " : " + status);
567+
logger.warning("Ingest Status for file: " + file.getId() + " : " + status);
569568
}
570-
logger.fine("File: " + dataFile.getId() + " ingest queued");
569+
logger.fine("File: " + file.getId() + " ingest queued");
571570

572571
init();
573572
JsfHelper.addInfoMessage(BundleUtil.getStringFromBundle("file.ingest.ingestQueued"));

src/main/java/edu/harvard/iq/dataverse/engine/command/impl/MergeInAccountCommand.java

+1-2
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,6 @@
1414
import edu.harvard.iq.dataverse.UserNotification;
1515
import edu.harvard.iq.dataverse.authorization.AuthenticatedUserLookup;
1616
import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinUser;
17-
import edu.harvard.iq.dataverse.authorization.providers.oauth2.OAuth2TokenData;
1817
import edu.harvard.iq.dataverse.authorization.users.ApiToken;
1918
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
2019
import edu.harvard.iq.dataverse.batch.util.LoggingUtil;
@@ -25,7 +24,6 @@
2524
import edu.harvard.iq.dataverse.engine.command.RequiredPermissions;
2625
import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
2726
import edu.harvard.iq.dataverse.engine.command.exception.IllegalCommandException;
28-
import edu.harvard.iq.dataverse.passwordreset.PasswordResetData;
2927
import edu.harvard.iq.dataverse.search.IndexResponse;
3028
import edu.harvard.iq.dataverse.search.savedsearch.SavedSearch;
3129
import edu.harvard.iq.dataverse.workflows.WorkflowComment;
@@ -177,6 +175,7 @@ protected void executeImpl(CommandContext ctxt) throws CommandException {
177175

178176
ctxt.em().createNativeQuery("Delete from OAuth2TokenData where user_id ="+consumedAU.getId()).executeUpdate();
179177

178+
ctxt.em().createNativeQuery("DELETE FROM explicitgroup_authenticateduser consumed USING explicitgroup_authenticateduser ongoing WHERE consumed.containedauthenticatedusers_id="+ongoingAU.getId()+" AND ongoing.containedauthenticatedusers_id="+consumedAU.getId()).executeUpdate();
180179
ctxt.em().createNativeQuery("UPDATE explicitgroup_authenticateduser SET containedauthenticatedusers_id="+ongoingAU.getId()+" WHERE containedauthenticatedusers_id="+consumedAU.getId()).executeUpdate();
181180

182181
ctxt.actionLog().changeUserIdentifierInHistory(consumedAU.getIdentifier(), ongoingAU.getIdentifier());

src/main/webapp/dataset.xhtml

+1-1
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@
8686
<meta property="og:title" content="#{DatasetPage.title}" />
8787
<meta property="og:type" content="article" />
8888
<meta property="og:url" content="#{DatasetPage.dataverseSiteUrl}/dataset.xhtml?persistentId=#{dataset.globalId}" />
89-
<meta property="og:image" content="#{DatasetPage.dataverseSiteUrl.concat(resource['images/dataverse-icon-1200.png'])}" />
89+
<meta property="og:image" content="#{DatasetPage.dataset.getDatasetThumbnail(ImageThumbConverter.DEFAULT_PREVIEW_SIZE) == null ? DatasetPage.dataverseSiteUrl.concat(resource['images/dataverse-icon-1200.png']): DatasetPage.dataverseSiteUrl.concat('/api/datasets/:persistentId/thumbnail?persistentId=').concat(DatasetPage.dataset.getGlobalId().asString())}" />
9090
<meta property="og:site_name" content="#{DatasetPage.publisher}" />
9191
<meta property="og:description" content="#{(DatasetPage.description.length()>150 ? DatasetPage.description.substring(0,147).concat('...') : DatasetPage.description)}" />
9292
<ui:repeat var="author" value="#{DatasetPage.datasetAuthors}">

src/test/java/edu/harvard/iq/dataverse/api/UsersIT.java

+3-4
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,7 @@
88
import edu.harvard.iq.dataverse.authorization.DataverseRole;
99
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
1010
import java.util.ArrayList;
11+
import java.util.Arrays;
1112
import java.util.List;
1213
import java.util.UUID;
1314
import jakarta.json.Json;
@@ -206,15 +207,13 @@ public void testMergeAccounts(){
206207
String aliasInOwner = "groupFor" + dataverseAlias;
207208
String displayName = "Group for " + dataverseAlias;
208209
String user2identifier = "@" + usernameConsumed;
210+
String target2identifier = "@" + targetname;
209211
Response createGroup = UtilIT.createGroup(dataverseAlias, aliasInOwner, displayName, superuserApiToken);
210212
createGroup.prettyPrint();
211213
createGroup.then().assertThat()
212214
.statusCode(CREATED.getStatusCode());
213215

214-
String groupIdentifier = JsonPath.from(createGroup.asString()).getString("data.identifier");
215-
216-
List<String> roleAssigneesToAdd = new ArrayList<>();
217-
roleAssigneesToAdd.add(user2identifier);
216+
List<String> roleAssigneesToAdd = Arrays.asList(user2identifier, target2identifier);
218217
Response addToGroup = UtilIT.addToGroup(dataverseAlias, aliasInOwner, roleAssigneesToAdd, superuserApiToken);
219218
addToGroup.prettyPrint();
220219
addToGroup.then().assertThat()

0 commit comments

Comments
 (0)