Skip to content

Commit 2d7bd80

Browse files
committed
Merge remote-tracking branch 'IQSS/develop' into IQSS/12020-Payara7_Java21_update
2 parents 7c47505 + 89cf927 commit 2d7bd80

8 files changed

Lines changed: 44 additions & 26 deletions

File tree

.github/workflows/container_base_push.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ jobs:
5050
# In case this is a push to develop, we care about buildtime.
5151
# Configure a remote ARM64 build host in addition to the local AMD64 in two steps.
5252
- name: Setup SSH agent
53-
uses: webfactory/ssh-agent@v0.9.1
53+
uses: webfactory/ssh-agent@v0.10.0
5454
with:
5555
ssh-private-key: ${{ secrets.BUILDER_ARM64_SSH_PRIVATE_KEY }}
5656
- name: Provide the known hosts key and the builder config

doc/sphinx-guides/source/admin/metadatacustomization.rst

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -666,10 +666,6 @@ When creating new metadata blocks, please review the :doc:`/style/text` section
666666

667667
If there are tips that you feel are omitted from this document, please open an issue at https://github.com/IQSS/dataverse/issues and consider making a pull request to make improvements. You can find this document at https://github.com/IQSS/dataverse/blob/develop/doc/sphinx-guides/source/admin/metadatacustomization.rst
668668

669-
Alternatively, you are welcome to request "edit" access to this "Tips for Dataverse Software metadata blocks from the community" Google doc: https://docs.google.com/document/d/1XpblRw0v0SvV-Bq6njlN96WyHJ7tqG0WWejqBdl7hE0/edit?usp=sharing
670-
671-
The thinking is that the tips can become issues and the issues can eventually be worked on as features to improve the Dataverse Software metadata system.
672-
673669
Development Tasks Specific to Changing Fields in Core Metadata Blocks
674670
---------------------------------------------------------------------
675671

doc/sphinx-guides/source/api/native-api.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6424,7 +6424,7 @@ The fully expanded example above (without environment variables) looks like this
64246424
Show Disclaimer for Publishing Datasets
64256425
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
64266426
6427-
The setting "PublishDatasetDisclaimerText", when set, will prevent a draft dataset from being published through the UI without the user acknowledging the disclaimer.
6427+
The setting :ref:`:PublishDatasetDisclaimerText`, when set, will prevent a draft dataset from being published through the UI without the user acknowledging the disclaimer.
64286428
64296429
.. note:: See :ref:`show-custom-popup-for-publishing-datasets` if the user acknowledgment is not required but you want the message to be displayed in the UI.
64306430
.. note:: See :ref:`curl-examples-and-environment-variables` if you are unfamiliar with the use of export below.

doc/sphinx-guides/source/installation/config.rst

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4075,7 +4075,12 @@ dataverse.feature.only-update-datacite-when-needed
40754075

40764076
Only contact DataCite to update a DOI after checking to see if DataCite has outdated information (for efficiency, lighter load on DataCite, especially when using file DOIs).
40774077

4078+
.. _dataverse.feature.require-embargo-reason:
40784079

4080+
dataverse.feature.require-embargo-reason
4081+
++++++++++++++++++++++++++++++++++++++++
4082+
4083+
Require an embargo reason when a user creates an embargo on one or more files. See :ref:`embargoes`.
40794084

40804085
.. _:ApplicationServerSettings:
40814086

@@ -4826,6 +4831,10 @@ If you have a long text string, you can upload it as a file as in the example be
48264831

48274832
``curl -X PUT --upload-file /tmp/long.txt http://localhost:8080/api/admin/settings/:DatasetPublishPopupCustomText``
48284833

4834+
There is a related setting called :ref:`:PublishDatasetDisclaimerText` that also makes text appear on the popup when publishing, but it requires a checkbox to be clicked.
4835+
4836+
See also :ref:`show-custom-popup-for-publishing-datasets` in the API Guide.
4837+
48294838
:DatasetPublishPopupCustomTextOnAllVersions
48304839
+++++++++++++++++++++++++++++++++++++++++++
48314840

@@ -5356,6 +5365,10 @@ The text displayed to the user that must be acknowledged prior to publishing a D
53565365

53575366
``curl -X PUT -d "By publishing this dataset, I fully accept all legal responsibility for ensuring that the deposited content is: anonymized, free of copyright violations, and contains data that is computationally reusable. I understand and agree that any violation of these conditions may result in the immediate removal of the dataset by the repository without prior notice." http://localhost:8080/api/admin/settings/:PublishDatasetDisclaimerText``
53585367

5368+
There is a similar setting called :ref:`:DatasetPublishPopupCustomText` that also makes text appear on the popup when publishing, but it is only informational. There is no checkbox to click.
5369+
5370+
See also :ref:`show-disclaimer-for-publishing-datasets` in the API Guide.
5371+
53595372
.. _:BagItHandlerEnabled:
53605373

53615374
:BagItHandlerEnabled

doc/sphinx-guides/source/user/dataset-management.rst

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -750,7 +750,7 @@ Note that only one Preview URL (normal or with anonymized access) can be configu
750750
Embargoes
751751
=========
752752

753-
A Dataverse instance may be configured to support file-level embargoes. Embargoes make file content inaccessible after a dataset version is published - until the embargo end date. A reason for the embargo may be supplied when creating the embargo. A reason may be required in some Dataverse instances.
753+
A Dataverse instance may be configured to support file-level embargoes. Embargoes make file content inaccessible after a dataset version is published - until the embargo end date. A reason for the embargo may be supplied when creating the embargo. A reason may be :ref:`required <dataverse.feature.require-embargo-reason>` in some Dataverse instances.
754754
This means that file previews and the ability to download files will be blocked. The effect is similar to when a file is restricted except that the embargo will end at the specified date without further action and during the embargo, requests for file access cannot be made.
755755
Embargoes of files in a version 1.0 dataset may also affect the date shown in the dataset and file citations. The recommended practice is for the citation to reflect the date on which all embargoes on files in version 1.0 end. (Since Dataverse creates one persistent identifier per dataset and doesn't create new ones for each version, the publication of later versions, with or without embargoed files, does not affect the citation date.)
756756

@@ -872,6 +872,8 @@ If your installation is configured to use DataCite as a persistent ID (PID) prov
872872
Review Datasets
873873
---------------
874874

875+
.. note:: This is an experimental feature.
876+
875877
.. _review-datasets-overview:
876878

877879
Review Dataset Overview

src/main/java/edu/harvard/iq/dataverse/DvObjectServiceBean.java

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -223,6 +223,10 @@ public DvObject updateContentIndexTime(DvObject dvObject) {
223223
* updateContentIndexTime method.
224224
*/
225225
@TransactionAttribute(REQUIRES_NEW)
226+
public DvObject updatePermissionIndexTimeInNewTransaction(DvObject dvObject) {
227+
return updatePermissionIndexTime(dvObject);
228+
}
229+
226230
public DvObject updatePermissionIndexTime(DvObject dvObject) {
227231
/**
228232
* @todo to avoid a possible OptimisticLockException, should we merge

src/main/java/edu/harvard/iq/dataverse/search/SolrIndexServiceBean.java

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -300,7 +300,12 @@ public IndexResponse indexPermissionsForOneDvObject(DvObject dvObject) {
300300
persistToSolr(docs);
301301
boolean updatePermissionTimeSuccessful = false;
302302
if (dvObject != null) {
303-
DvObject savedDvObject = dvObjectService.updatePermissionIndexTime(dvObject);
303+
DvObject savedDvObject = null;
304+
if (dvObject.isInstanceofDataset()) {
305+
savedDvObject = dvObjectService.updatePermissionIndexTimeInNewTransaction(dvObject);
306+
} else {
307+
savedDvObject = dvObjectService.updatePermissionIndexTime(dvObject);
308+
}
304309
if (savedDvObject != null) {
305310
updatePermissionTimeSuccessful = true;
306311
}

src/main/java/edu/harvard/iq/dataverse/util/bagit/BagGenerator.java

Lines changed: 16 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -123,7 +123,6 @@ public class BagGenerator {
123123
private PoolingHttpClientConnectionManager cm = null;
124124

125125
private ChecksumType hashtype = null;
126-
private boolean ignorehashes = false;
127126

128127
private long dataCount = 0l;
129128
private long totalDataSize = 0l;
@@ -158,6 +157,8 @@ public class BagGenerator {
158157
private boolean usingFetchFile = false;
159158
private boolean createHoleyBag = false;
160159
private List<FileEntry> oversizedFiles = new ArrayList<>();
160+
161+
private ChecksumType defaultHashtype;
161162

162163
// Bag-info.txt field labels
163164
private static final String CONTACT_NAME = "Contact-Name: ";
@@ -244,6 +245,13 @@ public BagGenerator(jakarta.json.JsonObject oremapObject, String dataciteXml, Ma
244245
e.printStackTrace();
245246
}
246247
initializeHoleyBagLimits();
248+
try {
249+
// Use the current type if we can retrieve it
250+
defaultHashtype = CDI.current().select(SystemConfig.class).get().getFileFixityChecksumAlgorithm();
251+
} catch (Exception e) {
252+
// Default to MD5 if we can't
253+
defaultHashtype = DataFile.ChecksumType.MD5;
254+
}
247255
}
248256

249257
private void initializeHoleyBagLimits() {
@@ -255,10 +263,6 @@ private void initializeHoleyBagLimits() {
255263
", createHoleyBag: " + createHoleyBag);
256264
}
257265

258-
public void setIgnoreHashes(boolean val) {
259-
ignorehashes = val;
260-
}
261-
262266
public static void println(String s) {
263267
System.out.println(s);
264268
System.out.flush();
@@ -353,13 +357,8 @@ public boolean generateBag(OutputStream outputStream) throws Exception {
353357
sha1StringBuffer.append(sha1Entry.getValue() + " " + path);
354358
}
355359
if(hashtype == null) { // No files - still want to send an empty manifest to nominally comply with BagIT specification requirement.
356-
try {
357-
// Use the current type if we can retrieve it
358-
hashtype = CDI.current().select(SystemConfig.class).get().getFileFixityChecksumAlgorithm();
359-
} catch (Exception e) {
360-
// Default to MD5 if we can't
361-
hashtype = DataFile.ChecksumType.MD5;
362-
}
360+
// Use the default
361+
hashtype = defaultHashtype;
363362
}
364363
if (!(hashtype == null)) {
365364
String manifestName = "manifest-";
@@ -644,10 +643,6 @@ private void processAllFiles(List<FileEntry> sortedFiles)
644643
// Track titles to detect duplicates
645644
Set<String> titles = new HashSet<>();
646645

647-
if ((hashtype == null) | ignorehashes) {
648-
hashtype = DataFile.ChecksumType.SHA512;
649-
}
650-
651646
for (FileEntry entry : sortedFiles) {
652647
// Extract all needed information from the JsonObject reference
653648
JsonObject child = entry.jsonObject;
@@ -679,12 +674,15 @@ private void processAllFiles(List<FileEntry> sortedFiles)
679674
childHash = child.getAsJsonObject(JsonLDTerm.checksum.getLabel()).get("@value").getAsString();
680675
}
681676
}
682-
677+
//Pick a hashtype if we encounter a file that doesn't have one
678+
if (hashtype == null) {
679+
hashtype = defaultHashtype;
680+
}
683681
resourceUsed[entry.resourceIndex] = true;
684682
String dataUrl = entry.getDataUrl();
685683

686684
try {
687-
if ((childHash == null) | ignorehashes) {
685+
if (childHash == null) {
688686
// Generate missing hash
689687

690688
try (InputStream inputStream = getInputStreamSupplier(dataUrl).get()){

0 commit comments

Comments
 (0)