diff --git a/doc/release-notes/12319-LocallyFAIRdata b/doc/release-notes/12319-LocallyFAIRdata new file mode 100644 index 00000000000..afbd0e26302 --- /dev/null +++ b/doc/release-notes/12319-LocallyFAIRdata @@ -0,0 +1,9 @@ +This release includes experimental support for "Locally FAIR" data. +This feature allows publication of content that will only be visible to authorized users or groups within a Dataverse installation. +User without authorization will not see the Locally FAIR collections, datasets, or files in search results and cannot visit their +pages or access them via the Dataverse API. + +For more information, see the [Locally FAIR Data](https://guides.dataverse.org/en/latest/user/locally-fair-data.html) guide. + +New Config Option: +Whether Locally FAIR content can be created is controlled by the new `dataverse.feature.allow-locally-fair-data` feature flag. \ No newline at end of file diff --git a/doc/sphinx-guides/source/api/native-api.rst b/doc/sphinx-guides/source/api/native-api.rst index d8da7b8df26..4103b95f875 100644 --- a/doc/sphinx-guides/source/api/native-api.rst +++ b/doc/sphinx-guides/source/api/native-api.rst @@ -929,6 +929,116 @@ In particular, the user permissions that this API call checks, returned as boole curl -H "X-Dataverse-key: $API_TOKEN" -X GET "$SERVER_URL/api/dataverses/$ID/userPermissions" + List Locally FAIR Role Assignees for a Dataverse Collection + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + + Lists the Locally FAIR role assignee identifiers configured for a Dataverse collection identified by ``id``. + For more about the concept, see the :doc:`/user/locally-fair` section of the User Guide. + + This API is superuser-only. + + .. code-block:: bash + + export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx + export SERVER_URL=https://demo.dataverse.org + export ID=root + + curl -H "X-Dataverse-key:$API_TOKEN" "$SERVER_URL/api/dataverses/$ID/locallyFairRoleAssignees" + + The fully expanded example above (without environment variables) looks like this: + + .. code-block:: bash + + curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" "https://demo.dataverse.org/api/dataverses/root/locallyFairRoleAssignees" + + The response is a JSON array of role assignee identifiers. For example: + + .. code-block:: json + + [ + "@TestUser", + "&maildomain/harvard.edu" + ] + + Set Locally FAIR Role Assignees for a Dataverse Collection + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + + Replaces the full set of locally FAIR role assignee identifiers for a Dataverse collection identified by ``id``. + + This API is superuser-only. + + .. code-block:: bash + + export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx + export SERVER_URL=https://demo.dataverse.org + export ID=root + export JSON='["@TestUser","&maildomain/harvard.edu"]' + + curl -H "X-Dataverse-key:$API_TOKEN" -X PUT -H "Content-Type: application/json" "$SERVER_URL/api/dataverses/$ID/locallyFairRoleAssignees" -d "$JSON" + + The fully expanded example above (without environment variables) looks like this: + + .. code-block:: bash + + curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X PUT -H "Content-Type: application/json" "https://demo.dataverse.org/api/dataverses/root/locallyFairRoleAssignees" -d '["@TestUser","&maildomain/harvard.edu"]' + + Pass an empty array to clear all locally FAIR role assignees from the collection: + + .. code-block:: bash + + curl -H "X-Dataverse-key:$API_TOKEN" -X PUT -H "Content-Type: application/json" "$SERVER_URL/api/dataverses/$ID/locallyFairRoleAssignees" -d '[]' + + All supplied identifiers must be valid existing role assignee identifiers. Invalid identifiers will result in ``400 Bad Request``. + + Add a Locally FAIR Role Assignee to a Dataverse Collection + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + + Adds a single locally FAIR role assignee identifier to a Dataverse collection identified by ``id``. + + This API is superuser-only. + + .. code-block:: bash + + export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx + export SERVER_URL=https://demo.dataverse.org + export ID=root + export ROLE_ASSIGNEE=&shib/1 + + curl -H "X-Dataverse-key:$API_TOKEN" -X PUT "$SERVER_URL/api/dataverses/$ID/locallyFairRoleAssignees/$ROLE_ASSIGNEE" + + The fully expanded example above (without environment variables) looks like this: + + .. code-block:: bash + + curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X PUT "https://demo.dataverse.org/api/dataverses/root/locallyFairRoleAssignees/&shib/1" + + The response includes the updated set of locally FAIR role assignee identifiers. + + Delete a Locally FAIR Role Assignee from a Dataverse Collection + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + + Removes a single locally FAIR role assignee identifier from a Dataverse collection identified by ``id``. + + This API is superuser-only. + + .. code-block:: bash + + export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx + export SERVER_URL=https://demo.dataverse.org + export ID=root + export ROLE_ASSIGNEE=:authenticated-users + + curl -H "X-Dataverse-key:$API_TOKEN" -X DELETE "$SERVER_URL/api/dataverses/$ID/locallyFairRoleAssignees/$ROLE_ASSIGNEE" + + The fully expanded example above (without environment variables) looks like this: + + .. code-block:: bash + + curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X DELETE "https://demo.dataverse.org/api/dataverses/root/locallyFairRoleAssignees/:authenticated-users" + + The response includes the updated set of locally FAIR role assignee identifiers. Removing an identifier that is blank or not currently assigned will result in ``400 Bad Request``. + + .. _create-dataset-command: Create a Dataset in a Dataverse Collection diff --git a/doc/sphinx-guides/source/installation/config.rst b/doc/sphinx-guides/source/installation/config.rst index e5ed52acb83..7ebf7c89a27 100644 --- a/doc/sphinx-guides/source/installation/config.rst +++ b/doc/sphinx-guides/source/installation/config.rst @@ -4082,6 +4082,15 @@ dataverse.feature.require-embargo-reason Require an embargo reason when a user creates an embargo on one or more files. See :ref:`embargoes`. +.. _dataverse.feature.allow-locally-fair-data: + +dataverse.feature.allow-locally-fair-data ++++++++++++++++++++++++++++++++++++++++++ + +Allows support for Locally FAIR collections and datasets. +When enabled, selected content can remain visible only to authorized users or groups within a Dataverse installation. +See :doc:`/user/locally-fair` for more information. + .. _:ApplicationServerSettings: Application Server Settings diff --git a/doc/sphinx-guides/source/user/index.rst b/doc/sphinx-guides/source/user/index.rst index 7a196afe476..e7602f0ef9d 100755 --- a/doc/sphinx-guides/source/user/index.rst +++ b/doc/sphinx-guides/source/user/index.rst @@ -21,6 +21,7 @@ User Guide dataverse-management dataset-management tabulardataingest/index + locally-fair appendix .. |what-is-dataverse| image:: ./img/what-is-dataverse.svg diff --git a/doc/sphinx-guides/source/user/locally-fair.rst b/doc/sphinx-guides/source/user/locally-fair.rst new file mode 100644 index 00000000000..35231b85fed --- /dev/null +++ b/doc/sphinx-guides/source/user/locally-fair.rst @@ -0,0 +1,106 @@ +Locally FAIR +++++++++++++ + +Locally FAIR describes content that is managed according to FAIR principles +(Findable, Accessible, Interoperable, and Reusable) within a defined local or +organizational community rather than for the public internet as a whole. + +Dataverse now has optional, experimental support for managing Locally FAIR collections. + +In a typical public Dataverse installation, published dataset metadata is visible +to everyone, even if the dataset's files themselves may be embargoed or restricted. Locally FAIR support +extends this model by allowing some collections, and the published datasets within them to remain +visible only to designated users or groups. This makes it possible for a single +Dataverse installation to support both: + +- public, globally discoverable content; and +- organizational content whose existence and metadata are only be visible to + authorized users. + +The rationale for making some content Locally FAIR can vary. +Locally FAIR content can include: + +- sensitive research collections; +- institution-only datasets; +- datasets that should not be accessible to bots that may not adhere to the dataset license and terms, and +- projects under contractual or policy restrictions; + +Dataverse's Locally FAIR mechanism is appropriate for repositories that will house at least some data +whose metadata should only be visible to organizational members. The decision to make data Locally FAIR +is managed at the collection level and repositories can have both FAIR and Locally FAIR content. + +.. contents:: |toctitle| + :local: + +What Locally FAIR Means +======================= + +Locally FAIR content is intended to be FAIR within a particular community. + +That means: +- **Findable** Data is easy to locate for both humans and machines, when authorized. Locally FAIR datasets (and files if configured) have persistent identifiers, but do not use DOIs which are publicly searchable. + +- **Accessible** Data is retrievable through standardized protocols. Authorized users can use Dataverse's standard user interface and API calls to access Locally FAIR content in the same way they do with any published data. + +- **Interoperable** Data should be compatible with other datasets and systems. Locally FAIR datasets in Dataverse use the same standard metadata blocks as for public content and files undergo the same ingest process, use the same previewers and tools, etc. + +- **Reusable** Data should be well-described and licensed in a way that allows others to use it for future research. The licenses and terms on locally FAIR content make it clear how and when the data can be re-used. + +Why Repositories Use It +======================= + +Without Locally FAIR support, repositories may need separate Dataverse +installations to separate public and organization-only content. + +How It Differs from Restricted Files +==================================== + +Restricting or embargoing files limits access to the file contents, but in a standard public +repository the dataset's published metadata, including the list of files, would still be visible. +If a dataset allows requests for file access, anyone can request access, even if the dataset's +license or terms limit access to specific groups. + +Locally FAIR goes further. Locally FAIR collections and datasets do not appear in content listings or +search results for unauthorized users, nor can the collection/dataset/file page be viewed. API access +is also blocked for unauthorized access. + +Who Can See Locally FAIR Content +================================ + +Visibility is determined by superusers and is managed at the collection level. +Access can be granted to any group(s) or user(s) defined in Dataverse - the same groups/users +available when assigning roles on collections, datasets, and files. + +How Can You Tell When Content is Locally FAIR? +============================================== + +The Dataverse UI adds a "Locally FAIR" tag to all collections, datasets, and files who's visibility +is limited by the locally FAIR mechanism. + + +Why is Locally FAIR support "Experimental" +========================================== + +The word "experimental" is used when functionality is new, may evolve signifcantly in future releases, +and generally may require more effort to configure and manage and/or more effort to support than more +mature functionality. + +With the current Locally FAIR implementation, managers need to be aware that they are responsible for +choosing collection settings compatible with Locally FAIR content, i.e. not using DOIs (whose metadata +is publicly accessible) or publicly visible stores, etc. Users and managers should also be aware that +some functionality that might expose Locally FAIR content, e.g. linking, may not be prohibited programmatically +but should still be avoided. Similarly, users should be aware that functionality such as metrics and quotas +may expose the existence of Locally FAIR content. If your Dataverse instance supports Locally FAIR data, +you are encouraged to be an active participant in reporting any issues and suggesting further improvements. + +Things to Keep in Mind +====================== + +If your repository supports Locally FAIR content: + +- published does not always mean public; +- search and browse results may vary depending on who is logged in; +- colleagues outside your authorized group may not be able to see the same + datasets you can see; +- you should not share Locally FAIR content with others who don't have access themselves; and +- this functionality is experimental. diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java b/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java index 3b1c8c4f3c4..fafbc67673e 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java +++ b/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java @@ -2139,9 +2139,15 @@ private String init(boolean initFull) { return permissionsWrapper.notFound(); } - // Check permisisons - if (!(workingVersion.isReleased() || workingVersion.isDeaccessioned()) && !this.canViewUnpublishedDataset()) { - return permissionsWrapper.notAuthorized(); + // Check permissions + boolean releasedAndCanView = workingVersion.isReleased() && (!dataset.isLocallyFAIR() || permissionsWrapper + .hasLocallyFAIRAccess(dvRequestService.getDataverseRequest(), dataset)); + if (!(releasedAndCanView || workingVersion.isDeaccessioned()) && !this.canViewUnpublishedDataset()) { + if (dataset.isLocallyFAIR()) { + return permissionsWrapper.notFound(); + } else { + return permissionsWrapper.notAuthorized(); + } } if (retrieveDatasetVersionResponse != null && !retrieveDatasetVersionResponse.wasRequestedVersionRetrieved()) { diff --git a/src/main/java/edu/harvard/iq/dataverse/Dataverse.java b/src/main/java/edu/harvard/iq/dataverse/Dataverse.java index a719e32aa78..31919398530 100644 --- a/src/main/java/edu/harvard/iq/dataverse/Dataverse.java +++ b/src/main/java/edu/harvard/iq/dataverse/Dataverse.java @@ -16,7 +16,9 @@ import java.util.Objects; import java.util.Set; import jakarta.persistence.CascadeType; +import jakarta.persistence.CollectionTable; import jakarta.persistence.Column; +import jakarta.persistence.ElementCollection; import jakarta.persistence.Entity; import jakarta.persistence.EnumType; import jakarta.persistence.Enumerated; @@ -105,7 +107,40 @@ public enum DataverseType { @NotNull(message = "{dataverse.category}") @Column( nullable = false ) private DataverseType dataverseType; - + + + @ElementCollection + @CollectionTable(name = "dataverse_locallyfairassignees", + joinColumns = @JoinColumn(name = "dataverse_id")) + @Column(name = "assigneeidentifier") + private Set locallyFAIRRoleAssigneeIdentifiers = new HashSet<>(); + + @Override + public Set getLocallyFAIRRoleAssigneeIdentifiers() { + return locallyFAIRRoleAssigneeIdentifiers; + } + + public void setLocallyFAIRRoleAssigneeIdentifiers(Set roleAssigneeIdentifiers) { + this.locallyFAIRRoleAssigneeIdentifiers = roleAssigneeIdentifiers; + } + + public void addLocallyFAIRRoleAssignee(String assigneeIdentifier) { + if (locallyFAIRRoleAssigneeIdentifiers == null) { + locallyFAIRRoleAssigneeIdentifiers = new HashSet<>(); + } + locallyFAIRRoleAssigneeIdentifiers.add(assigneeIdentifier); + } + + public void removeLocallyFAIRRoleAssignee(String assigneeIdentifier) { + if (locallyFAIRRoleAssigneeIdentifiers != null) { + locallyFAIRRoleAssigneeIdentifiers.remove(assigneeIdentifier); + } + } + + public boolean LocallyFAIR(String assigneeIdentifier) { + return locallyFAIRRoleAssigneeIdentifiers != null && locallyFAIRRoleAssigneeIdentifiers.contains(assigneeIdentifier); + } + /** * When {@code true}, users are not granted permissions the got for parent * dataverses. @@ -907,7 +942,7 @@ public boolean isAncestorOf( DvObject other ) { } return false; } - + public String getLocalURL() { return SystemConfig.getDataverseSiteUrlStatic() + "/dataverse/" + this.getAlias(); } @@ -924,4 +959,10 @@ public void addInputLevelsMetadataBlocksIfNotPresent(List block.getId().equals(metadataBlock.getId())); } + + @Override + public boolean isLocallyFAIR() { + return !locallyFAIRRoleAssigneeIdentifiers.isEmpty(); + } + } diff --git a/src/main/java/edu/harvard/iq/dataverse/DataversePage.java b/src/main/java/edu/harvard/iq/dataverse/DataversePage.java index d9cafbf421a..a8fca620fbe 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DataversePage.java +++ b/src/main/java/edu/harvard/iq/dataverse/DataversePage.java @@ -2,6 +2,7 @@ import edu.harvard.iq.dataverse.UserNotification.Type; import edu.harvard.iq.dataverse.authorization.Permission; +import edu.harvard.iq.dataverse.authorization.RoleAssignee; import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser; import edu.harvard.iq.dataverse.authorization.users.User; import edu.harvard.iq.dataverse.dataaccess.DataAccess; @@ -43,13 +44,17 @@ import java.util.AbstractMap; import java.util.ArrayList; import java.util.Arrays; +import java.util.Collections; import java.util.HashMap; import java.util.HashSet; import java.util.Map; import java.util.Map.Entry; +import java.util.Objects; import java.util.Set; import java.util.logging.Level; import java.util.logging.Logger; +import java.util.stream.Collectors; + import jakarta.faces.component.UIComponent; import jakarta.faces.component.UIInput; import org.primefaces.model.DualListModel; @@ -122,6 +127,8 @@ public enum LinkMode { PidProviderFactoryBean pidProviderFactoryBean; @EJB CacheFactoryBean cacheFactory; + @EJB + RoleAssigneeServiceBean roleAssigneeService; private Dataverse dataverse = new Dataverse(); @@ -141,6 +148,7 @@ public enum LinkMode { private List linkingDVSelectItems; private Dataverse linkingDataverse; private List selectedSubjects; + private List locallyFAIRRoleAssigneesList; public List getSelectedSubjects() { return selectedSubjects; @@ -340,13 +348,17 @@ public String init() { } } - // check if dv exists and user has permission - if (dataverse == null) { - return permissionsWrapper.notFound(); - } - if (!dataverse.isReleased() && !permissionService.on(dataverse).has(Permission.ViewUnpublishedDataverse)) { - // the permission lookup above should probably be moved into the permissionsWrapper -- L.A. 5.7 - return permissionsWrapper.notAuthorized(); + // Check permissions for unreleased dataverse and Locally FAIR permissions for released dataverses + boolean releasedAndCanView = dataverse.isReleased() && (!dataverse.isLocallyFAIR() || permissionsWrapper + .hasLocallyFAIRAccess(dvRequestService.getDataverseRequest(), dataverse)); + + if (!releasedAndCanView && !permissionService.on(dataverse).has(Permission.ViewUnpublishedDataverse)) { + // Return notFound for FAIR-restricted content, notAuthorized otherwise + if (dataverse.isLocallyFAIR()) { + return permissionsWrapper.notFound(); + } else { + return permissionsWrapper.notAuthorized(); + } } ownerId = dataverse.getOwner() != null ? dataverse.getOwner().getId() : null; @@ -1346,6 +1358,20 @@ public void updateDisplayOnCreate(Long mdbId, Long dsftId, boolean currentValue) } } } + /** + * Returns role assignees matching the search query, while excluding any assignees + * that are already associated with this dataverse through locally FAIR role assignment. + * + * @param query search text used to filter possible role assignees + * @return matching role assignees that can still be added to the dataverse + */ + public List completeRoleAssignee( String query ) { + List existingAssignees = dataverse.getLocallyFAIRRoleAssigneeIdentifiers().stream() + .map(id -> roleAssigneeService.getRoleAssignee(id)) + .filter(Objects::nonNull) + .collect(Collectors.toList()); + return roleAssigneeService.filterRoleAssignees(query, dataverse, existingAssignees); + } private void saveInputLevels(List listDFTIL, DatasetFieldType dsft, Dataverse dataverse) { // If the field already has an input level, update it @@ -1368,4 +1394,22 @@ private void saveInputLevels(List listDFTIL, Datas )); } } + + /* Get/set methods to keep the local locallyFARIRoleAssigneesList in sync with the Dataverse's locallyFAIRRoleAssigneeIdentifiers set. + */ + public List getLocallyFAIRRoleAssigneesList() { + if (locallyFAIRRoleAssigneesList == null) { + locallyFAIRRoleAssigneesList = dataverse.getLocallyFAIRRoleAssigneeIdentifiers().stream() + .map(roleAssigneeService::getRoleAssignee) + .filter(Objects::nonNull) + .collect(Collectors.toList()); + } + return locallyFAIRRoleAssigneesList; + } + + public void setLocallyFAIRRoleAssigneesList(List assignees) { + locallyFAIRRoleAssigneesList = (assignees == null) ? Collections.emptyList() : assignees; + dataverse.setLocallyFAIRRoleAssigneeIdentifiers( + locallyFAIRRoleAssigneesList.stream().map(RoleAssignee::getIdentifier).collect(Collectors.toSet())); + } } diff --git a/src/main/java/edu/harvard/iq/dataverse/DvObject.java b/src/main/java/edu/harvard/iq/dataverse/DvObject.java index 68ff739a77f..2b13e2a5cc7 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DvObject.java +++ b/src/main/java/edu/harvard/iq/dataverse/DvObject.java @@ -515,5 +515,23 @@ public void setStorageQuota(StorageQuota storageQuota) { @OneToMany(mappedBy = "definitionPoint",cascade={ CascadeType.REMOVE, CascadeType.MERGE,CascadeType.PERSIST}, orphanRemoval=true) List roleAssignments; - + + /** Whether this object is locally FAIR which is determined by whether it is in a locallyFAIR collection. + * @return {@code true} if this object is locally FAIR and not publicly visible, {@code false} otherwise. + */ + public boolean isLocallyFAIR() { + if( getOwner() != null ) { + return getOwner().isLocallyFAIR(); + } else { + return false; + } + } + + public Set getLocallyFAIRRoleAssigneeIdentifiers() { + if(getOwner() != null) { + return getOwner().getLocallyFAIRRoleAssigneeIdentifiers(); + } else { + return Collections.emptySet(); + } + } } diff --git a/src/main/java/edu/harvard/iq/dataverse/FilePage.java b/src/main/java/edu/harvard/iq/dataverse/FilePage.java index b08598b2fb8..09dc360e7be 100644 --- a/src/main/java/edu/harvard/iq/dataverse/FilePage.java +++ b/src/main/java/edu/harvard/iq/dataverse/FilePage.java @@ -221,15 +221,21 @@ public String init() { } } - // If this DatasetVersion is unpublished and permission is doesn't have permissions: - // > Go to the Login page - // // Check permissions - Boolean authorized = (fileMetadata.getDatasetVersion().isReleased()) - || (!fileMetadata.getDatasetVersion().isReleased() && this.canViewUnpublishedDataset()); - - if (!authorized) { - return permissionsWrapper.notAuthorized(); + DatasetVersion datasetVersion = fileMetadata.getDatasetVersion(); + Dataset dataset = datasetVersion.getDataset(); + + // Check Locally FAIR permissions for released datasets + boolean releasedAndCanView = datasetVersion.isReleased() && (!file.isLocallyFAIR() || + permissionsWrapper.hasLocallyFAIRAccess(dvRequestService.getDataverseRequest(), file)); + + if (!releasedAndCanView && !canViewUnpublishedDataset()) { + // Return notFound for FAIR-restricted content, notAuthorized otherwise + if (file.isLocallyFAIR()) { + return permissionsWrapper.notFound(); + } else { + return permissionsWrapper.notAuthorized(); + } } //termsOfAccess = fileMetadata.getDatasetVersion().getTermsOfUseAndAccess().getTermsOfAccess(); diff --git a/src/main/java/edu/harvard/iq/dataverse/PermissionServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/PermissionServiceBean.java index 402a1b06e3c..bd91363d2bb 100644 --- a/src/main/java/edu/harvard/iq/dataverse/PermissionServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/PermissionServiceBean.java @@ -1064,5 +1064,39 @@ public List getEffectiveRoleAssignments(AuthenticatedUser user, return Stream.concat(directAssignments, groupAssignments) .collect(Collectors.toList()); } + + /** + * Determines if a user can view a dataset version based on its release status + * and the supplied Locally FAIR role assignees. + * + * @param req The request containing the user and Ip info (for IPgroups) + * @param dvObject the dvObject that may have locallyFairAssignees + * @return true if the user has locally FAIR access + */ + public boolean hasLocallyFAIRAccess(DataverseRequest req, DvObject dvObject) { + Set locallyFairAssignees = dvObject.getLocallyFAIRRoleAssigneeIdentifiers(); + // If no locally FAIR restrictions, it's publicly viewable + if (locallyFairAssignees.isEmpty()) { + return false; + } + + // Check if user is in the locally FAIR assignee list + Set userAndGroups = new HashSet<>(groupService.groupsFor(req)); + User user = req.getUser(); + if (user.isAuthenticated()) { + if(user.isSuperuser()) { + return true; + } + userAndGroups.add(user); + } + + for (RoleAssignee ra : userAndGroups) { + if (locallyFairAssignees.contains(ra.getIdentifier())) { + return true; + } + } + + return false; + } } diff --git a/src/main/java/edu/harvard/iq/dataverse/PermissionsWrapper.java b/src/main/java/edu/harvard/iq/dataverse/PermissionsWrapper.java index 2c6f8ff2fb1..9e358b92aa4 100644 --- a/src/main/java/edu/harvard/iq/dataverse/PermissionsWrapper.java +++ b/src/main/java/edu/harvard/iq/dataverse/PermissionsWrapper.java @@ -13,6 +13,7 @@ import edu.harvard.iq.dataverse.engine.command.impl.*; import java.util.HashMap; import java.util.Map; +import java.util.Set; import java.util.logging.Logger; import jakarta.ejb.EJB; import jakarta.faces.view.ViewScoped; @@ -55,6 +56,7 @@ public class PermissionsWrapper implements java.io.Serializable { private final Map fileDownloadPermissionMap = new HashMap<>(); // { DvObject.id : Boolean } private final Map datasetPermissionMap = new HashMap<>(); // { Permission human_name : Boolean } + Boolean hasLocallyFAIRAccess; /** * Check if the current Dataset can Issue Commands * @@ -297,4 +299,12 @@ public String notAuthorized(){ public String notFound() { return navigationWrapper.notFound(); } + + // The locallyFAIRraIds should not change within a given view (they are set in the parent Dataverse of whatever object the view is for) + public boolean hasLocallyFAIRAccess(DataverseRequest req, DvObject dvo) { + if(hasLocallyFAIRAccess == null ) { + hasLocallyFAIRAccess = permissionService.hasLocallyFAIRAccess(req, dvo); + } + return hasLocallyFAIRAccess; + } } diff --git a/src/main/java/edu/harvard/iq/dataverse/api/AbstractApiBean.java b/src/main/java/edu/harvard/iq/dataverse/api/AbstractApiBean.java index 1c8984f47d3..2cf4155ab62 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/AbstractApiBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/AbstractApiBean.java @@ -370,6 +370,20 @@ protected Dataverse findDataverseOrDie( String dvIdtf ) throws WrappedResponse { } return dv; } + /** Find a dataverse but filter according to the visibility from the locallyFAIRRoleAssignments + * + * @param dvIdtf - the dataverse identifier + * @param req - the DataverseRequest + * @return the dataverse if found and visible, otherwise throws WrappedResponse + * @throws WrappedResponse if dataverse is not found (in findDatasetOrDie()) or not visible + */ + protected Dataverse findDataverseUserCanSeeOrDie(String dvIdtf, DataverseRequest req) throws WrappedResponse { + Dataverse dataverse = findDataverseOrDie(dvIdtf); + if (dataverse.isLocallyFAIR() && !permissionSvc.hasLocallyFAIRAccess(req, dataverse)) { + throw new WrappedResponse(error( Response.Status.NOT_FOUND, "Can't find dataverse with identifier='" + dvIdtf + "'")); + } + return dataverse; + } protected Template findTemplateOrDie(Long templateId, Dataverse dataverse) throws WrappedResponse { @@ -461,6 +475,22 @@ protected Dataset findDatasetOrDie(String id, boolean deep) throws WrappedRespon return dataset; } + /** Find a dataset but filter according to the visibility from the locallyFAIRRoleAssignments + * + * @param id - the dataset identifier + * @param req - the DataverseRequest + * @param deep - whether to perform a deep search + * @return the dataset if found and visible, otherwise throws WrappedResponse + * @throws WrappedResponse if dataset is not found (in findDatasetOrDie()) or not visible + */ + protected Dataset findDatasetUserCanSeeOrDie(String id, DataverseRequest req, boolean deep) throws WrappedResponse { + Dataset dataset = findDatasetOrDie(id, deep); + if (dataset.isLocallyFAIR() && !permissionSvc.hasLocallyFAIRAccess(req, dataset)) { + throw new WrappedResponse(notFound(BundleUtil.getStringFromBundle("find.dataset.error.dataset.not.found.id", Collections.singletonList(id)))); + } + return dataset; + } + protected DatasetVersion findDatasetVersionOrDie(final DataverseRequest req, String versionNumber, final Dataset ds, boolean includeDeaccessioned, boolean checkPermsWhenDeaccessioned) throws WrappedResponse { DatasetVersion dsv = execCommand(handleVersion(versionNumber, new Datasets.DsVersionHandler>() { @@ -538,6 +568,21 @@ protected DataFile findDataFileOrDie(String id) throws WrappedResponse { } } } + + /** Find a datafile but filter according to the visibility from the locallyFAIRRoleAssignments + * + * @param id - the datafile identifier + * @param req - the DataverseRequest + * @return the datafile if found and visible, otherwise throws WrappedResponse + * @throws WrappedResponse if datafile is not found (in findDatasetOrDie()) or not visible + */ + protected DataFile findDataFileUserCanSeeOrDie(String id, DataverseRequest req) throws WrappedResponse { + DataFile dataFile = findDataFileOrDie(id); + if (dataFile.isLocallyFAIR() && !permissionSvc.hasLocallyFAIRAccess(req, dataFile)) { + throw new WrappedResponse(notFound(BundleUtil.getStringFromBundle("find.datafile.error.datafile.not.found.id", Collections.singletonList(id)))); + } + return dataFile; + } protected DataverseRole findRoleOrDie(String id) throws WrappedResponse { DataverseRole role; diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Access.java b/src/main/java/edu/harvard/iq/dataverse/api/Access.java index f7654720b71..249f8eaf7a7 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Access.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Access.java @@ -140,19 +140,19 @@ public class Access extends AbstractApiBean { public BundleDownloadInstance datafileBundle(@Context ContainerRequestContext crc, @PathParam("fileId") String fileId, @QueryParam("fileMetadataId") Long fileMetadataId, @QueryParam("gbrecs") boolean gbrecs, @QueryParam("gbrids") String gbrids, @Context UriInfo uriInfo, @Context HttpHeaders headers, @Context HttpServletResponse response) /*throws NotFoundException, ServiceUnavailableException, PermissionDeniedException, AuthorizationRequiredException*/ { - - DataFile df = findDataFileOrDieWrapper(fileId); + GuestbookResponse gbr = null; + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + DataFile df = findDataFileUserCanSeeOrDieWrapper(fileId, req); // This will throw a ForbiddenException if access isn't authorized: - checkAuthorization(crc, df); - User requestor = getRequestor(crc); - if (checkGuestbookRequiredResponse(crc, uriInfo, df, gbrids)) { + checkAuthorization(req.getUser(), df); + if (checkGuestbookRequiredResponse(req.getUser(), uriInfo, df, gbrids)) { throw new BadRequestException(BundleUtil.getStringFromBundle("access.api.download.failure.guestbookResponseMissing", getGuestbookIdFromDatafile(df))); } if (gbrecs != true && df.isReleased()) { // Write Guestbook record if not done previously and file is released - GuestbookResponse gbr = guestbookResponseService.initAPIGuestbookResponse(df.getOwner(), df, session, requestor); + gbr = guestbookResponseService.initAPIGuestbookResponse(df.getOwner(), df, session, getRequestor(req.getUser())); guestbookResponseService.save(gbr); MakeDataCountEntry entry = new MakeDataCountEntry(uriInfo, headers, dvRequestService, df); mdcLogService.logEntry(entry); @@ -212,15 +212,15 @@ public BundleDownloadInstance datafileBundleWithGuestbookResponse(@Context Conta return datafileBundle(crc, fileId, fileMetadataId, gbrecs, gbrids, uriInfo, headers, response); } - //Added a wrapper method since the original method throws a wrapped response + //Added a wrapper method since the original method throws a wrapped response //the access methods return files instead of responses so we convert to a WebApplicationException - private DataFile findDataFileOrDieWrapper(String fileId){ + private DataFile findDataFileUserCanSeeOrDieWrapper(String fileId, DataverseRequest req){ DataFile df = null; try { - df = findDataFileOrDie(fileId); + df = findDataFileUserCanSeeOrDie(fileId, req); } catch (WrappedResponse ex) { logger.warning("Access: datafile service could not locate a DataFile object for id "+fileId+"!"); logger.warning(ex.getWrappedMessageWhenJson()); @@ -238,8 +238,9 @@ public Response datafile(@Context ContainerRequestContext crc, @PathParam("fileI @Context UriInfo uriInfo, @Context HttpHeaders headers, @Context HttpServletResponse response) /*throws NotFoundException, ServiceUnavailableException, PermissionDeniedException, AuthorizationRequiredException*/ { fileId = normalizeFileId(fileId); - - DataFile df = findDataFileOrDieWrapper(fileId); + + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + DataFile df = findDataFileUserCanSeeOrDieWrapper(fileId, req); GuestbookResponse gbr = null; if (df.isHarvested()) { @@ -249,15 +250,14 @@ public Response datafile(@Context ContainerRequestContext crc, @PathParam("fileI } // This will throw a ForbiddenException if access isn't authorized: - checkAuthorization(crc, df); - User requestor = getRequestor(crc); - if (checkGuestbookRequiredResponse(crc, uriInfo, df, gbrids)) { + checkAuthorization(req.getUser(), df); + if (checkGuestbookRequiredResponse(req.getUser(), uriInfo, df, gbrids)) { return error(BAD_REQUEST, BundleUtil.getStringFromBundle("access.api.download.failure.guestbookResponseMissing", getGuestbookIdFromDatafile(df))); } if (gbrecs != true && df.isReleased()){ // Write Guestbook record if not done previously and file is released - gbr = guestbookResponseService.initAPIGuestbookResponse(df.getOwner(), df, session, requestor); + gbr = guestbookResponseService.initAPIGuestbookResponse(df.getOwner(), df, session, getRequestor(req.getUser())); } DownloadInfo dInfo = new DownloadInfo(df); @@ -324,7 +324,7 @@ public Response datafile(@Context ContainerRequestContext crc, @PathParam("fileI String token = variableIdParams[i].replaceFirst("^v", ""); Long variableId = null; try { - variableId = new Long(token); + variableId = Long.parseLong(token); } catch (NumberFormatException nfe) { variableId = null; } @@ -428,7 +428,7 @@ private Response processDatafileWithGuestbookResponse(ContainerRequestContext cr // since all files must be in the same Dataset we can generate a Guestbook Response once and just replace the DataFile for each file in the list DataFile firstDatafile = datafilesMap.values().size() > 0 ? (DataFile) Arrays.stream(datafilesMap.values().toArray()).findFirst().get() : null; GuestbookResponse gbr = getGuestbookResponseFromBody(firstDatafile, GuestbookResponse.DOWNLOAD, jsonBody, user); - boolean guestbookResponseRequired = checkGuestbookRequiredResponse(crc, uriInfo, firstDatafile, null); + boolean guestbookResponseRequired = checkGuestbookRequiredResponse(user, uriInfo, firstDatafile, null); for (DataFile df : datafilesMap.values()) { displayName = df.getDisplayName(); datasetId = df.getOwner().getId(); @@ -458,12 +458,13 @@ private Response processDatafileWithGuestbookResponse(ContainerRequestContext cr } private Map getDatafilesMap(ContainerRequestContext crc, String fileIds) { + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); String fileIdParams[] = getFileIdsCSV(fileIds); Map datafilesMap = new HashMap<>(); // Get and validate all the DataFiles first if (fileIdParams != null && fileIdParams.length > 0) { for (int i = 0; i < fileIdParams.length; i++) { - DataFile df = findDataFileOrDieWrapper(fileIdParams[i]); + DataFile df = findDataFileUserCanSeeOrDieWrapper(fileIdParams[i], req); if (df.isHarvested()) { String errorMessage = "Datafile " + df.getId() + " is a harvested file that cannot be accessed in this Dataverse"; @@ -472,7 +473,7 @@ private Map getDatafilesMap(ContainerRequestContext crc, String } // This will throw a ForbiddenException if access isn't authorized: - checkAuthorization(crc, df); + checkAuthorization(req.getUser(), df); datafilesMap.put(df.getId(), df); } @@ -536,10 +537,10 @@ public String tabularDatafileMetadata(@Context ContainerRequestContext crc, @Pat public String tabularDatafileMetadataDDI(@Context ContainerRequestContext crc, @PathParam("fileId") String fileId, @QueryParam("fileMetadataId") Long fileMetadataId, @QueryParam("exclude") String exclude, @QueryParam("include") String include, @Context HttpHeaders header, @Context HttpServletResponse response) throws NotFoundException, ServiceUnavailableException /*, PermissionDeniedException, AuthorizationRequiredException*/ { String retValue = ""; - DataFile dataFile = null; + DataFile dataFile = null; - - dataFile = findDataFileOrDieWrapper(fileId); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + dataFile = findDataFileUserCanSeeOrDieWrapper(fileId, req); if (!dataFile.isTabularData()) { throw new BadRequestException("tabular data required"); @@ -549,13 +550,12 @@ public String tabularDatafileMetadataDDI(@Context ContainerRequestContext crc, @ } if (dataFile.isRestricted() || FileUtil.isActivelyEmbargoed(dataFile)) { boolean hasPermissionToDownloadFile = false; - DataverseRequest dataverseRequest; - dataverseRequest = createDataverseRequest(getRequestUser(crc)); - if (dataverseRequest != null && dataverseRequest.getUser() instanceof GuestUser) { + + if (req != null && req.getUser() instanceof GuestUser) { // We must be in the UI. Try to get a non-GuestUser from the session. - dataverseRequest = dvRequestService.getDataverseRequest(); + req = dvRequestService.getDataverseRequest(); } - hasPermissionToDownloadFile = permissionService.requestOn(dataverseRequest, dataFile).has(Permission.DownloadFile); + hasPermissionToDownloadFile = permissionService.requestOn(req, dataFile).has(Permission.DownloadFile); if (!hasPermissionToDownloadFile) { throw new BadRequestException("no permission to download file"); } @@ -632,7 +632,9 @@ public Response listDatafileMetadataAuxByOrigin(@Context ContainerRequestContext } private Response listAuxiliaryFiles(User user, String fileId, String origin, UriInfo uriInfo, HttpHeaders headers, HttpServletResponse response) { - DataFile df = findDataFileOrDieWrapper(fileId); + + DataverseRequest req = createDataverseRequest(user); + DataFile df = findDataFileUserCanSeeOrDieWrapper(fileId, req); List auxFileList = auxiliaryFileService.findAuxiliaryFiles(df, origin); @@ -672,8 +674,9 @@ public DownloadInstance downloadAuxiliaryFile(@Context ContainerRequestContext c @Context UriInfo uriInfo, @Context HttpHeaders headers, @Context HttpServletResponse response) throws ServiceUnavailableException { - - DataFile df = findDataFileOrDieWrapper(fileId); + + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + DataFile df = findDataFileUserCanSeeOrDieWrapper(fileId, req); DownloadInfo dInfo = new DownloadInfo(df); boolean publiclyAvailable = false; @@ -724,7 +727,7 @@ public DownloadInstance downloadAuxiliaryFile(@Context ContainerRequestContext c // as defined for the DataFile itself), and will throw a ForbiddenException // if access is denied: if (!publiclyAvailable) { - checkAuthorization(crc, df); + checkAuthorization(req.getUser(), df); } return downloadInstance; @@ -763,7 +766,7 @@ public Response downloadAllFromLatest(@Context ContainerRequestContext crc, @Pat try { User user = getRequestUser(crc); DataverseRequest req = createDataverseRequest(user); - final Dataset retrieved = findDatasetOrDie(datasetIdOrPersistentId); + final Dataset retrieved = findDatasetUserCanSeeOrDie(datasetIdOrPersistentId, req, false); if (!(user instanceof GuestUser)) { // The reason we are only looking up a draft version for a NON-guest user // is that we know that guest never has the Permission.ViewUnpublishedDataset. @@ -839,15 +842,15 @@ public Response downloadAllFromVersion(@Context ContainerRequestContext crc, @Pa try { DatasetVersion dsv = getDatasetVersionFromVersion(crc, datasetIdOrPersistentId, versionId); if (dsv == null) { - // (A "Not Found" would be more appropriate here, I believe, than a "Bad Request". - // But we've been using the latter for a while, and it's a popular API... - // and this return code is expected by our tests - so I'm choosing it to keep + // (A "Not Found" would be more appropriate here, I believe, than a "Bad Request". + // But we've been using the latter for a while, and it's a popular API... + // and this return code is expected by our tests - so I'm choosing it to keep // -- L.A.) return error(BAD_REQUEST, BundleUtil.getStringFromBundle("access.api.exception.version.not.found")); } String fileIds = getFileIdsAsCommaSeparated(dsv.getFileMetadatas()); - // We don't want downloads from Draft versions to be counted, - // so we are setting the gbrecs (aka "do not write guestbook response") + // We don't want downloads from Draft versions to be counted, + // so we are setting the gbrecs (aka "do not write guestbook response") // variable accordingly: if (dsv.isDraft()) { gbrecs = true; @@ -877,7 +880,7 @@ public Response downloadAllFromVersionWithGuestbookResponse(@Context ContainerRe private DatasetVersion getDatasetVersionFromVersion(ContainerRequestContext crc, String datasetIdOrPersistentId, String versionId) throws WrappedResponse { DataverseRequest req = createDataverseRequest(getRequestUser(crc)); - final Dataset ds = execCommand(new GetDatasetCommand(req, findDatasetOrDie(datasetIdOrPersistentId))); + final Dataset ds = execCommand(new GetDatasetCommand(req, findDatasetUserCanSeeOrDie(datasetIdOrPersistentId, req, false))); return execCommand(handleVersion(versionId, new Datasets.DsVersionHandler<>() { @Override @@ -995,7 +998,8 @@ private Response downloadDatafiles(ContainerRequestContext crc, String body, boo String customZipServiceUrl = settingsService.getValueForKey(SettingsServiceBean.Key.CustomZipDownloadServiceUrl); boolean useCustomZipService = customZipServiceUrl != null; - User user = getRequestor(crc); + User user = getRequestor(getRequestUser(crc)); + DataverseRequest req = createDataverseRequest(user); Boolean getOrig = false; for (String key : uriInfo.getQueryParameters().keySet()) { @@ -1012,10 +1016,10 @@ private Response downloadDatafiles(ContainerRequestContext crc, String body, boo Set datasetIds = new HashSet<>(); Boolean guestbookResponseRequired = null; for (int i = 0; i < fileIdParams.length; i++) { - DataFile df = findDataFileOrDieWrapper(fileIdParams[i]); + DataFile df = findDataFileUserCanSeeOrDieWrapper(fileIdParams[i], req); if (guestbookResponseRequired == null) { // Only need to check this on the first file - guestbookResponseRequired = checkGuestbookRequiredResponse(crc, uriInfo, df, gbrids); + guestbookResponseRequired = checkGuestbookRequiredResponse(user, uriInfo, df, gbrids); } datafilesMap.put(df.getId(), df); datasetIds.add(df.getOwner() != null ? df.getOwner().getId() : 0L); @@ -1048,6 +1052,7 @@ private Response downloadDatafiles(ContainerRequestContext crc, String body, boo if (useCustomZipService) { URI redirect_uri = null; try { + //ToDo - make extnerla Zipper LocallyFAIR aware redirect_uri = handleCustomZipDownload(user, customZipServiceUrl, fileIdParams, uriInfo, headers, donotwriteGBResponse, true); } catch (WebApplicationException wae) { throw wae; @@ -1192,13 +1197,14 @@ public InputStream tempPreview(@PathParam("fileSystemId") String fileSystemId, @ @Path("fileCardImage/{fileId}") @GET @Produces({ "image/png" }) - public InputStream fileCardImage(@PathParam("fileId") Long fileId, @Context UriInfo uriInfo, @Context HttpHeaders headers, @Context HttpServletResponse response) /*throws NotFoundException, ServiceUnavailableException, PermissionDeniedException, AuthorizationRequiredException*/ { + @AuthRequired + public InputStream fileCardImage(@Context ContainerRequestContext crc, @PathParam("fileId") Long fileId, @Context UriInfo uriInfo, @Context HttpHeaders headers, @Context HttpServletResponse response) /*throws NotFoundException, ServiceUnavailableException, PermissionDeniedException, AuthorizationRequiredException*/ { DataFile df = dataFileService.find(fileId); - - if (df == null) { + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + if (df == null || (df.isLocallyFAIR() && !permissionSvc.hasLocallyFAIRAccess(req, df))) { logger.warning("Preview: datafile service could not locate a DataFile object for id "+fileId+"!"); return null; } @@ -1226,71 +1232,17 @@ public InputStream fileCardImage(@PathParam("fileId") Long fileId, @Context UriI return null; } - - // Note: - // the Dataverse page is no longer using this method. - @Path("dsCardImage/{versionId}") - @GET - @Produces({ "image/png" }) - public InputStream dsCardImage(@PathParam("versionId") Long versionId, @Context UriInfo uriInfo, @Context HttpHeaders headers, @Context HttpServletResponse response) /*throws NotFoundException, ServiceUnavailableException, PermissionDeniedException, AuthorizationRequiredException*/ { - - - DatasetVersion datasetVersion = versionService.find(versionId); - - if (datasetVersion == null) { - logger.warning("Preview: Version service could not locate a DatasetVersion object for id "+versionId+"!"); - return null; - } - - //String imageThumbFileName = null; - StorageIO thumbnailDataAccess = null; - - // First, check if this dataset has a designated thumbnail image: - - if (datasetVersion.getDataset() != null) { - - DataFile logoDataFile = datasetVersion.getDataset().getThumbnailFile(); - if (logoDataFile != null) { - - try { - StorageIO dataAccess = logoDataFile.getStorageIO(); - if (dataAccess != null) { // && dataAccess.isLocalFile()) { - dataAccess.open(); - thumbnailDataAccess = ImageThumbConverter.getImageThumbnailAsInputStream(dataAccess, 48); - } - if (thumbnailDataAccess != null && thumbnailDataAccess.getInputStream() != null) { - return thumbnailDataAccess.getInputStream(); - } - } catch (IOException ioEx) { - thumbnailDataAccess = null; - } - } - - - - // If not, we'll try to use one of the files in this dataset version: - /* - if (thumbnailDataAccess == null) { - - if (!datasetVersion.getDataset().isHarvested()) { - thumbnailDataAccess = getThumbnailForDatasetVersion(datasetVersion); - } - }*/ - - } - return null; - } - @Path("dvCardImage/{dataverseId}") @GET @Produces({ "image/png" }) - public InputStream dvCardImage(@PathParam("dataverseId") Long dataverseId, @Context UriInfo uriInfo, @Context HttpHeaders headers, @Context HttpServletResponse response) /*throws NotFoundException, ServiceUnavailableException, PermissionDeniedException, AuthorizationRequiredException*/ { + @AuthRequired + public InputStream dvCardImage(@Context ContainerRequestContext crc, @PathParam("dataverseId") Long dataverseId, @Context UriInfo uriInfo, @Context HttpHeaders headers, @Context HttpServletResponse response) /*throws NotFoundException, ServiceUnavailableException, PermissionDeniedException, AuthorizationRequiredException*/ { logger.fine("entering dvCardImage"); Dataverse dataverse = dataverseService.find(dataverseId); - - if (dataverse == null) { + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + if (dataverse == null || (dataverse.isLocallyFAIR() && !permissionService.hasLocallyFAIRAccess(req, dataverse))) { logger.warning("Preview: Version service could not locate a DatasetVersion object for id "+dataverseId+"!"); return null; } @@ -1442,7 +1394,6 @@ private String getWebappImageResource(String imageName) { * @param isPublic * @param type * @param fileInputStream - * @param contentDispositionHeader * @param formDataBodyPart * @return * @@ -1511,12 +1462,7 @@ public Response saveAuxiliaryFileWithVersion(@Context ContainerRequestContext cr * @param fileId * @param formatTag * @param formatVersion - * @param origin - * @param isPublic - * @param fileInputStream - * @param contentDispositionHeader - * @param formDataBodyPart - * @return + * @return */ @DELETE @AuthRequired @@ -1688,21 +1634,22 @@ public Response requestFileAccess(@Context ContainerRequestContext crc public Response listFileAccessRequests(@Context ContainerRequestContext crc, @PathParam("id") String fileToRequestAccessId, @Context HttpHeaders headers) { DataverseRequest dataverseRequest; + try { + dataverseRequest = createDataverseRequest(getRequestAuthenticatedUserOrDie(crc)); + } catch (WrappedResponse wr) { + List args = Arrays.asList(wr.getLocalizedMessage()); + return error(UNAUTHORIZED, BundleUtil.getStringFromBundle("access.api.fileAccess.failure.noUser", args)); + } DataFile dataFile; try { - dataFile = findDataFileOrDie(fileToRequestAccessId); + dataFile = findDataFileUserCanSeeOrDie(fileToRequestAccessId, dataverseRequest); } catch (WrappedResponse ex) { List args = Arrays.asList(fileToRequestAccessId); return error(BAD_REQUEST, BundleUtil.getStringFromBundle("access.api.requestList.fileNotFound", args)); } - try { - dataverseRequest = createDataverseRequest(getRequestAuthenticatedUserOrDie(crc)); - } catch (WrappedResponse wr) { - List args = Arrays.asList(wr.getLocalizedMessage()); - return error(UNAUTHORIZED, BundleUtil.getStringFromBundle("access.api.fileAccess.failure.noUser", args)); - } + if (!(dataverseRequest.getAuthenticatedUser().isSuperuser() || permissionService.requestOn(dataverseRequest, dataFile).has(Permission.ManageFilePermissions))) { return error(FORBIDDEN, BundleUtil.getStringFromBundle("access.api.rejectAccess.failure.noPermissions")); } @@ -1918,8 +1865,9 @@ public Response getUserFileAccessRequested(@Context ContainerRequestContext crc, DataFile dataFile; AuthenticatedUser requestAuthenticatedUser; try { - dataFile = findDataFileOrDie(dataFileId); requestAuthenticatedUser = getRequestAuthenticatedUserOrDie(crc); + DataverseRequest req = createDataverseRequest(requestAuthenticatedUser); + dataFile = findDataFileUserCanSeeOrDie(dataFileId, req); } catch (WrappedResponse wr) { return wr.getResponse(); } @@ -1940,7 +1888,8 @@ public Response getUserFileAccessRequested(@Context ContainerRequestContext crc, public Response getUserPermissionsOnFile(@Context ContainerRequestContext crc, @PathParam("id") String dataFileId) { DataFile dataFile; try { - dataFile = findDataFileOrDie(dataFileId); + DataverseRequest req = createDataverseRequest(getRequestAuthenticatedUserOrDie(crc)); + dataFile = findDataFileUserCanSeeOrDie(dataFileId, req); } catch (WrappedResponse wr) { return wr.getResponse(); } @@ -1952,12 +1901,12 @@ public Response getUserPermissionsOnFile(@Context ContainerRequestContext crc, @ return ok(jsonObjectBuilder); } - private boolean checkGuestbookRequiredResponse(ContainerRequestContext crc, UriInfo uriInfo, DataFile df, String gbrids) throws WebApplicationException { + private boolean checkGuestbookRequiredResponse(User user, UriInfo uriInfo, DataFile df, String gbrids) throws WebApplicationException { // Check if guestbook response is required boolean required = df.getOwner().hasEnabledGuestbook(); boolean wasWrittenInPost = false; if (required) { - User requestor = getRequestor(crc); + User requestor = getRequestor(user); if (requestor instanceof AuthenticatedUser && permissionService.userOn(requestor, df.getOwner()).has(Permission.EditDataset)) { required = false; } @@ -2000,14 +1949,13 @@ private GuestbookResponse getGuestbookResponseFromBody(DataFile dataFile, String // checkAuthorization is a convenience method; it calls the boolean method // isAccessAuthorized(), the actual workhorse, and throws a 403 exception if not. - private void checkAuthorization(ContainerRequestContext crc, DataFile df) throws WebApplicationException { - User user = getRequestor(crc); + private void checkAuthorization(User initialUser, DataFile df) throws WebApplicationException { + User user = getRequestor(initialUser); if (!isAccessAuthorized(user, df)) { throw new ForbiddenException(); } } - private User getRequestor(ContainerRequestContext crc) { - User user = getRequestUser(crc); + private User getRequestor(User user) { // CompoundAuthMechanism should find the user by API Key/Token, Workflow, etc. And for SPA the Bearer Token // For JSF check if CompoundAuthMechanism couldn't find the user then try to get it from the session if (session!=null && user instanceof GuestUser) { diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java b/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java index 136b6dbb69b..a6cfe50af91 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java @@ -221,7 +221,7 @@ public interface DsVersionHandler { @Path("{id}") public Response getDataset(@Context ContainerRequestContext crc, @PathParam("id") String id, @Context UriInfo uriInfo, @Context HttpHeaders headers, @Context HttpServletResponse response, @QueryParam("returnOwners") boolean returnOwners) { return response( req -> { - final Dataset retrieved = execCommand(new GetDatasetCommand(req, findDatasetOrDie(id, true))); + final Dataset retrieved = execCommand(new GetDatasetCommand(req, findDatasetUserCanSeeOrDie(id, req, true))); final DatasetVersion latest = execCommand(new GetLatestAccessibleDatasetVersionCommand(req, retrieved)); final JsonObjectBuilder jsonbuilder = json(retrieved, returnOwners); //Report MDC if this is a released version (could be draft if user has access, or user may not have access at all and is not getting metadata beyond the minimum) @@ -262,7 +262,7 @@ public Response exportDataset(@Context ContainerRequestContext crc, @QueryParam( // Trying to get version 1.0 for a dataset that's already at 3.0, for example, is not supported. if (!datasetVersion.isDraft() && versionId != null) { - Command cmd = new GetLatestPublishedDatasetVersionCommand(dvRequestService.getDataverseRequest(), dataset); + Command cmd = new GetLatestPublishedDatasetVersionCommand(req, dataset); DatasetVersion latestPublishedVersion = commandEngine.submit(cmd); if (latestPublishedVersion == null) { return error(BAD_REQUEST, "Non-draft version requested but for published versions only the latest (" + DS_VERSION_LATEST_PUBLISHED + ") is supported."); @@ -462,7 +462,7 @@ public Response useDefaultCitationDate(@Context ContainerRequestContext crc, @Pa public Response listVersions(@Context ContainerRequestContext crc, @PathParam("id") String id, @QueryParam("excludeFiles") Boolean excludeFiles,@QueryParam("excludeMetadataBlocks") Boolean excludeMetadataBlocks, @QueryParam("limit") Integer limit, @QueryParam("offset") Integer offset) { return response( req -> { - Dataset dataset = findDatasetOrDie(id); + Dataset dataset = findDatasetUserCanSeeOrDie(id, req, false); Boolean deepLookup = excludeFiles == null ? true : !excludeFiles; Boolean includeMetadataBlocks = excludeMetadataBlocks == null ? true : !excludeMetadataBlocks; @@ -494,7 +494,7 @@ public Response getVersion(@Context ContainerRequestContext crc, //If excludeFiles is null the default is to provide the files and because of this we need to check permissions. boolean checkPerms = includeFiles; - Dataset dataset = findDatasetOrDie(datasetId); + Dataset dataset = findDatasetUserCanSeeOrDie(datasetId, req, false); DatasetVersion requestedDatasetVersion = getDatasetVersionOrDie(req, versionId, dataset, @@ -542,7 +542,7 @@ public Response getVersionFiles(@Context ContainerRequestContext crc, @Context UriInfo uriInfo, @Context HttpHeaders headers) { return response(req -> { - DatasetVersion datasetVersion = getDatasetVersionOrDie(req, versionId, findDatasetOrDie(datasetId, false), uriInfo, headers, includeDeaccessioned); + DatasetVersion datasetVersion = getDatasetVersionOrDie(req, versionId, findDatasetUserCanSeeOrDie(datasetId, req,false), uriInfo, headers, includeDeaccessioned); DatasetVersionFilesServiceBean.FileOrderCriteria fileOrderCriteria; try { fileOrderCriteria = orderCriteria != null ? DatasetVersionFilesServiceBean.FileOrderCriteria.valueOf(orderCriteria) : DatasetVersionFilesServiceBean.FileOrderCriteria.NameAZ; @@ -593,7 +593,7 @@ public Response getVersionFileCounts(@Context ContainerRequestContext crc, } catch (IllegalArgumentException e) { return badRequest(BundleUtil.getStringFromBundle("datasets.api.version.files.invalid.access.status", List.of(accessStatus))); } - DatasetVersion datasetVersion = getDatasetVersionOrDie(req, versionId, findDatasetOrDie(datasetId), uriInfo, headers, includeDeaccessioned, false); + DatasetVersion datasetVersion = getDatasetVersionOrDie(req, versionId, findDatasetUserCanSeeOrDie(datasetId, req, false), uriInfo, headers, includeDeaccessioned, false); JsonObjectBuilder jsonObjectBuilder = Json.createObjectBuilder(); jsonObjectBuilder.add("total", datasetVersionFilesServiceBean.getFileMetadataCount(datasetVersion, fileSearchCriteria)); jsonObjectBuilder.add("perContentType", json(datasetVersionFilesServiceBean.getFileMetadataCountPerContentType(datasetVersion, fileSearchCriteria))); @@ -614,7 +614,8 @@ public Response getDownloadCountByDatasetId(@Context ContainerRequestContext crc Long count; LocalDate date = includeMDC == null || !includeMDC ? getMDCStartDate() : null; try { - Dataset ds = findDatasetOrDie(datasetId); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + Dataset ds = findDatasetUserCanSeeOrDie(datasetId, req, false); id = ds.getId(); count = guestbookResponseService.getDownloadCountByDatasetId(id, date); } catch (WrappedResponse wr) { @@ -651,7 +652,7 @@ public Response getFileAccessFolderView(@Context ContainerRequestContext crc, @P DatasetVersion version; try { DataverseRequest req = createDataverseRequest(getRequestUser(crc)); - version = getDatasetVersionOrDie(req, versionId, findDatasetOrDie(datasetId), uriInfo, headers); + version = getDatasetVersionOrDie(req, versionId, findDatasetUserCanSeeOrDie(datasetId, req, false), uriInfo, headers); } catch (WrappedResponse wr) { return wr.getResponse(); } @@ -682,7 +683,7 @@ public Response getFileAccessFolderView(@Context ContainerRequestContext crc, @P public Response getVersionMetadata(@Context ContainerRequestContext crc, @PathParam("id") String datasetId, @PathParam("versionId") String versionId, @Context UriInfo uriInfo, @Context HttpHeaders headers) { return response( req -> ok( jsonByBlocks( - getDatasetVersionOrDie(req, versionId, findDatasetOrDie(datasetId), uriInfo, headers ) + getDatasetVersionOrDie(req, versionId, findDatasetUserCanSeeOrDie(datasetId, req, false), uriInfo, headers ) .getDatasetFields())), getRequestUser(crc)); } @@ -697,7 +698,7 @@ public Response getVersionMetadataBlock(@Context ContainerRequestContext crc, @Context HttpHeaders headers) { return response( req -> { - DatasetVersion dsv = getDatasetVersionOrDie(req, versionNumber, findDatasetOrDie(datasetId), uriInfo, headers ); + DatasetVersion dsv = getDatasetVersionOrDie(req, versionNumber, findDatasetUserCanSeeOrDie(datasetId, req, false), uriInfo, headers ); Map> fieldsByBlock = DatasetField.groupByBlock(dsv.getDatasetFields()); for ( Map.Entry> p : fieldsByBlock.entrySet() ) { @@ -727,7 +728,7 @@ public Response getLinkset(@Context ContainerRequestContext crc, @PathParam("id" } DataverseRequest req = createDataverseRequest(getRequestUser(crc)); try { - DatasetVersion dsv = getDatasetVersionOrDie(req, versionId, findDatasetOrDie(datasetId), uriInfo, headers); + DatasetVersion dsv = getDatasetVersionOrDie(req, versionId, findDatasetUserCanSeeOrDie(datasetId, req, false), uriInfo, headers); return Response .ok(Json.createObjectBuilder() .add("linkset", @@ -882,7 +883,7 @@ public Response updateDraftVersion(@Context ContainerRequestContext crc, String public Response getVersionJsonLDMetadata(@Context ContainerRequestContext crc, @PathParam("id") String id, @PathParam("versionId") String versionId, @Context UriInfo uriInfo, @Context HttpHeaders headers) { try { DataverseRequest req = createDataverseRequest(getRequestUser(crc)); - DatasetVersion dsv = getDatasetVersionOrDie(req, versionId, findDatasetOrDie(id), uriInfo, headers); + DatasetVersion dsv = getDatasetVersionOrDie(req, versionId, findDatasetUserCanSeeOrDie(id, req, false), uriInfo, headers); OREMap ore = new OREMap(dsv, settingsService.isTrueForKey(SettingsServiceBean.Key.ExcludeEmailFromExport, false)); return ok(ore.getOREMapBuilder(true)); @@ -2128,10 +2129,12 @@ public Response linkDataset(@Context ContainerRequestContext crc, @PathParam("li @Path("{id}/versions/{versionId}/customlicense") public Response getCustomTermsTab(@PathParam("id") String id, @PathParam("versionId") String versionId, @Context UriInfo uriInfo, @Context HttpHeaders headers) { + //ToDo - should this use @AuthRequired and get the user from the crc? User user = session.getUser(); String persistentId; try { - if (DatasetUtil.getLicense(getDatasetVersionOrDie(createDataverseRequest(user), versionId, findDatasetOrDie(id), uriInfo, headers)) != null) { + DataverseRequest req = createDataverseRequest(user); + if (DatasetUtil.getLicense(getDatasetVersionOrDie(createDataverseRequest(user), versionId, findDatasetUserCanSeeOrDie(id, req, false), uriInfo, headers)) != null) { return error(Status.NOT_FOUND, "This Dataset has no custom license"); } persistentId = getRequestParameter(":persistentId".substring(1)); @@ -2152,7 +2155,8 @@ public Response getCustomTermsTab(@PathParam("id") String id, @PathParam("versio public Response getLinks(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied ) { try { User u = getRequestUser(crc); - Dataset dataset = findDatasetOrDie(idSupplied); + DataverseRequest req = createDataverseRequest(u); + Dataset dataset = findDatasetUserCanSeeOrDie(idSupplied, req, false); if (!dataset.isReleased() && !permissionService.hasPermissionsFor(u, dataset, EnumSet.of(Permission.ViewUnpublishedDataset))) { return error(Response.Status.FORBIDDEN, "User is not allowed to list the link(s) of this dataset"); @@ -2257,7 +2261,7 @@ public Response deleteAssignment(@Context ContainerRequestContext crc, @PathPara public Response getAssignments(@Context ContainerRequestContext crc, @PathParam("identifier") String id) { return response(req -> ok(execCommand( - new ListRoleAssignments(req, findDatasetOrDie(id))) + new ListRoleAssignments(req, findDatasetUserCanSeeOrDie(id, req, false))) .stream().map(ra -> json(ra)).collect(toJsonArray())), getRequestUser(crc)); } @@ -2290,7 +2294,7 @@ public Response deletePrivateUrl(@Context ContainerRequestContext crc, @PathPara @Path("{id}/previewUrl") public Response getPreviewUrlData(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied) { return response( req -> { - PrivateUrl privateUrl = execCommand(new GetPrivateUrlCommand(req, findDatasetOrDie(idSupplied))); + PrivateUrl privateUrl = execCommand(new GetPrivateUrlCommand(req, findDatasetUserCanSeeOrDie(idSupplied, req, false))); return (privateUrl != null) ? ok(json(privateUrl)) : error(Response.Status.NOT_FOUND, "Private URL not found."); }, getRequestUser(crc)); @@ -2330,7 +2334,8 @@ public Response deletePreviewUrl(@Context ContainerRequestContext crc, @PathPara @Path("{id}/thumbnail/candidates") public Response getDatasetThumbnailCandidates(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied) { try { - Dataset dataset = findDatasetOrDie(idSupplied); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + Dataset dataset = findDatasetUserCanSeeOrDie(idSupplied, req, false); boolean canUpdateThumbnail = false; canUpdateThumbnail = permissionSvc.requestOn(createDataverseRequest(getRequestUser(crc)), dataset).canIssue(UpdateDatasetThumbnailCommand.class); if (!canUpdateThumbnail) { @@ -2360,9 +2365,11 @@ public Response getDatasetThumbnailCandidates(@Context ContainerRequestContext c @GET @Produces({"image/png"}) @Path("{id}/thumbnail") - public Response getDatasetThumbnail(@PathParam("id") String idSupplied) { + @AuthRequired + public Response getDatasetThumbnail(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied) { try { - Dataset dataset = findDatasetOrDie(idSupplied); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + Dataset dataset = findDatasetUserCanSeeOrDie(idSupplied, req, false); InputStream is = DatasetUtil.getThumbnailAsInputStream(dataset, ImageThumbConverter.DEFAULT_CARDIMAGE_SIZE); if(is == null) { return notFound("Thumbnail not available"); @@ -2376,9 +2383,11 @@ public Response getDatasetThumbnail(@PathParam("id") String idSupplied) { @GET @Produces({ "image/png" }) @Path("{id}/logo") - public Response getDatasetLogo(@PathParam("id") String idSupplied) { + @AuthRequired + public Response getDatasetLogo(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied) { try { - Dataset dataset = findDatasetOrDie(idSupplied); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + Dataset dataset = findDatasetUserCanSeeOrDie(idSupplied, req, false); InputStream is = DatasetUtil.getLogoAsInputStream(dataset); if (is == null) { return notFound("Logo not available"); @@ -2446,8 +2455,10 @@ public Response getRsync(@Context ContainerRequestContext crc, @PathParam("ident } Dataset dataset = null; try { - dataset = findDatasetOrDie(id); + AuthenticatedUser user = getRequestAuthenticatedUserOrDie(crc); + DataverseRequest req = createDataverseRequest(user); + dataset = findDatasetUserCanSeeOrDie(id, req, false); ScriptRequestResponse scriptRequestResponse = execCommand(new RequestRsyncScriptCommand(createDataverseRequest(user), dataset)); DatasetLock lock = datasetService.addDatasetLock(dataset.getId(), DatasetLock.Reason.DcmUpload, user.getId(), "script downloaded"); @@ -2639,7 +2650,8 @@ public Response getAvailableFileCategories(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied) { try { - Dataset ds = findDatasetOrDie(idSupplied); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + Dataset ds = findDatasetUserCanSeeOrDie(idSupplied, req, false); List datasetFileCategories = dataFileCategoryService.mergeDatasetFileCategories(ds.getCategories()); JsonArrayBuilder fileCategoriesArrayBuilder = Json.createArrayBuilder(); for (String fieldName : datasetFileCategories) { @@ -2661,9 +2673,11 @@ public Response getCurationStatus(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied, @QueryParam("includeHistory") boolean includeHistory) { try { - Dataset ds = findDatasetOrDie(idSupplied); - DatasetVersion dsv = ds.getLatestVersion(); User user = getRequestUser(crc); + DataverseRequest req = createDataverseRequest(user); + Dataset ds = findDatasetUserCanSeeOrDie(idSupplied, req, false); + DatasetVersion dsv = ds.getLatestVersion(); + boolean canSeeStatus = false; // Check if curation labels should be shown to all users @@ -2753,11 +2767,12 @@ public Response deleteCurationStatus(@Context ContainerRequestContext crc, @Path @Path("{id}/uploadurls") public Response getMPUploadUrls(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied, @QueryParam("size") long fileSize) { try { - Dataset dataset = findDatasetOrDie(idSupplied); + User user = getRequestUser(crc); + DataverseRequest req = createDataverseRequest(user); + Dataset dataset = findDatasetUserCanSeeOrDie(idSupplied, req, false); boolean canUpdateDataset = false; - User user = getRequestUser(crc); - canUpdateDataset = permissionSvc.requestOn(createDataverseRequest(user), dataset) + canUpdateDataset = permissionSvc.requestOn(req, dataset) .canIssue(UpdateDatasetVersionCommand.class); if (!canUpdateDataset) { return error(Response.Status.FORBIDDEN, "You are not permitted to upload files to this dataset."); @@ -3147,16 +3162,16 @@ public Response addFileToDataset(@Context ContainerRequestContext crc, public Response cleanStorage(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied, @QueryParam("dryrun") Boolean dryrun) { // get user and dataset User authUser = getRequestUser(crc); - + DataverseRequest req = createDataverseRequest(authUser); Dataset dataset; try { - dataset = findDatasetOrDie(idSupplied); + dataset = findDatasetUserCanSeeOrDie(idSupplied, req, false); } catch (WrappedResponse wr) { return wr.getResponse(); } // check permissions - if (!permissionSvc.permissionsFor(createDataverseRequest(authUser), dataset).contains(Permission.EditDataset)) { + if (!permissionSvc.permissionsFor(req, dataset).contains(Permission.EditDataset)) { return error(Response.Status.INTERNAL_SERVER_ERROR, "Access denied!"); } @@ -3192,8 +3207,9 @@ public Response getCompareVersions(@Context ContainerRequestContext crc, @PathPa @Context UriInfo uriInfo, @Context HttpHeaders headers) { try { DataverseRequest req = createDataverseRequest(getRequestUser(crc)); - DatasetVersion dsv1 = getDatasetVersionOrDie(req, versionId1, findDatasetOrDie(id), uriInfo, headers, includeDeaccessioned); - DatasetVersion dsv2 = getDatasetVersionOrDie(req, versionId2, findDatasetOrDie(id), uriInfo, headers, includeDeaccessioned); + Dataset ds = findDatasetUserCanSeeOrDie(id, req, false); + DatasetVersion dsv1 = getDatasetVersionOrDie(req, versionId1, ds, uriInfo, headers, includeDeaccessioned); + DatasetVersion dsv2 = getDatasetVersionOrDie(req, versionId2, ds, uriInfo, headers, includeDeaccessioned); if (dsv1.getCreateTime().getTime() > dsv2.getCreateTime().getTime()) { return error(BAD_REQUEST, BundleUtil.getStringFromBundle("dataset.version.compare.incorrect.order")); } @@ -3212,7 +3228,7 @@ public Response getCompareVersionsSummary(@Context ContainerRequestContext crc, @QueryParam("offset") Integer offset) { return response(req -> { try { - Dataset dataset = findDatasetOrDie(id); + Dataset dataset = findDatasetUserCanSeeOrDie(id, req, false); List versionSummaries = execCommand(new GetDatasetVersionSummariesCommand(req, dataset, limit, offset)); JsonArrayBuilder versionSummariesArrayBuilder = jsonDatasetVersionSummaries(versionSummaries); long datasetVersionTotalCount = execCommand(new GetDatasetVersionCountCommand(req, dataset)); @@ -3328,11 +3344,13 @@ private DatasetVersion getDatasetVersionOrDie(final DataverseRequest req, String @GET @Path("{identifier}/locks") - public Response getLocksForDataset(@PathParam("identifier") String id, @QueryParam("type") DatasetLock.Reason lockType) { + @AuthRequired + public Response getLocksForDataset(@Context ContainerRequestContext crc, @PathParam("identifier") String id, @QueryParam("type") DatasetLock.Reason lockType) { Dataset dataset = null; try { - dataset = findDatasetOrDie(id); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + dataset = findDatasetUserCanSeeOrDie(id, req, false); Set locks; if (lockType == null) { locks = dataset.getLocks(); @@ -3494,10 +3512,12 @@ public Response listLocks(@Context ContainerRequestContext crc, @QueryParam("typ @GET @Path("{id}/makeDataCount/citations") - public Response getMakeDataCountCitations(@PathParam("id") String idSupplied) { + @AuthRequired + public Response getMakeDataCountCitations(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied) { try { - Dataset dataset = findDatasetOrDie(idSupplied); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + Dataset dataset = findDatasetUserCanSeeOrDie(idSupplied, req, false); JsonArrayBuilder datasetsCitations = Json.createArrayBuilder(); List externalCitations = datasetExternalCitationsService.getDatasetExternalCitationsByDataset(dataset); for (DatasetExternalCitations citation : externalCitations) { @@ -3522,9 +3542,10 @@ public Response getMakeDataCountCitations(@PathParam("id") String idSupplied) { @GET @Path("{id}/makeDataCount/{metric}") - public Response getMakeDataCountMetricCurrentMonth(@PathParam("id") String idSupplied, @PathParam("metric") String metricSupplied, @QueryParam("country") String country) { + @AuthRequired + public Response getMakeDataCountMetricCurrentMonth(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied, @PathParam("metric") String metricSupplied, @QueryParam("country") String country) { String nullCurrentMonth = null; - return getMakeDataCountMetric(idSupplied, metricSupplied, nullCurrentMonth, country); + return getMakeDataCountMetric(crc, idSupplied, metricSupplied, nullCurrentMonth, country); } @GET @@ -3532,7 +3553,7 @@ public Response getMakeDataCountMetricCurrentMonth(@PathParam("id") String idSup @Path("{identifier}/storagesize") public Response getStorageSize(@Context ContainerRequestContext crc, @PathParam("identifier") String dvIdtf, @QueryParam("includeCached") boolean includeCached) { return response(req -> ok(MessageFormat.format(BundleUtil.getStringFromBundle("datasets.api.datasize.storage"), - execCommand(new GetDatasetStorageSizeCommand(req, findDatasetOrDie(dvIdtf), includeCached, GetDatasetStorageSizeCommand.Mode.STORAGE, null)))), getRequestUser(crc)); + execCommand(new GetDatasetStorageSizeCommand(req, findDatasetUserCanSeeOrDie(dvIdtf, req, false), includeCached, GetDatasetStorageSizeCommand.Mode.STORAGE, null)))), getRequestUser(crc)); } @GET @@ -3570,7 +3591,7 @@ public Response getDownloadSize(@Context ContainerRequestContext crc, } catch (IllegalArgumentException e) { return error(Response.Status.BAD_REQUEST, "Invalid mode: " + mode); } - DatasetVersion datasetVersion = getDatasetVersionOrDie(req, version, findDatasetOrDie(dvIdtf), uriInfo, headers, includeDeaccessioned, false); + DatasetVersion datasetVersion = getDatasetVersionOrDie(req, version, findDatasetUserCanSeeOrDie(dvIdtf, req, false), uriInfo, headers, includeDeaccessioned, false); long datasetStorageSize = datasetVersionFilesServiceBean.getFilesDownloadSize(datasetVersion, fileSearchCriteria, fileDownloadSizeMode); String message = MessageFormat.format(BundleUtil.getStringFromBundle("datasets.api.datasize.download"), datasetStorageSize); JsonObjectBuilder jsonObjectBuilder = Json.createObjectBuilder(); @@ -3582,9 +3603,11 @@ public Response getDownloadSize(@Context ContainerRequestContext crc, @GET @Path("{id}/makeDataCount/{metric}/{yyyymm}") - public Response getMakeDataCountMetric(@PathParam("id") String idSupplied, @PathParam("metric") String metricSupplied, @PathParam("yyyymm") String yyyymm, @QueryParam("country") String country) { + @AuthRequired + public Response getMakeDataCountMetric(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied, @PathParam("metric") String metricSupplied, @PathParam("yyyymm") String yyyymm, @QueryParam("country") String country) { try { - Dataset dataset = findDatasetOrDie(idSupplied); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + Dataset dataset = findDatasetUserCanSeeOrDie(idSupplied, req, false); NullSafeJsonBuilder jsonObjectBuilder = jsonObjectBuilder(); MakeDataCountUtil.MetricType metricType = null; try { @@ -3698,7 +3721,8 @@ public Response getFileStore(@Context ContainerRequestContext crc, @PathParam("i Dataset dataset; try { - dataset = findDatasetOrDie(dvIdtf); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + dataset = findDatasetUserCanSeeOrDie(dvIdtf, req, false); } catch (WrappedResponse ex) { return error(Response.Status.NOT_FOUND, "No such dataset"); } @@ -3779,9 +3803,9 @@ public Response resetFileStore(@Context ContainerRequestContext crc, @PathParam( @Path("{identifier}/curationLabelSet") public Response getCurationLabelSet(@Context ContainerRequestContext crc, @PathParam("identifier") String dvIdtf, @Context UriInfo uriInfo, @Context HttpHeaders headers) throws WrappedResponse { - + AuthenticatedUser user = null; try { - AuthenticatedUser user = getRequestAuthenticatedUserOrDie(crc); + user = getRequestAuthenticatedUserOrDie(crc); if (!user.isSuperuser()) { return error(Response.Status.FORBIDDEN, "Superusers only."); } @@ -3792,7 +3816,8 @@ public Response getCurationLabelSet(@Context ContainerRequestContext crc, @PathP Dataset dataset; try { - dataset = findDatasetOrDie(dvIdtf); + DataverseRequest req = createDataverseRequest(user); + dataset = findDatasetUserCanSeeOrDie(dvIdtf, req,false); } catch (WrappedResponse ex) { return ex.getResponse(); } @@ -3891,7 +3916,8 @@ public Response getAllowedCurationLabels(@Context ContainerRequestContext crc, Dataset dataset; try { - dataset = findDatasetOrDie(dvIdtf); + DataverseRequest req = createDataverseRequest(user); + dataset = findDatasetUserCanSeeOrDie(dvIdtf, req, false); } catch (WrappedResponse ex) { return ex.getResponse(); } @@ -3912,8 +3938,10 @@ public Response getTimestamps(@Context ContainerRequestContext crc, @PathParam(" Dataset dataset = null; DateTimeFormatter formatter = DateTimeFormatter.ISO_LOCAL_DATE_TIME; try { - dataset = findDatasetOrDie(id); User u = getRequestUser(crc); + DataverseRequest req = createDataverseRequest(u); + dataset = findDatasetUserCanSeeOrDie(id, req, false); + Set perms = new HashSet(); perms.add(Permission.ViewUnpublishedDataset); boolean canSeeDraft = permissionSvc.hasPermissionsFor(u, dataset, perms); @@ -4037,7 +4065,8 @@ public Response getGlobusUploadParams(@Context ContainerRequestContext crc, @Pat Dataset dataset; try { - dataset = findDatasetOrDie(datasetId); + DataverseRequest req = createDataverseRequest(authUser); + dataset = findDatasetUserCanSeeOrDie(datasetId, req, false); } catch (WrappedResponse wr) { return wr.getResponse(); } @@ -4368,7 +4397,8 @@ public Response getGlobusDownloadParams(@Context ContainerRequestContext crc, @P Dataset dataset; try { - dataset = findDatasetOrDie(datasetId); + DataverseRequest req = createDataverseRequest(authUser); + dataset = findDatasetUserCanSeeOrDie(datasetId, req, false); } catch (WrappedResponse wr) { return wr.getResponse(); } @@ -5046,7 +5076,7 @@ public Response getDatasetVersionArchivalStatus(@Context ContainerRequestContext return error(Response.Status.FORBIDDEN, "Superusers only."); } DataverseRequest req = createDataverseRequest(au); - DatasetVersion dsv = getDatasetVersionOrDie(req, versionNumber, findDatasetOrDie(datasetId), uriInfo, + DatasetVersion dsv = getDatasetVersionOrDie(req, versionNumber, findDatasetUserCanSeeOrDie(datasetId, req, false), uriInfo, headers, true); if (dsv.getArchivalCopyLocation() == null) { @@ -5305,7 +5335,7 @@ public Response getExternalToolDVParams(@Context ContainerRequestContext crc, @QueryParam(value = "locale") String locale) { try { DataverseRequest req = createDataverseRequest(getRequestUser(crc)); - DatasetVersion target = getDatasetVersionOrDie(req, version, findDatasetOrDie(datasetId), null, null); + DatasetVersion target = getDatasetVersionOrDie(req, version, findDatasetUserCanSeeOrDie(datasetId, req, false), null, null); if (target == null) { return error(BAD_REQUEST, "DatasetVersion not found."); } @@ -5475,7 +5505,7 @@ public String getDatasetVersionCitationAsString(ContainerRequestContext crc, Str boolean checkFilePerms = false; DataverseRequest req = createDataverseRequest(getRequestUser(crc)); - DatasetVersion dsv = getDatasetVersionOrDie(req, versionId, findDatasetOrDie(datasetId), uriInfo, headers, + DatasetVersion dsv = getDatasetVersionOrDie(req, versionId, findDatasetUserCanSeeOrDie(datasetId, req, false), uriInfo, headers, includeDeaccessioned, checkFilePerms); return dsv.getCitation(format, true, false); } @@ -5520,7 +5550,8 @@ public Response getGuestbookEntryOption(@Context ContainerRequestContext crc, @P Dataset dataset; try { - dataset = findDatasetOrDie(dvIdtf); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + dataset = findDatasetUserCanSeeOrDie(dvIdtf, req, false); } catch (WrappedResponse ex) { return error(Response.Status.NOT_FOUND, "No such dataset"); } @@ -5601,12 +5632,14 @@ public Response resetGuestbookEntryAtRequest(@Context ContainerRequestContext cr @Path("{id}/userPermissions") public Response getUserPermissionsOnDataset(@Context ContainerRequestContext crc, @PathParam("id") String datasetId) { Dataset dataset; + User requestUser = getRequestUser(crc); try { - dataset = findDatasetOrDie(datasetId); + DataverseRequest req = createDataverseRequest(requestUser); + dataset = findDatasetUserCanSeeOrDie(datasetId, req, false); } catch (WrappedResponse wr) { return wr.getResponse(); } - User requestUser = getRequestUser(crc); + JsonObjectBuilder jsonObjectBuilder = Json.createObjectBuilder(); jsonObjectBuilder.add("canViewUnpublishedDataset", permissionService.userOn(requestUser, dataset).has(Permission.ViewUnpublishedDataset)); jsonObjectBuilder.add("canEditDataset", permissionService.userOn(requestUser, dataset).has(Permission.EditDataset)); @@ -5626,7 +5659,7 @@ public Response getCanDownloadAtLeastOneFile(@Context ContainerRequestContext cr @Context UriInfo uriInfo, @Context HttpHeaders headers) { return response(req -> { - DatasetVersion datasetVersion = getDatasetVersionOrDie(req, versionId, findDatasetOrDie(datasetId), uriInfo, headers, includeDeaccessioned); + DatasetVersion datasetVersion = getDatasetVersionOrDie(req, versionId, findDatasetUserCanSeeOrDie(datasetId, req, false), uriInfo, headers, includeDeaccessioned); return ok(permissionService.canDownloadAtLeastOneFile(req, datasetVersion)); }, getRequestUser(crc)); } @@ -5675,7 +5708,8 @@ public Response getPidGenerator(@Context ContainerRequestContext crc, @PathParam Dataset dataset; try { - dataset = findDatasetOrDie(dvIdtf); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + dataset = findDatasetUserCanSeeOrDie(dvIdtf, req, false); } catch (WrappedResponse ex) { return error(Response.Status.NOT_FOUND, "No such dataset"); } @@ -6140,7 +6174,7 @@ public Response deleteDatasetFiles(@Context ContainerRequestContext crc, @PathPa public Response getVersionCreationNote(@Context ContainerRequestContext crc, @PathParam("id") String datasetId, @PathParam("versionId") String versionId, @Context UriInfo uriInfo, @Context HttpHeaders headers) throws WrappedResponse { return response(req -> { - DatasetVersion datasetVersion = getDatasetVersionOrDie(req, versionId, findDatasetOrDie(datasetId), uriInfo, headers); + DatasetVersion datasetVersion = getDatasetVersionOrDie(req, versionId, findDatasetUserCanSeeOrDie(datasetId, req, false), uriInfo, headers); String note = datasetVersion.getVersionNote(); if(note == null) { return ok(Json.createObjectBuilder()); @@ -6210,7 +6244,7 @@ public Response deleteVersionNote(@Context ContainerRequestContext crc, @PathPar @Produces({ MediaType.APPLICATION_JSON, "text/csv" }) public Response getRoleAssignmentHistory(@Context ContainerRequestContext crc, @PathParam("identifier") String id, @Context HttpHeaders headers) { return response(req -> { - Dataset dataset = findDatasetOrDie(id); + Dataset dataset = findDatasetUserCanSeeOrDie(id, req, false); // user is authenticated AuthenticatedUser authenticatedUser = getRequestAuthenticatedUserOrDie(crc); @@ -6227,7 +6261,7 @@ public Response getFilesRoleAssignmentHistory(@Context ContainerRequestContext c @PathParam("identifier") String id, @Context HttpHeaders headers) { return response(req -> { - Dataset dataset = findDatasetOrDie(id); + Dataset dataset = findDatasetUserCanSeeOrDie(id, req, false); // user is authenticated AuthenticatedUser authenticatedUser = getRequestAuthenticatedUserOrDie(crc); @@ -6275,7 +6309,8 @@ public Response updateLicense(@Context ContainerRequestContext crc, @Path("{identifier}/storage/quota") public Response getDatasetQuota(@Context ContainerRequestContext crc, @PathParam("identifier") String dvIdtf, @QueryParam("showInherited") boolean showInherited) throws WrappedResponse { try { - Long bytesAllocated = execCommand(new GetDatasetQuotaCommand(createDataverseRequest(getRequestUser(crc)), findDatasetOrDie(dvIdtf), showInherited)); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + Long bytesAllocated = execCommand(new GetDatasetQuotaCommand(req, findDatasetUserCanSeeOrDie(dvIdtf, req, false), showInherited)); if (bytesAllocated != null) { return ok(MessageFormat.format(BundleUtil.getStringFromBundle("dataset.storage.quota.allocation"),bytesAllocated)); } @@ -6329,7 +6364,7 @@ public Response deleteDatasetQuota(@Context ContainerRequestContext crc, @PathPa @Path("{identifier}/storage/use") public Response getDatasetStorageUse(@Context ContainerRequestContext crc, @PathParam("identifier") String identifier) throws WrappedResponse { return response(req -> ok(MessageFormat.format(BundleUtil.getStringFromBundle("dataset.storage.use"), - execCommand(new GetDatasetStorageUseCommand(req, findDatasetOrDie(identifier))))), getRequestUser(crc)); + execCommand(new GetDatasetStorageUseCommand(req, findDatasetUserCanSeeOrDie(identifier, req, false))))), getRequestUser(crc)); } @GET @@ -6340,20 +6375,21 @@ public Response getUploadLimits(@Context ContainerRequestContext crc, @PathParam @Context HttpHeaders headers) throws WrappedResponse { Dataset dataset; - - try { - dataset = findDatasetOrDie(dvIdtf); - } catch (WrappedResponse ex) { - return error(Response.Status.NOT_FOUND, "No such dataset"); - } - AuthenticatedUser user; try { user = getRequestAuthenticatedUserOrDie(crc); } catch (WrappedResponse ex) { return error(Response.Status.BAD_REQUEST, "This API call requires authentication."); } - if (!permissionSvc.requestOn(createDataverseRequest(user), dataset).has(Permission.EditDataset)) { + DataverseRequest req = createDataverseRequest(user); + try { + dataset = findDatasetUserCanSeeOrDie(dvIdtf, req, false); + } catch (WrappedResponse ex) { + return error(Response.Status.NOT_FOUND, "No such dataset"); + } + + + if (!permissionSvc.requestOn(req, dataset).has(Permission.EditDataset)) { return error(Response.Status.FORBIDDEN, "This API call requires EditDataset permission."); } diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java b/src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java index 083f8a8af52..56548d89cd4 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java @@ -3,7 +3,6 @@ import com.google.common.collect.Lists; import com.google.api.client.util.ArrayMap; import edu.harvard.iq.dataverse.*; -import static edu.harvard.iq.dataverse.api.AbstractApiBean.error; import edu.harvard.iq.dataverse.api.auth.AuthRequired; import edu.harvard.iq.dataverse.api.datadeposit.SwordServiceBean; import edu.harvard.iq.dataverse.api.dto.*; @@ -16,7 +15,6 @@ import edu.harvard.iq.dataverse.authorization.groups.impl.explicit.ExplicitGroup; import edu.harvard.iq.dataverse.authorization.groups.impl.explicit.ExplicitGroupProvider; import edu.harvard.iq.dataverse.authorization.groups.impl.explicit.ExplicitGroupServiceBean; -import edu.harvard.iq.dataverse.authorization.groups.impl.ipaddress.ip.IpAddress; import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser; import edu.harvard.iq.dataverse.authorization.users.User; import edu.harvard.iq.dataverse.dataset.DatasetType; @@ -24,7 +22,6 @@ import edu.harvard.iq.dataverse.dataverse.featured.DataverseFeaturedItem; import edu.harvard.iq.dataverse.dataverse.featured.DataverseFeaturedItemServiceBean; import edu.harvard.iq.dataverse.engine.command.DataverseRequest; -import edu.harvard.iq.dataverse.engine.command.exception.CommandException; import edu.harvard.iq.dataverse.engine.command.impl.*; import edu.harvard.iq.dataverse.pidproviders.PidProvider; import edu.harvard.iq.dataverse.pidproviders.PidUtil; @@ -73,6 +70,7 @@ import jakarta.ws.rs.core.Context; import jakarta.ws.rs.core.HttpHeaders; import jakarta.ws.rs.core.StreamingOutput; +import org.apache.commons.lang3.StringUtils; import org.glassfish.jersey.media.multipart.FormDataBodyPart; import org.glassfish.jersey.media.multipart.FormDataContentDisposition; import org.glassfish.jersey.media.multipart.FormDataParam; @@ -357,10 +355,10 @@ public Response validateDatasetJson(@Context ContainerRequestContext crc, String @Path("{identifier}/datasetSchema") @Produces(MediaType.APPLICATION_JSON) public Response getDatasetSchema(@Context ContainerRequestContext crc, @PathParam("identifier") String idtf) { - User u = getRequestUser(crc); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); try { - String datasetSchema = execCommand(new GetDatasetSchemaCommand(createDataverseRequest(u), findDataverseOrDie(idtf))); + String datasetSchema = execCommand(new GetDatasetSchemaCommand(req, findDataverseUserCanSeeOrDie(idtf, req))); JsonObject jsonObject = JsonUtil.getJsonObject(datasetSchema); return Response.ok(jsonObject).build(); } catch (WrappedResponse ex) { @@ -719,7 +717,7 @@ private Dataset parseDataset(String datasetJson) throws WrappedResponse { @Path("{identifier}") public Response getDataverse(@Context ContainerRequestContext crc, @PathParam("identifier") String idtf, @QueryParam("returnOwners") boolean returnOwners, @QueryParam("returnChildCount") boolean returnChildCount) { return response(req -> { - Dataverse dataverse = execCommand(new GetDataverseCommand(req, findDataverseOrDie(idtf))); + Dataverse dataverse = execCommand(new GetDataverseCommand(req, findDataverseUserCanSeeOrDie(idtf, req))); boolean hideEmail = settingsService.isTrueForKey(SettingsServiceBean.Key.ExcludeEmailFromExport, false); return ok(json(dataverse, hideEmail, returnOwners, false, returnChildCount ? dataverseService.getChildCount(dataverse) : null)); }, getRequestUser(crc)); @@ -769,8 +767,9 @@ private Object formatAttributeValue(String attribute, String value) throws Wrapp @Path("{identifier}/inputLevels") public Response getInputLevels(@Context ContainerRequestContext crc, @PathParam("identifier") String identifier) { try { - Dataverse dataverse = findDataverseOrDie(identifier); - List inputLevels = execCommand(new ListDataverseInputLevelsCommand(createDataverseRequest(getRequestUser(crc)), dataverse)); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + Dataverse dataverse = findDataverseUserCanSeeOrDie(identifier, req); + List inputLevels = execCommand(new ListDataverseInputLevelsCommand(req, dataverse)); return ok(jsonDataverseInputLevels(inputLevels)); } catch (WrappedResponse e) { return e.getResponse(); @@ -861,11 +860,12 @@ public Response listMetadataBlocks(@Context ContainerRequestContext crc, @QueryParam("returnDatasetFieldTypes") boolean returnDatasetFieldTypes, @QueryParam("datasetType") String datasetTypeIn) { try { - Dataverse dataverse = findDataverseOrDie(dvIdtf); + DataverseRequest req = createDataverseRequest(getRequestUser(crc)); + Dataverse dataverse = findDataverseUserCanSeeOrDie(dvIdtf, req); DatasetType datasetType = datasetTypeSvc.getByName(datasetTypeIn); final List metadataBlocks = execCommand( new ListMetadataBlocksCommand( - createDataverseRequest(getRequestUser(crc)), + req, dataverse, onlyDisplayedOnCreate, datasetType @@ -920,7 +920,7 @@ public Response getMetadataRoot_legacy(@Context ContainerRequestContext crc, @Pa @Produces(MediaType.APPLICATION_JSON) public Response getMetadataRoot(@Context ContainerRequestContext crc, @PathParam("identifier") String dvIdtf) { return response(req -> { - final Dataverse dataverse = findDataverseOrDie(dvIdtf); + final Dataverse dataverse = findDataverseUserCanSeeOrDie(dvIdtf, req); if (permissionSvc.request(req) .on(dataverse) .has(Permission.EditDataverse)) { @@ -966,7 +966,7 @@ public Response listFacets(@Context ContainerRequestContext crc, try { User user = getRequestUser(crc); DataverseRequest request = createDataverseRequest(user); - Dataverse dataverse = findDataverseOrDie(dvIdtf); + Dataverse dataverse = findDataverseUserCanSeeOrDie(dvIdtf, request); List dataverseFacets = execCommand(new ListFacetsCommand(request, dataverse)); if (returnDetails) { @@ -995,7 +995,7 @@ public Response getFeaturedDataverses(@Context ContainerRequestContext crc, @Pat try { User u = getRequestUser(crc); DataverseRequest r = createDataverseRequest(u); - Dataverse dataverse = findDataverseOrDie(dvIdtf); + Dataverse dataverse = findDataverseUserCanSeeOrDie(dvIdtf, r); JsonArrayBuilder fs = Json.createArrayBuilder(); for (Dataverse f : execCommand(new ListFeaturedCollectionsCommand(r, dataverse))) { fs.add(f.getAlias()); @@ -1126,7 +1126,7 @@ public Response listMetadataBlockFacets(@Context ContainerRequestContext crc, @P try { User u = getRequestUser(crc); DataverseRequest request = createDataverseRequest(u); - Dataverse dataverse = findDataverseOrDie(dvIdtf); + Dataverse dataverse = findDataverseUserCanSeeOrDie(dvIdtf, request); List metadataBlockFacets = Optional.ofNullable(execCommand(new ListMetadataBlockFacetsCommand(request, dataverse))).orElse(Collections.emptyList()); List metadataBlocksDTOs = metadataBlockFacets.stream() .map(item -> new DataverseMetadataBlockFacetDTO.MetadataBlockDTO(item.getMetadataBlock().getName(), item.getMetadataBlock().getLocaleDisplayFacet())) @@ -1222,7 +1222,7 @@ public JsonObjectBuilder visit(DataFile df) { }; return response(req -> ok( - execCommand(new ListDataverseContentCommand(req, findDataverseOrDie(dvIdtf))) + execCommand(new ListDataverseContentCommand(req, findDataverseUserCanSeeOrDie(dvIdtf, req))) .stream() .map(dvo -> (JsonObjectBuilder) dvo.accept(ser)) .collect(toJsonArray()) @@ -1235,7 +1235,7 @@ public JsonObjectBuilder visit(DataFile df) { public Response getStorageSize(@Context ContainerRequestContext crc, @PathParam("identifier") String dvIdtf, @QueryParam("includeCached") boolean includeCached) throws WrappedResponse { return response(req -> ok(MessageFormat.format(BundleUtil.getStringFromBundle("dataverse.datasize"), - execCommand(new GetDataverseStorageSizeCommand(req, findDataverseOrDie(dvIdtf), includeCached)))), getRequestUser(crc)); + execCommand(new GetDataverseStorageSizeCommand(req, findDataverseUserCanSeeOrDie(dvIdtf, req), includeCached)))), getRequestUser(crc)); } @GET @@ -1243,7 +1243,8 @@ public Response getStorageSize(@Context ContainerRequestContext crc, @PathParam( @Path("{identifier}/storage/quota") public Response getCollectionQuota(@Context ContainerRequestContext crc, @PathParam("identifier") String dvIdtf, @QueryParam("showInherited") boolean showInherited) throws WrappedResponse { try { - Long bytesAllocated = execCommand(new GetCollectionQuotaCommand(createDataverseRequest(getRequestUser(crc)), findDataverseOrDie(dvIdtf), showInherited)); + DataverseRequest request = createDataverseRequest(getRequestUser(crc)); + Long bytesAllocated = execCommand(new GetCollectionQuotaCommand(request, findDataverseUserCanSeeOrDie(dvIdtf, request), showInherited)); if (bytesAllocated != null) { return ok(MessageFormat.format(BundleUtil.getStringFromBundle("dataverse.storage.quota.allocation"),bytesAllocated)); } @@ -1297,7 +1298,7 @@ public Response deleteCollectionQuota(@Context ContainerRequestContext crc, @Pat @Path("{identifier}/storage/use") public Response getCollectionStorageUse(@Context ContainerRequestContext crc, @PathParam("identifier") String identifier) throws WrappedResponse { return response(req -> ok(MessageFormat.format(BundleUtil.getStringFromBundle("dataverse.storage.use"), - execCommand(new GetCollectionStorageUseCommand(req, findDataverseOrDie(identifier))))), getRequestUser(crc)); + execCommand(new GetCollectionStorageUseCommand(req, findDataverseUserCanSeeOrDie(identifier, req))))), getRequestUser(crc)); } @GET @@ -1305,7 +1306,7 @@ public Response getCollectionStorageUse(@Context ContainerRequestContext crc, @P @Path("{identifier}/roles") public Response listRoles(@Context ContainerRequestContext crc, @PathParam("identifier") String dvIdtf) { return response(req -> ok( - execCommand(new ListRolesCommand(req, findDataverseOrDie(dvIdtf))) + execCommand(new ListRolesCommand(req, findDataverseUserCanSeeOrDie(dvIdtf, req))) .stream().map(r -> json(r)) .collect(toJsonArray()) ), getRequestUser(crc)); @@ -1323,7 +1324,7 @@ public Response createRole(@Context ContainerRequestContext crc, RoleDTO roleDto @Path("{identifier}/assignments") public Response listAssignments(@Context ContainerRequestContext crc, @PathParam("identifier") String dvIdtf) { return response(req -> ok( - execCommand(new ListRoleAssignments(req, findDataverseOrDie(dvIdtf))) + execCommand(new ListRoleAssignments(req, findDataverseUserCanSeeOrDie(dvIdtf, req))) .stream() .map(a -> json(a)) .collect(toJsonArray()) @@ -1513,7 +1514,7 @@ public Response createExplicitGroup(@Context ContainerRequestContext crc, Explic @Path("{identifier}/groups/") public Response listGroups(@Context ContainerRequestContext crc, @PathParam("identifier") String dvIdtf, @QueryParam("key") String apiKey) { return response(req -> ok( - execCommand(new ListExplicitGroupsCommand(req, findDataverseOrDie(dvIdtf))) + execCommand(new ListExplicitGroupsCommand(req, findDataverseUserCanSeeOrDie(dvIdtf, req))) .stream().map(eg -> json(eg)) .collect(toJsonArray()) ), getRequestUser(crc)); @@ -1525,7 +1526,7 @@ public Response listGroups(@Context ContainerRequestContext crc, @PathParam("ide public Response getGroupByOwnerAndAliasInOwner(@Context ContainerRequestContext crc, @PathParam("identifier") String dvIdtf, @PathParam("aliasInOwner") String grpAliasInOwner) { - return response(req -> ok(json(findExplicitGroupOrDie(findDataverseOrDie(dvIdtf), + return response(req -> ok(json(findExplicitGroupOrDie(findDataverseUserCanSeeOrDie(dvIdtf, req), req, grpAliasInOwner))), getRequestUser(crc)); } @@ -1538,9 +1539,10 @@ public Response getGuestbookResponsesByDataverse(@Context ContainerRequestContex Dataverse dv; try { - dv = findDataverseOrDie(dvIdtf); User u = getRequestUser(crc); DataverseRequest req = createDataverseRequest(u); + dv = findDataverseUserCanSeeOrDie(dvIdtf, req); + if (permissionSvc.request(req) .on(dv) .has(Permission.EditDataverse)) { @@ -1712,7 +1714,8 @@ private ExplicitGroup findExplicitGroupOrDie(DvObject dv, DataverseRequest req, public Response listLinks(@Context ContainerRequestContext crc, @PathParam("identifier") String dvIdtf) { try { User u = getRequestUser(crc); - Dataverse dv = findDataverseOrDie(dvIdtf); + DataverseRequest req = createDataverseRequest(u); + Dataverse dv = findDataverseUserCanSeeOrDie(dvIdtf, req); if (!u.isSuperuser()) { return error(Status.FORBIDDEN, "Not a superuser"); } @@ -2155,4 +2158,123 @@ public Response getRoleAssignmentHistory(@Context ContainerRequestContext crc, return getRoleAssignmentHistoryResponse(dataverse, authenticatedUser, false, headers); }, getRequestUser(crc)); } + + @GET + @AuthRequired + @Path("{identifier}/locallyFairRoleAssignees") + @Produces(MediaType.APPLICATION_JSON) + public Response listLocallyFairRoleAssignees(@Context ContainerRequestContext crc, @PathParam("identifier") String dvIdtf) { + try { + User user = getRequestUser(crc); + if (!user.isSuperuser()) { + return error(Status.FORBIDDEN, "Not a superuser"); + } + + Dataverse dataverse = findDataverseOrDie(dvIdtf); + JsonArrayBuilder assignees = Json.createArrayBuilder(); + dataverse.getLocallyFAIRRoleAssigneeIdentifiers().stream() + .sorted() + .forEach(assignees::add); + return ok(assignees); + } catch (WrappedResponse ex) { + return ex.getResponse(); + } + } + + @PUT + @AuthRequired + @Path("{identifier}/locallyFairRoleAssignees") + @Consumes(MediaType.APPLICATION_JSON) + @Produces(MediaType.APPLICATION_JSON) + public Response setLocallyFairRoleAssignees(@Context ContainerRequestContext crc, + @PathParam("identifier") String dvIdtf, + List roleAssigneeIdentifiers) { + try { + User user = getRequestUser(crc); + if (!user.isSuperuser()) { + return error(Status.FORBIDDEN, "Not a superuser"); + } + + Dataverse dataverse = findDataverseOrDie(dvIdtf); + Set validatedIdentifiers = validateLocallyFairRoleAssigneeIdentifiers(roleAssigneeIdentifiers); + dataverse.setLocallyFAIRRoleAssigneeIdentifiers(validatedIdentifiers); + dataverseService.save(dataverse); + dataverseService.index(dataverse, true); + + return ok(String.format("Locally FAIR role assignees updated for dataverse %s.", dvIdtf), jsonLocallyFairRoleAssignees(dataverse)); + } catch (WrappedResponse ex) { + return ex.getResponse(); + } + } + + @PUT + @AuthRequired + @Path("{identifier}/locallyFairRoleAssignees/{roleAssigneeIdentifier: .*}") + @Produces(MediaType.APPLICATION_JSON) + public Response addLocallyFairRoleAssignee(@Context ContainerRequestContext crc, + @PathParam("identifier") String dvIdtf, + @PathParam("roleAssigneeIdentifier") String roleAssigneeIdentifier) { + try { + User user = getRequestUser(crc); + if (!user.isSuperuser()) { + return error(Status.FORBIDDEN, "Not a superuser"); + } + + Dataverse dataverse = findDataverseOrDie(dvIdtf); + if (findAssignee(roleAssigneeIdentifier) == null) { + return badRequest("Invalid role assignee identifier: " + roleAssigneeIdentifier); + } + + dataverse.addLocallyFAIRRoleAssignee(roleAssigneeIdentifier); + dataverseService.save(dataverse); + dataverseService.index(dataverse, true); + + return ok(String.format("Locally FAIR role assignee added to dataverse %s.", dvIdtf), jsonLocallyFairRoleAssignees(dataverse)); + } catch (WrappedResponse ex) { + return ex.getResponse(); + } + } + + @DELETE + @AuthRequired + @Path("{identifier}/locallyFairRoleAssignees/{roleAssigneeIdentifier: .*}") + @Produces(MediaType.APPLICATION_JSON) + public Response deleteLocallyFairRoleAssignee(@Context ContainerRequestContext crc, + @PathParam("identifier") String dvIdtf, + @PathParam("roleAssigneeIdentifier") String roleAssigneeIdentifier) { + try { + User user = getRequestUser(crc); + if (!user.isSuperuser()) { + return error(Status.FORBIDDEN, "Not a superuser"); + } + + Dataverse dataverse = findDataverseOrDie(dvIdtf); + if(StringUtils.isBlank(roleAssigneeIdentifier) || !dataverse.getLocallyFAIRRoleAssigneeIdentifiers().contains(roleAssigneeIdentifier)) { + return badRequest("Invalid role assignee identifier: " + roleAssigneeIdentifier); + } + dataverse.removeLocallyFAIRRoleAssignee(roleAssigneeIdentifier); + dataverseService.save(dataverse); + dataverseService.index(dataverse, true); + + return ok(String.format("Locally FAIR role assignee removed from dataverse %s.", dvIdtf), jsonLocallyFairRoleAssignees(dataverse)); + } catch (WrappedResponse ex) { + return ex.getResponse(); + } + } + + private Set validateLocallyFairRoleAssigneeIdentifiers(List roleAssigneeIdentifiers) throws WrappedResponse { + if (roleAssigneeIdentifiers == null) { + return Collections.emptySet(); + } + + Set validatedIdentifiers = new TreeSet<>(); + for (String identifier : roleAssigneeIdentifiers) { + if (findAssignee(identifier) == null) { + throw new WrappedResponse(badRequest("Invalid role assignee identifier: " + identifier)); + } + validatedIdentifiers.add(identifier); + } + return validatedIdentifiers; + } + } diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Files.java b/src/main/java/edu/harvard/iq/dataverse/api/Files.java index 0a1b19985a4..a745e851532 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Files.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Files.java @@ -573,7 +573,7 @@ private Response getFileDataResponse(final DataverseRequest req, boolean returnOwners, UriInfo uriInfo, HttpHeaders headers) throws WrappedResponse { - final DataFile dataFile = execCommand(new GetDataFileCommand(req, findDataFileOrDie(fileIdOrPersistentId))); + final DataFile dataFile = execCommand(new GetDataFileCommand(req, findDataFileUserCanSeeOrDie(fileIdOrPersistentId, req))); FileMetadata fileMetadata = execCommand(handleVersion(datasetVersionId, new Datasets.DsVersionHandler<>() { @Override public Command handleLatest() { @@ -626,7 +626,7 @@ public Response getFileMetadata(@Context ContainerRequestContext crc, @PathParam } final DataFile df; try { - df = execCommand(new GetDataFileCommand(req, findDataFileOrDie(fileIdOrPersistentId))); + df = execCommand(new GetDataFileCommand(req, findDataFileUserCanSeeOrDie(fileIdOrPersistentId, req))); } catch (Exception e) { return error(BAD_REQUEST, "Error attempting get the requested data file."); } @@ -634,7 +634,7 @@ public Response getFileMetadata(@Context ContainerRequestContext crc, @PathParam if(null != getDraft && getDraft) { try { - fm = execCommand(new GetDraftFileMetadataIfAvailableCommand(req, findDataFileOrDie(fileIdOrPersistentId))); + fm = execCommand(new GetDraftFileMetadataIfAvailableCommand(req, findDataFileUserCanSeeOrDie(fileIdOrPersistentId, req))); } catch (WrappedResponse w) { return error(BAD_REQUEST, "An error occurred getting a draft version, you may not have permission to access unpublished data on this dataset." ); } @@ -754,37 +754,37 @@ public Response reingest(@Context ContainerRequestContext crc, @PathParam("id") } boolean ingestLock = dataset.isLockedFor(DatasetLock.Reason.Ingest); - + if (ingestLock) { return error(FORBIDDEN, "Dataset already locked with an Ingest lock"); } - + if (!FileUtil.canIngestAsTabular(dataFile)) { return error(BAD_REQUEST, "Tabular ingest is not supported for this file type (id: "+id+", type: "+dataFile.getContentType()+")"); } - + dataFile.SetIngestScheduled(); - + if (dataFile.getIngestRequest() == null) { dataFile.setIngestRequest(new IngestRequest(dataFile)); } dataFile.getIngestRequest().setForceTypeCheck(true); - + // update the datafile, to save the newIngest request in the database: dataFile = fileService.save(dataFile); - - // queue the data ingest job for asynchronous execution: + + // queue the data ingest job for asynchronous execution: String status = ingestService.startIngestJobs(dataset.getId(), new ArrayList<>(Arrays.asList(dataFile)), u); - + if (!StringUtil.isEmpty(status)) { - // This most likely indicates some sort of a problem (for example, + // This most likely indicates some sort of a problem (for example, // the ingest job was not put on the JMS queue because of the size // of the file). But we are still returning the OK status - because - // from the point of view of the API, it's a success - we have - // successfully gone through the process of trying to schedule the + // from the point of view of the API, it's a success - we have + // successfully gone through the process of trying to schedule the // ingest job... - + return ok(status); } return ok("Datafile " + id + " queued for ingest"); @@ -854,28 +854,28 @@ private void exportDatasetMetadata(SettingsServiceBean settingsServiceBean, Data logger.log(Level.WARNING, "Dataset publication finalization: exception while exporting:{0}", ex.getMessage()); } } - + /** * API endpoint to retrieve a URL for a file-level external tool. - * + * * This endpoint allows clients to get a URL for accessing an external tool * that operates at the file level. The URL includes necessary authentication tokens and * parameters based on the user's permissions and the tool's configuration. - * + * * The endpoint accepts JSON input with optional parameters: * - preview: boolean flag to indicate if the tool should run in preview mode (suppressing header metadata like name/PID that would already be on the file page) * - locale: string specifying the locale for internationalization - * + * * The response includes: * - toolUrl: the URL to access the external tool * - toolName: the display name of the external tool * - fileId: the ID of the file * - preview: whether the URL is for preview mode - * + * * Authentication is required, and appropriate permissions are checked before generating the URL. * For restricted files (including files in draft/deaccessioned datasets, embargoed files, or * files with expired retention periods), the user must have DownloadFile permission. - * + * * @param crc The container request context for authentication * @param fileId The ID of the file * @param externalToolId The ID of the external tool @@ -908,7 +908,7 @@ public Response getExternalToolUrl(@Context ContainerRequestContext crc, @PathPa return error(Response.Status.BAD_REQUEST, "Invalid JSON format in request body."); } } - + try { // Find the file DataFile dataFile; @@ -932,13 +932,13 @@ public Response getExternalToolUrl(@Context ContainerRequestContext crc, @PathPa // Check if the tool's content type matches the file's content type String toolContentType = externalTool.getContentType(); String fileContentType = dataFile.getContentType(); - if (toolContentType != null && !toolContentType.isEmpty() && + if (toolContentType != null && !toolContentType.isEmpty() && !toolContentType.equals(fileContentType)) { - return error(BAD_REQUEST, - "External tool content type (" + toolContentType + + return error(BAD_REQUEST, + "External tool content type (" + toolContentType + ") does not match file content type (" + fileContentType + ")."); } - + if (!externalToolService.meetsRequirements(externalTool, dataFile)) { return error(BAD_REQUEST, "External tool requirements not met for this file."); } @@ -995,8 +995,8 @@ public Response getExternalToolUrl(@Context ContainerRequestContext crc, @PathPa "An error occurred while generating the external tool URL."); } } - - // This method provides a callback for an external tool to retrieve it's + + // This method provides a callback for an external tool to retrieve its // parameters/api URLs. If the request is authenticated, e.g. by it being // signed, the api URLs will be signed. If a guest request is made, the URLs // will be plain/unsigned. @@ -1027,7 +1027,7 @@ public Response getExternalToolFMParams(@Context ContainerRequestContext crc, @P eth = new ExternalToolHandler(externalTool, target.getDataFile(), apiToken, target, locale); return ok(eth.createPostBody(eth.getParams(JsonUtil.getJsonObject(externalTool.getToolParameters())), JsonUtil.getJsonArray(externalTool.getAllowedApiCalls()))); } - + @GET @Path("fixityAlgorithm") public Response getFixityAlgorithm() { @@ -1039,7 +1039,7 @@ public Response getFixityAlgorithm() { @Path("{id}/downloadCount") public Response getFileDownloadCount(@Context ContainerRequestContext crc, @PathParam("id") String dataFileId) { return response(req -> { - DataFile dataFile = execCommand(new GetDataFileCommand(req, findDataFileOrDie(dataFileId))); + DataFile dataFile = execCommand(new GetDataFileCommand(req, findDataFileUserCanSeeOrDie(dataFileId, req))); return ok(guestbookResponseService.getDownloadCountByDataFileId(dataFile.getId()).toString()); }, getRequestUser(crc)); } @@ -1049,13 +1049,13 @@ public Response getFileDownloadCount(@Context ContainerRequestContext crc, @Path @Path("{id}/dataTables") public Response getFileDataTables(@Context ContainerRequestContext crc, @PathParam("id") String dataFileId) { DataFile dataFile; + DataverseRequest dataverseRequest = createDataverseRequest(getRequestUser(crc)); try { - dataFile = findDataFileOrDie(dataFileId); + dataFile = findDataFileUserCanSeeOrDie(dataFileId, dataverseRequest); } catch (WrappedResponse e) { return notFound("File not found for given id."); } if (dataFile.isRestricted() || FileUtil.isActivelyEmbargoed(dataFile)) { - DataverseRequest dataverseRequest = createDataverseRequest(getRequestUser(crc)); boolean hasPermissionToDownloadFile = permissionSvc.requestOn(dataverseRequest, dataFile).has(Permission.DownloadFile); if (!hasPermissionToDownloadFile) { return forbidden("Insufficient permissions to access the requested information."); @@ -1132,7 +1132,7 @@ public Response setFileTabularTags(@Context ContainerRequestContext crc, @PathPa @Path("{id}/hasBeenDeleted") public Response getHasBeenDeleted(@Context ContainerRequestContext crc, @PathParam("id") String dataFileId) { return response(req -> { - DataFile dataFile = execCommand(new GetDataFileCommand(req, findDataFileOrDie(dataFileId))); + DataFile dataFile = execCommand(new GetDataFileCommand(req, findDataFileUserCanSeeOrDie(dataFileId, req))); return ok(dataFileServiceBean.hasBeenDeleted(dataFile)); }, getRequestUser(crc)); } @@ -1149,7 +1149,7 @@ public Response getHasBeenDeleted(@Context ContainerRequestContext crc, @PathPar public Response getFileCitationByVersion(@Context ContainerRequestContext crc, @PathParam("id") String fileIdOrPersistentId, @PathParam("dsVersionString") String versionNumber, @QueryParam("includeDeaccessioned") boolean includeDeaccessioned) { try { DataverseRequest req = createDataverseRequest(getRequestUser(crc)); - final DataFile df = execCommand(new GetDataFileCommand(req, findDataFileOrDie(fileIdOrPersistentId))); + final DataFile df = execCommand(new GetDataFileCommand(req, findDataFileUserCanSeeOrDie(fileIdOrPersistentId, req))); Dataset ds = df.getOwner(); DatasetVersion dsv = findDatasetVersionOrDie(req, versionNumber, ds, includeDeaccessioned, true); if (dsv == null) { @@ -1179,7 +1179,7 @@ public Response getFileVersionsList(@Context ContainerRequestContext crc, @QueryParam("offset") Integer offset) { try { DataverseRequest req = createDataverseRequest(getRequestUser(crc)); - final DataFile df = execCommand(new GetDataFileCommand(req, findDataFileOrDie(fileIdOrPersistentId))); + final DataFile df = execCommand(new GetDataFileCommand(req, findDataFileUserCanSeeOrDie(fileIdOrPersistentId, req))); FileMetadata fm = df.getFileMetadata(); if (fm == null) { return notFound(BundleUtil.getStringFromBundle("files.api.fileNotFound")); diff --git a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/UpdateDataverseCommand.java b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/UpdateDataverseCommand.java index 2a288e22dac..35c4d5bd049 100644 --- a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/UpdateDataverseCommand.java +++ b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/UpdateDataverseCommand.java @@ -87,7 +87,8 @@ protected Dataverse innerExecute(CommandContext ctxt) throws IllegalCommandExcep // This check is not recursive as all the values just report the immediate parent if (!oldDvType.equals(dataverse.getDataverseType()) || !oldDvName.equals(dataverse.getName()) - || !oldDvAlias.equals(dataverse.getAlias())) { + || !oldDvAlias.equals(dataverse.getAlias()) + || !oldDv.getLocallyFAIRRoleAssigneeIdentifiers().equals(dataverse.getLocallyFAIRRoleAssigneeIdentifiers())) { datasetsReindexRequired = true; } diff --git a/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java index 119a423f17a..e570b220e53 100644 --- a/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java @@ -69,7 +69,6 @@ import java.util.List; import java.util.Locale; import java.util.Map; -import java.util.Objects; import java.util.Set; import java.util.concurrent.ConcurrentHashMap; import java.util.concurrent.Future; @@ -112,8 +111,6 @@ import org.apache.tika.metadata.Metadata; import org.apache.tika.parser.ParseContext; import org.apache.tika.sax.BodyContentHandler; -import org.eclipse.microprofile.config.Config; -import org.eclipse.microprofile.config.ConfigProvider; import org.eclipse.microprofile.metrics.MetricUnits; import org.eclipse.microprofile.metrics.Timer; import org.eclipse.microprofile.metrics.annotation.Metric; @@ -124,7 +121,6 @@ public class IndexServiceBean { private static final Logger logger = Logger.getLogger(IndexServiceBean.class.getCanonicalName()); - private static final Config config = ConfigProvider.getConfig(); @PersistenceContext(unitName = "VDCNet-ejbPU") private EntityManager em; @@ -179,7 +175,6 @@ public class IndexServiceBean { public static final String discoverabilityPermissionSuffix = "_permission"; private static final String groupPrefix = "group_"; private static final String groupPerUserPrefix = "group_user"; - private static final String publicGroupIdString = "public"; private static final String publicGroupString = groupPrefix + "public"; public static final String PUBLISHED_STRING = "Published"; private static final String UNPUBLISHED_STRING = "Unpublished"; @@ -189,8 +184,6 @@ public class IndexServiceBean { public static final String HARVESTED = "Harvested"; private Dataverse rootDataverseCached; - private VariableMetadataUtil variableMetadataUtil; - @TransactionAttribute(REQUIRES_NEW) public Future indexDataverseInNewTransaction(Dataverse dataverse) throws SolrServerException, IOException{ return indexDataverse(dataverse, false); @@ -231,7 +224,11 @@ public Future indexDataverse(Dataverse dataverse, boolean processPaths) solrInputDocument.addField(SearchFields.DATAVERSE_CATEGORY, dataverse.getIndexableCategoryName()); if (dataverse.isReleased()) { solrInputDocument.addField(SearchFields.PUBLICATION_STATUS, PUBLISHED_STRING); - if (FeatureFlags.ADD_PUBLICOBJECT_SOLR_FIELD.enabled()) { + boolean isLocallyFAIR = dataverse.isLocallyFAIR(); + if(isLocallyFAIR) { + solrInputDocument.addField(SearchFields.LOCALLY_FAIR, isLocallyFAIR); + } + if (FeatureFlags.ADD_PUBLICOBJECT_SOLR_FIELD.enabled() && !isLocallyFAIR) { solrInputDocument.addField(SearchFields.PUBLIC_OBJECT, true); } solrInputDocument.addField(SearchFields.RELEASE_OR_CREATE_DATE, dataverse.getPublicationDate()); @@ -1030,7 +1027,11 @@ public SolrInputDocuments toSolrDocs(IndexableDataset indexableDataset, Set findDataversePerms(Dataverse dataverse) { List permStrings = new ArrayList<>(); if (hasBeenPublished(dataverse)) { - permStrings.add(IndexServiceBean.getPublicGroupString()); + Set raIds = dataverse.getLocallyFAIRRoleAssigneeIdentifiers(); + if (raIds.isEmpty()) { + permStrings.add(IndexServiceBean.getPublicGroupString()); + } else { + raIds.stream() + .map(this::convertToIndexableString) + .filter(s -> s != null) + .forEach(permStrings::add); + } } + // And anyone who has permission to view the unpublished version permStrings.addAll(findDvObjectPerms(dataverse)); return permStrings; } - + public List findDatasetVersionPerms(DatasetVersion version) { List perms = new ArrayList<>(); if (version.isReleased()) { - perms.add(IndexServiceBean.getPublicGroupString()); - } + Set raIds = version.getDataset().getOwner().getLocallyFAIRRoleAssigneeIdentifiers(); + if (raIds.isEmpty()) { + perms.add(IndexServiceBean.getPublicGroupString()); + } else { + raIds.stream() + .map(this::convertToIndexableString) + .filter(s -> s != null) + .forEach(perms::add); + } + } + // And anyone who has permission to view the unpublished version perms.addAll(findDvObjectPerms(version.getDataset())); return perms; } - public List findDvObjectPerms(DvObject dvObject) { + private List findDvObjectPerms(DvObject dvObject) { List permStrings = new ArrayList<>(); Permission p = getRequiredSearchPermission(dvObject); @@ -135,5 +154,42 @@ private String getIndexableStringForUserOrGroup(RoleAssignee userOrGroup) { return null; } } + + +/** + * Converts a single role assignee identifier (e.g., "@john.doe", "&admins") to its + * indexable form for Solr (e.g., "user_1", "group_admins") w/o any db lookup for groups. + * + * @param identifier Identifier prefixed with @ (user) or & (group) + * @return Indexable string for Solr, or null if conversion fails + */ +public String convertToIndexableString(String identifier) { + if (identifier == null || identifier.isEmpty()) { + return null; + } + + char prefix = identifier.charAt(0); + String value = identifier.substring(1); + + if (prefix == '@') { + // User identifier - need to extract the numeric ID + // Format: @userIdentifier -> user_ + AuthenticatedUser user = authSvc.getAuthenticatedUser(value); + if (user != null) { + return IndexServiceBean.getGroupPerUserPrefix() + user.getId(); + } else { + logger.fine("Could not find user for identifier: " + identifier); + return null; + } + } else if (prefix == '&') { + // Group alias - can use directly + // Format: &groupAlias -> group_groupAlias + return IndexServiceBean.getGroupPrefix() + value; + } else { + logger.warning("Unknown role assignee identifier format: " + identifier); + return null; + } +} + } diff --git a/src/main/java/edu/harvard/iq/dataverse/search/SolrIndexServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/search/SolrIndexServiceBean.java index af43493014a..90282712060 100644 --- a/src/main/java/edu/harvard/iq/dataverse/search/SolrIndexServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/search/SolrIndexServiceBean.java @@ -122,12 +122,7 @@ private List determineSolrDocsForFilesFromDataset(Map.Entry perms = new ArrayList<>(); - if (dataverse.isReleased()) { - perms.add(IndexServiceBean.getPublicGroupString()); - } else { - perms = searchPermissionsService.findDataversePerms(dataverse); - } + List perms = searchPermissionsService.findDataversePerms(dataverse); Long noDatasetVersionForDataverses = null; DvObjectSolrDoc dvDoc = new DvObjectSolrDoc(dataverse.getId().toString(), IndexServiceBean.solrDocIdentifierDataverse + dataverse.getId(), noDatasetVersionForDataverses, dataverse.getName(), perms); return dvDoc; @@ -158,12 +153,7 @@ private DvObjectSolrDoc constructDatafileSolrDoc(DataFileProxy fileProxy, List constructDatafileSolrDocsFromDataset(Dataset dataset) { List datafileSolrDocs = new ArrayList<>(); for (DatasetVersion datasetVersionFileIsAttachedTo : datasetVersionsToBuildCardsFor(dataset)) { - List perms = new ArrayList<>(); - if (datasetVersionFileIsAttachedTo.isReleased()) { - perms.add(IndexServiceBean.getPublicGroupString()); - } else { - perms = searchPermissionsService.findDatasetVersionPerms(datasetVersionFileIsAttachedTo); - } + List perms = searchPermissionsService.findDatasetVersionPerms(datasetVersionFileIsAttachedTo); for (FileMetadata fileMetadata : datasetVersionFileIsAttachedTo.getFileMetadatas()) { Long fileId = fileMetadata.getDataFile().getId(); @@ -206,12 +196,8 @@ private DvObjectSolrDoc makeDatasetSolrDoc(DatasetVersion version) { String solrIdEnd = getDatasetOrDataFileSolrEnding(version.getVersionState()); String solrId = solrIdStart + solrIdEnd; String name = version.getTitle(); - List perms = new ArrayList<>(); - if (version.isReleased()) { - perms.add(IndexServiceBean.getPublicGroupString()); - } else { - perms = searchPermissionsService.findDatasetVersionPerms(version); - } + List perms = searchPermissionsService.findDatasetVersionPerms(version); + return new DvObjectSolrDoc(version.getDataset().getId().toString(), solrId, version.getId(), name, perms); } @@ -382,7 +368,7 @@ public IndexResponse indexPermissionsOnSelfAndChildren(DvObject definitionPoint) * them), the code below does a lightweight query to see how many fileMetadatas exist in it and, if it is equal to or below fileQueryMin, calls getFileMetadatas().size() to assure they are loaded * (before we pass the version into a new transaction where it will be detached and fileMetadatas can't be loaded). Calling getFileMetadas.size() should be lightweight when the fileMetadatas are * loaded (first case) and done only when needed for the second case. - * + * **/ List versionsToIndex = new ArrayList<>(); for (DatasetVersion version : datasetVersionsToBuildCardsFor(dataset)) { @@ -421,7 +407,7 @@ public void indexDatasetBatchInNewTransaction(List datasetIds, final int[] if(versions.size()>1) { Long releasedVersionId = null; Long draftVersionId = null; - + for (DatasetVersion version : versions) { if (version.isReleased()) { releasedVersionId = version.getId(); @@ -429,10 +415,10 @@ public void indexDatasetBatchInNewTransaction(List datasetIds, final int[] draftVersionId = version.getId(); } } - + populateChangedFileIds( - releasedVersionId, - draftVersionId, + releasedVersionId, + draftVersionId, changedFileIds ); } @@ -449,10 +435,10 @@ public void indexDatasetFilesInNewTransaction(List versions, fin if(versions.size()>1) { Long releasedVersionId = versions.get(versions.get(0).isReleased() ? 0 : 1).getId(); Long draftVersionId = versions.get(versions.get(0).isReleased() ? 1 : 0).getId(); - + populateChangedFileIds( - releasedVersionId, - draftVersionId, + releasedVersionId, + draftVersionId, changedFileIds ); } @@ -466,7 +452,7 @@ public void indexDatasetFilesInNewTransaction(List versions, fin /** * Retrieves the IDs of file metadatas that have changed between the released version * and the draft version of a dataset. - * + * * @param releasedVersionId the ID of the released dataset version * @param draftVersionId the ID of the draft dataset version * @param changedFileMetadataIds the list to populate with changed file metadata IDs @@ -477,8 +463,8 @@ public void populateChangedFileIds(Long releasedVersionId, Long draftVersionId, query.setParameter(2, draftVersionId); /* - * When the query was configured to return Long, it was returning Integer. - * The query has been changed to return Integer now. The code here is robust + * When the query was configured to return Long, it was returning Integer. + * The query has been changed to return Integer now. The code here is robust * if that changes in the future. */ List queryResults = query.getResultList(); @@ -505,7 +491,7 @@ public void populateChangedFileIds(Long releasedVersionId, Long draftVersionId, } logger.fine("Found " + changedFileIds.size() + " datafiles whose metadata has changed between versions " + releasedVersionId + " and " + draftVersionId); } - + private void processDatasetVersionFiles(DatasetVersion version, final int[] fileCounter, int fileQueryMin, List changedFileIds) { List cachedPerms = searchPermissionsService.findDatasetVersionPerms(version); @@ -513,9 +499,9 @@ private void processDatasetVersionFiles(DatasetVersion version, Long versionId = version.getId(); List filesToReindexAsBatch = new ArrayList<>(); - // If the version is draft and there is a released version, + // If the version is draft and there is a released version, // we only need perm docs for the files with filemetadata changes == those in changedFileMetadataIds - + // Process files in batches of 100 int batchSize = 100; diff --git a/src/main/java/edu/harvard/iq/dataverse/search/SolrSearchResult.java b/src/main/java/edu/harvard/iq/dataverse/search/SolrSearchResult.java index a725b73ee70..11e2628411a 100644 --- a/src/main/java/edu/harvard/iq/dataverse/search/SolrSearchResult.java +++ b/src/main/java/edu/harvard/iq/dataverse/search/SolrSearchResult.java @@ -145,6 +145,8 @@ public class SolrSearchResult { private boolean datasetValid; + private boolean locallyFAIR; + public String getDvTree() { return dvTree; } @@ -1435,4 +1437,12 @@ public Long getDatasetCount() { public void setDatasetCount(Long datasetCount) { this.datasetCount = datasetCount; } + + public void setLocallyFAIR(Boolean locallyFAIR) { + this.locallyFAIR = Boolean.TRUE.equals(locallyFAIR); + } + public boolean isLocallyFAIR() { + return locallyFAIR; + } + } diff --git a/src/main/java/edu/harvard/iq/dataverse/search/SolrSearchServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/search/SolrSearchServiceBean.java index b265ad967d4..401bd58a70b 100644 --- a/src/main/java/edu/harvard/iq/dataverse/search/SolrSearchServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/search/SolrSearchServiceBean.java @@ -463,7 +463,9 @@ public SolrQueryResponse search( Boolean datasetValid = (Boolean) solrDocument.getFieldValue(SearchFields.DATASET_VALID); Long fileCount = (Long) solrDocument.getFieldValue(SearchFields.FILE_COUNT); Long datasetCount = (Long) solrDocument.getFieldValue(SearchFields.DATASET_COUNT); - + + Boolean locallyFAIR = (Boolean) solrDocument.getFieldValue(SearchFields.LOCALLY_FAIR); + List matchedFields = new ArrayList<>(); SolrSearchResult solrSearchResult = new SolrSearchResult(query, name); @@ -546,6 +548,7 @@ public SolrQueryResponse search( solrSearchResult.setEmbargoEndDate(embargoEndDate); solrSearchResult.setRetentionEndDate(retentionEndDate); + solrSearchResult.setLocallyFAIR(locallyFAIR); /** * @todo start using SearchConstants class here */ diff --git a/src/main/java/edu/harvard/iq/dataverse/settings/FeatureFlags.java b/src/main/java/edu/harvard/iq/dataverse/settings/FeatureFlags.java index e1c7e69f7db..09e5cc61576 100644 --- a/src/main/java/edu/harvard/iq/dataverse/settings/FeatureFlags.java +++ b/src/main/java/edu/harvard/iq/dataverse/settings/FeatureFlags.java @@ -254,6 +254,25 @@ public enum FeatureFlags { * flag makes a reason required, both in the UI and API. */ REQUIRE_EMBARGO_REASON("require-embargo-reason"), + + /** + * Experimental: Allow Locally FAIR Data. With Locally FAIR, access to a + * collection and published data in it are restricited to people/groups + * specified. For a non-privileged user, the collection, datasets, and files + * will not be returned in search results, requests to access the relevant pages + * will fail with 404 responses, etc. This functionality is explicitly + * experimental at present and will be confusing and/or ineffective if other + * settings for the collection, datasets, files are not appopriate for Locally + * FAIR data. For example, using DataCite DOIs results in the datasets (and + * files is configured) being reported to DataCite and thus the fact of their + * existence and their metadata would be visible despite the Locally FAIR + * restriction. See the Guides for further considerations. + * + * @apiNote Raise flag by setting + * "dataverse.feature.allow-locally-fair-data" + * @since Dataverse 6.10 + */ + ALLOW_LOCALLY_FAIR_DATA("allow-locally-fair-data"), ; final String flag; diff --git a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java index 06ed300acdb..64679b169d1 100644 --- a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java +++ b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java @@ -1270,6 +1270,18 @@ public static JsonArrayBuilder getTabularFileTags(DataFile df) { return tabularTags; } + public static JsonObjectBuilder jsonLocallyFairRoleAssignees(Dataverse dataverse) { + JsonArrayBuilder assignees = Json.createArrayBuilder(); + dataverse.getLocallyFAIRRoleAssigneeIdentifiers().stream() + .sorted() + .forEach(assignees::add); + + return Json.createObjectBuilder() + .add("dataverseId", dataverse.getId()) + .add("dataverseAlias", dataverse.getAlias()) + .add("locallyFairRoleAssignees", assignees); + } + private static class DatasetFieldsToJson implements DatasetFieldWalker.Listener { Deque objectStack = new LinkedList<>(); diff --git a/src/main/java/propertyFiles/Bundle.properties b/src/main/java/propertyFiles/Bundle.properties index f636a346b6b..78e890a3a44 100644 --- a/src/main/java/propertyFiles/Bundle.properties +++ b/src/main/java/propertyFiles/Bundle.properties @@ -1047,6 +1047,8 @@ dataverse.delete.featuredItems.success=All featured items of this Dataverse have dataverse.createTemplate.error.jsonParseMetadataFields=Error parsing the POSTed template dataset fields: {0} dataverse.setDefaultTemplate.success=The default dataset template has been successfully set for this dataverse. dataverse.removeDefaultTemplate.success=The default dataset template has been successfully removed from this dataverse. +dataverse.locallyfair.label=Locally FAIR: Published access limited to the groups below: +dataverse.locallyfair.title=The published collection and its published datasets and their files are only visible to the users/groups listed. If the list is empty, standard Dataverse rules apply. # rolesAndPermissionsFragment.xhtml # advanced.xhtml @@ -1689,6 +1691,8 @@ dataset.versionUI.draft=Draft dataset.versionUI.inReview=In Review dataset.versionUI.unpublished=Unpublished dataset.versionUI.deaccessioned=Deaccessioned +dataset.versionUI.locallyFAIR=Locally FAIR +dataset.versionUI.locallyFAIR.tip=Only visible to specified users/groups, See Dataverse Locally Fair Guide for more information. dataset.cite.title.released=DRAFT VERSION will be replaced in the citation with V1 once the dataset has been published. dataset.cite.title.draft=DRAFT VERSION will be replaced in the citation with the selected version once the dataset has been published. dataset.cite.title.deassessioned=DEACCESSIONED VERSION has been added to the citation for this version since it is no longer available. diff --git a/src/main/webapp/dataset.xhtml b/src/main/webapp/dataset.xhtml index f2f5b176dc7..1cd8e4f3cc7 100644 --- a/src/main/webapp/dataset.xhtml +++ b/src/main/webapp/dataset.xhtml @@ -173,6 +173,7 @@ + diff --git a/src/main/webapp/dataverse.xhtml b/src/main/webapp/dataverse.xhtml index 4ffac1c24b3..f88ba1fd49a 100644 --- a/src/main/webapp/dataverse.xhtml +++ b/src/main/webapp/dataverse.xhtml @@ -7,7 +7,8 @@ xmlns:p="http://primefaces.org/ui" xmlns:o="http://omnifaces.org/ui" xmlns:of="http://omnifaces.org/functions" - xmlns:jsf="http://xmlns.jcp.org/jsf"> + xmlns:jsf="http://xmlns.jcp.org/jsf" + xmlns:dataverse="http://xmlns.jcp.org/jsf/composite/dataverse"> @@ -298,6 +299,22 @@ + + +
+ + #{bundle['dataverse.locallyfair.label']} + + +
+ +
+
diff --git a/src/main/webapp/dataverse_header.xhtml b/src/main/webapp/dataverse_header.xhtml index 8b77b719917..7cc47435170 100644 --- a/src/main/webapp/dataverse_header.xhtml +++ b/src/main/webapp/dataverse_header.xhtml @@ -233,6 +233,7 @@

#{dataverse.name}

+
diff --git a/src/main/webapp/file.xhtml b/src/main/webapp/file.xhtml index 2292ebf4c45..2e38350807a 100644 --- a/src/main/webapp/file.xhtml +++ b/src/main/webapp/file.xhtml @@ -80,6 +80,7 @@ + diff --git a/src/main/webapp/resources/dataverse/userGroupSelect.xhtml b/src/main/webapp/resources/dataverse/userGroupSelect.xhtml new file mode 100644 index 00000000000..aa634afeb77 --- /dev/null +++ b/src/main/webapp/resources/dataverse/userGroupSelect.xhtml @@ -0,0 +1,52 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/src/main/webapp/roles-assign.xhtml b/src/main/webapp/roles-assign.xhtml index ef1b8a2e1a5..37c0e265a7e 100644 --- a/src/main/webapp/roles-assign.xhtml +++ b/src/main/webapp/roles-assign.xhtml @@ -4,7 +4,8 @@ xmlns:c="http://java.sun.com/jsp/jstl/core" xmlns:p="http://primefaces.org/ui" xmlns:jsf="http://xmlns.jcp.org/jsf" - xmlns:iqbs="http://xmlns.jcp.org/jsf/composite/iqbs"> + xmlns:iqbs="http://xmlns.jcp.org/jsf/composite/iqbs" + xmlns:dataverse="http://xmlns.jcp.org/jsf/composite/dataverse"> @@ -20,27 +21,11 @@ #{bundle['dataverse.permissions.usersOrGroups.assignDialog.userOrGroup']}
- - - - - - - - - - - - +
diff --git a/src/main/webapp/search-include-fragment.xhtml b/src/main/webapp/search-include-fragment.xhtml index 71c3775b833..735b586e447 100644 --- a/src/main/webapp/search-include-fragment.xhtml +++ b/src/main/webapp/search-include-fragment.xhtml @@ -520,6 +520,7 @@ +
@@ -584,6 +585,7 @@ +
diff --git a/src/test/java/edu/harvard/iq/dataverse/api/LocallyFairIT.java b/src/test/java/edu/harvard/iq/dataverse/api/LocallyFairIT.java new file mode 100644 index 00000000000..45d1e979e20 --- /dev/null +++ b/src/test/java/edu/harvard/iq/dataverse/api/LocallyFairIT.java @@ -0,0 +1,330 @@ +package edu.harvard.iq.dataverse.api; + +import edu.harvard.iq.dataverse.settings.SettingsServiceBean; +import io.restassured.RestAssured; +import io.restassured.path.json.JsonPath; +import io.restassured.response.Response; +import jakarta.json.Json; +import jakarta.json.JsonArrayBuilder; +import jakarta.ws.rs.core.Response.Status; +import org.junit.jupiter.api.AfterAll; +import org.junit.jupiter.api.BeforeAll; +import org.junit.jupiter.api.Test; + +import java.util.ArrayList; +import java.util.Arrays; +import java.util.List; +import java.util.logging.Logger; + +import static io.restassured.RestAssured.given; +import static org.hamcrest.CoreMatchers.*; +import static org.junit.jupiter.api.Assertions.assertTrue; + +/** + * Integration tests for the Locally FAIR mechanism. + */ +public class LocallyFairIT { + + private static final Logger logger = Logger.getLogger(LocallyFairIT.class.getCanonicalName()); + + private List dataverseAliases = new ArrayList<>(); + private List datasetPids = new ArrayList<>(); + private List usernames = new ArrayList<>(); + private String adminToken; + + @BeforeAll + public static void setUpClass() { + RestAssured.baseURI = UtilIT.getRestAssuredBaseUri(); + } + + @AfterAll + public static void tearDownClass() { + } + + @org.junit.jupiter.api.AfterEach + public void tearDown() { + if (adminToken == null) { + adminToken = getSuperuserToken(); + } + for (String datasetPid : datasetPids) { + UtilIT.destroyDataset(datasetPid, adminToken); + } + for (String dataverseAlias : dataverseAliases) { + UtilIT.deleteDataverse(dataverseAlias, adminToken); + } + for (String username : usernames) { + UtilIT.deleteUser(username); + } + dataverseAliases.clear(); + datasetPids.clear(); + usernames.clear(); + adminToken = null; + } + + private String getSuperuserToken() { + Response createResponse = UtilIT.createRandomUser(); + String adminApiToken = UtilIT.getApiTokenFromResponse(createResponse); + String username = UtilIT.getUsernameFromResponse(createResponse); + usernames.add(username); + UtilIT.setSuperuserStatus(username, true).then().assertThat().statusCode(Status.OK.getStatusCode()); + this.adminToken = adminApiToken; + return adminApiToken; + } + + /** + * Test CRUD of a collection's locally fair assignees. + * This checks that users can be added, listed, and removed from the locally FAIR list. + */ + @Test + public void testLocallyFairAssigneesCRUD() { + String superUserToken = getSuperuserToken(); + String dataverseAlias = UtilIT.createRandomCollectionGetAlias(superUserToken); + dataverseAliases.add(dataverseAlias); + Response userResponse = UtilIT.createRandomUser(); + String username = "@" + UtilIT.getUsernameFromResponse(userResponse); + usernames.add(UtilIT.getUsernameFromResponse(userResponse)); + String userToken = UtilIT.getApiTokenFromResponse(userResponse); + + // 1. Add locally fair assignee + addLocallyFairRoleAssignee(dataverseAlias, username, superUserToken) + .then().assertThat().statusCode(Status.OK.getStatusCode()) + .body("data.locallyFairRoleAssignees", hasItem(username)); + + // 2. List locally fair assignees + listLocallyFairRoleAssignees(dataverseAlias, superUserToken) + .then().assertThat().statusCode(Status.OK.getStatusCode()) + .body("data", hasItem(username)); + + // 3. Set locally fair assignees (replaces) + Response userResponse2 = UtilIT.createRandomUser(); + String userToken2 = UtilIT.getApiTokenFromResponse(userResponse2); + String username2 = "@" + UtilIT.getUsernameFromResponse(userResponse2); + usernames.add(UtilIT.getUsernameFromResponse(userResponse2)); + setLocallyFairRoleAssignees(dataverseAlias, Arrays.asList(username2), superUserToken) + .then().assertThat().statusCode(Status.OK.getStatusCode()) + .body("data.locallyFairRoleAssignees", hasItem(username2)) + .body("data.locallyFairRoleAssignees", not(hasItem(username))); + + // 4. Delete locally fair assignee + deleteLocallyFairRoleAssignee(dataverseAlias, username2, superUserToken) + .then().assertThat().statusCode(Status.OK.getStatusCode()) + .body("data.locallyFairRoleAssignees", not(hasItem(username2))); + + // 5. Test Forbidden for non-superuser + listLocallyFairRoleAssignees(dataverseAlias, userToken) + .then().assertThat().statusCode(Status.FORBIDDEN.getStatusCode()); + } + + /** + * Test that a user listed directly and via a group can access locally fair content. + * Also checks that a user NOT listed/in a group cannot access it. + */ + @Test + public void testLocallyFairAccessPermissions() { + String superUserToken = getSuperuserToken(); + String dvAlias = UtilIT.createRandomCollectionGetAlias(superUserToken); + dataverseAliases.add(dvAlias); + + Response dvResponse = UtilIT.exportDataverse(dvAlias, superUserToken); + Integer dataverseId =UtilIT.getDataverseIdFromResponse(dvResponse); + //dvResponse.jsonPath().getInt("data.id"); + + // Create Users + Response directUserResponse = UtilIT.createRandomUser(); + String directUserToken = UtilIT.getApiTokenFromResponse(directUserResponse); + String directUsername = "@" + UtilIT.getUsernameFromResponse(directUserResponse); + usernames.add(UtilIT.getUsernameFromResponse(directUserResponse)); + + Response groupUserResponse = UtilIT.createRandomUser(); + String groupUserToken = UtilIT.getApiTokenFromResponse(groupUserResponse); + String groupUsername = "@" + UtilIT.getUsernameFromResponse(groupUserResponse); + usernames.add(UtilIT.getUsernameFromResponse(groupUserResponse)); + + Response unauthorizedUserResponse = UtilIT.createRandomUser(); + String unauthorizedUserToken = UtilIT.getApiTokenFromResponse(unauthorizedUserResponse); + usernames.add(UtilIT.getUsernameFromResponse(unauthorizedUserResponse)); + + // Create Group + String groupAlias = "testGroup" + UtilIT.getRandomString(4); + UtilIT.createGroup(dvAlias, groupAlias, "Test Group", superUserToken).then().assertThat().statusCode(Status.CREATED.getStatusCode()); + String groupIdentifier = "&explicit/" + dataverseId + "-" + groupAlias; + UtilIT.addToGroup(dvAlias, groupAlias, Arrays.asList(groupUsername), superUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()); + + // Restrict Dataverse + setLocallyFairRoleAssignees(dvAlias, Arrays.asList(directUsername, groupIdentifier), superUserToken) + .then().assertThat().statusCode(Status.OK.getStatusCode()); + + // Publish Dataverse + UtilIT.publishDataverseViaNativeApi(dvAlias, superUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()); + + // Verify Access + // Direct User + UtilIT.getDataverseWithOwners(dvAlias, directUserToken, false).then().assertThat().statusCode(Status.OK.getStatusCode()); + // Group User + UtilIT.getDataverseWithOwners(dvAlias, groupUserToken, false).then().assertThat().statusCode(Status.OK.getStatusCode()); + // Unauthorized User + UtilIT.getDataverseWithOwners(dvAlias, unauthorizedUserToken, false).then().assertThat().statusCode(Status.NOT_FOUND.getStatusCode()); + // Anonymous User + UtilIT.getDataverseWithOwners(dvAlias, null, false).then().assertThat().statusCode(Status.NOT_FOUND.getStatusCode()); + } + + /** + * Test that the Locally FAIR mechanism works with collections, datasets, and datafiles. + * Verifies 404 for unauthorized users on all object types. + */ + @Test + public void testLocallyFairAcrossAllObjectTypes() { + String superUserToken = getSuperuserToken(); + String dvAlias = UtilIT.createRandomCollectionGetAlias(superUserToken); + dataverseAliases.add(dvAlias); + + // Create Dataset + Response createDatasetResponse = UtilIT.createRandomDatasetViaNativeApi(dvAlias, superUserToken); + String datasetPid = UtilIT.getDatasetPersistentIdFromResponse(createDatasetResponse); + datasetPids.add(datasetPid); + Integer datasetId = UtilIT.getDatasetIdFromResponse(createDatasetResponse); + + // Upload File + Response uploadFileResponse = UtilIT.uploadFileViaNative(Integer.toString(datasetId), "scripts/search/data/binary/trees.zip", superUserToken); + Integer fileId = UtilIT.getDataFileIdFromResponse(uploadFileResponse); + + // Publish all + UtilIT.publishDataverseViaNativeApi(dvAlias, superUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()); + UtilIT.publishDatasetViaNativeApi(datasetPid, "major", superUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()); + + // Restrict Dataverse + Response authorizedUserResponse = UtilIT.createRandomUser(); + String authorizedUserToken = UtilIT.getApiTokenFromResponse(authorizedUserResponse); + String authorizedUsername = "@" + UtilIT.getUsernameFromResponse(authorizedUserResponse); + usernames.add(UtilIT.getUsernameFromResponse(authorizedUserResponse)); + addLocallyFairRoleAssignee(dvAlias, authorizedUsername, superUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()); + + Response unauthorizedUserResponse = UtilIT.createRandomUser(); + String unauthorizedUserToken = UtilIT.getApiTokenFromResponse(unauthorizedUserResponse); + usernames.add(UtilIT.getUsernameFromResponse(unauthorizedUserResponse)); + + // 1. Check Dataverse + UtilIT.getDataverseWithOwners(dvAlias, authorizedUserToken, false).then().assertThat().statusCode(Status.OK.getStatusCode()); + UtilIT.getDataverseWithOwners(dvAlias, unauthorizedUserToken, false).then().assertThat().statusCode(Status.NOT_FOUND.getStatusCode()); + + // 2. Check Dataset + UtilIT.nativeGetUsingPersistentId(datasetPid, authorizedUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()); + UtilIT.nativeGetUsingPersistentId(datasetPid, unauthorizedUserToken).then().assertThat().statusCode(Status.NOT_FOUND.getStatusCode()); + + // 3. Check Datafile + UtilIT.getFileData(fileId.toString(), authorizedUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()); + UtilIT.getFileData(fileId.toString(), unauthorizedUserToken).then().assertThat().statusCode(Status.NOT_FOUND.getStatusCode()); + } + + /** + * Test that locally fair content doesn't appear in search results for non-authorized users and does for those who can see it. + */ + @Test + public void testLocallyFairSearchVisibility() { + String superUserToken = getSuperuserToken(); + String dvAlias = UtilIT.createRandomCollectionGetAlias(superUserToken); + dataverseAliases.add(dvAlias); + String dvName = JsonPath.from(UtilIT.getDataverseWithOwners(dvAlias, superUserToken, false).body().asString()).getString("data.name"); + + // Restrict Dataverse + Response authorizedUserResponse = UtilIT.createRandomUser(); + String authorizedUserToken = UtilIT.getApiTokenFromResponse(authorizedUserResponse); + String authorizedUsername = "@" + UtilIT.getUsernameFromResponse(authorizedUserResponse); + usernames.add(UtilIT.getUsernameFromResponse(authorizedUserResponse)); + addLocallyFairRoleAssignee(dvAlias, authorizedUsername, superUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()); + + // Publish + UtilIT.publishDataverseViaNativeApi(dvAlias, superUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()); + + // Wait for index + UtilIT.sleepForSearch(dvName, superUserToken, "", 1, 5); + + // Unauthorized search + Response unauthorizedUserResponse = UtilIT.createRandomUser(); + String unauthorizedUserToken = UtilIT.getApiTokenFromResponse(unauthorizedUserResponse); + usernames.add(UtilIT.getUsernameFromResponse(unauthorizedUserResponse)); + UtilIT.search("name:\"" + dvName + "\"", unauthorizedUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()) + .body("data.total_count", equalTo(0)); + + // Authorized search + UtilIT.search("name:\"" + dvName + "\"", authorizedUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()) + .body("data.total_count", equalTo(1)) + .body("data.items[0].name", equalTo(dvName)); + } + + /** + * Test reindexing behavior when a parent collection becomes restricted. + * A dataset published normally should become hidden after its parent is restricted and it is reindexed. + */ + @Test + public void testReindexingMakesDatasetLocallyFair() { + String superUserToken = getSuperuserToken(); + String parentDv = UtilIT.createRandomCollectionGetAlias(superUserToken); + dataverseAliases.add(parentDv); + Response createDatasetResponse = UtilIT.createRandomDatasetViaNativeApi(parentDv, superUserToken); + String datasetPid = UtilIT.getDatasetPersistentIdFromResponse(createDatasetResponse); + datasetPids.add(datasetPid); + + // Publish normally + UtilIT.publishDataverseViaNativeApi(parentDv, superUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()); + UtilIT.publishDatasetViaNativeApi(datasetPid, "major", superUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()); + + // Wait for search + UtilIT.sleepForSearch("\"" + datasetPid + "\"", null, "", 1, 5); + + // Verify publicly visible + UtilIT.search("\"" + datasetPid + "\"", null).then().assertThat().statusCode(Status.OK.getStatusCode()) + .body("data.total_count", equalTo(1)); + + // Restrict parent + Response authorizedUserResponse = UtilIT.createRandomUser(); + String authorizedUserToken = UtilIT.getApiTokenFromResponse(authorizedUserResponse); + String authorizedUsername = "@" + UtilIT.getUsernameFromResponse(authorizedUserResponse); + usernames.add(UtilIT.getUsernameFromResponse(authorizedUserResponse)); + addLocallyFairRoleAssignee(parentDv, authorizedUsername, superUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()); + + // Reindex dataset + UtilIT.reindexDataset(datasetPid).then().assertThat().statusCode(Status.OK.getStatusCode()); + + // Wait for reindex to propagate (should disappear for anonymous) + boolean disappeared = false; + for (int i = 0; i < 10; i++) { + Response searchResp = UtilIT.search("\"" + datasetPid + "\"", null); + if (searchResp.jsonPath().getInt("data.total_count") == 0) { + disappeared = true; + break; + } + try { + Thread.sleep(2000); + } catch (InterruptedException e) { + } + } + assertTrue(disappeared, "Dataset should have disappeared from search for anonymous users"); + + // Verify authorized user can still see it in search + UtilIT.search("\"" + datasetPid + "\"", authorizedUserToken).then().assertThat().statusCode(Status.OK.getStatusCode()) + .body("data.total_count", equalTo(1)); + } + + private Response listLocallyFairRoleAssignees(String dvIdtf, String apiToken) { + return given().header("X-Dataverse-key", apiToken) + .get("/api/dataverses/" + dvIdtf + "/locallyFairRoleAssignees"); + } + + private Response setLocallyFairRoleAssignees(String dvIdtf, List roleAssigneeIdentifiers, String apiToken) { + return given().header("X-Dataverse-key", apiToken) + .contentType("application/json") + .body(roleAssigneeIdentifiers) + .put("/api/dataverses/" + dvIdtf + "/locallyFairRoleAssignees"); + } + + private Response addLocallyFairRoleAssignee(String dvIdtf, String roleAssigneeIdentifier, String apiToken) { + return given().header("X-Dataverse-key", apiToken) + .put("/api/dataverses/" + dvIdtf + "/locallyFairRoleAssignees/" + roleAssigneeIdentifier); + } + + private Response deleteLocallyFairRoleAssignee(String dvIdtf, String roleAssigneeIdentifier, String apiToken) { + return given().header("X-Dataverse-key", apiToken) + .delete("/api/dataverses/" + dvIdtf + "/locallyFairRoleAssignees/" + roleAssigneeIdentifier); + } +} diff --git a/src/test/java/edu/harvard/iq/dataverse/api/UtilIT.java b/src/test/java/edu/harvard/iq/dataverse/api/UtilIT.java index e335521c881..e63944c58c3 100644 --- a/src/test/java/edu/harvard/iq/dataverse/api/UtilIT.java +++ b/src/test/java/edu/harvard/iq/dataverse/api/UtilIT.java @@ -59,6 +59,7 @@ public class UtilIT { public static final String API_TOKEN_HTTP_HEADER = "X-Dataverse-key"; private static final String USERNAME_KEY = "userName"; + private static final String PERSISTENTUSERID_KEY = "persistentUserId"; private static final String EMAIL_KEY = "email"; private static final String API_TOKEN_KEY = "apiToken"; private static final String BUILTIN_USER_KEY = "burrito"; @@ -309,6 +310,11 @@ static String getUsernameFromResponse(Response createUserResponse) { JsonPath createdUser = JsonPath.from(createUserResponse.body().asString()); String username = createdUser.getString("data.user." + USERNAME_KEY); logger.info("Username found in create user response: " + username); + //Support for when user is created via a call to /api/users/:me which doesn't return username + if( username == null ) { + username = createdUser.getString("data." + PERSISTENTUSERID_KEY); + logger.info("Username found via persistentUserId in create user response: " + username); + } return username; } @@ -1969,9 +1975,11 @@ static Response getFileWithOwners(String datafileId, String apiToken, boolean r } static Response getDataverseWithOwners(String alias, String apiToken, boolean returnOwners) { - return given() - .header(API_TOKEN_HTTP_HEADER, apiToken) - .get("/api/dataverses/" + RequestSpecification rs = given(); + if(apiToken != null) { + rs = rs.header(API_TOKEN_HTTP_HEADER, apiToken); + } + return rs.get("/api/dataverses/" + alias + (returnOwners ? "/?returnOwners=true" : "")); } diff --git a/tests/integration-tests.txt b/tests/integration-tests.txt index 51253928df9..3fa22e41f9c 100644 --- a/tests/integration-tests.txt +++ b/tests/integration-tests.txt @@ -1 +1 @@ -DataversesIT,DatasetsIT,SwordIT,AdminIT,BuiltinUsersIT,UsersIT,UtilIT,ConfirmEmailIT,FileMetadataIT,FilesIT,SearchIT,InReviewWorkflowIT,HarvestingServerIT,HarvestingClientsIT,MoveIT,MakeDataCountApiIT,FileTypeDetectionIT,EditDDIIT,ExternalToolsIT,AccessIT,DuplicateFilesIT,DownloadFilesIT,LinkIT,DeleteUsersIT,DeactivateUsersIT,AuxiliaryFilesIT,InvalidCharactersIT,LicensesIT,NotificationsIT,BagIT,MetadataBlocksIT,NetcdfIT,SignpostingIT,FitsIT,LogoutIT,DataRetrieverApiIT,ProvIT,S3AccessIT,OpenApiIT,InfoIT,DatasetFieldsIT,SavedSearchIT,DatasetTypesIT,DataverseFeaturedItemsIT,SendFeedbackApiIT,CustomizationIT,JsonLDExportIT,WorkflowsIT,LDNInboxIT,LocalContextsIT +DataversesIT,DatasetsIT,SwordIT,AdminIT,BuiltinUsersIT,UsersIT,UtilIT,ConfirmEmailIT,FileMetadataIT,FilesIT,SearchIT,InReviewWorkflowIT,HarvestingServerIT,HarvestingClientsIT,MoveIT,MakeDataCountApiIT,FileTypeDetectionIT,EditDDIIT,ExternalToolsIT,AccessIT,DuplicateFilesIT,DownloadFilesIT,LinkIT,DeleteUsersIT,DeactivateUsersIT,AuxiliaryFilesIT,InvalidCharactersIT,LicensesIT,NotificationsIT,BagIT,MetadataBlocksIT,NetcdfIT,SignpostingIT,FitsIT,LogoutIT,DataRetrieverApiIT,ProvIT,S3AccessIT,OpenApiIT,InfoIT,DatasetFieldsIT,SavedSearchIT,DatasetTypesIT,DataverseFeaturedItemsIT,SendFeedbackApiIT,CustomizationIT,JsonLDExportIT,WorkflowsIT,LDNInboxIT,LocalContextsIT,LocallyFairIT