Skip to content

Commit 607d3fe

Browse files
committed
Merge branch 'develop' into 11323-includeDeaccesioned-compare-ds-versions
2 parents 257c9a5 + efbbd18 commit 607d3fe

38 files changed

Lines changed: 1335 additions & 316 deletions

.github/workflows/deploy_beta_testing.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@ jobs:
6868
overwrite: true
6969

7070
- name: Execute payara war deployment remotely
71-
uses: appleboy/ssh-action@v1.2.1
71+
uses: appleboy/ssh-action@v1.2.2
7272
env:
7373
INPUT_WAR_FILE: ${{ env.war_file }}
7474
with:
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
The tutorial on running Dataverse in Docker has been updated to include [how to load a metadata block](https://dataverse-guide--11204.org.readthedocs.build/en/11204/container/running/demo.html#additional-metadata-blocks) and then update Solr to know about the new fields. See also #11004 and #11204
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
### Modify Assign Role API error message
2+
3+
Change error message from specifying 'User' and 'Dataverse' since it is used for groups and Datasets as well
4+
Generic message: datasets.api.grant.role.assignee.has.role.error=Role has already been granted.
5+
6+
Note: Re-Translation needed.
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
### Edit Dataset Metadata API extension
2+
3+
- This endpoint now allows removing fields (by sending empty values), as long as they are not required by the dataset.
4+
- New ``sourceInternalVersionNumber`` optional query parameter, which prevents inconsistencies by managing updates that
5+
may occur from other users while a dataset is being edited.
Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,50 @@
1+
{
2+
"fields": [
3+
{
4+
"typeName": "alternativeTitle",
5+
"multiple": true,
6+
"typeClass": "primitive",
7+
"value": []
8+
},
9+
{
10+
"typeName": "distributor",
11+
"multiple": true,
12+
"typeClass": "compound",
13+
"value": [
14+
{
15+
"distributorName": {
16+
"typeName": "distributorName",
17+
"multiple": false,
18+
"typeClass": "primitive",
19+
"value": ""
20+
},
21+
"distributorAffiliation": {
22+
"typeName": "distributorAffiliation",
23+
"multiple": false,
24+
"typeClass": "primitive",
25+
"value": ""
26+
}
27+
}
28+
]
29+
},
30+
{
31+
"fields": [
32+
{
33+
"typeName": "author",
34+
"value": [
35+
{
36+
"authorName": {
37+
"typeName": "authorName",
38+
"value": "Belicheck, Bill"
39+
},
40+
"authorAffiliation": {
41+
"typeName": "authorIdentifierScheme",
42+
"value": ""
43+
}
44+
}
45+
]
46+
}
47+
]
48+
}
49+
]
50+
}

doc/sphinx-guides/source/_static/api/harvesting-client.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,5 +7,6 @@
77
"metadataFormat": "oai_dc",
88
"customHeaders": "x-oai-api-key: xxxyyyzzz",
99
"set": "user-lmops",
10+
"schedule": "Weekly, Sat 5 AM",
1011
"allowHarvestingMissingCVV":true
1112
}

doc/sphinx-guides/source/api/native-api.rst

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2135,6 +2135,30 @@ The fully expanded example above (without environment variables) looks like this
21352135
21362136
For these edits your JSON file need only include those dataset fields which you would like to edit. A sample JSON file may be downloaded here: :download:`dataset-edit-metadata-sample.json <../_static/api/dataset-edit-metadata-sample.json>`
21372137

2138+
This endpoint also allows removing fields, as long as they are not required by the dataset. To remove a field, send an empty value (``""``) for individual fields. For multiple fields, send an empty array (``[]``). A sample JSON file for removing fields may be downloaded here: :download:`dataset-edit-metadata-delete-fields-sample.json <../_static/api/dataset-edit-metadata-delete-fields-sample.json>`
2139+
2140+
If another user updates the dataset version metadata before you send the update request, data inconsistencies may occur. To prevent this, you can use the optional ``sourceInternalVersionNumber`` query parameter. This parameter must include the internal version number corresponding to the dataset version being updated. Note that internal version numbers increase sequentially with each version update.
2141+
2142+
If this parameter is provided, the update will proceed only if the internal version number remains unchanged. Otherwise, the request will fail with an error.
2143+
2144+
Example using ``sourceInternalVersionNumber``:
2145+
2146+
.. code-block:: bash
2147+
2148+
export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
2149+
export SERVER_URL=https://demo.dataverse.org
2150+
export PERSISTENT_IDENTIFIER=doi:10.5072/FK2/BCCP9Z
2151+
export SOURCE_INTERNAL_VERSION_NUMBER=5
2152+
2153+
curl -H "X-Dataverse-key: $API_TOKEN" -X PUT "$SERVER_URL/api/datasets/:persistentId/editMetadata?persistentId=$PERSISTENT_IDENTIFIER&replace=true&sourceInternalVersionNumber=$SOURCE_INTERNAL_VERSION_NUMBER" --upload-file dataset-update-metadata.json
2154+
2155+
The fully expanded example above (without environment variables) looks like this:
2156+
2157+
.. code-block:: bash
2158+
2159+
curl -H "X-Dataverse-key: xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X PUT "https://demo.dataverse.org/api/datasets/:persistentId/editMetadata/?persistentId=doi:10.5072/FK2/BCCP9Z&replace=true&sourceInternalVersionNumber=5" --upload-file dataset-update-metadata.json
2160+
2161+
21382162
Delete Dataset Metadata
21392163
~~~~~~~~~~~~~~~~~~~~~~~
21402164

doc/sphinx-guides/source/container/running/demo.rst

Lines changed: 48 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -213,6 +213,54 @@ If you want to specify fewer previewers, you can edit the ``compose.yml`` file.
213213

214214
``INCLUDE_PREVIEWERS=text,html,pdf,csv``
215215

216+
217+
.. _additional-metadata-blocks:
218+
219+
Additional Metadata Blocks
220+
++++++++++++++++++++++++++
221+
222+
Metadata fields such as "Title" are part of a metadata block such as "Citation". See :ref:`metadata-references` in the User Guide for the metadata blocks that ship with Dataverse.
223+
224+
At a high level, we will be loading a metadata block and then adjusting our Solr config to know about it.
225+
226+
Care should be taken when adding additional metadata blocks. There is no way to `preview <https://github.com/IQSS/dataverse/issues/2551>`_ or `delete <https://github.com/IQSS/dataverse/issues/9628>`_ a metadata block so please use a throwaway environment.
227+
228+
:ref:`metadata-references` lists some experimental metadata blocks. In the example below, we'll use the CodeMeta block.
229+
230+
First, download a metadata block or create one by following :doc:`/admin/metadatacustomization` in the Admin Guide.
231+
232+
Load the metadata block like this:
233+
234+
``curl http://localhost:8080/api/admin/datasetfield/load -H "Content-type: text/tab-separated-values" -X POST --upload-file codemeta.tsv``
235+
236+
Next, reconfigure Solr to know about the new metadata block.
237+
238+
You can back up your existing Solr schema like this:
239+
240+
``cp docker-dev-volumes/solr/data/data/collection1/conf/schema.xml docker-dev-volumes/solr/data/data/collection1/conf/schema.xml.orig``
241+
242+
You can see the existing fields Solr knows about like this:
243+
244+
``curl http://localhost:8983/solr/collection1/schema/fields``
245+
246+
Update your Solr schema with the following command:
247+
248+
``curl http://localhost:8080/api/admin/index/solr/schema | docker run -i --rm -v ./docker-dev-volumes/solr/data:/var/solr gdcc/configbaker:unstable update-fields.sh /var/solr/data/collection1/conf/schema.xml``
249+
250+
Then, reload Solr:
251+
252+
``curl "http://localhost:8983/solr/admin/cores?action=RELOAD&core=collection1"``
253+
254+
You can get a diff of your old and new Solr schema like this:
255+
256+
``diff docker-dev-volumes/solr/data/data/collection1/conf/schema.xml.orig docker-dev-volumes/solr/data/data/collection1/conf/schema.xml``
257+
258+
You should be able to see the new fields from the metadata block you added in the following output:
259+
260+
``curl http://localhost:8983/solr/collection1/schema/fields``
261+
262+
At this point you can proceed with testing the metadata block in the Dataverse UI. First you'll need to enable it for a collection (see :ref:`general-information` in the User Guide section about collection). Afterwards, create a new dataset, save it, and then edit the metadata for that dataset. Your metadata block should appear.
263+
216264
Next Steps
217265
----------
218266

doc/sphinx-guides/source/developers/making-releases.rst

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -331,15 +331,19 @@ Deploy Final Release on Demo
331331

332332
Above you already did the hard work of deploying a release candidate to https://demo.dataverse.org. It should be relatively straightforward to undeploy the release candidate and deploy the final release.
333333

334+
.. _update-schemaspy:
335+
334336
Update SchemaSpy
335337
----------------
336338

337-
We maintain SchemaSpy at URLs like https://guides.dataverse.org/en/6.3/schemaspy/index.html
339+
We maintain SchemaSpy at URLs like https://guides.dataverse.org/en/latest/schemaspy/index.html and (for example) https://guides.dataverse.org/en/6.6/schemaspy/index.html
338340

339341
Get the attention of the core team and ask someone to update it for the new release.
340342

341343
Consider updating `the thread <https://groups.google.com/g/dataverse-community/c/f95DQU-wlVM/m/cvUp3E9OBgAJ>`_ on the mailing list once the update is in place.
342344

345+
See also :ref:`schemaspy`.
346+
343347
Alert Translators About the New Release
344348
---------------------------------------
345349

doc/sphinx-guides/source/developers/testing.rst

Lines changed: 0 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -477,25 +477,6 @@ reduced anyway.
477477

478478
You will obviously have to utilize caching functionality of your CI service or do proper Docker layering.
479479

480-
The Phoenix Server
481-
~~~~~~~~~~~~~~~~~~
482-
483-
How the Phoenix Tests Work
484-
^^^^^^^^^^^^^^^^^^^^^^^^^^
485-
486-
A server at http://phoenix.dataverse.org has been set up to test the latest code from the develop branch. Testing is done using chained builds of Jenkins jobs:
487-
488-
- A war file is built from the latest code in develop: https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-build-develop/
489-
- The resulting war file is depoyed to the Phoenix server: https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-deploy-develop/
490-
- REST Assured Tests are run across the wire from the Jenkins server to the Phoenix server: https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-apitest-develop/
491-
492-
How to Run the Phoenix Tests
493-
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
494-
495-
- Take a quick look at http://phoenix.dataverse.org to make sure the server is up and running Dataverse. If it's down, fix it.
496-
- Log into Jenkins and click "Build Now" at https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-build-develop/
497-
- Wait for all three chained Jenkins jobs to complete and note if they passed or failed. If you see a failure, open a GitHub issue or at least get the attention of some developers.
498-
499480
Accessibility Testing
500481
---------------------
501482

0 commit comments

Comments
 (0)