You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The tutorial on running Dataverse in Docker has been updated to include [how to load a metadata block](https://dataverse-guide--11204.org.readthedocs.build/en/11204/container/running/demo.html#additional-metadata-blocks) and then update Solr to know about the new fields. See also #11004 and #11204
Copy file name to clipboardExpand all lines: doc/sphinx-guides/source/api/native-api.rst
+24Lines changed: 24 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2135,6 +2135,30 @@ The fully expanded example above (without environment variables) looks like this
2135
2135
2136
2136
For these edits your JSON file need only include those dataset fields which you would like to edit. A sample JSON file may be downloaded here: :download:`dataset-edit-metadata-sample.json <../_static/api/dataset-edit-metadata-sample.json>`
2137
2137
2138
+
This endpoint also allows removing fields, as long as they are not required by the dataset. To remove a field, send an empty value (``""``) for individual fields. For multiple fields, send an empty array (``[]``). A sample JSON file for removing fields may be downloaded here: :download:`dataset-edit-metadata-delete-fields-sample.json <../_static/api/dataset-edit-metadata-delete-fields-sample.json>`
2139
+
2140
+
If another user updates the dataset version metadata before you send the update request, data inconsistencies may occur. To prevent this, you can use the optional ``sourceInternalVersionNumber`` query parameter. This parameter must include the internal version number corresponding to the dataset version being updated. Note that internal version numbers increase sequentially with each version update.
2141
+
2142
+
If this parameter is provided, the update will proceed only if the internal version number remains unchanged. Otherwise, the request will fail with an error.
Copy file name to clipboardExpand all lines: doc/sphinx-guides/source/container/running/demo.rst
+48Lines changed: 48 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -213,6 +213,54 @@ If you want to specify fewer previewers, you can edit the ``compose.yml`` file.
213
213
214
214
``INCLUDE_PREVIEWERS=text,html,pdf,csv``
215
215
216
+
217
+
.. _additional-metadata-blocks:
218
+
219
+
Additional Metadata Blocks
220
+
++++++++++++++++++++++++++
221
+
222
+
Metadata fields such as "Title" are part of a metadata block such as "Citation". See :ref:`metadata-references` in the User Guide for the metadata blocks that ship with Dataverse.
223
+
224
+
At a high level, we will be loading a metadata block and then adjusting our Solr config to know about it.
225
+
226
+
Care should be taken when adding additional metadata blocks. There is no way to `preview <https://github.com/IQSS/dataverse/issues/2551>`_ or `delete <https://github.com/IQSS/dataverse/issues/9628>`_ a metadata block so please use a throwaway environment.
227
+
228
+
:ref:`metadata-references` lists some experimental metadata blocks. In the example below, we'll use the CodeMeta block.
229
+
230
+
First, download a metadata block or create one by following :doc:`/admin/metadatacustomization` in the Admin Guide.
231
+
232
+
Load the metadata block like this:
233
+
234
+
``curl http://localhost:8080/api/admin/datasetfield/load -H "Content-type: text/tab-separated-values" -X POST --upload-file codemeta.tsv``
235
+
236
+
Next, reconfigure Solr to know about the new metadata block.
237
+
238
+
You can back up your existing Solr schema like this:
At this point you can proceed with testing the metadata block in the Dataverse UI. First you'll need to enable it for a collection (see :ref:`general-information` in the User Guide section about collection). Afterwards, create a new dataset, save it, and then edit the metadata for that dataset. Your metadata block should appear.
Copy file name to clipboardExpand all lines: doc/sphinx-guides/source/developers/making-releases.rst
+5-1Lines changed: 5 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -331,15 +331,19 @@ Deploy Final Release on Demo
331
331
332
332
Above you already did the hard work of deploying a release candidate to https://demo.dataverse.org. It should be relatively straightforward to undeploy the release candidate and deploy the final release.
333
333
334
+
.. _update-schemaspy:
335
+
334
336
Update SchemaSpy
335
337
----------------
336
338
337
-
We maintain SchemaSpy at URLs like https://guides.dataverse.org/en/6.3/schemaspy/index.html
339
+
We maintain SchemaSpy at URLs like https://guides.dataverse.org/en/latest/schemaspy/index.html and (for example) https://guides.dataverse.org/en/6.6/schemaspy/index.html
338
340
339
341
Get the attention of the core team and ask someone to update it for the new release.
340
342
341
343
Consider updating `the thread <https://groups.google.com/g/dataverse-community/c/f95DQU-wlVM/m/cvUp3E9OBgAJ>`_ on the mailing list once the update is in place.
Copy file name to clipboardExpand all lines: doc/sphinx-guides/source/developers/testing.rst
-19Lines changed: 0 additions & 19 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -477,25 +477,6 @@ reduced anyway.
477
477
478
478
You will obviously have to utilize caching functionality of your CI service or do proper Docker layering.
479
479
480
-
The Phoenix Server
481
-
~~~~~~~~~~~~~~~~~~
482
-
483
-
How the Phoenix Tests Work
484
-
^^^^^^^^^^^^^^^^^^^^^^^^^^
485
-
486
-
A server at http://phoenix.dataverse.org has been set up to test the latest code from the develop branch. Testing is done using chained builds of Jenkins jobs:
487
-
488
-
- A war file is built from the latest code in develop: https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-build-develop/
489
-
- The resulting war file is depoyed to the Phoenix server: https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-deploy-develop/
490
-
- REST Assured Tests are run across the wire from the Jenkins server to the Phoenix server: https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-apitest-develop/
491
-
492
-
How to Run the Phoenix Tests
493
-
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
494
-
495
-
- Take a quick look at http://phoenix.dataverse.org to make sure the server is up and running Dataverse. If it's down, fix it.
496
-
- Log into Jenkins and click "Build Now" at https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-build-develop/
497
-
- Wait for all three chained Jenkins jobs to complete and note if they passed or failed. If you see a failure, open a GitHub issue or at least get the attention of some developers.
0 commit comments