Skip to content

Commit

Permalink
Merge pull request #8146 from IQSS/develop
Browse files Browse the repository at this point in the history
Merge v5.7 into master
  • Loading branch information
kcondon authored Oct 13, 2021
2 parents 1c2d8d8 + 80fda20 commit 78c9a44
Show file tree
Hide file tree
Showing 210 changed files with 7,085 additions and 3,084 deletions.
53 changes: 53 additions & 0 deletions .github/ISSUE_TEMPLATE/bug_report.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
---
name: Bug report
about: Did you encounter something unexpected or incorrect in the Dataverse software? We'd like to hear about it!
title: ''
labels: ''
assignees: ''

---
<!--
Thank you for contributing to the Dataverse Project through the creation of a bug report!
WARNING: If this is a security issue it should be reported privately to [email protected]
More information on bug issues and contributions can be found in the "Contributing to Dataverse" page:
https://github.com/IQSS/dataverse/blob/develop/CONTRIBUTING.md#bug-reportsissues
Please fill out as much of the template as you can.
Start below this comment section.
-->
**What steps does it take to reproduce the issue?**

* When does this issue occur?


* Which page(s) does it occurs on?


* What happens?


* To whom does it occur (all users, curators, superusers)?


* What did you expect to happen?



**Which version of Dataverse are you using?**



**Any related open or closed issues to this bug report?**



**Screenshots:**

No matter the issue, screenshots are always welcome.

To add a screenshot, please use one of the following formats and/or methods described here:

* https://help.github.com/en/articles/file-attachments-on-issues-and-pull-requests
*
35 changes: 35 additions & 0 deletions .github/ISSUE_TEMPLATE/feature_request.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
---
name: Feature request
about: Suggest an idea or new feature for the Dataverse software!
title: 'Feature Request/Idea:'
labels: ''
assignees: ''

---
<!--
Thank you for contributing to the Dataverse Project through the creation of a feature request!
More information on ideas/feature requests and contributions can be found in the "Contributing to Dataverse" page:
https://github.com/IQSS/dataverse/blob/develop/CONTRIBUTING.md#ideasfeature-requests
Please fill out as much of the template as you can.
Start below this comment section.
-->

**Overview of the Feature Request**


**What kind of user is the feature intended for?**
(Example users roles: API User, Curator, Depositor, Guest, Superuser, Sysadmin)


**What inspired the request?**


**What existing behavior do you want changed?**


**Any brand new behavior do you want to add to Dataverse?**


**Any related open or closed issues to this feature request?**
2 changes: 1 addition & 1 deletion .github/workflows/guides_build_sphinx.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,6 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: ammaraskar/sphinx-action@master
- uses: OdumInstitute/sphinx-action@master
with:
docs-folder: "doc/sphinx-guides/"
2 changes: 0 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,6 @@ If there is no pre-existing issue or it has been closed, please click on the "Ne

If you do not receive a reply to your new issue or comment in a timely manner, please email [email protected] with a link to the issue.

We are aware of the new issue templates described at https://help.github.com/articles/about-issue-and-pull-request-templates but haven't converted over yet.

### Writing an Issue

For the subject of an issue, please start it by writing the feature or functionality it relates to, i.e. "Create Account:..." or "Dataset Page:...". In the body of the issue, please outline the issue you are reporting with as much detail as possible. In order for the Dataverse development team to best respond to the issue, we need as much information about the issue as you can provide. Include steps to reproduce bugs. Indicate which version you're using, which is shown at the bottom of the page. We love screenshots!
Expand Down
1 change: 0 additions & 1 deletion ISSUE_TEMPLATE.md

This file was deleted.

2 changes: 1 addition & 1 deletion Vagrantfile
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
VAGRANTFILE_API_VERSION = "2"

Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
config.vm.box = "bento/centos-8.2"
config.vm.box = "bento/rockylinux-8.4"

config.vm.provider "virtualbox" do |vbox|
vbox.cpus = 4
Expand Down
1 change: 1 addition & 0 deletions conf/solr/8.8.1/schema.xml
Original file line number Diff line number Diff line change
Expand Up @@ -162,6 +162,7 @@
<field name="dvSubject" type="string" stored="true" indexed="true" multiValued="true"/>

<field name="publicationStatus" type="string" stored="true" indexed="true" multiValued="true"/>
<field name="externalStatus" type="string" stored="true" indexed="true" multiValued="false"/>

<field name="subtreePaths" type="string" stored="true" indexed="true" multiValued="true"/>

Expand Down
12 changes: 12 additions & 0 deletions doc/release-notes/5.0-release-notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -213,6 +213,18 @@ If this is a new installation, please see our [Installation Guide](http://guides

## Upgrade Instructions

### Prerequisite: Retroactively store original file size

Starting with release 4.10 the size of the saved original file (for an ingested tabular datafile) is stored in the database. We provided the following API that retrieve and permanently store the sizes for any already existing saved originals:

`/api/admin/datafiles/integrity/fixmissingoriginalsizes`

(See the documentation note in the Native API guide, under "[Datafile Integrity](https://guides.dataverse.org/en/5.0/api/native-api.html#datafile-integrity)").

To check your installation, issue this command:

`curl http://localhost:8080/api/admin/datafiles/integrity/fixmissingoriginalsizes`

### Upgrade from Glassfish 4.1 to Payara 5

The instructions below describe the upgrade procedure based on moving an existing glassfish4 domain directory under Payara. We recommend this method instead of setting up a brand-new Payara domain using the installer because it appears to be the easiest way to recreate your current configuration and preserve all your data.
Expand Down
2 changes: 2 additions & 0 deletions doc/release-notes/5.4-release-notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -274,6 +274,8 @@ Add the following JVM options to the `<config name="server-config"><java-config>

`<jvm-options>[9|]--add-opens=java.logging/java.util.logging=ALL-UNNAMED</jvm-options>`

`<jvm-options>[1.8.0u251|]-Xbootclasspath/a:${com.sun.aas.installRoot}/lib/grizzly-npn-api.jar</jvm-options>`

6\. Start Payara

- `service payara start`
Expand Down
22 changes: 14 additions & 8 deletions doc/release-notes/5.6-release-notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ This development was supported by the [Research Data Alliance](https://rd-allian

### Dataset Migration API (Experimental)

Datasets can now imported following the format of an OAI-ORE export (RDA-conformant Bags), allowing for easier migration from one Dataverse installation to another, and migration from other systems. This experimental, superuser only, endpoint also allows keeping the existing persistent identifier (where the authority and shoulder match those for which the software is configured) and publication dates.
Datasets can now be imported following the format of an OAI-ORE export (RDA-conformant Bags), allowing for easier migration from one Dataverse installation to another, and migration from other systems. This experimental, superuser only, endpoint also allows keeping the existing persistent identifier (where the authority and shoulder match those for which the software is configured) and publication dates.

This development was supported by DANS and the [Research Data Alliance](https://rd-alliance.org) and follows the recommendations from the [Research Data Repository Interoperability Working Group](http://dx.doi.org/10.15497/RDA00025).

Expand All @@ -36,20 +36,22 @@ Newly-supported major use cases in this release include:
- Users can download Guestbook records using a new API. (Issue #7767, PR #7931)
- Users can update terms metadata using the new semantic API. (Issue #5899, PR #7414)
- Users can retrieve, set, and update metadata using a new, flatter JSON-LD format. (Issue #6497, PR #7414)
- Administrators can use the Traces API to retreive information about specific types of user activity (Issue #7952, PR #7953)
- Administrators can use the Traces API to retrieve information about specific types of user activity (Issue #7952, PR #7953)

## Notes for Dataverse Installation Administrators

### New Database Constraint

A new DB Constraint has been added in this release. Full instructions on how to identify whether or not your database needs any cleanup before the upgrade can be found in the [Dataverse software GitHub repository](https://github.com/IQSS/dataverse/blob/develop/scripts/issues/7451/PRE-RELEASE-INFO.txt). This information was also emailed out to Dataverse installation contacts.

### Payara 5.2021.4 (or Higher) Required
### Payara 5.2021.5 (or Higher) Required

Some changes in this release require an upgrade to Payara 5.2021.4 or higher. (See the upgrade section).
Some changes in this release require an upgrade to Payara 5.2021.5 or higher. (See the upgrade section).

Instructions on how to update can be found in the [Payara documentation](https://docs.payara.fish/community/docs/5.2021.4/documentation/user-guides/upgrade-payara.html) We've included the necessary steps below, but we recommend that you review the Payara upgrade instructions as it could be helpful during any troubleshooting.

Installations upgrading from a previous Payara version shouldn't encounter a logging configuration bug in Payara-5.2021.5, but if your server.log fills with repeated notes about logging configuration and WELD complaints about loading beans, see the paragraph on `logging.properties` in the [installation guide](https://guides.dataverse.org/en/5.6/installation/installation-main.html#running-the-dataverse-software-installer)

### Enhancement to DDI Metadata Exports

To increase support for internationalization and to improve compliance with CESSDA requirements, DDI exports now have a holdings element with a URI attribute whose value is the URL form of the dataset PID.
Expand Down Expand Up @@ -106,15 +108,19 @@ In the following commands we assume that Payara 5 is installed in `/usr/local/pa

5\. Replace the brand new payara/glassfish/domains/domain1 with your old, preserved `domain1`

6\. Start Payara
6\. In domain.xml, add/confirm the existence of the following JVM option to/in the `<config name="server-config"><java-config>` section:

`<jvm-options>[1.8.0u251|]-Xbootclasspath/a:${com.sun.aas.installRoot}/lib/grizzly-npn-api.jar</jvm-options>`

7\. Start Payara

- `service payara start`

7\. Deploy this version.
8\. Deploy this version.

- `$PAYARA/bin/asadmin deploy dataverse-5.6.war`

8\. Restart payara
9\. Restart payara

- `service payara stop`
- `service payara start`
Expand All @@ -125,6 +131,6 @@ In the following commands we assume that Payara 5 is installed in `/usr/local/pa

Note that you can skip this step if your installation uses the default-style, randomly-generated six alphanumeric character-long identifiers for your datasets! This is the case with most Dataverse installations.

The underlying database framework has been modified in this release, to make it easier for installations to create custom procedures for generating identifier strings that suit their needs. Your current configuration will be automatically updated by the database upgrade (Flyway) script incorporated in the release. No manual configuration changes should be necessary. However, after the upgrade, we recommend that you confirm that your installation can still create new datasets, and that they are still assigned sequential numeric identifiers. In the unlikely chance that this is no longer working, please re-create the stored procedure following the steps described in the documentation for the `:IdentifierGenerationStyle` setting in the *Configuration* section of the Installation Guide for this release (v5.6).
The underlying database framework has been modified in this release, to make it easier for installations to create custom procedures for generating identifier strings that suit their needs. Your current configuration will be automatically updated by the database upgrade (Flyway) script incorporated in the release. No manual configuration changes should be necessary. However, after the upgrade, we recommend that you confirm that your installation can still create new datasets, and that they are still assigned sequential numeric identifiers. In the unlikely chance that this is no longer working, please re-create the stored procedure following the steps described in the documentation for the `:IdentifierGenerationStyle` setting in the *Configuration* section of the Installation Guide for this release (v5.6).

(Running the script supplied there will NOT overwrite the position on the sequence you are currently using!)
Loading

0 comments on commit 78c9a44

Please sign in to comment.