Skip to content

Releases: IQSS/dataverse

v4.9.3

28 Sep 21:32
add6c3b
Compare
Choose a tag to compare

Note: We recommend upgrading to 4.9.4, which includes a fix for a bug that prevented users from logging in using an email address. Learn more in the 4.9.4 Release Notes.

This release is focused on expanded options for handling datasets and files in Dataverse. The dataset linking feature is now available to all users, not just superusers. It is now accessible through the UI in addition to the API. Users now have the option of downloading all files in a dataset in their original file format via the Download All button, in addition to the already available "archival format" option. Installations can now configure whether or not PIDs will be minted for files. We have also made the application more stable by addressing leaks.

For the complete list of issues, see the 4.9.3 milestone in Github.

For help with upgrading, installing, or general questions please email [email protected].

Installation:

If this is a new installation, please see our Installation Guide.

Upgrade:

  1. Undeploy the previous version.
  • <glassfish install path>/glassfish4/bin/asadmin list-applications
  • <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
  1. Stop glassfish and remove the generated directory, start
  • service glassfish stop
  • remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
  • service glassfish start
  1. Deploy this version.
  • <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.9.3.war
  1. Run db update script
psql -U <db user> -d <db name> -f upgrade_v4.9.2_to_v4.9.3.sql
  1. Restart glassfish
  2. Replace Solr schema.xml
    -stop solr instance (service solr stop, depending on solr installation/OS, see http://guides.dataverse.org/en/4.9.3/installation/prerequisites.html#solr-init-script)
    -replace schema.xml
cp /tmp/dvinstall/schema.xml /usr/local/solr/solr-7.3.0/server/solr/collection1/conf
-start solr instance (service solr start, depending on solr/OS)
  1. Kick off in place reindex
    http://guides.dataverse.org/en/4.9.3/admin/solr-search-index.html#reindex-in-place
curl -X DELETE http://localhost:8080/api/admin/index/timestamps
curl http://localhost:8080/api/admin/index/continue
  1. If you are running TwoRavens as part of your Dataverse installation, please find the following line in your TwoRavens app_ddi.js file:
dataurl = dataurl+"?key="+apikey+"&gbrecs=true";

(or, if you are using an older version, it may look like this:)

dataurl = dataurl+"?key="+apikey;

and change it as follows:

dataurl = dataurl+"?key="+apikey+"%26gbrecs=true";

(this ensures that the download counts are properly incremented for TwoRavens explore sessions, and eliminates the confusing "Warning: The request is not valid json. Check for special characters" messages that some users were seeing after the 4.8.6 upgrade)

If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version.

v4.9.2

08 Aug 21:06
4ad7bf5
Compare
Choose a tag to compare

This release is focused on ingest upgrades, new import APIs, and infrastructure upgrades.
Stata 14, Stata 15, and .tsv files will now be ingested by Dataverse. New APIs will allow datasets with existing DOIs to be imported into Dataverse. Bootstrap and Primefaces, which power the Dataverse front end, have been updated.

For the complete list of issues, see the 4.9.2 milestone in Github.

For help with upgrading, installing, or general questions please email [email protected].

Installation:

If this is a new installation, please see our Installation Guide.

Upgrade:

  1. Undeploy the previous version.
  • <glassfish install path>/glassfish4/bin/asadmin list-applications
  • <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
  1. Stop glassfish and remove the generated directory, start
  • service glassfish stop
  • remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
  • service glassfish start
  1. Deploy this version.
  • <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.9.2.war
  1. Run db update script
psql -U <db user> -d <db name> -f upgrade_v4.9.1_to_v4.9.2.sql
  1. Restart glassfish

If your Dataverse is configured to use R, you need to install one extra R library (haven); Dataverse is now using it to export tabular datafiles as RData. Install the package with the following R command (for example):

install.packages("haven", repos="https://cloud.r-project.org/", lib="/usr/lib64/R/library" )

If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version.

v4.9.1

27 Jun 22:00
7237517
Compare
Choose a tag to compare

This release contains a patch for a permissions issue introduced in the last release where a contributor's permission is not preserved when they add a file while creating a new dataset ( #4783 ). It also includes an updated postgres driver in order to support Postgres v9.6. Since this driver is universal, older drivers have been removed but earlier versions such as v9.3 should continue working.

For the complete list of issues, see the 4.9.1 milestone in Github.

For help with upgrading, installing, or general questions please email [email protected].

Installation:

If this is a new installation, please see our Installation Guide.

Upgrade:

  1. Undeploy the previous version.
  • <glassfish install path>/glassfish4/bin/asadmin list-applications
  • <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
  1. Stop glassfish and remove the generated directory, start
  • service glassfish stop
  • remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
  • service glassfish start
  1. Deploy this version.
  • <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.9.1.war
  1. Restart glassfish

If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version.

v4.9

06 Jun 18:56
d8451c5
Compare
Choose a tag to compare

Note: We recommend upgrading to 4.9.1, which includes a patch to address a high impact bug. Learn more in the 4.9.1 Release Notes.

This release introduces new features, File PIDs and Provenance. A new metrics API has been included. We have updated the Solr version used for search, improved error handling for file upload, fixed memory leaks in Export and added several more useful APIs: move dataverse, link dataset and dataverse, and uningest a tabular data file. Numerous bug fixes and documentation improvements have been made.

  • File PIDs
  • Provenance
  • Metrics API
  • Update Solr to v7.3
  • Move Dataverse API
  • Link Dataset and Dataverse APIs
  • Uningest tabular file
  • Make file upload more robust by improving error handling
  • Fix memory leak in Export
  • Fix issues with contact us email, make from address Dataverse server, reply to address requestor
  • Change the way DOIs and Handles are stored in the database to be more flexible with respect to format.
  • Add Mixtepec Mixtec language to metadata list of languages.
  • Make metadata URLs clickable, ie. Alternative URL

For the complete list of issues, see the 4.9 milestone in Github.

For help with upgrading, installing, or general questions please email [email protected].

Installation:

If this is a new installation, please see our Installation Guide.

Upgrade:

If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version.
This release has a number of extra steps, most notably upgrading Solr, migrating DOIs to a new storage format, and reindexing. This will require a brief downtime and a period of incomplete search records as the index rebuilds, post Solr upgrade. It is strongly recommended you test this upgrade in a test environment and back up your database before deploying to production.

When upgrading from the previous version, you will need to do the following:

  1. Shut down access to production service, you do not want users interacting with site during upgrade.
  2. Undeploy current version of Dataverse from each web server.
  • <glassfish install path>/glassfish4/bin/asadmin list-applications
  • <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
  1. Stop glassfish and remove the generated directory, restart glassfish
    • service glassfish stop
    • remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
    • service glassfish start
  2. Back up production database
  3. Install and configure Solr v7.3
    See http://guides.dataverse.org/en/4.9/installation/prerequisites.html#installing-solr
  4. Deploy v4.9 to web servers
  • <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.9.war
  1. Upgrade the database by running the update script.
    Once again, we STRONGLY RECOMMEND taking a full backup of the database before proceeding with the upgrade. Among other changes in this release, we are rearranging the way DOI identifiers are stored in the database. While your existing persistent identifiers stay the same (as the name suggests!), the update script will modify the database entries (it has to do with how the "authority" and "shoulder" suffix are stored). And since we are modifying something as important as the identifiers of your datasets, it's a great idea to have a handy way to restore your database as it was, in the unlikely event anything goes wrong.
    pg_dump --clean <db name> is a good way to save the entire database as an importable .sql file.
    Run the upgrade script:
  • psql -U <db user> -d <db name> -f upgrade_v4.8.6_to_v4.9.0.sql
  1. (Optionally) Enable Provenance
curl -X PUT -d 'true' http://localhost:8080/api/admin/settings/:ProvCollectionEnabled

  1. Update metadata languages list
curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"
  1. Restart glassfish
  2. Clear index, then index all metadata
curl http://localhost:8080/api/admin/index/clear
curl http://localhost:8080/api/admin/index

Please note: Do not run the registerDataFileAll command below if you do not plan to give your files persistent identifiers, which are no longer required in 4.9.3 or later (#4929).

  1. Run the retroactive file PID registration script or register all file PID endpoint
    Note: if you have a large amount of files being registered, you may want to contact your doi provider in advance to determine whether this level of traffic will cause a problem for their service.
curl http://localhost:8080/api/admin/registerDataFileAll?key=<super user api token>

This utility logs progress to server.log and a completion message with a total and any failures.
12. When file registration completes, perform in-place reindex.

curl -X DELETE http://localhost:8080/api/admin/index/timestamps
curl http://localhost:8080/api/admin/index/continue

If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg

IMPORTANT: If you are running TwoRavens as part of your Dataverse installation:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.9/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R

If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:

cd [files directory]
rm -f `find . -name '*.prep'`

If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.

v4.8.6

02 Apr 20:46
874a944
Compare
Choose a tag to compare

This release introduces a modular Explore feature to support external tools. It includes performance enhancements for S3 storage, provides an API endpoint to move datasets and other improvements, includes documentation improvements and fixes a number of bugs.

  • Modular Explore (enables tools such as Data Explorer and Two Ravens)
  • Redirect to the S3 location, instead of streaming.
  • New API: Add api end point to move datasets
  • Terms of use for Native API
  • Developer Guides - As a new developer, I want a single page set of quick and easy instructions for installing Dataverse
  • Fix for: Guestbook - Downloads via API are not counted (if your installation uses TwoRavens, please see the note below!)
  • Fix/Clean up handling of 403 and 404 exit codes in data access API
  • Fix for: Dataset page: Page fails to load when it cannot understand an image file to generate thumbnail
  • Fix for: When setting harvesting schedule to weekly, settings don't persist
  • Fix for a memory leak in IndexAll;
  • Improvements for: Slow Page Load: Some dataset pages are slow to load, resulting in a timeout error.

For the complete list of issues, see the 4.8.6 milestone in Github.

ATTENTION: If you are running TwoRavens as part of your Dataverse installation, the bug fix for the counting of file downloads via the API has a side effect of having users' data explore sessions counted as two downloads. To avoid this, a change in the TwoRavens configuration is required. Find the following line in the TwoRavens app_ddi.js file:

dataurl = dataurl+"?key="+apikey;

and change it as follows:

dataurl = dataurl+"?key="+apikey+"&gbrecs=true";

For help with upgrading, installing, or general questions please email [email protected].

Installation:

If this is a new installation, please see our Installation Guide.

Upgrade:

If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:

  1. Undeploy the previous version.
    • <glassfish install path>/glassfish4/bin/asadmin list-applications
    • <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
  2. Stop glassfish and remove the generated directory, start
    • service glassfish stop
    • remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
    • service glassfish start
  3. Deploy this version.
    • <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.8.6.war
  4. Run db update script
    psql -U <db user> -d <db name> -f upgrade_v4.8.5_to_v4.8.6.sql
  5. Restart glassfish
  6. Configure Two Ravens as an External Tool
    Because the Explore button is now modular, the previous way of enabling Two Ravens using settings will no longer work. To ensure Two Ravens Explore continues to be available, following the instructions here:
    http://guides.dataverse.org/en/4.8.6/installation/r-rapache-tworavens.html#e-enable-tworavens-button-in-dataverse
  7. Reindex to update doi link format in citation
    http://guides.dataverse.org/en/4.8.6/admin/solr-search-index.html#reindex-in-place
    Reexport all to update doi link format in citation
    http://guides.dataverse.org/en/4.8.6/admin/metadataexport.html?highlight=export#batch-exports-through-the-api

If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg

IMPORTANT: If you are running TwoRavens as part of your Dataverse installation:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.8.6/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R

If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:

cd [files directory]
rm -f `find . -name '*.prep'`

If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.

v4.8.5

17 Jan 22:20
64cb1ef
Compare
Choose a tag to compare

This release fixes issues with IP Groups and Guestbook. It improves the Download All behavior and introduces an experimental backup to secondary storage utility. It also supports AWS IAM role and multiple regions. Other changes support future functionality.

  • IP Group access permissions were not activated.
  • Guestbook entry form validation was not working and overwriting prepopulated field values were not being saved.
  • Download All check box now downloads all files, regardless of scroll bar position.
  • An experimental backup to secondary storage utility is provided.
  • A docker container with a standalone Dataverse instance is provided to support automatic integration tests.
  • Support for AWS IAM role for S3 driver and multiple regions.
  • Modular Configure
  • Some infrastucture to support differential privacy.

For the complete list of issues, see the 4.8.5 milestone in Github.

For help with upgrading, installing, or general questions please email [email protected].

Installation:

If this is a new installation, please see our Installation Guide.

Upgrade:

If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:

  1. Undeploy the previous version.
    • /glassfish4/bin/asadmin list-applications
    • /glassfish4/bin/asadmin undeploy dataverse
  2. Stop glassfish and remove the generated directory, start
    • service glassfish stop
    • remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
    • service glassfish start
  3. Deploy this version.
    • /glassfish4/bin/asadmin deploy dataverse-4.8.5.war
  4. Restart glassfish

If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg

IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.8.3/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R

If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:

cd [files directory]
rm -f `find . -name '*.prep'`

If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.

TROUBLESHOOTING NOTE:

Potential issue with the PostgreSQL JDBC driver version incompatibility causing the Dataverse timer to malfunction.

Dataverse uses the EJB timer perform scheduled tasks: harvesting from remote servers and keeping the local OAI sets and metadata expor
ts updated. We've discovered that this timer may stop working if the version of the JDBC driver used by Glassfish is no longer in sync
with the version of the PostgreSQL database. The symptoms: if your scheduled harvests are no longer running and there are error messa
ges in the server.log with the following lines in them:

Internal Exception: java.io.StreamCorruptedException: invalid stream header ...
Exception Description: Could not deserialize object from byte array ...

it most likely means that your JDBC driver needs to be upgraded. For example, if you are running the version 9.3 of PostgresQL, make s
ure you have the driver postgresql-9.3-1104.jdbc4.jar in your /glassfish/lib directory. The correct version of the d
river for your version of PostgreSQL can be found at https://jdbc.postgresql.org/download.html. If you have an older driver in glassf
ish/lib, stop Glassfish, remove the old driver and replace it with the new version. We recommend that you remove the entire contents o
f /glassfish/domains/domain1/generated/ before starting Glassfish again.

v4.8.4

05 Dec 21:52
708fb95
Compare
Choose a tag to compare

Overview:

This release adds schema.org metadata to dataset pages for better indexing by search engines, allows downloading of dataset metadata in schema.org format, fixes a bug in publishing a dataset where affiliation is not set, and several orcid-related issues.

  • Add schema.org markup to dataset pages
  • Export dataset metadata in schema.org format and add it to download list
  • Fix navbar search box that was throwing errors
  • Fix issue where dataset without an author affiliation could not be published
  • Fix oauth button labels where connect button became statically typed to orcid rather than the authentication provider (thanks to Ruben Andreassen and Pete Meyer for the fix)
  • Fix internal server error when logging in using Google
  • Fix url provided in sample orcid.json file. It was not allowing user account info to be prepopulated on create
  • Remove default gray background in dataverse theme
  • Reword failed ingest error message to emphasize upload completed, only ingest failed

For the complete list of issues, see the 4.8.4 milestone in Github.

For help with upgrading, installing, or general questions please email [email protected].

Installation:

If this is a new installation, please see our Installation Guide.

Upgrade:

If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:

  1. Undeploy the previous version.
    • /glassfish4/bin/asadmin list-applications
    • /glassfish4/bin/asadmin undeploy dataverse
  2. Stop glassfish and remove the generated directory, start
    • service glassfish stop
    • remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
    • service glassfish start
  3. Deploy this version.
    • /glassfish4/bin/asadmin deploy dataverse-4.8.4.war
  4. Run db update script
    psql -U <db user> -d <db name> -f upgrade_v4.8.3_to_v4.8.4.sql
  5. Restart glassfish

Note: the url provided in the v2.0 orcid.json file was slightly in error, requiring an update.
Using the v2.0 orcid.json file, http://guides.dataverse.org/en/4.8.4/_downloads/orcid.json , enter the client id and secret and then update the provider information:

curl -X POST -H 'Content-type: application/json' --upload-file orcid.json http://localhost:8080/api/admin/authenticationProviders

After updating this information, restart glassfish.

If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg

IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.8.3/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R

If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:

cd [files directory]
rm -f `find . -name '*.prep'`

If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.

TROUBLESHOOTING NOTE:

Potential issue with the PostgreSQL JDBC driver version incompatibility causing the Dataverse timer to malfunction.

Dataverse uses the EJB timer perform scheduled tasks: harvesting from remote servers and keeping the local OAI sets and metadata expor
ts updated. We've discovered that this timer may stop working if the version of the JDBC driver used by Glassfish is no longer in sync
with the version of the PostgreSQL database. The symptoms: if your scheduled harvests are no longer running and there are error messa
ges in the server.log with the following lines in them:

Internal Exception: java.io.StreamCorruptedException: invalid stream header ...
Exception Description: Could not deserialize object from byte array ...

it most likely means that your GDBC driver needs to be upgraded. For example, if you are running the version 9.3 of PostgresQL, make s
ure you have the driver postgresql-9.3-1104.jdbc4.jar in your /glassfish/lib directory. The correct version of the d
river for your version of PostgreSQL can be found at https://jdbc.postgresql.org/download.html. If you have an older driver in glassf
ish/lib, stop Glassfish, remove the old driver and replace it with the new version. We recommend that you remove the entire contents o
f /glassfish/domains/domain1/generated/ before starting Glassfish again.

v4.8.3

21 Nov 22:26
2c3f3a1
Compare
Choose a tag to compare

Overview:

This release supports ORCiD schema v2.0 in accordance with their roadmap: https://members.orcid.org/api/news/xsd-20-update. It also fixes a bug on the file landing page, allowing users to request access to restricted files.

  • Support ORCiD schema v2.0
  • Fix a bug on the file landing page preventing users from requesting access to restricted files from that page
  • Include additional Postgres drivers to support newer versions of Postgres with recommended drivers

For the complete list of issues, see the 4.8.3 milestone in Github.

For help with upgrading, installing, or general questions please email [email protected].

Installation:

If this is a new installation, please see our Installation Guide.

Upgrade:

If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:

  1. Undeploy the previous version.
    • /glassfish4/bin/asadmin list-applications
    • /glassfish4/bin/asadmin undeploy dataverse
  2. Stop glassfish and remove the generated directory, start
    • service glassfish stop
    • remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
    • service glassfish start
  3. Deploy this version.
    • /glassfish4/bin/asadmin deploy dataverse-4.8.3.war

If you previously had ORCiD authentication configured, please note that we have upgraded to the v2.0 API in this release. To be in sync with their upgrade policy, you must update the ORCiD authentication provider information, see http://guides.dataverse.org/en/4.8.3/installation/oauth2.html#dataverse-side .

Using the v2.0 orcid.json file, http://guides.dataverse.org/en/4.8.3/_downloads/orcid.json , enter the client id and secret and then update the provider information:

curl -X POST -H 'Content-type: application/json' --upload-file orcid.json http://localhost:8080/api/admin/authenticationProviders

After updating this information, restart glassfish.

If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg

IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.8.3/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R

If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:

cd [files directory]
rm -f `find . -name '*.prep'`

If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.

TROUBLESHOOTING NOTE:

Potential issue with the PostgreSQL JDBC driver version incompatibility causing the Dataverse timer to malfunction.

Dataverse uses the EJB timer perform scheduled tasks: harvesting from remote servers and keeping the local OAI sets and metadata expor
ts updated. We've discovered that this timer may stop working if the version of the JDBC driver used by Glassfish is no longer in sync
with the version of the PostgreSQL database. The symptoms: if your scheduled harvests are no longer running and there are error messa
ges in the server.log with the following lines in them:

Internal Exception: java.io.StreamCorruptedException: invalid stream header ...
Exception Description: Could not deserialize object from byte array ...

it most likely means that your GDBC driver needs to be upgraded. For example, if you are running the version 9.3 of PostgresQL, make s
ure you have the driver postgresql-9.3-1104.jdbc4.jar in your /glassfish/lib directory. The correct version of the d
river for your version of PostgreSQL can be found at https://jdbc.postgresql.org/download.html. If you have an older driver in glassf
ish/lib, stop Glassfish, remove the old driver and replace it with the new version. We recommend that you remove the entire contents o
f /glassfish/domains/domain1/generated/ before starting Glassfish again.

v4.8.2

08 Nov 00:33
ab613a3
Compare
Choose a tag to compare

Overview:

In this release we have fixed an issue so curators may edit datasets while in review. We provide an OpenShift template for Dataverse and Docker image for experimenting with this configuration. We've made several additional bug fixes. For a complete list of changes, see the closed issues in this milestone.

  • Allow curators to edit datasets while in review
  • Provide OpenShift template and Docker image
  • Document how one can experiment with Dataverse and OpenShift
  • Fix index exception when there are Null values in TextBox-type metadata fields
  • Fix missing thumbnail images for map files when stored on S3
  • When Terms of Use and Terms of Access are defined, expand all populated fields on Dataset Terms tab
  • Mitigate against password guessing by adding a math question to the form after 2 failed attempts
  • Fix displayed navigation URLs to use Pretty URLs format

For the complete list of issues, see the 4.8.2 milestone in Github.

For help with upgrading, installing, or general questions please email [email protected].

Installation:

If this is a new installation, please see our Installation Guide.

Upgrade:

If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:

  1. Undeploy the previous version.
    • /glassfish4/bin/asadmin list-applications
    • /glassfish4/bin/asadmin undeploy dataverse
  2. Stop glassfish and remove the generated directory, start
    • service glassfish stop
    • remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
    • service glassfish start
  3. Deploy this version.
    • /glassfish4/bin/asadmin deploy dataverse-4.8.2.war

If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg

IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.8.2/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R

If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:

cd [files directory]
rm -f `find . -name '*.prep'`

If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.

v4.8.1

10 Oct 22:02
Compare
Choose a tag to compare

Overview:

This release improves performance for the dataset page, especially those with many files. It also includes a usability improvement to the verify email link.

For help with upgrading, installing, or general questions please email [email protected].

Installation:

If this is a new installation, please see our Installation Guide.

Upgrade:

If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version. When upgrading from the previous version, you will need to do the following:

  1. Undeploy the previous version.
    • /glassfish4/bin/asadmin list-applications
    • /glassfish4/bin/asadmin undeploy dataverse
  2. Stop glassfish and remove the generated directory, start
    • service glassfish stop
    • remove the generated directory: rm -rf /usr/local/glassfish4/glassfish/domains/domain1/generated
    • service glassfish start
  3. Deploy this version.
    • /glassfish4/bin/asadmin deploy dataverse-4.8.1.war

If you are upgrading from v3.x, you will need to perform a migration to v4.x since our application was redesigned and the database schema are completely different. This is a significant undertaking. Please contact us (support at dataverse.org) before beginning. Also refer to our migration google group for additional support and information: https://groups.google.com/forum/#!forum/dataverse-migration-wg

IMPORTANT: If you are running TwoRavens with your dataverse:
Make sure the two applications are using the same version of the "pre-processed statistics" R code. Compare the 2 files:
On the TwoRavens side:
.../dataexplore/rook/preprocess/preprocess.R
On the Dataverse side:
.../applications/dataverse-4.8.1/WEB-INF/classes/edu/harvard/iq/dataverse/rserve/scripts/preprocess.R

If they are different, replace the Dataverse copy with the TwoRavens copy (i.e., the TwoRavens version wins!).
And, also, remove all the already-generated pre-processed fragments in your Dataverse file directory, for example:

cd [files directory]
rm -f `find . -name '*.prep'`

If the two copies are the same, you don't need to do any of this.
Please note that this is a temporary measure, we are working on a fix that will make the two applications resolve code version conflicts like this automatically.