Skip to content

Commit

Permalink
Merge pull request #6390 from IQSS/develop
Browse files Browse the repository at this point in the history
KC_Merge_4_18_1
  • Loading branch information
kcondon authored Nov 20, 2019
2 parents 118aa71 + a89e033 commit a91d370
Show file tree
Hide file tree
Showing 17 changed files with 173 additions and 54 deletions.
6 changes: 5 additions & 1 deletion doc/release-notes/4.18-release-notes.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
# Dataverse 4.18

**Note: There is an issue in 4.18 with the display of validation messages on the dataset page (#6380) and we recommend using 4.18.1 for any production environments.**

This release brings new features, enhancements, and bug fixes to Dataverse. Thank you to all of the community members who contributed code, suggestions, bug reports, and other assistance across the project.

## Release Highlights
Expand All @@ -8,7 +10,9 @@ This release brings new features, enhancements, and bug fixes to Dataverse. Than

File-level External Tools can now be configured to display in a "Preview Mode" designed for embedding within the file landing page.

While not technically part of this release, previewers have been made available for several common file types. The previewers support for spreadsheet, image, text, document, audio, video, html files and more. These previewers can be found in the <a href="https://github.com/QualitativeDataRepository/dataverse-previewers">Qualitative Data Repository Github Repository</a>.
While not technically part of this release, previewers have been made available for several common file types. The previewers support for spreadsheet, image, text, document, audio, video, html files and more. These previewers can be found in the <a href="https://github.com/QualitativeDataRepository/dataverse-previewers">Qualitative Data Repository Github Repository</a>. The spreadsheet viewer was contributed by the [Dataverse SSHOC][] project.

[Dataverse SSHOC]: https://www.sshopencloud.eu/news/developing-sshoc-dataverse

### Microsoft Login

Expand Down
45 changes: 45 additions & 0 deletions doc/release-notes/4.18.1-release-notes.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
# Dataverse 4.18.1

This release provides a fix for a regression introduced in 4.18 and implements a few other small changes.

## Release Highlights

### Proper Validation Messages

When creating or editing dataset metadata, users were not receiving field-level indications about what entries failed validation and were only receiving a message at the top of the page. This fix restores field-level indications.

## Major Use Cases

Use cases in this release include:

- Users will receive the proper messaging when dataset metadata entries are not valid.
- Users can now view the expiration date of an API token and revoke a token on the API Token tab of the account page.

## Complete List of Changes

For the complete list of code changes in this release, see the <a href="https://github.com/IQSS/dataverse/milestone/86?closed=1">4.18.1 milestone</a> in Github.

For help with upgrading, installing, or general questions please post to the <a href="https://groups.google.com/forum/#!forum/dataverse-community">Dataverse Google Group</a> or email [email protected].

## Installation

If this is a new installation, please see our <a href="http://guides.dataverse.org/en/4.18.1/installation/">Installation Guide</a>.

## Upgrade

1. Undeploy the previous version.

- &lt;glassfish install path&gt;/glassfish4/bin/asadmin list-applications
- &lt;glassfish install path&gt;/glassfish4/bin/asadmin undeploy dataverse

2. Stop glassfish and remove the generated directory, start.

- service glassfish stop
- remove the generated directory: rm -rf &lt;glassfish install path&gt;glassfish4/glassfish/domains/domain1/generated
- service glassfish start

3. Deploy this version.

- &lt;glassfish install path&gt;/glassfish4/bin/asadmin deploy &lt;path&gt;dataverse-4.18.1.war

4. Restart glassfish.
29 changes: 10 additions & 19 deletions doc/sphinx-guides/source/admin/harvestclients.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,36 +3,27 @@ Managing Harvesting Clients

.. contents:: |toctitle|
:local:

Your Dataverse as a Metadata Harvester
--------------------------------------

Harvesting is a process of exchanging metadata with other repositories. As a harvesting *client*, your Dataverse can
gather metadata records from remote sources. These can be other Dataverse instances or other archives that support OAI-PMH, the standard harvesting protocol. Harvested metadata records will be indexed and made searchable by your users. Clicking on a harvested dataset in the search results takes the user to the original repository. Harvested datasets cannot be edited in your Dataverse installation.
Harvesting is a process of exchanging metadata with other repositories. As a harvesting *client*, your Dataverse can gather metadata records from remote sources. These can be other Dataverse instances or other archives that support OAI-PMH, the standard harvesting protocol. Harvested metadata records will be indexed and made searchable by your users. Clicking on a harvested dataset in the search results takes the user to the original repository. Harvested datasets cannot be edited in your Dataverse installation.

Harvested records can be kept in sync with the original repository through scheduled incremental updates, daily or weekly.
Alternatively, harvests can be run on demand, by the Admin.
Harvested records can be kept in sync with the original repository through scheduled incremental updates, daily or weekly.
Alternatively, harvests can be run on demand, by the Admin.

Managing Harvesting Clients
---------------------------

To start harvesting metadata from a remote OAI repository, you first create and configure a *Harvesting Client*.
To start harvesting metadata from a remote OAI repository, you first create and configure a *Harvesting Client*.

Clients are managed on the "Harvesting Clients" page accessible via the :doc:`dashboard`. Click on the *Add Client* button to get started.

The process of creating a new, or editing an existing client, is largely self-explanatory. It is split into logical steps, in a way that allows the user to go back and correct the entries made earlier. The process is interactive and guidance text is provided. For example, the user is required to enter the URL of the remote OAI server. When they click *Next*, the application will try to establish a connection to the server in order to verify that it is working, and to obtain the information about the sets of metadata records and the metadata formats it supports. The choices offered to the user on the next page will be based on this extra information. If the application fails to establish a connection to the remote archive at the address specified, or if an invalid response is received, the user is given an opportunity to check and correct the URL they entered.

New in Dataverse 4, vs. DVN 3
-----------------------------


- Note that when creating a client you will need to select an existing local dataverse to host the datasets harvested. In DVN 3, a dedicated "harvesting dataverse" would be created specifically for each remote harvesting source. In Dataverse 4, harvested content can be added to *any dataverse*. This means that a dataverse can now contain datasets harvested from multiple sources and/or a mix of local and harvested datasets.


- An extra "Archive Type" pull down menu is added to the Create and Edit dialogs. This setting, selected from the choices such as "Dataverse 4", "DVN, v2-3", "Generic OAI", etc. is used to properly format the harvested metadata as they are shown in the search results. It is **very important** to select the type that best describes this remote server, as failure to do so can result in information missing from the search results, and, a **failure to redirect the user to the archival source** of the data!

It is, however, **very easy to correct** a mistake like this. For example, let's say you have created a client to harvest from the XYZ Institute and specified the archive type as "Dataverse 4". You have been able to harvest content, the datasets appear in search result, but clicking on them results in a "Page Not Found" error on the remote site. At which point you realize that the XYZ Institute admins have not yet upgraded to Dataverse 4, still running DVN v3.1.2 instead. All you need to do is go back to the Harvesting Clients page, and change the setting to "DVN, v2-3". This will fix the redirects **without having to re-harvest** the datasets.
The process of creating a new, or editing an existing client, is largely self-explanatory. It is split into logical steps, in a way that allows the user to go back and correct the entries made earlier. The process is interactive and guidance text is provided. For example, the user is required to enter the URL of the remote OAI server. When they click *Next*, the application will try to establish a connection to the server in order to verify that it is working, and to obtain the information about the sets of metadata records and the metadata formats it supports. The choices offered to the user on the next page will be based on this extra information. If the application fails to establish a connection to the remote archive at the address specified, or if an invalid response is received, the user is given an opportunity to check and correct the URL they entered.

- Another extra entry, "Archive Description", is added to the *Edit Harvesting Client* dialog. This description appears at the bottom of each search result card for a harvested dataset or datafile. By default, this text reads "This Dataset is harvested from our partners. Clicking the link will take you directly to the archival source of the data." Here it can be customized to be more descriptive, for example, "This Dataset is harvested from our partners at the XYZ Institute..."
What if a Run Fails?
~~~~~~~~~~~~~~~~~~~~

Each harvesting client run logs a separate file per run to Glassfish's default logging directory (``/usr/local/glassfish4/glassfish/domains/domain1/logs/`` unless you've changed it). Look for filenames in the format ``harvest_TARGET_YYYY_MM_DD_timestamp.log`` to get a better idea of what's going wrong.

Note that you'll want to run a minimum Dataverse version of 4.6, optimally 4.18, for the best OAI-PMH interoperability.
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,9 +65,9 @@
# built documents.
#
# The short X.Y version.
version = '4.18'
version = '4.18.1'
# The full version, including alpha/beta/rc tags.
release = '4.18'
release = '4.18.1'

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
Expand Down
3 changes: 2 additions & 1 deletion doc/sphinx-guides/source/versions.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,9 @@ Dataverse Documentation Versions

This list provides a way to refer to the documentation for previous versions of Dataverse. In order to learn more about the updates delivered from one version to another, visit the `Releases <https://github.com/IQSS/dataverse/releases>`__ page in our GitHub repo.

- 4.18
- 4.18.1

- `4.18 </en/4.18/>`__
- `4.17 </en/4.17/>`__
- `4.16 </en/4.16/>`__
- `4.15.1 </en/4.15.1/>`__
Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
-->
<groupId>edu.harvard.iq</groupId>
<artifactId>dataverse</artifactId>
<version>4.18</version>
<version>4.18.1</version>
<packaging>war</packaging>
<name>dataverse</name>
<properties>
Expand Down
31 changes: 30 additions & 1 deletion src/main/java/edu/harvard/iq/dataverse/ApiTokenPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@
import edu.harvard.iq.dataverse.authorization.users.ApiToken;
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
import edu.harvard.iq.dataverse.util.BundleUtil;
import edu.harvard.iq.dataverse.api.Util;
import java.sql.Timestamp;
import java.util.ArrayList;
import java.util.Calendar;
import java.util.List;
import java.util.logging.Logger;
import javax.ejb.EJB;
Expand Down Expand Up @@ -75,4 +75,33 @@ public void generate() {

}
}

public String getApiTokenExpiration() {
if (session.getUser().isAuthenticated()) {
AuthenticatedUser au = (AuthenticatedUser) session.getUser();
apiToken = authSvc.findApiTokenByUser(au);
if (apiToken != null) {
return Util.getDateFormat().format(apiToken.getExpireTime());
} else {
return "";
}
} else {
// It should be impossible to get here from the UI.
return "";
}
}

public Boolean tokenIsExpired(){
return apiToken.getExpireTime().before(new Timestamp(System.currentTimeMillis()));
}

public void revoke() {
if (session.getUser().isAuthenticated()) {
AuthenticatedUser au = (AuthenticatedUser) session.getUser();
apiToken = authSvc.findApiTokenByUser(au);
if (apiToken != null) {
authSvc.removeApiToken(au);
}
}
}
}
10 changes: 10 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/EditDatafilesPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -729,6 +729,16 @@ public void setShowAccessPopup(boolean showAccessPopup) {} // dummy set method
//fails silently. This is because they are only granted editDataset permissions
//for that scope after the creation is completed. -Matthew 4.7.1
public void restrictFiles(boolean restricted) throws UnsupportedOperationException{

//First make sure they've even selected file(s) to restrict....
if (this.getSelectedFiles().isEmpty()) {
if (restricted) {
PrimeFaces.current().executeScript("PF('selectFilesForRestrictEditFilesPage').show()");
} else {
PrimeFaces.current().executeScript("PF('selectFilesForUnRestrictEditFilesPage').show()");
}
return;
}

// since we are restricted files, first set the previously restricted file list, so we can compare for
// determining whether to show the access popup
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -437,7 +437,7 @@ public ApiToken findApiTokenByUser(AuthenticatedUser au) {
try {
return typedQuery.getSingleResult();
} catch (NoResultException | NonUniqueResultException ex) {
logger.log(Level.INFO, "When looking up API token for {0} caught {1}", new Object[]{au, ex});
logger.log(Level.FINE, "When looking up API token for {0} caught {1}", new Object[]{au, ex});
return null;
}
}
Expand Down
6 changes: 5 additions & 1 deletion src/main/java/propertyFiles/Bundle.properties
Original file line number Diff line number Diff line change
Expand Up @@ -397,10 +397,14 @@ oauth2.callback.message=<strong>Authentication Error</strong> - Dataverse could

# tab on dataverseuser.xhtml
apitoken.title=API Token
apitoken.message=Your API Token is displayed below after it has been created. Check out our {0}API Guide{1} for more information on using your API Token with the Dataverse APIs.
apitoken.message=Your API Token is valid for a year. Check out our {0}API Guide{1} for more information on using your API Token with the Dataverse APIs.
apitoken.notFound=API Token for {0} has not been created.
apitoken.expired.warning=This token is about to expire, please generate a new one.
apitoken.expired.error=This token is expired, please generate a new one.
apitoken.generateBtn=Create Token
apitoken.regenerateBtn=Recreate Token
apitoken.revokeBtn=Revoke Token
apitoken.expirationDate.label=Expiration Date

#dashboard.xhtml
dashboard.title=Dashboard
Expand Down
2 changes: 1 addition & 1 deletion src/main/webapp/dataset.xhtml
Original file line number Diff line number Diff line change
Expand Up @@ -443,7 +443,7 @@
or !empty DatasetPage.datasetVersionUI.subject.value
or !empty DatasetPage.datasetVersionUI.relPublicationCitation
or !empty DatasetPage.datasetVersionUI.notes.value) and !empty DatasetPage.datasetSummaryFields}">
<div class="col-sm-12 metadata-container">
<div class="col-sm-12 metadata-container margin-bottom-half">
<table class="metadata">
<tbody>
<ui:repeat value="#{DatasetPage.datasetSummaryFields}" var="dsf">
Expand Down
4 changes: 2 additions & 2 deletions src/main/webapp/datasetFieldForEditFragment.xhtml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@
<!-- input text start UPDATE: UI:REMOVE applied due to duplicate ID errors, left code as reference incase test scripts complain -->
<span id="pre-input-#{dsf.datasetFieldType.name}" class="pre-input-tag"/></ui:remove>

<p:inputText value="#{dsfv.valueForEdit}" id="inputText" required="#{required}"
<p:inputText value="#{dsfv.valueForEdit}" id="inputText" pt:aria-required="#{dsf.required}"
styleClass="form-control #{dsfv.datasetField.datasetFieldType.name == 'title' and DatasetPage.editMode == 'CREATE' ? 'datasetfield-title' : ''}"
rendered="#{!dsfv.datasetField.datasetFieldType.controlledVocabulary
and (dsfv.datasetField.datasetFieldType.fieldType == 'TEXT'
Expand All @@ -22,7 +22,7 @@
or dsfv.datasetField.datasetFieldType.fieldType == 'EMAIL')}"/>
<p:watermark for="inputText" value="#{dsfv.datasetField.datasetFieldType.localeWatermark}"></p:watermark>

<p:inputTextarea value="#{dsfv.valueForEdit}" id="description" required="#{required}"
<p:inputTextarea value="#{dsfv.valueForEdit}" id="description" pt:aria-required="#{dsf.required}"
rows="5" cols="60" styleClass="form-control"
rendered="#{dsfv.datasetField.datasetFieldType.fieldType == 'TEXTBOX'}"/>
<p:watermark for="description" value="#{dsfv.datasetField.datasetFieldType.localeWatermark}"></p:watermark>
Expand Down
Loading

0 comments on commit a91d370

Please sign in to comment.