github IQSS/dataverse v4.10

latest releases: v6.4, v6.3, v6.2...
5 years ago

This release includes support for large data transfers and storage, a simplified upgrade process, and internationalization.

All installations will be able to use Dataverse's integration with the Data Capture Module, an optional component for deposition of large datasets (both large number of files and large file size). Specific support for large datasets includes client-side checksums, non-http uploads (currently supporting rsync via ssh), and preservation of in-place directory hierarchy. This expands Dataverse to other disciplines and allows project installations to handle large-scale data.

Administrators will be able to configure a Dataverse installation to allow datasets to be mirrored to multiple locations, allowing faster data transfers from closer locations, access to more efficient or cost effective computation, and other benefits.

Internationalization features provided by Scholar's Portal are now available in Dataverse.

Dataverse Installation Administrators will be able to upgrade from one version to another without the need to step through each incremental version.

Configuration options for custom S3 URLs of Amazon S3 compatible storage available.
See configuration documentation for details.

For the complete list of issues, see the 4.10 milestone in Github.

For help with upgrading, installing, or general questions please email support@dataverse.org.

Installation:

If this is a new installation, please see our Installation Guide.

Upgrade:

  1. Undeploy the previous version.
  • <glassfish install path>/glassfish4/bin/asadmin list-applications
  • <glassfish install path>/glassfish4/bin/asadmin undeploy dataverse
  1. Stop glassfish and remove the generated directory, start
  • service glassfish stop
  • remove the generated directory: rm -rf <glassfish install path>glassfish4/glassfish/domains/domain1/generated
  • service glassfish start
  1. Deploy this version.
  • <glassfish install path>/glassfish4/bin/asadmin deploy <path>dataverse-4.10.war
  1. Run db update script
psql -U <db user> -d <db name> -f upgrade_v4.9.4_to_v4.10.sql
  1. Restart glassfish
  2. Update citation metadata block
curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @citation.tsv -H "Content-type: text/tab-separated-values"
  1. Restart glassfish
  2. Replace Solr schema.xml, optionally replace solrconfig.xml to change search results boost logic
    -stop solr instance (service solr stop, depending on solr installation/OS, see http://guides.dataverse.org/en/4.10/installation/prerequisites.html#solr-init-script)
    -replace schema.xml , optionallyreplace solrconfig.xml
cp /tmp/dvinstall/schema.xml /usr/local/solr/solr-7.3.0/server/solr/collection1/conf
cp /tmp/dvinstall/solrconfig.xml /usr/local/solr/solr-7.3.0/server/solr/collection1/conf
-start solr instance (service solr start, depending on solr/OS)
  1. Kick off in place reindex
    http://guides.dataverse.org/en/4.9.3/admin/solr-search-index.html#reindex-in-place
curl -X DELETE http://localhost:8080/api/admin/index/timestamps
curl http://localhost:8080/api/admin/index/continue
  1. Retroactively store original file size

Starting with release 4.10 the size of the saved original file (for an
ingested tabular datafile) is stored in the database. We provided the
following API that retrieve and permanently store the sizes for any
already existing saved originals:
/api/admin/datafiles/integrity/fixmissingoriginalsizes (see the
documentation note in the Native API guide, under "Datafile
Integrity").

It will become necessary in later versions (specifically 5.0) to have these sizes in the database. In this version,
having them makes certain operations more efficient (primary example
is a user downloading the saved originals for multiple files/an entire
dataset etc.) Also, if present in the database, the size will be added
to the file information displayed in the output of the /api/datasets;
which can be useful for some users.

  1. Run ReExportall to generate JSON-LD exports in the new format added in 4.10: http://guides.dataverse.org/en/4.10/admin/metadataexport.html?highlight=export#batch-exports-through-the-api

A note on upgrading from older versions:

If you are upgrading from v4.x, you must upgrade to each intermediate version before installing this version with the exception of db updates as noted.

We now offer an EXPERIMENTAL database upgrade method allowing users to skip over a number of releases. E.g., it should be possible now to upgrade a Dataverse database from v4.8.6 directly to v4.10, without having to deploy the war files for the 5 releases between these 2 versions and manually running the corresponding database upgrade scripts.

The upgrade script, dbupgrade.sh is provided in the scripts/database directory of the Dataverse source tree. See the file README_upgrade_across_versions.txt for the instructions.

Don't miss a new dataverse release

NewReleases is sending notifications on new releases.