Version 1.0 Beta 2 release
Historical release
Release notes
Major features / changes
-
Download all data, either as zip or shell script with links to download directly from S3 (#319, #322, #356)
-
This needs the following ENV variables to be set
DOWNLOAD_FILES_SIZE_LIMIT=100000000 DOWNLOAD_PATH=/shared/downloads DOWNLOAD_FILES_COUNT_LIMIT=150
-
A copy of the metadata is stored in
metadata.json
in S3, to enable users to download the metadata from S3 (** backwards incompatible change to S3 structure*). This is created for the work and for each subject, session and modality within the work. -
The system metadata has been moved to a file called
system_metadata.json
(** backwards incompatible change to S3 structure*) -
Add
meta.json
within modalities -
When a subject, session or modality is modified, the
updated_date
of the experiment is also modified -
If a zip file is present, and it was generated after the last modified date of the work, it is not generated again and the same file is served
-
If a shell file is present and it was generated after the last modified date of the work and is no more than a day old, it is not generated again and the same file is served
-
The validity of the pre-signed URL used in the shell file is set to 24 hours
-
-
Download search results as a shell script with links to download directly from S3 (#274)
-
Changes to data structure in S3 (** backwards incompatible change to S3 structure*)
-
All object keys are sanitised to follow S3 naming rules (#308, #312 and #318)
-
Improved data structure in S3 (#154, #341)
- Object keys no longer has
files
directory - Object keys will not start with a slash
- Object keys no longer has
-
Data structure in S3 is modified to reflect the new data tree when subject, session or modality titles are modified (#335)
-
-
Sort Collections and filesets in alphanumeric order (#282) (** requires a change to the solr schema and data will need to be re-indexed to see the sorting improvements, if updating from an earlier version)
Minor fixes and UI improvements
-
Subject, session and modality title is sanitised and fixed during import (#334)
-
General UI fixes / improvements
- Social media form entries removed from profile edit page (79)
- Allow users to add 100s of files through the UI (#84)
- Error pages have RDMS template (#238)
- Changed deposit action label in workflow to be user friendly (#255)
- Fix for uploaded file validation (#266, #308)
- Added tabs
tombstoned
andpublished
to review submissions (#270)- Fixed
published
anddraft
tabs in review submissions, along with pagination for each tab (#270)
- Fixed
- Fix for single use download link, when downloading a file (#309)
- Fix for error when editing a fileset (#320)
- Rebrand product to ReSeed (#323)
-
CRC 1280 UI changes
- Changed label subject name to subject title (#136)
- Improve modality form to autofill modality title based on modality value (#140)
- Send notification to all participants on publish, archive or tombstone (#248)
- Display workflow comments in descending date order (#249)
- Removed public view link from collection show page and fix pagination for public view (#250)
- Display date picker below date field for sessions form for better visibility (#252)
- Restrict CRC 1280 groups to only accept experiments and experiments to only belong to CRC 1280 groups (#254)
- Fix for javaScript validation for subjects, sessions and modalities (#257)
- Simplified CRC 1280 experiment form by adding default values for language, funder (#260, #269)
- Groups, experiments, subjects, sessions, modalities and files are ordered alphabetically (#282 and #330)
- Fix for incorrect values for subject, session and modality when editing (#284)
- Modified breadcrumb for CRC 1280 experiment to include group (#294)
Rake tasks to fix the data structure in S3
The data structure in S3 has been modified. If upgrading from an earlier version, run the following rake tasks in the web container (after this code has been deployed), to change the structure in S3 to the new data structure.
SSH to the web container
docker exec -it rdms_web_1 /bin/bash
-
Rake tasks to change data structure in S3 (#154, #341)
- Object keys no longer has
files
directory - Object keys will not start with a slash
bundle exec rake rdms:remove_files_folders_from_s3 bundle exec rake rdms:remove_forward_slash_from_starting_of_s3_keys
- Object keys no longer has
-
Rake task to add needed metadata files for download
- Replace
metadata.json
withsystem_metadata.json
- Add user
metadata.json
- Add
meta.json
within modalities
rake rdms:fix_metadata_in_s3
- Replace
You need to run the rake tasks in order. First the two in 1 and then 2. If 1 is run after 2, the user metadata.json
will get overwritten with the old metadata.json
(now called system_metadata.json
)
The rake tasks will change the data structure, and create the required metadata files in the buckets starting with the configured S3 prefix. They need to be run just once. Re-running the rake tasks will not do any harm.
The outcome of the rake task can be observed when you download a work.
Updating the SOLR schema and reindexing the data
The SOLR schema has been modified to enable alpha-numeric sorting of collection, work and fileset titles. If upgrading from an earlier version, please follow the steps below.
- The modified solr schema is in the code base at hyrax/solr/conf/schema.xml.
- The solr config directory hyrax/solr/conf gets mounted within the docker solr container at
/opt/solr/solr_conf
- The Solr instance directory for the core
hyrax_production
is located at/var/solr/data/hyrax_production
within the solr container.
To update the solr schema
Method 1
copy the new schema.xml
to volumes/solr/data/hyrax_production/schema.xml
and making sure the new file retains the same file access permissions as the old file before starting RDMS.
Alternative method
-
SSH into the solr container and copy the new schema to SOLR
docker ps docker exec -it rdms_solr_1 /bin/bash
Copy the updated schema.xml from
/opt/solr/solr_conf
to/var/solr/data/hyrax_production
cp /opt/solr/solr_conf/schema.xml /var/solr/data/hyrax_production/schema.xml
-
Reload the SOLR core
Restarting the container should restart solr, which in turn will reload the solr core with the new config
or reload the core from the SOLR user interface, core admin
http://localhost:8983/solr/#/~cores/hyrax_production
Tip: To access the solr UI running within the docker container, port forward the port and bind it to your local port using ssh
ssh host -N -L 8983:localhost:8983
(for example)
You would need to re-index all of the data, if you had any CRC-1280 data imported
Run ActiveFedora::Base.reindex_everything
from the rails console of the web container. You may want to do this in screen or using nohup as it could take a while to reindex, depending on the data size.