D3.2 Research Outputs Assessments v1

Bronnen

Type: 
Taal: 
Engels
Organisatie: 
Datum: 
Inhoudsopgave: 

Scope ; Executive Summary ; 1. Research Outputs Identification ; 2. Assessment Criteria ; 3. Results of Research Outputs Assessment - Year 1 ; 4. Presto4U Dataset ; 5. Conclusion and Future Work

Samenvatting: 

This deliverable from the Presto4U project assesses “research output” (RO) tools in the digital audiovisual preservation domain designed to help solve problems faced by the project’s “Communities of Practice”. Such output includes software, hardware and or a methodology that has potential for commercial take-up in the near future. The primary emphasis is on EU project research outcomes; the project hopes to address commercial developments in its second year.

 

Chapter 1 identifies the RO’s chosen for assessment. The choice was based on three things: the needs expressed during various Community of Practice exchanges that took place in the project’s first year, commercial “technology watch” results and project partner’s preservation needs. It goes on to describe a Presto4U tool, PrestoKAT, that allows semi-automatic identification of tool features and score them as to their “technology readiness level (TRL)”. The tools, classified into 4 broad categories (metadata mapping and validation, storage, quality assessment and preservation platforms and systems) are then described.

 

Chapter 2 describes the assessment criteria for each broad category of tools based on a project modified version of the ISO/IEC 25023 standards for assessing software quality of tools. What is assessed, the methodology and the functional tests carried out are presented along with their importance (mandatory, required, optional etc).

 

Chapter 3 presents the tool assessment results. Seven tools were assessed . The evaluation was carried by Presto4U partners JRS, IT Innovation, EURIX, RAI using their own hardware installations. Chapter 4 describes the Presto4U dataset used to carry out the tool assessment with the goal of creating a publicly available representative dataset for the assessment of AV preservation tools that hopefully can be offered for use with a common license by the end of the second project year.

 

In Chapter 5, the authors conclude by highlighting the main observations from their assessment exercise in year 1 and provide some insight into how the task will carry on in year 2 based on the lessons learned.

Beoordeling: 

This long report presents a lot of detail and as such is not real easy to get through; in addition, not all RO assessments were completed due to a variety of reasons. In fact, the deliverable is less valuable for its specific tool assessments than for its in-depth description of the kinds of criteria and metrics applicable in the quality assessment of audiovisual preservation tools. For those tasked with assessing tools to be used within an audiovisual digital repository, this report provides a valuable sort of “ISO customized” set of criteria/metrics specifically relevant to audiovisual digital repositories. For actual tool assessments it is probably best to wait for the “tool catalog” the project will deliver by the end of its second year.

Trefwoorden: 
Reviewer: 
Beth Delaney
Reviewdatum: 
2014