ro2019

Logo

Workshop on Research Objects 2019

View the Project on GitHub ResearchObject/ro2019

Peer Review of RO-9

Review 1

Quality of Writing

Is the text easy to follow? Are core concepts defined or referenced? Is it clear what is the author’s contribution?

Research Object / Zenodo

URL for a Research Object or Zenodo record provided?   Guidelines followed?   Open format (e.g. HTML)?   Sufficient metadata, e.g. links to software?   Some form of Data Package provided?   Add text below if you need to clarify your score.

Overall evaluation

Please provide a brief review, including a justification for your scores. Both score and review text are required.

This abstract presents the motivation, design and initial evaluation of DSDB tool for handling the provenance, versioning, de-duplication, history, and query of dynamic databases used in the Allen Institute for Cell Science. The tool allows researchers to share and use datasets without duplication, but also to make a series of validations (e.g., verify existence of filepaths, wrong input values, etc.).

According to the evaluation, the tools has proved to be beneficial, in terms of efficiency gains in storing, versioning and documentation, but the main limitation at the moment is slower retrieval times, especially for large databases (which will be investigated in the future work).

Perhaps the most interesting for this workshop is the conceptual model of DSDB, i.e., the DSDB database schema, which has some common concepts with the research object model. In fact, DSDB can be seen as a different type of implementation and approach to describe and associate datasets and few related artefacts, so that they can be more easily shared and exchanged. Thus, although the core of DSDB is the dataset, it also includes elements for algorithms, executions and annotations.

As all of these, plus may others may be part of a research object, it would be good to discuss and investigate possibilities, and potential benefits (if any) for the Allen Institute, for connecting or reusing the research object concept and approach more directly in the the future work.

Taking the research at the core, datasets with their versions, along with other relevant artefacts, could be encapsulated, discovered, shared and reused more easily with a richer set of machine-readable relations and annotations. The workshop could provide a good place for discussing about this.

Review 2

Quality of Writing

Is the text easy to follow? Are core concepts defined or referenced? Is it clear what is the author’s contribution?

(delete as appropriate)

typo in second sentence “produce” -> “produces”

Research Object / Zenodo

URL for a Research Object or Zenodo record provided?   Guidelines followed?   Open format (e.g. HTML)?   Sufficient metadata, e.g. links to software?   Some form of Data Package provided?   Add text below if you need to clarify your score.

Overall evaluation

Please provide a brief review, including a justification for your scores. Both score and review text are required.

This paper provides a case study in computational modelling in cell science and a proposed toolset, which is evaluated.

I believe that the scenario is more broadly applicable as it addresses the phase of the lifecycle where data is live, in use and mutable (described as “no final version” - I really like this characterisation). The main contribution for the workshop audience is perhaps the way this system addresses the handling of provenance.

Although it doesn’t discuss Research Objects explicitly, the case study defines requirements, describes tools and performs an evaluation which directly inform a discussion about the research objects approach - and especially implementation.

I believe this to be a useful contribution to the workshop because the challenges addressed in this case study are important and not yet a solved problem in the Research Objects world.