Showing posts with label data management requirements. Show all posts
Showing posts with label data management requirements. Show all posts

Saturday, 13 November 2010

Oxford Research Data Management Pages

The University of Oxford has launched the Research Data Management Website. This thematic site has been developed by Research Services in collaboration with OUCS and OULS as part of the EIDCSR Project.


The RDM website is designed to support researchers with their research data management activities and includes information about:

  • research funder requirements in the area of research data management
  • services available within the University to assist researchers in this area
  • guidance on how to produce a data management plan as part of a funding application
  • further sources of advice and online guidance, updates and news, and tools and training available to help.

Previously, web-based information about research data management was available from a number of sites across the University but it was felt that a single source of `signposting’ information would be a valuable resource for researchers from all subject disciplines at differing stages of the research cycle, increasing understanding of the benefits of improved research data management, as well as communicating the range of services available.

Friday, 7 May 2010

A new interesting project: Data Management for Bio-Imaging











A new data management project funded by JISC known as Data Management for Bio-Imaging has just created a wiki that will contain relevant information about the project.


The aim of the project is to generate better understanding and planning of data management for bio-imaging within the John Innes Centre

The project plans to document the data flows and infrastructure in the Coen Lab and the JISC Bio-Imaging service. In both cases they use sophisticated instruments such as light microscopy, CCD systems and confocal microscopy generating terabytes of imaging data.

To address their data management needs they are deploying an Open Microscopy Environment known as OMERO which features like:

- Managing and organizing
- Search&Browsing
- 3D Projection
- Metadata, annotation, tagging
- Share, Export, Import

In addition to this, they will train users, including post-docs, to use the system as well as defining strategies to handle user acceptance and encourage image processing.

This is an extremely interesting activity and we´ll surely keep a close eye.

Friday, 23 October 2009

EIDCSR technical analysis: from soft to hard

After having conducted the EIDCSR audit and requirements analysis exercise, we have started converting the high level requirements gathered into technical requirements. The idea is to produce a systems design document for a Systems Developer to start with the implementation. Howard Noble, from Computing Services, is leading this exercise for the next two months.

To start with the technical analysis, Howard and I have had a very fruitful meeting this morning. We have brainstormed ideas for a high level system design trying to identify the practical things that can be done to support the data management workflows of the research groups taking part in EIDCSR.


Using a board to produce a "rich picture" recording the processes we have encountered and our thoughts was extremely useful. We will now produce a "cleaner" version of this picture and bring it forward to key people in the research groups in a workshop. This will hopefully helps us to communicate what the project aims to achieve as well as getting feedback on the design so that researchers requirements drive any development .




Wednesday, 9 September 2009

Data audit and requirements analysis

One of the initial exercises to be conducted as part of the EIDCSR project was the audit and requirements analysis based on DAF to document the data practices and assets as well as to capture the requirements for tools and services of the research groups participating in the project. This exercise took place throughout the summer and the report describing the results will be available soon.

As I explained on a previous post, these research groups collaborate as part of a BBSRC grant to conduct research on ventricular architecture by using novel techniques such as Magnetic Resonance Imaging (MRI) and Diffusion Tensor MRI (DTMRI) and combine them with traditional histological techniques as well as with image processing with data registration and computational models for bio-mathematical simulation.

Their research workflow is well described by Gernot et. all (2009)* in the diagram below. It starts with the generation of complementary images stacks that are then processed in different ways to generate meshes that can be used for computational modelling of the heart.


The result of this complex process produces the following data outputs:
  • Histology data: large high resolution images produced by microscopes in the lab representing sections of a heart.
  • MRI and DTMRI data: stack of tiff images resulting from the raw data produced by the magnet in a lab.
  • Segmentation data: outputs resulting from applying image segmentation techniques to the histology and MRI data.
  • Mesh data: volumetric model produced from segmented data in a mesh generator.
  • Simulations: electrophysiological simulation using the mesh data and other input files that define the models and the parameters.
  • 3D heart atlas: representing an average representation of a heart ventricles obtained from the histology and MRI data.
And the research group requirements can be grouped under three themes:
  • Secure storage: all the data outputs presented above are stored on a combination of desktop computers and a project NAS system and researchers realize the need to keep the data safe by having appropriate and resilient back-up procedures.
  • Data transfer: the histology data are large and needs to be accessed by researchers within the groups and others.
  • Metadata: currently the provenance metadata for some of the data presented above is recorded in printed lab-books. This information is crucial when making the data available to others and it is required when publishing articles based on the data. In addition to this, it may be helpful to improve searching within the NAS system.
*Gernot Plank, Rebecca A.B. Burton, Patrick Hales, Martin Bishop, Tahir Mansoori, Miguel O. Bernabeu, Alan Garny, Anton J. Prassl, Christian Bollensdorff, Fleur Mason, Fahd Mahmood, Blanca Rodriguez, Vicente Grau, Jürgen E. Schneider, David Gavaghan, and Peter Kohl Generation of histo-anatomically representative models of the individual heart: tools and application Phil Trans R Soc A 2009 367: 2257-2292.

ShareThis