Methodology for the Environmental Assessment – shared

As discussed during the JISCMRD Programme Launch in Nottingham, projects thought it would be good to share what each one is doing regarding the Data Asset Framework (DAF) and/or gathering user needs, in order to see if it can be used/re-used by the other projects. We have been describing Kaptur’s approach, and the rationale for this approach, in a series of blog posts. Previous blog posts on this topic are available by searching the tag ‘environmental assessment‘, the two most relevant of these are: ‘Methodology for the Environmental Assessment‘; and ‘Environmental Assessment interview questions‘. Feedback is welcomed.

Kaptur is not using DAF, although we have considered what can be learned from the DAF approach. DAF provides institutions with a means to:

“identify, locate, describe and assess how they are managing their research data assets”

DAF Screencast

A fully comprehensive website is available:http://www.data-audit.eu/. This includes the DAF Implementation Guide (PDF).

DAF recommend that you begin by deciding what you mean by ‘data assets’, for example they mention:

“numerical data, statistics, output from experimental equipment, survey results, interview transcripts, databases, images or audiovisual files, amongst other things”

DAF Implementation Guide (PDF) pp.7

Our initial probing interviews and research in the area of visual arts data tell us that we are not ready yet to pin this down to specific assets, although potentially all of the above could be included. One of the issues arising out of the probing interviews was the concept of what ‘research data’ was in the first place. We decided to undertake formal interviews to gather detailed qualitative information that could better inform Kaptur and help to build relationships with visual arts researchers at the four institutions. This approach, whilst not following DAF exactly, did also include questions that enabled information to be gathered about the types of data asset that researchers were producing and how they were being managed.

The scope of the Kaptur Environmental Assessment report has been defined in our methodology, which we make available for use and re-use:

Following the imminent publication of our report, the next stage is to establish working groups in each institution as a way to both continue the dialogue with the visual arts researchers and also to encompass a wider range of stakeholders. We have been looking at the CARDIO assessment tool, particularly as this is designed to “improve communication and understanding” between stakeholders. However this is normally used following a more formal data audit procedure, and therefore we may adapt the approach of CARDIO to suit our timescales and circumstances. For example there is a clear benefit to holding face-to-face meetings with all the stakeholders and this will take priority, however it may be that questions or elements of the CARDIO tool can be used to inform the agenda for these meetings. This is yet to be discussed, and will be raised at the Steering Group meeting on Monday as part of the Implementation Plan.


#jiscmrd programme launch – Commonalities

Simon Hodson, JISCMRD Programme Manager, has asked all projects to do a short blog post about commonalities.

View from the National College for School Leadership, Nottingham. Photo: MTG

Kaptur has previously highlighted the commonalities with the first round of JISCMRD programme funding (2009-11) and how we plan to use training materials produced by Project CAiRO and also have spent time looking at JISC Incremental. The commonalities identified so far from the JISCMRD Programme launch are:

1. Disciplinary

The session on the last day put a few of the projects together in an ‘Arts and Humanities’ group. Some of the projects that are particularly relevant to us are:

2. Pilot infrastructure

Kaptur is one of 17 projects in Strand A of the JISCMRD programme (Simon Hodson’s blog post on this) – we are therefore seeking to both learn lessons from more experienced projects in this strand (who had previous JISCMRD funding or links) and also find out how similar pilot projects are approaching things.

3. Approach

  • During the Programme Launch there was a lot of talk about DCC tools including DMP Online, DAF, and CARDIO – look out for a future blog post about our environmental assessment methodology.
  • Also keen to learn lessons from the MaDAM project, which is now MiSS (MaDAM into Sustainable Service) – http://www.miss.manchester.ac.uk/ (great URL!)
  • Research360@Bath looks good too!

Please let me know if I have overlooked any projects that are relevant to Kaptur – we are interested in engaging with other projects and welcome feedback!

Links


#jiscmrd – Kaptur’s post on benefits and metrics #KRDS

Simon Hodson, JISCMRD Programme Manager, has asked all 18 month JISCMRD projects to write a blog post about the key expected benefits that each project will achieve, and what metrics we will use to evidence these at the end of the project.

Links

Following a presentation by Neil Beagrie, Director of consultancy at Charles Beagrie, the JISCMRD projects were provided with a ‘Summary of Benefits Identified by the RDMI Projects’ and a ‘Summary of Metrics Identified by the RDMI Projects’. We were invited to select three benefits and then match these up with the appropriate metrics, making sure to include both quantitative and qualitative metrics for each benefit. I would like to emphasise that the following has not been discussed within the project team yet and is subject to confirmation.

Benefits

  1. Sustainability of research data infrastructure.
  2. Change to user practices.
  3. Mitigating organisational risks.

Metrics

  1. By each institution creating and approving its own Business Costs and Sustainability plans; the ultimate proof is in the longevity of the research data infrastructure. Qualitative data will be gathered through the Steering Group meetings which will include high-level senior staff across the four institutions. Quantitative data may include percentage or estimated cost savings/efficiencies for central services and/or departments.
  2. By taking a snapshot of existing practice at the four institutions through the Environmental Assessment report and then through maintaining user engagement throughout the project and taking snapshots at key stages to monitor progress. Qualitative data will be gathered through the interviews undertaken as part of the Environmental Assessment report, and through ongoing engagement e.g. through working groups and/or focus groups. Quantitative data will be gathered in the following ways: online questionnaires and/or feedback forms to record the impact on working practice of the project, these would be undertaken at key points e.g. we are planning an online survey in January, and would also gather feedback after training events; if online training materials are created then usage statistics will be gathered.
  3. By taking a snapshot of existing practice at the four institutions through the Environmental Assessment report and then through maintaining user engagement throughout the project and taking snapshots at key stages to monitor progress. Qualitative data will include a range of stakeholder examples of improved risk management e.g. organisational practice before Kaptur and how this has changed during/afterwards. Quantitative data may include a percentage improvement in routine back-up of data, and/or a percentage improvement in research data management awareness and policies/systems.

Neil emphasised that when using the KRDS tools it is a good idea to do initial work by one individual and then work this up in a project team context. Therefore the points raised in this blog post will be discussed at our next meeting in early January and possibly again at the Steering Group meeting. Neil also mentioned that it was important to adapt the tools to your project needs. From reading the documentation I am also aware of the need to start with the benefits framework tool prior to moving on to the value chain and benefits impact tool. The work done by other projects giving example worksheets is really useful, in particular the UK Data Archive and the Archaeology Data Service. Reference: Report and Presentations from the JISC Digital Curation/Preservation Benefits Tools Project Dissemination Workshop


#jiscmrd Programme Launch – Tips from Brian Kelly

Links (via @briankelly)

Tips from Brian Kelly

  1. Instead of having a separate website and blog, integrate the web pages into the blog or vice versa.
  2. Think about what will happen to the blog if you leave or what happens after the end of the project.
  3. Use the ‘about’ page to really say how you are going to use the blog e.g. your blogging practices and approach and why – see Blog Policies.
  4. Write a ‘Communications Strategy’.
  5. Engage with high impact channels in the following ways: think of a human interest angle to your story; be proactive in seeking opportunities e.g. if there is a relevant news story that your project relates to in some way.
  6. Write a ‘Press Release from the Future’ as a way of setting where you want to be and then working out how you will achieve it.

How we are planning to use these for Kaptur

  1. Plan to integrate website/blog; our Technical Manager is back in mid-January so may ask him to set up a wordpress.org via our website instead of the current wordpress.com (will require re-directs obviously and not ideal so will think about it first).
  2. We are working towards a Data Management Plan for our project’s research data including the blog, a blog post about this will be forthcoming.
  3. We have now updated our About page to include a blog policy.
  4. Although the JISC Project Plan has various plans within the document such as a Dissemination Plan (ours is available here: Kaptur Project Plan (PDF)), Brian was talking specifically about how to target those channels that are high impact e.g. Times Higher Education, radio, and TV. See point 5 for tips.
  5. The visual arts researchers that we are engaging with at each of the institutions have the potential to address these points, subject to the research criteria we are applying of ‘informed consent’. We will look out for these opportunities.
  6. This is on my ‘to-do’ list e.g. ours may include ‘having an article in Times Higher’

On a side note, I chose to use SlideShare for Kaptur based on reading Brian Kelly’s blog post about SlideShare, which I then blogged about for a different project back in May: Identifying impact with SlideShare

I have also begun using Storify after Brian showed me how the Event Amplifier uses this. A Storify has been created for this week’s Kultivate event on Linked Data.


#jiscmrd – Day 2 – session report on ‘identifying and supporting researcher requirements’

This session was presented by Jonathan Tedds, Senior Research Liaison Manager (IT Services) at University of Leicester and Meik Poschen, Manchester eResearch Centre (MeRC), Requirements and Evaluation Lead from the University of Manchester. Both presentations were interactive, so the key points in this blog are a mixture of points from their slides and from general discussion. The group raised the issue of commonality in terms of approaches and methodologies to identify researcher requirements; an outcome for the programme could be projects sharing what they are doing in this area. The Sustainable Management of Digital Music Research Data project have written a few blog posts about their methodology.

Links

MaDAM Key points

  • researchers themselves were a useful source of information but it was also important to talk to experimental officers, other PIs, other research groups in order to view the institution’s wider picture
  • for user group scoping try and select groups where there could be mutual benefit – researchers should also get benefit from taking part and giving their time
  • they used an iterative approach; developers and users had a lot of meetings; the project team observed researchers’ work practice; then involved them with the evaluation of the technical system and soft infrastructure
  • funder requirements are now clearer – over the course of an 18 month project things change with researchers and funders
  • one of the huge benefits of MaDAM was awareness raising – at the beginning it may have initially been perceived as increasing researchers’ workload but this perception changes over time as funder requirements have changed – now data management is viewed favourably
  • cultural change is needed, high level institutional support is crucial too
  • importance of personal contact and observation of researchers’ day-to-day research practice
  • some questions:
    how much storage will researcher need over time? how long has data to be kept in an active or easy accessible state for re-use or sharing? how will the relationship between new policies and research practices develop? how will dissemination practices and hence scholarly communications be effected?

The group felt that although researchers were getting more engaged in this area, the research councils need to do more to enable cultural change with researchers i.e. there is a sense that the research councils have panicked the institutions rather than the researchers. One institution mentioned that a major grant was rejected on the grounds of the technical appendix; the group agreed that researchers need to know about this.

Research Data Management at Leicester – key points

  • professorial level champions are good, but also good to get those who are more technically engaged within researcher groups – engaging at different levels
  • in the past a project like HALOGEN would have used in-house IT expertise, but would be a one-off solution; they wanted to be able to re-use the infrastructure for future inter-disciplinary and inter-institutional projects
  • Tedds raised the importance of ensuring that requirements analysis is iterative – continuous engagement
  • challenges included: retro-fitting data to make it interoperable (versioning and provenance issues)
  • key thing is to provide something that is less effort for researchers as it is more standardised
  • there was a great slide on ‘direct benefits’, this included 1.3 million pounds worth of funding from the Leverhulme Trust
  • another great slide on ‘indirect benefits’, which included costs avoided
  • Tedds recommended recognising the different cultures and mindsets – the research liaison role helped with this
  • top tip: grab researchers’ attention when they are applying for funding – system now includes some checkboxes they have to select when applying for funding e.g. ‘do you need support?’ – this also provides a record of demand
  • Leicester’s research computing management group will be chaired by the Pro Vice Chancellor for research so this will feed back to senior management

There was some discussion about the use of SharePoint in the group – the educational pricing for SharePoint in the Cloud is not going to be released now until next Summer.

The group talked about the issue of researchers’ understanding of what research data is; terminology and disciplinary challenges. An example was given of a ‘fabrics database’ which was accepted by an institution – however there was a misunderstanding about what constituted a ‘database’ – an articulated lorry of wool fabric samples turned up = ‘the database’.

We need to provide different options for training – not just face-to-face – as time is a big issue even if researchers recognise they need it. It may be that we could also offer a component to plug into existing courses rather than offering something totally new.

The group discussed the importance of extending training beyond researchers themselves in order to provide consistency across the board irrespective of which department the researchers go to – a common language – a shared agreement of what we are all up to in supporting researchers.