A Capability Maturity Model for Research Data Management
CMM for RDM » 1. Data Management in General » 1.4 Process Assessment

1.4 Process Assessment

Last modified by Arden Kirkland on 2014/06/06 12:53

1.4 Process Assessment

Process Assessment includes Measurement and Analysis and Verifying Implementation. Measurement and Analysis describes the need to measure the process and analyze the measurements, and typically includes examples of the measurements that could be taken to determine the status and effectiveness of the Activities Performed. Verifying Implementation describes the steps to ensure that the activities are performed in compliance with the process that has been established, and typically encompasses reviews and audits by management and quality assurance.

Process assessment involves establishing measures and control of the effectiveness and quality of data management so that the RDM processes are continuously improved.  This key process area is based on the activities performed that are well defined as the result of level-3 maturity in the RDM capabilities. The fact that a research project or organization (group, institution, or community) is capable of conducting process assessment signifies a level-4 capability maturity, i.e., the managed level. It is important to point out that a higher level of capability maturity must have achieved the previous level of maturity because the previous level of maturity is the foundation for achieving the next level of capability maturity.

The first step in process assessment is to set quantitative quality goals for both RDM outcomes and processes. Effectiveness and quality are measured for important RDM process activities. Identifying these measures is an intensive process and better conducted across all projects as part of an organizational measurement program. In other words, effectiveness and quality measures tend to be project-neutral and should be able to be applied to all projects in process assessment for RDM.

The second step in process assessment focuses on continuous process improvement. The effectiveness and quality measures established through the first step will be used to identify weaknesses and strengthen the process proactively, with the goal of preventing the occurrence of defects. Data on the effectiveness of the RDM process is used to perform cost benefit analyses of RDM.  

There is very little available in the literature to generalize the characteristics of level 4 and level 5 of capability maturity in RDM. The measurement and quality management for RDM is therefore defined in terms of analogy to the original CMM (Paulk et al., 1993).

1.4.1 Measurement and Analysis

The goal of RDM varies because the nature and characteristics of research types and data differ from discipline to discipline. Data flows and stages in field observations and lab experiments will be different from those in computer simulations or computational intensive types of research, for example. The involvement of researchers and data professionals in data flows and stages is also different, e.g, data collection during a field visit will be usually conducted by researchers while datasets ready for curation are handled by data mangers or librarians. The measurements for process assessment should maintain a focus on effectiveness and quality while recognizing these differences and complexities. The following therefore is targeted to establishing the measurements regardless who (researchers, data staff, or librarians) perform it:

  • The amount of effort that went into the process, e.g., how many redundant runs were performed to complete the processing.
  • Time spent on a task, e.g., how long it took to verify/check data, code data, or transform data.
  • Presence (or absence) of process data collection: when data about process effectiveness is collected on the spot, it is easier to do than after the fact. It is tedious to do it afterwards and the data can easily become inaccurate. 
  • Data points produced: e.g., number of survey responses generated, number of data frames segmented. 

Measurements can be constructed from the perspective of input, output, and throughput, or from the perspective of workflows. The  amount of effort, for example, can be considered as an input measurement, while data points produced would be an output measurement. Effectiveness is getting things right. Process measurements can help to identify problems, especially the causes of the problems. If you observe the missing data is high, then it makes sense to look for what caused the missing data. 

1.4.2 Verifying Implementation

According to the original CMM, "Verifying Implementation describes the steps to ensure that the activities are performed in compliance with the process that has been established. Verification typically encompasses reviews and audits by management and software quality assurance" (Paulk et al., 1993, p. 38). Verifying implementation in the context of RDM focuses on reviews and audits of the key processes areas against the established policies and procedures (which are mainly reflected in the commitment to perform, ability to perform, and activities performed). The goal is to identify whether there is any weakness in the process and how it can be strengthened.

Rubric

 Rubric for 1.4 - Process Assessment
Level 0
This process or practice is not being observed 
No steps have been taken to establish procedures for measurement, analysis, or verification of the research process in general
Level 1: Initial
Data are managed intuitively at project level without clear goals and practices 
Measurement, analysis, or verification of the research process in general have been considered minimally by individual team members, but not codified
Level 2: Managed
DM process is characterized for projects and often reactive 
Measurement, analysis, or verification of the research process in general have been recorded for this project, but have not taken wider community needs or standards into account
Level 3: Defined
DM is characterized for the organization/community and proactive 
The project follows approaches to measurement, analysis, or verification of the research process in general that have been defined for the entire community or institution
Level 4: Quantitatively Managed
DM is measured and controlled  
Quantitative quality goals have been established including measurement, analysis, and verification of the research process in general, and both data and practices are systematically measured for quality
Level 5: Optimizing
Focus on process improvement  
Processes regarding measurement, analysis, or verification of the research process in general are evaluated on a regular basis, and necessary improvements are implemented

References


Paulk, M. C., Curtis, B., Chrissis, M. B., & Weber, C. V. (1993). Capability Maturity Model for Software, Version 1.1 (No. CMU/SEI-93-TR-024). Software Engineering Institute. Retrieved fromhttp://resources.sei.cmu.edu/library/asset-view.cfm?assetID=11955

<--Previous Page / Next Page -->

Created by Jian Qin on 2013/06/11 08:54

XWiki Enterprise 5.1-milestone-1 - Documentation