Evaluation Coding

Johnny Saldana (2009, p.3) suggests that a code in qualitative research “is most often a word or short phrase that … assigns a summative, salient, essence-capturing … attribute for a portion of language-based or visual data. The data can consist of interview transcripts, … field notes, journals, documents, literature, artifacts, photographs, video, websites, email correspondence and so on. The portion of data to be coded … can range in magnitude from a single word to … an entire page of text …“.

In the model below, Saldana (2009, Fig 1.1) shows that codes can be categorised and that categories can be mapped into themes or concepts and fitted within theory. As drawn by Saldana, the model shows an inductive process where theory is developed from code. A deductive process is equally valid, whereby code is developed from theory.

codes-to-theory-model_saldana-2009_fig-1-1

Johnny Saldana (2009, p.97) draws on Rallis & Rossman to describe evaluation coding as “the application of non-quantitative codes onto qualitative data that assign judgements about the merit and worth of programs or policy (Rallis & Rossman, 2003, p.492)“.

He draws on Patton to say that program evaluation is “the systematic collection of information about the activities, characteristics and outcomes of programs to make judgements about the program, improve program effectiveness and/or inform decisions about future programming. Policies, organisations and personnel can also be evaluated (Patton, 2002, p.10)“.

To Rallis & Rossman, evaluation data describe, compare and predict. Description focuses on patterned observations or participant responses of attributes and details that assess quality. Comparison explores how the program measures up to a standrd or ideal. Prediction provides recommendations for change, if needed, and how those changes might be implemented.” (Saldana, 2009, p.97) (my emphasis)

Saldana says (p.98) that evaluation coding is “appropriate for policy, critical, action, organizational, and (of course) evaluation studies, particulary across multiple sites and extended periods of time“. He suggests that other coding methods “can be applied to or supplement evaluation coding” and lists magnitude coding, descriptive coding, values coding and grounded theory coding methods for this purpose. He draws on Pitman & Maxwell to observe that evaluation coding “is also customised for specific studies since the coding system must also reflect the questions that initiated and structured the evaluation in the first place (Pitman & Maxwell, 1992, p.765)“.

Finally, Saldana (p.101) quotes Stake to note that “all evaluation studies are case studies (Stake, 1995, p.95)

References:
PATTON, M. Q. 2002. Qualitative research and evaluation methods, Sage Publications, Inc.
PITMAN, M. A. & MAXWELL, J. A. 1992. Qualitative approaches to evaluation: Models and methods. The handbook of qualitative research in education, 729, 770.
RALLIS, S.E and GROSSMAN, G.B. 2003. Mixed methods in evaluation contexts: a pragmatic framework. Handbook of Mixed methods in social & behavioral context, Eds. A. Tashakori a Ch. Tedllie. London: Sage.
SALDAÑA, J. 2009. The coding manual for qualitative researchers, Sage Publications Ltd.
YIN, R. K. 2009. Case study research: Design and methods, Sage publications, INC.
STAKE, R. E. 1995. The art of case study research, Sage Publications, Inc.

Evaluation research

If you want to understand ‘evaluation research’, a good way to begin is by reading Ranjit Kumar (2010, ch.18). His model for the concept of evaluation looks like this:

evaluation_concept1

The concept of evaluation (Kumar, 2010, p.325)

Kumar suggests that evaluation can be approached from two perspectives (p.328): “the focus of the evaluation; and the philosophical base that underpins an evaluation“. The types of evaluation that arise from each perspective are shown in the figure below. He emphasises that the perspectives are not mutually exclusive, explaining this with the example of a study which determines the impact of a programme by asking what its clients perceive its effects to have been on them. This study could be classified as an impact / outcome evaluation from the ‘focus of the evalution’ perspective or, equally, as a client-centred evaluation from the ‘philosophical’ perspective.

evaluation_perspectivesPerspectives in the classification of evaluation studies (Kumar, 2010, p.329)

From the perspective of the focus of the evaluation, there are many designs or methods of evaluation. For example in an impact assessment evaluation Kumar identifies (p.338) the following commonly used designs:

  • after-only design;
  • before-and-after design;
  • experimental control design;
  • comparative study design;
  • reflexive control design;
  • interrupted time-series design (see model below);
  • replicated cross-sectional design

interrupted_time_series_designInterrupted time-series design (Kumar, 2010, p.340)

From a philosophical perspective, Kumar argues (p.342) that “there are no specific models or methods of evaluation. You use the same methods and models [as for the other perspective] but the required information is gathered from different people or aspects …”

Kumar provides an informative overview. For greater detail and a better appreciation of these perspectives he recommends Stufflebeam & Shinkfield (1985).

Reference:
KUMAR, R. 2010. Research methodology: A step-by-step guide for beginners, Sage Publications Ltd.
STUFFLEBEAM, D. L. & SHINKFIELD, A. J. 1985. Systematic evaluation: A self-instructional guide to theory and practice, Kluwer-Nijhoff Boston.

The project coordinator’s perspective

Ika et al (2010) present research findings which “suggest that project success [in the international aid industry] is insensitive to the level of project planning efforts but a significant correlation does exist between the use of monitoring and evaluation tools and project profile, a success criterion which is an early pointer of project long-term impact.

IKA, L. A., DIALLO, A. & THUILLIER, D. 2010. Project management in the international development industry: The project coordinator’s perspective. International Journal of Managing Projects in Business, 3, 61-93.