AgINFRA Evaluation Planning

From agINFRA

Jump to: navigation, search

Contents

agINFRA Evaluation Plan


Content, Context, Process (CCP) model

The general agINFRA evaluation framework includes the Content, Context, Process (CCP) model which has been developed and validated since the 1980s (Stockdale and Standing 2006). The CCP model supports evaluators in identifying stakeholders’ communities and addressing the practical issues of an evaluation, by pointing out important internal and external factors that will affect the evaluation process and outcomes.

As factors that may influence the successful implementation and execution of the evaluation process, the CCP model distinguishes between

  • the content of an evaluation, taking into consideration what is measured,
  • the context in which the evaluation takes place, and
  • the process of the evaluation, which is seen as a social process.

The model also includes questions about what is being measured and evaluated, by whom, for what purpose, how and when evaluators are engaged. The interaction and linking between CCP elements allows for taking account of the multiple dimensions involved in the evaluation process. (Stockdale and Standing 2006; Irani and Love 2008)

CCP Model.png

Evaluation is essentially a practical task that requires a lot of organisation and communication among the participants. Experience has shown that, beyond proper methods used, there are better and worse ways of conducting an evaluation and that the likelihood of a successful evaluation can be increased through careful attention to organization and communication. The CCP model suggests considering in the initial steps of an evaluation process the following questions:

What are the background and the context of the evaluation?

agINFRA is an Integrated Infrastructure Initiative (I3) project that develops a scientific data infrastructure (e-infrastructure) for agricultural research communities, aimed at facilitating the use of services and tools for data management, sharing and re-use. agINFRA also addresses existing obstacles concerning the open access to scientific information and data in agriculture, and aims to improve the preparedness of scientific communities to face and exploit the abundance of data that is (or will be) available for agricultural research.

What values have to be taken into account?

A successful evaluation must take account of the values, expectations and standards of the stakeholder and user communities. The point has been made that unless an evaluation takes on board the fundamental values of the communities for which the project outcomes are intended, they are likely to be dismissed as “counter-cultural”. This is not to argue that these values must never be challenged. Rather, it is to emphasise that if using the agINFRA infrastructure and services requires a change in the practices of the intended user communities such change must be managed carefully and the new practices effectively supported by the services, tools and data provide.

What is being evaluated?

One of the most important parts of the evaluation is the definition of what aspects of the agINFRA infrastructure and services are addressed and what precisely is going to be evaluated. During the evaluation process also some definitions may need to be changed, hence, mechanisms are necessary for agreeing upon such changes and adjusting the evaluation guidelines, methods and tools accordingly. The evaluation aims at assessing the agINFRA infrastructure, components and services with focus on how effectively they help making data from agricultural and related fields of research accessible and re-usable. The agINFRA infrastructure, components and services are a set of elements whose complexity will be hidden to most users, while supporting user categories such as research centres and projects, data providers (repository managers), and aggregators and service providers (e.g. portals) in the area of agricultural research.

Who are the evaluation stakeholders?

The agINFRA evaluation methodology adopts a stakeholder-focused approach involving the relevant user groups in the evaluation process. The various points of view and expertise they can bring to the evaluation are likely to be very helpful. The involvement of the stakeholders can also add to the overall recognition and participation in agINFRA activities, including testing and giving feedback. Furthermore, such community support during the implementation and evaluation phase can contribute to understanding early on the innovative impact that enhanced management, sharing and collaborative use of various and large volumes of data can bring to the sector.

There are different notions of what stakeholders are and about their role in a project like agINFRA. A useful approach is first to distinguish between direct and indirect stakeholders:

  • Direct stakeholders in the design, implementation and evaluation of agINFRA’s infrastructure, tools (components) and services are the envisaged users and their organizations, the latter in view of the context of use and additional organizational requirements that should be considered;
  • Indirect stakeholders are other parties who take an interest in the development and provision of the agINFRA infrastructure and services (e.g. national and European decision makers in research policies and infrastructures, research associations and others), or may be indirectly affected by their future use (e.g. existing organizations that offer individual services also covered by agINFRA).
Evaluation Stakeholders.png


agINFRA Evaluation Methodology

Evaluation Methodology.png

The evaluation process will address all technical elements of agINFRA, i.e. Infrastructure, Integrated Services and individual Components, take place through different phases and involve different stakeholder / users groups. The evaluation will include three types of evaluation activities

  • Internal technical testing and validation, which is carried out regularly, especially after major updates of technical elements or in preparation of the internal and public demonstrators.
  • User- focused controlled demonstrators / trials and evaluation,
  • User-focused public demonstrators / trials and evaluation,


The critical factors for successful implementation and execution of the evaluation process include, but are not limited to, the following:

  • Taking account in the evaluation design the insights from the user needs and requirements analysis already carried out in the project;
  • Building on the solid expertise of the project partners in planning and executing evaluation tasks in previous projects;
  • Defining clearly the success criteria for the evaluation results;
  • Incorporating any valuable feedback on the evaluation plan, criteria, methods and techniques;
  • Establishing practical guidelines on how the evaluation process should be conducted and how the results should be collected and reported.

User-focused demonstrators and evaluation

The demonstrators and trials to the end users intent to involve the participating repositories, portals and end-users to receive feedback on the integrated agINFRA services in order to allow for user evaluation and possible improvements of the services and underlying components.
User-focused demonstrators and evaluation have been divided in two phases:


Controlled demonstrators / trials & valuation

in this first round, demonstrators/trials at the involved partner sites will take place with user groups that are invited to try the various services and provide feedback. At the sites from 10 to 50 and more users will be engaged, the number depending on the target community and types of services deployed, e.g. services for repository/IT manager or end-user focused services. For capturing the user feedback on the services, session scripts will be used that, for example, comprise an initial group presentation with moderated discussion, guided walk-through with recording of Q&A, individual hands-on use with observation checklist, etc.

Partner responsible for organizing controlled pilot trial - Phase A AK, ESPOL, FAO, UAH, FAO, INFN, OU, SRFG, CRA, Univ. Belgrade, ISI, CAAS-All
Partner responsible for providing the evaluation tools AK
Participants / user evaluators 10-50 members of the relevant user groups for the agINFRA integrated service and tool
Responsible partner for collecting and analysing feedback AK
Date of implementation M25 - M26


Public demonstrators/trials and evaluation

With the experience gained during the first period, and after the adaptation of the deployed services, if required, the second round of demonstrators /trials is started: This round will comprise public workshops in the context of international conferences and other major events, with a less controlled validation format but overall more users. Emphasis in this round will also be on selected services that may require special attention for possible improvements.

Partner responsible for organizing controlled demonstrators - phase B FAO, AK, ESPOL, UAH
Partner responsible for providing the evaluation tools AK
Participants evaluators Users of CIARD RING, AGRIS / OpenAgris, Organic.Edunet and Green Learning Network
Responsible partner for collecting and analysing feedback AK
Date of implementation M27


User-focused public demonstrators and evolution (“Open Days“)

The second round of the user-focused demonstrators / trials will be public, hands-on testing sessions at international conferences or other larger events that will be organised (“Open Days”). The duration of the public demonstrators will be three (3) months where each pilot trial partner will have to organise and conduct such an agINFRA evaluation event involving several relevant user groups.
The hands-on sessions will have a less rigid format and involve many more users than phases A and B of the controlled trials (30-50 new users for each event), but be organised to allow for solid external evaluation of the agINFRA Integrated Services and Components.

Partner responsible for organizing public demonstrators - Open days AK, FAO, SZTAKI, 21c & OU, CAAS-All, IPB & Univ. Belgrade, INFN & CRA, ISI
Partner responsible for providing the evaluation tools AK
Participants evaluators 30-50 experts of relevant areas for each agINFRA integrated service and tool
Responsible partner for collecting and analysing feedback AK
Date of implementation M31-M33


Technical Testing and Validation

The technical testing and evaluation focuses on the underlying infrastructure and services (Grid, Cloud) and the supported agINFRA Integrated Services and Components. The overall goal is to systematically collect and analyse testing results in order to allow for refinement and customization of the infrastructure tools and services that are deployed by the agINFRA Integrated Services and Components or in support of them.
The technical testing and evaluation will be conducted internally among the technical partner.

Unit test of individual agINFRA services

  • Background and cooperation with WP3 (and WP4/WP5),
  • Methods and tools: related infrastructure features,
  • Test cases, reports and their formats.

Smoke and integration tests of integrated agINFRA services

  • Background and cooperation with WP3 and WP6,
  • Methods and tools: related infrastructure and gUSE features,
  • Test cases, reports and their formats.

Stress test (benchmarking) of integrated agINFRA services

  • Background and cooperation with WP3 and WP6,
  • Methods and tools: related infrastructure and gUSE features,
  • Test cases, reports and their formats.

Acceptance test of integrated agINFRA services

  • Background and cooperation with WP2 (and WP4/WP5),
  • Methods and tools: user survey, etc.
  • Test cases, report and their formats.


Reporting

All reporting from the evaluation events can be found here

Personal tools