dark overlay
nav button USDA Logo

FSIS

Web Content Viewer (JSR 286)

Actions
Loading...

Web Content Viewer (JSR 286)

Actions
Loading...

Web Content Viewer (JSR 286)

Actions
Loading...

Web Content Viewer (JSR 286)

Actions
Loading...

Web Content Viewer (JSR 286)

Actions
Loading...

Evaluation Design

Evaluation Questions 
The first step in any project is to develop a plan for the work to be done. The plan for an evaluation project is called a "design" and is a particularly vital step to provide an appropriate assessment. A good design offers an opportunity to maximize the quality of the evaluation, helps minimize and justify the time and cost necessary to perform the work, and increases the strength of the key findings and recommendations by ensuring that threats to valid results are minimized. When you wish to have your program evaluated, be prepared to engage in this planning process to ensure that your questions will be answered and your needs met.

An evaluation design consists of the evaluation questions under study, the methodological strategies for answering these questions, a data collection plan that anticipates and addresses problems that may be encountered, an analysis plan that will ensure that questions are answered appropriately, and a product description (usually a report). Taking the time to adequately define the evaluation questions is perhaps the most important task, one the evaluator will need to perform with the client to ensure that the client's needs and concerns are met. Selecting an appropriate methodological approach is probably the most important scientific task the evaluator will perform, taking into consideration resource and time constraints as well as scientific issues. Providing a product description shows the client what to expect from the evaluation and ensures that results will be useful. Each of these sections is expanded upon below.

Evaluators help clients develop the "correct" questions for study, because how questions are posed has immense implications for the evaluation approach taken, the data collected, etc.-in other words, for the entire evaluation design. This is so because evaluators want to properly answer client questions. Therefore, evaluators will work with clients to ensure that the wording of questions accurately reflects what the client really wishes to know. To ensure optimal results, you, as the client, must devote time during this phase of the evaluation.

Methodological Strategies
There are various general types of evaluations to meet one or more basic client needs. "Normative" evaluation aims to determine the extent to which programs are implemented in the way they were meant to be; "process" evaluation aims to describe how the program is actually functioning; and "outcome" or "impact" evaluation aims to assess what effect the program had. Evaluators also use the terms "formative" versus "summative" evaluation to refer to work that focuses on forming/planning/improving a program, versus assessing the end result or summary effects of the program.

Evaluation questions will fall into these types of categories and evaluators will ensure that the methodological strategy selected will allow an answer based on the type of question. For example, merely describing the program implementation requires a simple, less scientifically rigorous approach than attempting to claim that the program had a certain effect on the intended targets. The latter requires a method that assesses changes over time, and perhaps comparisons among targets or control groups who did not participate in the program. Evaluators are trained to recognize what sort of approaches various types of evaluation questions require.

Data Collection and Analysis
From the evaluation questions, the evaluator will determine the kinds of information needed, the sources of this information (e.g., employees, customers, clients), methods of collecting the information (e.g., questionnaires, interviews, observations), and the timing and frequency of data collection. All the while, the evaluator will keep in mind the resources available to collect information, and the time period in which this information is needed, adjusting the plan accordingly. To be truly efficient, any plan will require only the work actually needed to complete a project. So it is with the data collection and analysis sections of the evaluation plan. Only those data needed to address the evaluation questions should be collected and a plan will be ready to manipulate and use those data. You, as the client, would not be expected to devote time to this phase.

Product
Not only will evaluators know exactly what data are needed and be ready to analyze them, but they will also have a plan for presentation of those data and results that will simply and accurately report the answers to the evaluation questions. The design will show the product plans, customized for various audiences if necessary.

An in-depth evaluation design as described above is the responsibility of the PEIS evaluation staff to create and use. Once this in-depth evaluation plan has been prepared, PEIS provides a short, summary evaluation plan for client review to ensure that the plan captures their concerns and meets their needs and interests. It includes the following components: evaluation purpose, study questions, methodology, and deliverables. The deliverables section makes it clear what product is expected from the evaluation and when.

Last Modified Sep 17, 2013