Process & Outcome Evaluation
As Government agencies and private sector organizations become more concerned about the effectiveness of their programs and policies, program evaluations involving process and outcome analyses are becoming an integral part of program management.
To address these needs, Westat's social scientists apply their expertise in agency-specific programmatic areas and their technical skills in research design, mathematical modeling, and statistical analysis. Westat develops outcome measures and appropriate baselines, examines program processes in relation to outcomes, and develops models to isolate program effects from exogenous factors.
Westat has designed and conducted hundreds of ethnographies, focus groups, in-depth interviews, multimodal surveys, and structured observations to gather data that enhances our understanding of programs and products.
Grounded in a thorough understanding of program assets and objectives, process evaluation is designed to monitor activities during implementation and guide decisions about potential changes in strategy as a program matures. Process evaluation helps ensure that your program is operating as you intended and enhances its effectiveness as it proceeds.
Describing your program's successes and documenting lessons learned, outcome evaluation takes an extended view to measure the realization of short- and long-term program goals. It assesses the changes in your target audiences' awareness, knowledge, attitudes, and behaviors and measures how these changes can be attributed to campaign exposure.
Our evaluation protocols are tailored to the specific program, sensitive to the objectives of your stakeholders. They incorporate both qualitative and quantitative methods to capture important nuances. Our techniques provide the information you need to track and assess your campaign's success or make essential changes in your messaging.
We are able to scientifically measure the outcomes and interpret the results of your campaign or web site. Protocols include
- Focus groups
- In-depth interviews
- Case studies
- Heuristic reviews
- Web site user modeling
- Low-fidelity/high-fidelity prototyping
- Cognitive interviewing techniques
- Comparative user studies
- Accessibility studies
We often combine these qualitative and quantitative methods with data from secondary sources to provide a comprehensive picture of a program and its processes, costs, and outcomes.
The handbook, developed for the National Science Foundation (NSF), provides project directors and principal investigators working with NSF with a basic guide for evaluating NSF's educational projects. It builds on firmly established principles, blending technical knowledge and common sense to meet the special needs of NSF and its stakeholders.