Skip to main content
Marriott School of Business

Evaluate the Results

The Evaluation stage is where impact meets accountability. This phase focuses on assessing whether a social innovation is achieving its intended outcomes and how effectively. We guide students and organizations to move beyond anecdotal evidence and embrace data-driven insights. Evaluation isn’t just about proving success—it’s about learning what works, what doesn’t, and why. By identifying key performance indicators and using both qualitative and quantitative methods, changemakers can refine their strategies and improve outcomes.

Before scaling or replicating an intervention, it’s essential to determine which areas need adjustment. This stage ensures that ventures stay aligned with their mission and responsive to the communities they serve. Explore the toolbox of skills below to strengthen your evaluation approach and build a foundation for lasting impact.

Your Toolbox
Outputs, Outcomes, Impact.png

Outputs, Outcomes and Impact

Choose what you measure wisely.

Outputs, outcomes and impact are all essential to evaluation and all follow an intervention, but they matter in very different ways. Outputs are the products or activities completed by the organization (e.g. classes taught, beds filled, houses built). Outcomes, on the other hand, are the specific changes among a population following an intervention (e.g. increased literacy rates, lower chronic homelessness, dropping drug use). Finally, impact represents the portion of outcomes that can be attributed to the intervention.

For example: if an organization is created to end homelessness in a city, its output may be the number of affordable housing rates built and filled. Its outcomes could be the amount of unsheltered people observed on the streets. And its impact could be an observable drop in homeless rates over time.

Make sure you are measuring the right data for your chosen outcome. Measuring the wrong data point can lead to a false picture of whether an intervention is succeeding or falling short.

Can you explicitly name your desired outputs, outcomes, and impact?
Randomized+Controlled Trials.png

Randomized Controlled Trials

Consider using the gold standard of evaluation.

Randomized Controlled Tests (RCT's) set the bar high. These tests are usually expensive and difficult to run, but they are very reliable at showing what type of effect an intervention is actually having on a population. RCT's work by measuring changes in a chosen outcome among two randomly selected groups: one group that receives an intervention and a "control" group of others that does not receive the intervention.

For example, if an organization wants to check how well its gift card incentive program is working, it can randomly select a group of people to receive the program and a nearly equal group to not receive it. The organization would then measure how often both of these customer types buy gift cards.

RCT's may not be feasible when an organization lacks the resources needed or when it would be unethical to withhold the intervention from a control group, as in the case of school lunches for needy populations.

Is an RCT feasible for your intervention?
Pre+Post Testing.png

Pre and Post Tests

To test your intervention, measure an outcome before and after.

One useful way to assess the effectiveness of an intervention is to gather data on the same variable before and after the program is implemented. Typically this is done with a wide sample size of the targeted population. Do the results show an observable change between the pre-intervention test and the post-intervention test?

A pattern may emerge from the data, showing some sort of correlation between the intervention and the variable. But this does not necessarily mean the intervention causes the resulting change.

What would the results of a pre- and post-test evaluation say about your intervention?
Organizational Learning 2.png

Organizational Learning

Create a culture of continuous improvement.

Organizational learning describes the value a group places on continually seeking to improve. How much an organization is willing to grow shines through in the systems it implements, such as knowledge sharing among departments, incentives for experimentation, and evaluation processes embedded in every team.

Organizational learning benefits both organizations and their partners. One fruit of this mindset is the innovation it fosters, which lead to better interventions. Further, organizational learning increases responsiveness to issues that arise in the implementation process.

However, this mindset does not mean that an intervention never ends. On the contrary, it means continually looking at the bigger picture of how a population is being served and adjusting accordingly. This adjustment should include the ending of programs once an organization believes its mission has been achieved and a problem has been sustainably eliminated.

To what extent do your organization's systems show a willingness to improve?

Stories of Evaluation in action

data-content-type="article"

Using Data to Improve Homelessness Support Services in Utah

June 28, 2024 04:24 PM
From the Director: Dr. Eva Witesman makes a case for embracing data-based learning and improvement for support providers serving people experiencing homelessness in Utah.
overrideBackgroundColorOrImage= overrideTextColor= promoTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= promoTextAlignment=
data-content-type="article"

Stopping Poverty

April 01, 2022 11:54 AM
A chance meeting between four powerhouse social innovators helped change the trajectory of preexisting poverty solutions and is continuing to make waves across the world for reducing poverty.
overrideBackgroundColorOrImage= overrideTextColor= promoTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= promoTextAlignment=
data-content-type="article"

Shifting From Research to Reality

June 12, 2024 02:57 PM
BYU Public Relations student Alyssa Minor did not know early on that she would write a detailed research paper on child abuse in Ghana's orphanages. But in the Ballard Center Do Good. Better. class, Minor discovered the problem and chose to investigate.
overrideBackgroundColorOrImage= overrideTextColor= promoTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText= promoTextAlignment=
overrideBackgroundColorOrImage= overrideTextColor= promoTextAlignment= overrideCardHideSection=false overrideCardHideByline=false overrideCardHideDescription=false overridebuttonBgColor= overrideButtonText=