An evaluation is an evidence-based snapshot in time of what’s happening based on a program or intervention’s theory of change.  There are different types of evaluations including formative evaluations (also known as “needs assessments”), process evaluations (what happened during the program implementation), outcome evaluations (what happened after the program was implemented), and impact evaluations (what impact the program has after implementation). The latter two are also referred to as “summative” evaluations because they are completed after the program is implemented and speak to the “summative” outputs or impacts as a whole. Thinking about this in terms of cooking can be a fun way to understand more about evaluation types:

Often in the form of a written report, the tides continue to shift from text-heavy and lengthy reporting toward highly visual and useful reporting. It’s not about the length of the report anymore, it’s about ease of  understanding and use of the results! This is also known as “utilization-focused evaluation”. 

Evaluation reports are often used to justify funding from payer or grant sources. More importantly, they are used to improve practices by providing information and context about what is working well (so you can keep doing it) and what’s not working  well (so you can work intentionally to improve it).

If you’re new or interested in learning about how to save time making the most useful reports of your evaluation findings, we recommend Depict Data Studio. There are loads of helpful tips on the latest ways to ensure that clients are getting the most out of the evaluation reports produced. They also have some very simple hacks that will up your Microsoft Word and Excel skills by orders of magnitude!

Analytics are different systematic approaches used to gain information from data or statistics.

Science Soft created an excellent visual representation (below) and blog post about the four types of analytics, though here is our more simple take for you:

4 types of data analytics

Descriptive analytics simply describe what is happening. For example, there are 100 clients in a program.

Diagnostic analytics gets closer to why something is happening. For example, there were 50 clients in the program last year and now there are 100 clients in the program. Another data set confirms that funding doubled and covered the cost of 2x as many clients this year.

Predictive analytics do just that: predict what is likely to happen in the future with access to the right data. We’re getting a little bit more complex here, and this is where machine learning can come into play, but a good way to think about this is weather forecasting. For example, when you move from 100 to 200 clients, the quality of services will decrease because there is less face-to-face time with each client.

Prescriptive analytics describes what to do in order to get rid of a future issue or to ensure that a good outcome is maintained. For example, when you move from 100 to 200 clients, the quality of services will decrease AND to prevent that you will need to hire 2 more staff members. 

These simple examples are helpful to our clients and staff to gain a better understanding of what “analytics” means. Additionally, evaluation reports often include the results from various analytics processes (a.k.a. analyses) in order to report evidence-based information. It is important to keep in mind that analytics are not the only thing that count as legitimate evidence, of course.

Understanding the difference between evaluation and analytics will help your organization move forward with evidence-based decision making to better serve our community. 

To learn more about CCNY’s data and evaluation toolkits, call us today at (716) 855-0007, ext. 317 or e-mail