Evaluation: Counting and Comparing Totals

One of the most common ways to evaluate our videoconference programs is to count!

What do we count?

  • Number of students impacted (which is sometimes hard or skewed if a teacher does more than one VC)
  • Number of videoconferences
  • Number of types of videoconferences (ASK programs, content providers, free programs, collaborations in all their varieties, meetings, professional development, experts hosted onsite, etc. etc.)
  • Grant funding
  • Total cost of programs
  • District contribution of the cost
  • What else? What do YOU count?

What do we compare to?

When you count the above items, what do you compare it to?

  • Last year’s data
  • Last few year’s data
  • Other similar organization’s data
  • Baseline data
  • Between schools/districts
  • What else? What do YOU compare to?

Some of my comparisons

In our year end reports, I always include some comparisons. These are two of them. Do you compare this way also?

This graph compares the total events among the districts I serve. This is mostly for their own comparison as they like to compare themselves against each other. For a casual observer such as you, the data is less useful because you don’t know the size of my districts. In addition, as seems to be true everywhere, there are certain districts that all the others like to measure themselves against.

Still, isn’t it interesting to see which districts grew their use of VC (when overall use is down about 100 events total)?

There are some drastic changes, some of which I have yet to explain. I need to do some phone calls and see what is going on and if there is a possible solution.

Another graph I keep an eye on is related to our funding trends.

The total program costs are what our whole program spent on content providers and ASK programs. The red grant funding columns show what we, Berrien RESA, funded for our districts, using funding from various sources. The difference between the two is what the districts contributed.

Note the sharp increase in grant funding and corresponding decrease in district contributions to the program. I totally expected this, with many districts cutting back to participate in only free or fully funded programs. This trend reflects the impact that Michigan’s economy is having on public schools.

I wonder also, if this trend is happening across the nation, and if so, how are the content providers faring? Are they doing ok or is it hitting them hard too?

What comparing are you doing as you end the year? What questions are raised from the information you’re collecting?

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.