Tag Archives: Change

Upgrade to our Distance Learning Room

Yesterday and today I’ve been teaching several sessions on videoconferencing & Skype to our annual Berrien RESA Tech Camp. I’ve been super excited because I got to teach with our new upgraded room!

It’s actually not as upgraded as some of you can afford, but I’m still excited!

Before

So first, here’s what it looked like before.

This 60 inch monitor was part of a Goals 2000 Grant installation in 1999. Look at how the colors were bleeding apart in the top right. Very annoying!

After

Of course, now, this is Michigan, and we’re out of money! So the upgrade was actually driven by the Smart award presentation given to MACUL Educator of the Year winners. The award included a Smart Board, projector, clickers, slate, document camera, as well as all the software. I’ve been so excited because I wanted to figure out how to get desktop videoconferencing working on an interactive whiteboard.

Last Friday, just in time for tech camp, the last piece was finished in the DL room – mounting the Smart Board, building a cabinet for the Polycom VSX 7000. Here’s what it looks like in a “standards based” videoconference with the Polycom just dialed out to my office unit.

(Apologies for the cell phone quality picture.)

Here’s what it looks like in a Skype call with Roxanne Glaser for my workshop today. I circled the Logitech webcam so you could see where that is.

Next I want to figure out how to use a long USB extension cable to mount the webcam somewhere close to the monitor for whole class connections. Although, the more I play with Skype / desktop VC, I think it’s nicer to be able to pick the webcam up and move it around as needed (sort of like manual presets! ha!).

So that’s my new distance learning room. The monitor hanging in the ceiling came out. The pole in the middle of the room came out. It’s a lot cleaner-looking and much more flexible now. Yay! And, now we can do Smart Board training in this room too!

I know, I know. A “real” upgrade would mean upgrading to HD videoconferencing. Well that’s on the list, but for now I’m happy with this!

Hovering Makes Videconferencing Look Hard

Do you know any techs who insist on being “onsite” for every videoconference?

I’m coming to the conclusion that hovering makes it look hard. If the tech has to come out for every videoconference, how can the teacher or media aide feel that they can do it themselves?

The fact is, teachers can use the VC remote on their own!

A teacher in Jazz Workshop uses the videoconference remote.

In the Jazz Workshop, on Monday afternoon, we send participants off to small groups to practice dialing, moving the camera, setting presets, and adjusting the volume. On their own! And they do fine!!

Here is a comment from a teacher participating in the July Jazz 2010 workshop. Emphasis is mine.

There are a plethora of resources to enhance learning with students with videoconferencing.  Why haven’t I used these resources before?  Fear of technology is the response. However, after this workshop, my perception has been changed.  It is easy to use the equipment and find a videoconference appropriate for your classroom.

I came into this class very apprehensive about videoconferencing.  I wasn’t sure I could run the equipment and felt it was going to be over my head. However, I feel I have grown so much as a VC newbie!  I feel comfortable looking for programs, filling out registration forms, and confident in my ability to be a leader in our building.  I am really excited about sharing this information with our school through a staff meeting or allowing teachers to come in and observe during a videoconference.

How do we help teachers move past their fear?

  • Give them the remote!
  • Let them play with it!!
  • Give them short, easy cheatsheets and let them practice in a non-threatening environment (i.e. first when they aren’t in front of kids!)
  • Help them see how easy it us!

What do you think? Do you agree? How do you reduce the fear for your teachers?

Other Ways to Evaluate

This year, I decided I wanted more than numbers to evaluate my videoconference program. So I created two surveys, one for teachers who participated in a videoconference, and another for teachers who did not participate in a videoconference. So far, I have 54 responses and am hoping to get more before the end of the school year.

Here is what I asked my teachers:

If they participated in a videoconference this year:

  • What benefits do you see to your students in using videoconferencing?
  • Which videoconferences do you want to do again next year?
  • Which videoconference(s) were NOT good and not worth your time?
  • Are there any other topics that you wish there was a videoconference for?
  • Any other comments?

If they did NOT participate in a videoconference this year:

  • Have you ever done a videoconference in the past?
  • What videoconference(s) do you wish you could participate in next year?
  • What do you need to be able to participate in a videoconference next year?
  • Any other comments?

I also collected some demographic data: grade level, subject area, name & district.

So far I am getting interesting feedback, and hope that it will give me good data to plan programs and events for next year…

What questions are you asking YOUR teachers?

Evaluation: Comparing Against National Data

Yesterday we talked about comparing our end of year report numbers to last year’s data and between schools and districts.

What about comparing your data to national data?

Graph by nDevilTV

In the spring of 2008, many of you contributed to my dissertation study focusing on videoconference coordinators and the use of videoconference in K12 schools.

When you look at your end of year data, it might be useful to compare it to the data from this study.

Three measures

There are three ways to measure your videoconferences to compare against this data:

  • Total videoconference events (professional development, meetings, and student events, excluding daily courses) divided by the number of students times 100 to get a whole number.
    (Total events / #students * 100)
  • Total student events (projects, collaborations, content providers, etc; excluding daily courses) divided by the number of students times 100 to get a whole number.
    (Student events / #students * 100)
  • Percent of teachers using videoconferencing.
    (#Teachers who used VC / total#teachers)
  • You can add a fourth by adding these measures together.

Comparison Data

In my dissertation study (277 respondents from six countries and 31 U.S. states), the utilization statistics were:

  • Total events/# of students *100: ranged from 0 to 60 with a mean of 4.
  • Student events/# of students: ranged from 0 to 67 with a mean of 4.
  • Percentage of teachers using videoconferencing: ranged from 0 to 100% with a mean of 26%.
  • Total Utilization Score: The three added together: ranged from 0 to 180 with a mean of 35.

So, you can aim for the mean, or you can aim for the highest range. Either way it will give you a feel for how your program is doing in comparison with the respondents to my research study.

Sample Analysis

For the fun of it, I took the data from one of my high use elementary schools. This was a 2nd-3rd grade building with 16 regular classroom teachers. They did 63 VCs this year.

  • Total events: 63/404 students *100 = 15.6
  • Student events: the same. No PD or meetings this year. 15.6
  • Percent of teachers using VC: 14/16 of the regular classroom teachers used VC: 87.5%
  • Total Utilization Score: 118.7 (which is nicely higher than the mean of 35; but not quite as high as the top 180 score in my study).

On the other hand, I have a couple schools who didn’t use it at all this year. So don’t think that all is rosy and perfect in my corner of the world!

What do you think? Is this a fair comparison? Does it weight the percent of teachers using it too much? How does your school compare? How would you measure total utilization?

(and don’t forget, I’m not counting daily full length shared classes because they skew the data)….

Evaluation: Counting and Comparing Totals

One of the most common ways to evaluate our videoconference programs is to count!

What do we count?

  • Number of students impacted (which is sometimes hard or skewed if a teacher does more than one VC)
  • Number of videoconferences
  • Number of types of videoconferences (ASK programs, content providers, free programs, collaborations in all their varieties, meetings, professional development, experts hosted onsite, etc. etc.)
  • Grant funding
  • Total cost of programs
  • District contribution of the cost
  • What else? What do YOU count?

What do we compare to?

When you count the above items, what do you compare it to?

  • Last year’s data
  • Last few year’s data
  • Other similar organization’s data
  • Baseline data
  • Between schools/districts
  • What else? What do YOU compare to?

Some of my comparisons

In our year end reports, I always include some comparisons. These are two of them. Do you compare this way also?

This graph compares the total events among the districts I serve. This is mostly for their own comparison as they like to compare themselves against each other. For a casual observer such as you, the data is less useful because you don’t know the size of my districts. In addition, as seems to be true everywhere, there are certain districts that all the others like to measure themselves against.

Still, isn’t it interesting to see which districts grew their use of VC (when overall use is down about 100 events total)?

There are some drastic changes, some of which I have yet to explain. I need to do some phone calls and see what is going on and if there is a possible solution.

Another graph I keep an eye on is related to our funding trends.

The total program costs are what our whole program spent on content providers and ASK programs. The red grant funding columns show what we, Berrien RESA, funded for our districts, using funding from various sources. The difference between the two is what the districts contributed.

Note the sharp increase in grant funding and corresponding decrease in district contributions to the program. I totally expected this, with many districts cutting back to participate in only free or fully funded programs. This trend reflects the impact that Michigan’s economy is having on public schools.

I wonder also, if this trend is happening across the nation, and if so, how are the content providers faring? Are they doing ok or is it hitting them hard too?

What comparing are you doing as you end the year? What questions are raised from the information you’re collecting?

Informal Evaluation Strategies

by Horia Varian from Flickr Creative Commons

One of the ways to evaluate your videoconference program is to ask questions throughout the school year. Here are some of the questions I ask:

Questions to Ask Teachers

Often when I don’t get to actually watch a videoconference, I like to email the teacher afterwards. Particularly if the program is one I haven’t seen before, or if the teacher is new to VC. I ask:

  • How did it go?
  • Did the content meet your curriculum?
  • Was the quality of the VC ok?
  • Did you have any problems with it?

Often I can then resolve any issues positively so that the teacher will come back for another videoconference.

Questions to Ask VC Coordinators

I also like to ask questions of my VC coordinators whenever I get a chance. Often on the phone as we’re discussing an issue or problem, I ask open ended questions to learn more about how VC is going in their school.

  • How are your teachers doing this year? Are they busy and stressed?
  • How is it going? (often this question brings out barriers, which we can then discuss solutions together)
  • How is your principal supporting VC this year?

The trick is to really listen! Listen to what might seem to be “complaints” or “excuses” in your mind. Listen! Are there ways to address those issues to make it easier for your teachers & coordinators?

What ways do you informally evaluate your program throughout the school year? Please comment and share!

End of Year Evaluation Strategies

by kevinzhengli from Flickr Creative Commons

As we come to the end of the school year, it’s time to reflect on the year, evaluate how it went, and use that data to plan for next year. So this week, we’ll be focusing on evaluation of our videoconference programs.

What Do We Evaluate?

  • How many videoconferences were done by each school
  • Which teachers used VC and why
  • Which teachers that used VC in the past didn’t use it this year and why
  • Which programs that we offered were effective and which weren’t
  • Effect on student achievement (if possible)
  • Which schools need more assistance
  • What the training needs are for the summer and next year
  • What else can you think of?

Previous Thinking on Evaluation

Before thinking more this week, let’s review some of what has already been said about evaluating our videoconference programs.

For some additional reading, consider these research articles:

How are you evaluating your program from this school year? Please comment & share!

    Don't Blame the Teachers! Respect the Resistance!

    Thursday night I listened to Larry Cuban over at Classroom 2.0. It was an interesting interview, peppered with entertaining comments in the chat. While listening, I skimmed Larry’s blog. I found this comment from this post on his blog that is a nice succinct summary of the bigger picture problem with change in schools:

    Do high school structures promote enough time and the classroom climate to support frequent and open use of reasoning skills? Hardly. Take for example, the 4 Ts: Time, Teacher load, Textbooks, and Tests. Read more…

    We’ve discussed before the challenges for high school teachers to use VC in their curriculum and they mirror Cuban’s 4 T’s.

    I’ve also been reading about change – What’s Worth Fighting For Out There?, Leading in a Culture of Change, and Educational Change Over Time? The Sustainability and Nonsustainability of Three Decades of Secondary School Change and Continuity.

    I’ve learned about the big picture of change in education, which vibes with what Cuban is saying. I’ve also learned that it’s important to respect and listen to the resistance. They might be able to see challenges you can’t see. They usually have a good reason for resisting. We need to listen to that! Understand it. Respect it.

    So here’s the question for you:

    How are YOU respecting the resistance? If you are blaming teachers for not integrating your favorite technology in their curriculum, what are you doing to help them get past the 4 T’s? When’s the last time you really understood and experienced the pressures teachers face? Are you just throwing ideas at them (here’s a great Web 2.0 tool), or are you actually setting up lessons and projects that meet their curriculum goals?

    Supporting VCs by a High School Media Specialist

    This week I’m finishing up a session of the Planning Interactive Curriculum Connections online class. One of the participants, a high school media specialist, wrote an excellent plan for supporting VCs in her school. I wanted to highlight a few points:

    Here at the high school, I can also be available to assist teachers during the actual connection. This includes assisting with the equipment, setting up the room appropriately, and making students aware of what to expect and how to speak and act during the VC.  From my experience, most high school teachers feel fairly confident with the equipment after they are given instruction on how to use it.  High schools students can also assist in running the equipment (muting microphone, changing camera presets, etc.).  It is imperative to provide teachers with sufficient support during a VC program.  If they feel overwhelmed by the technical aspects of a VC program and keeping their students engaged and on task, chances are they will not enjoy the experience and will not be interested in scheduling future VC programs.  Therefore, communication is the key.  Teachers should not be left alone until they feel comfortable with running the VC program by themselves. Once they express that they are comfortable, they can be left with contact information in case they face technical problems. – Alma Holtgren, Lakeshore High School, Stevensville, MI

    Do you agree? Are you able to provide this type of support to your teachers? If not, how do you compensate? Please comment!

    Videoconferencing Implementation

    This post is part of a series examining articles on the communication aspects of videoconferencing.

    Reference Baber, J. R. (1996). Re-visioning corporate communication: A case study of videoconferencing implementation. Retrieved from ProQuest Digital Dissertations. (AAT 9700122)

    Summary

    This study was more on the implementation of corporate communication via videoconferencing than on the actual communication. Still useful and interesting.

    Baber (1996) offers the Culture-Process-Technology approach as a framework for the successful implementation of videoconferencing in the corporate environment. The framework recommends:

    (1) that organizations should ensure that managers at all levels are willing to support the implementation process; (2) that videoconferencing “champions” be found to administer the system at the project level; (3) that operator training programs be developed to create a wide base of skilled end users; (4) that conference schedules be published regularly to inform end users of meeting times and to sustain ongoing interest in videoconferencing; and (5) that use of videoconferencing system features be consistently modeled to encourage the use of innovation and the re-invention of technology. (p. 128)

    Application/Discussion

    Do you have these principles in place in your school?

    1. Do you have a principal/ administrator supporting the implementation of videoconferencing in your school? What does that support look like?
    2. Do you have a champion for VC in your school? (probably you!)
    3. Are a lot of people getting skilled with using VC? Can your teachers mute & unmute? Can they use presets (if they are set for them ahead of time)? Can they dial if they are given the IP?
    4. How do you organize and publish schedules? Is your system working for you?
    5. Are interesting and innovative ways of using VC celebrated and communicated?

    What else do you think is important for implementation?