Training Application Reviewers Using Adobe Connect

For two years, I chaired the awards committee of Indiana University’s Graduate and Professional Student Organization (GPSO), a duty I relinquished a few months ago. During that time, I oversaw awards processes that awarded nearly $50,000 to Indiana University (IU) graduate and professional students to travel and conduct research. Of course, I didn’t review the hundreds of award applications by myself. Each semester, several dozen IU graduate and professional students reviewed the applications. IU provides students and staff access to Adobe Connect and I used that to orient and train these application reviewers. This worked out very, very well and in this post I will share how and why I used Connect so you can adapt parts of this process for your own use.

Adobe Connect is an online tool that allows participants to share documents and chat using video, voice, and text. It is often used to conduct webinars; my former employer uses it regularly and EDUCAUSE uses it very effectively for their (free and awesome) EDUCAUSE Live! sessions. I used Connect because it was available and very familiar to me, not because it has unique features missing from other similar tools. In particular, I used Connect to:

  1. Hold training sessions online hoping that they would be more convenient to schedule and attend than an in-person meeting.
  2. Record training sessions so those who could not attend live sessions could view them later.
  3. Communicate with attendees using a microphone to talk over some simple and necessary PowerPoint slides.
  4. Allow participants to use text chat to ask questions and make comments. Connect can allow others to use microphones and webcams but I wanted to keep things simple and the playing field level for attendees.
  5. Use poll questions during the training as described below.

The central premise in these training sessions is that reviewing award applications is very comparable to content analysis. So I approached these training sessions similar to the way in which I train content analysis coders:

  1. Provide a broad overview of the philosophy and purpose of the application review processes. This included not only a brief review of the mission of the GPSO Awards Committee but also an overview of the entire process, including those parts that fall outside the application review process. This broad overview helped reviewers not only better place their efforts in context but it also armed them with knowledge that helped them resolve ambiguities and unpredicted situations. If you’re the touchy-feely sort, it helped empower the reviewers. If you’re not, it helped reduce the number of questions they asked me. Either way, it was very helpful for the process.
  2. Introduce the award applications and the criteria used to review them. As I did this, I briefly tried to explain the rationale for the questions on the applications.
  3. Read through one or two completed applications (made-up or from previous semesters) and review them using the criteria previously introduced.  To do this, I showed on the screen the contents of each question and the relevant review criteria as I scored each question. It was important that I not only review the applications but also that I “think aloud” while doing so to help the reviewers understand how to apply the review criteria.
  4. Read through another completed application or two and have reviewers review them.  As I showed each question and relevant review criteria, I opened a poll question asking each reviewer to enter his or her score for that question. After a minute or two, I revealed the results of the poll question and we discussed the results. I also talked through how I would have scored that question to help reinforce the review criteria and clarify them. Some reviewers who scored the question differently compared to the rest of the group – and they knew that because I had shown the poll results – would ask questions and we would enter into a discussion to arrive at a consensus on how to score that question.

If this were content analysis, I would have repeated step 4 until we met a pre-determined interrater reliability index. But it’s not content analysis and it seemed unreasonable to ask the volunteer reviewers to spend that much time in training so we just reviewed one or two applications together.  After the training sessions, I sent to all of the reviewers (a) the link to the recording of the training session and (b) a copy of the PowerPoint slides.

This process seemed to work very well. It was easier to schedule the training sessions because I only had to worry about finding the best date and time for reviewers.  To schedule the training sessions, I provided several choices on the reviewer application form and used the results to determine the date and time when the most reviewers were available, prioritizing the preferences of new reviewers who had never been through training. It was trivial to record the training sessions and allow everyone, attendees and non-attendees alike, to view the recordings. Most importantly, I could use poll questions during the training to accurately gauge whether my reviewers had arrived at a consensus understanding of the review criteria. Hiding the results of the poll questions as they were being answered ensured that the responses I received were not influenced by reviewers who had already submitted their response and showing the results allowed reviewers to determine if their understanding was accurate or not.

I don’t know if anyone else uses Connect or similar tools to train application reviewers in this manner. I assume that others do something similar. The basic process is amenable to contexts other than award applications. As long as the technology is accessible, review criteria are well-defined, sample documents are available (they can be made up or old documents can be used – with permission, of course!), and a sufficient number of attendees can be convinced to attend (instead of waiting for the recording), this should work reasonably well in other situations where a group of people is being asked to use common criteria to review a set of documents e.g. conference program reviewers, search committees.  This process seemed to make my training sessions accessible and effective and I think they could work for others’ training sessions, too.

I apologize for not providing examples of these training sessions or training materials. I deleted most of my GPSO documents once I handed over my responsibilities to the new chair; there is so much confidential information in those documents that I don’t want access to them anymore! Even if I had access to the recordings and training documents, there is probably confidential material in them that would prevent me from making them publicly available anyway. Sorry! If you really want or need some examples, please let me know and I would happily mock some up for you.


Posted

in

,

by

Tags:

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *