The coronavirus has forced college instructors to transition to virtual teaching. But many have little to no experience doing so remotely, much less creating effective online courses in a matter of weeks. 

Students are struggling with the change, too. On Twitter, several widely shared tweets about the difficulties of transitioning to an online term have been met with a chorus of students virtually sharing their experiences about needing more flexibility from instructors and being unimpressed with remote classes. 

Within this environment, many colleges are looking to improve communication with their students by surveying them about what’s working and what isn’t. Beyond gathering data for post-mortems for the unexpected and unconventional spring term, schools are also looking for ways to improve online instruction should the pandemic force campuses to remain shuttered in the fall. 

To get feedback from as many students as possible, higher ed experts say every type of college should be conducting surveys. They should be easy to take and include open-ended questions that can capture a diversity of experiences. 

“Everyone is grappling with this,” said Christine Wolff-Eisenberg, manager of surveys and research at Ithaka S+R, a consultancy that created a survey for higher ed institutions to use to assess their COVID-19 response. “It’s not the kind of thing that’s isolated to just one part of the country or one part of the sector.”

Survey early and often

The Higher Education Data Sharing Consortium (HEDS) worked quickly to make a survey that its member colleges —​ which are mostly liberal arts schools —​ and other institutions could use to assess how their spring term was going. Schools could adapt the template to fit their needs. 

Questions cover how students thought faculty and staff members helped them with the transition online and communicated information about the pandemic. It also polls students about which instructional methods are the most and least effective and whether they plan to return in the fall semester. As of May 11, it had garnered responses from more than 30,000 students across 56 schools. 

“It was a very hard transition for students, faculty and the institutions, and we thought the officials we work with would need some feedback as soon as possible to how that transition was going because it was so abrupt,” said Charles Blaich, director of HEDS. 

HEDS is posting preliminary results that may help administrators make changes more quickly. For instance, it found that students who are confident they will return in the fall are also more likely to feel connected to their institution and that their school has done a good job at protecting them from the health effects of COVID-19. 

Educause, a nonprofit focused on improving higher ed’s use of information technology, created a kit for institutions to use to poll students about their experiences. “The primary goal was, ‘Let’s rapidly put something together that’s practical that members can take and use right away,'” said Mark McCormack, Educause’s senior director of analytics and research. 

Likewise, Ithaka S+R’s student survey includes a mix of questions about their general wellness, course formats and the resources available to them. It mostly asks students about their experiences in the last week rather than speculating about the future. “It’s really intended to take the temperature of students at a moment in time,” Wolff-Eisenberg said. 

That enabled colleges to take quick action on the results instead of waiting until after the semester ended to adjust how classes and services were delivered. For instance, the survey found that many students wanted more information about what financial aid may be available to them —​ a problem administrators can quickly remedy, she said. 

Getting students to take them

Surveys should be structured so they take five to 10 minutes to complete and include a mix of closed- and open-ended questions. 

This way, they are less likely to overburden students during an already stressful time, while still giving them an opportunity to describe the unique circumstances they’re facing. “When you actually hear their stories of what they’re saying — that’s so much more powerful,” HEDS’ Blaich said. “It gives you a sense of what’s underneath the numbers.”

That was the case for Georgetown University’s survey, which is designed for students worldwide to share how COVID-19 is impacting their lives and academic careers. 

Common themes emerged across 516 students’ responses to open-ended questions. Several talked about their worsening mental health in the wake of the crisis and their desire for instructors to be more flexible with assignments and deadlines. 

Professors send emails telling us we need to have a stable internet connection, that our background should be tidy and that we should not be distracting during online classes, which struck me as incredibly insensitive,” wrote one student, who the survey kept anonymous. “They should be more understanding and recognize that these are extreme circumstances.”


“Everyone is grappling with this. It’s not the kind of thing that’s isolated to just one part of the country or one part of the sector.”

Christine Wolff-Eisenberg

Manager of surveys and research, Ithaka S+R


To garner more student responses, some surveys are structured to let them remain anonymous. But HEDS is taking a different approach by asking for contact information. 

That way, school officials could connect students with resources to help with problems they said they were experiencing in the survey. “Some institutions we work say they’re literally reaching out to every student in some way, shape or form to talk with them and see what’s going on to make sure they’re okay,” Blaich said. “(The survey) is a piece of information you can use to support that other broader contact effort.”

It’s also essential to explain to students that their responses could drive change at the institution. “We’ve got to communicate to students that we need their help in getting feedback,” said Steve Benton, a data scientist at Campus Labs, which makes data collection software for colleges. “How did this go? What can we do to improve? What are their needs?”

To do so, colleges should be ready to email and even call students to remind them about the survey and encourage them to complete it. In some cases, mailing the survey or sending it multiple times may be necessary, he added. 

The results are in. Now what?

If schools can’t reach students who lack reliable access to a computer or internet connection, the data that surveys collect will be incomplete. 

“It’s very important to think about who is (the survey) population,” Benton said. “You certainly don’t want to exclude students who are financially strapped to the point that they don’t have Wi-Fi access.”

That’s why it’s vital to optimize surveys for mobile devices and to ensure they’re accessible to students with disabilities. Educause, for example, tests its surveys on several types of devices —​ including computers, tablets and smartphones —​ and offers them in formats that work well with screen readers. 

Colleges are also using the information to determine what can be improved next term. However, institutions should be wary of using the results to completely guide their strategy for the fall. 

“The data that you would collect through a survey like this should be the beginning of the conversation,” Educause’s McCormack said. “I don’t think the survey in and of itself would tell you everything that you would want or need to know, (but) it might give you a couple of signposts or indicators that something may be off.”

Source Article