Sunday 24 November 2013

Wrapping a MOOC: Student Perceptions of an Experiment in Blended Learning

         Wrapping a MOOC: Student Perceptions of an Experiment in Blended Learning


Derek O. Bruff

Director, Center for Teaching
Senior Lecturer, Department of Mathematics
Vanderbilt University
Nashville, TN 37235 USA
derek.bruff@vanderbilt.edu

Douglas H. Fisher
Associate Professor of Computer Science and of Computer Engineering
Department of Electrical Engineering and Computer Science
Vanderbilt University
Nashville, TN 37235 USA
douglas.h.fisher@vanderbilt.edu

Kathryn E. McEwen
Graduate Assistant, Center for Teaching
Doctoral Candidate – German, Department of Germanic and Slavic Languages
Vanderbilt University
Nashville, TN 37235 USA
kathryn.e.mcewen@vanderbilt.edu

Blaine E. Smith
Doctoral Candidate – Language, Literacy, and Culture
Department of Teaching and Learning
Vanderbilt University
Nashville, TN 37235 USA
blaine.smith@vanderbilt.edu

Abstract

Although massive open online courses (MOOCs) are seen to be, and are in fact designed to be, stand-alone online courses, their introduction to the higher education landscape has expanded the space of possibilities for blended course designs (those that combine online and face-to-face learning experiences). Instead of replacing courses at higher education institutions, could MOOCs enhance those courses? This paper reports one such exploration, in which a Stanford University Machine Learning MOOC was integrated into a graduate course in machine learning at Vanderbilt University during the Fall 2012 semester. The blended course design, which leveraged a MOOC course and platform for lecturing, grading, and discussion, enabled the Vanderbilt instructor to lead an overload course in a topic much desired by students. The study shows that while students regarded some elements of the course positively, they had concerns about the coupling of online and in-class components of this particular blended course design. Analysis of student and instructor reflections on the course suggests dimensions for characterizing blended course designs that incorporate MOOCs, either in whole or in part. Given the reported challenges in this case study of integrating a MOOC in its entirety in an on-campus course, the paper advocates for more complex forms of blended learning in which course materials are drawn from multiple MOOCs, as well as from other online sources.

Keywords: massive open online course (MOOC), blended learning, online learning, wrapper, flipped classroom, course cohesion, subject coupling, task coupling, local learning communities, global learning communities, course customization

Introduction

Technology continues to transform education in traditional and online settings (Baldwin, 1998), as the recent proliferation of massive open online courses (MOOCs) demonstrates (Guthrie, 2012; Mangan, 2012; Pappano, 2012). Although MOOCs are seen to be, and in fact are designed to be, standalone online courses (Hill, 2012), their introduction to the higher education landscape has expanded the space for possible blended or hybrid course designs (those that combine online and face-to-face learning experiences). Creating a blended course that incorporates another instructor's MOOC simplifies the blended course design problem in some respects, by fixing the online component of the blended course, while allowing the blended course instructor to shape the in-class components. However, fitting in-class modules into an existing MOOC in a way that optimizes student engagement, satisfaction, and ultimately learning, can be challenging.

This paper reports a case study of a blended graduate course in machine learning at Vanderbilt University in Fall 2012, which incorporated a Stanford University MOOC. It reports student perceptions of the blended course and identifies elements of the blended course design that the authors think are responsible for these perceptions. Drawing on these findings, the paper suggests a number of design considerations of potential interest to instructors wishing to build blended learning experiences around MOOCs or to integrate online and face-to-face components of blended courses more generally. Although the blended course in this study adopted the entirety of one particular MOOC, the paper suggests that other customizations may well be both possible and desirable, particularly those that select from and mix multiple MOOC sources.

Background

Blended Learning

Blended or hybrid approaches to teaching integrate face-to-face (offline) instruction with online materials, creating what can be a flexible and effective model for instruction (Aycock, Garnham, & Kaleta, 2002; Bowen, Chingos, Lack, & Nygren, 2012; Hill, 2012). By leveraging online modes of content delivery outside of class time, blended courses can free face-to-face sessions for instructor feedback, applications, and interaction (Aycock et al., 2002; Hill, 2012). Indeed, a 2010 meta-analysis prepared by the United States Department of Education reports that in recent experimental and quasi-experimental studies, blended instruction has been found to be more effective than either face-to-face or fully online instruction (Means, Toyama, Murphy, Bakia, & Jones, 2010). However, caveats are in order, as the meta-analysis notes "it was the combination of elements in the treatment conditions (which was likely to have included additional learning time and material as well as additional opportunities for collaboration) that produced the observed learning advantages" (Means et al., 2010, p. xviii).

As Aycock et al. (2002) outline, blended approaches demonstrate wide variations not only in the distribution of face-to-face and online time, but also in course design, which reflect and accommodate differences in teaching style and course content. Although there is no "standard" approach to blended courses, they often involve a rigorous, time-intensive redesign of traditional face-to-face courses to fully integrate face-to-face and online learning (Aycock et al., 2002; Stone & Perumean-Chaney, 2011). Students' work online must be made clearly relevant to their work in the classroom, just as the face-to-face sessions must draw on and apply the online materials (Babb, Stewart, & Johnson, 2010; Gilbert & Flores-Zambada, 2011; Toth, Amrein-Beardsley, & Foulger, 2010). Building on this research, this paper will argue, based on the authors' case study, that the degree and type of coupling between online and face-to-face components is an important dimension along which blended courses can be varied.

MOOCs and Blended Learning

Despite variations in format, the "traditional" blended course assumes a common designer of both face-to-face and online learning: namely, the on-campus instructor(s) (Aycock et al., 2002; Gilbert & Flores-Zambada, 2011; Rodriguez & Anicete, 2010). For example, in one version of what is often called a "flipped" or "inverted" classroom (Lage, Platt, & Treglia, 2000), students gain first exposure to course content through online video lectures created by their instructor, then explore that content more deeply during class through active learning exercises also designed by their instructor (Talbert, 2012). Although some versions of the flipped classroom involve materials created by others, such as the use of textbooks for the pre-class first exposure to content (Mazur, 2009), the blend of online and face-to-face learning activities in such courses is designed by the students' on-campus instructor.

MOOCs present a new option for blended course design. Instead of "flipping" one's course by producing online lecture videos or leveraging textbooks, instructors can "wrap" their courses around existing MOOCs (Caulfield, 2012a; Fisher, 2012; Koller, 2012; Mangan, 2012; Shirky, 2012). In this approach, students in an on-campus course are asked to participate in part or in whole in a MOOC hosted at another institution, with the local instructor supplementing that online learning experience with face-to-face classroom interactions. Since MOOCs are designed externally and intended to function as stand-alone courses (Hill, 2012), incorporation of a MOOC in a blended learning experience constrains the face-to-face instructor's course design decisions: the online component is relatively fixed, and only the in-class component can be varied. The online component is, however, only relatively fixed because the instructor of the wrapper can always choose to use only parts of the MOOC, a possibility that the authors return to later in discussing customization around more than one MOOC and other online content.

The challenges posed by "wrapping" a course around a MOOC are not unlike those posed by incorporating a textbook, authored by another, into a course. However, given the variety and interactivity of learning experiences available on most MOOCs – lecture videos, automatically graded quizzes, discussion forums – the use of externally hosted MOOCs in blended courses involves design questions not raised by the use of textbooks. These are the questions explored in the current case study.

Methods

Instructional Context
The setting for the present case study (Yin, 2003) was a graduate-level course on machine learning taught at Vanderbilt University, a research university, by co-author Fisher. The Machine Learning graduate course was typically only offered every other year by the computer science program. Due to demand for the course from another graduate academic program of the University, a special section of the course was run during the Fall 2012 on an "off" year, and as an overload course for Fisher. As a result, many of the 10 students in the course were from outside computer science though all were graduate students with some computing sophistication, but new to machine learning. In order to maintain a sustainable workload across all his courses, Fisher decided to draw on some of the educational resources provided by MOOCs for this course, building on his experience incorporating open educational resources (Wiley & Gurrell, 2009), such as online lecture videos created by other faculty, into previous courses (Fisher, 2012). In fact, in an earlier Spring 2012 Machine Learning course, Fisher used online lectures by Stanford professor Andrew Ng, director of the Stanford Artificial Intelligence Lab and co-founder of Coursera.

Students in the Fall 2012 course, however, were asked to go well beyond simply watching Ng's online lectures; they actually enrolled in and were required to complete Ng's Machine Learning MOOC on the Coursera platform. This involved watching lecture videos, completing quizzes and programming assignments, and, optionally, participating in discussion forums. Students were asked to take screenshots of their submitted quizzes and programming assignments and send those to Fisher, allowing that work to contribute to the students' grades in the Vanderbilt course.

The start of the 10-week Stanford MOOC happened to coincide with the beginning of the Vanderbilt semester, one of the reasons Fisher chose to use it as part of his course. However, there were topics in machine learning not addressed by the MOOC that were of potential use to students in their research at Vanderbilt. Thus, students were also assigned additional readings, which were discussed in weekly face-to-face class sessions led by Fisher. Where the MOOC provided an introduction to some classic and widely used methods of machine learning, the readings were journal papers, consisting of both recent and seminal research in the field, chosen to build on the topics covered in the MOOC and to introduce other important areas of machine learning. During the final four weeks of the semester, after the MOOC ended, students worked individually on projects of their own design, receiving guidance and feedback from Fisher and each other, during the remaining in-class meetings. Student projects of this sort had been components of previous offerings of Fisher's Vanderbilt Machine Learning course; it was the first 10 weeks of the 14-week semester that differed in the Fall 2012 offering.

Figure 1 shows the rough layout of machine learning topics throughout the Fall 2012 course. The left column gives the topics covered by journal readings, which were the focus of weekly in-class discussions; the right column lists the online video topics from Andrew Ng's MOOC. The one-week offset in the readings (left column) was intended to allow Ng's MOOC videos to first introduce concepts before they were then expanded upon in the readings. Arrows between the columns indicate some of the dominant conceptual correspondences between the readings and MOOC videos, though synergies along these lines could only be realized at relatively high levels of abstraction and not at a "nuts-and-bolts" level. Fisher adapted his experience with these same videos in the Spring 2012 Machine Learning course in arriving at this sequencing for the Fall 2012 course.


Figure 1. Topics covered in the wrapper course by readings (left) and MOOC (right)

In many cases there are no correspondences shown. In some cases, none are shown because the connections are pervasive. For example, though no correspondences are shown for Week 6 of the MOOC lectures on experimental evaluation, this material had relevance across nearly all in-class discussions and online topics, and, in fact, the readings and their discussions presaged and reflected on this material throughout the course. Similarly, Week 7 of the in-class discussions on projects also referenced material throughout both online and in-class topics. In contrast, some in-class topics, such as relational learning, inductive logic programming, and knowledge-biased learning, simply had no strong linkage – certainly not at the nuts-and-bolts level – with MOOC topics, though in some cases the contrasts suggested by paradigmatic differences were a topic of discussion.

Although the topic references listed in Figure 1 are course specific, the implication that the figure conveys that the readings for in-class discussion were selected with some care has broad applications. They also reflect Fisher's priorities to include certain topics, which although having little or no substantive connection to the online topics, he thought important particularly in the Vanderbilt context. And in all cases, even linkages that did exist were treated at a relatively high level of abstraction.

Fisher describes this course structure as a "wrapper" approach, a term adopted from the machine learning research literature, referring to an algorithm that is wrapped around another in order to extract the most salient features from the environment, and therefore to improve overall learning. With this approach in mind, he "wrapped" his on-campus course around the Machine Learning MOOC offered on the Coursera platform. The in-class lectures and low-stakes homework assignments Fisher provided in previous offerings of this course were replaced by the MOOC's online lecture videos, automatically graded quizzes, and programming assignments. Doing so enabled Fisher to use class time differently, focusing it more on interactive discussions and more challenging material. This structure is a version of the flipped classroom referenced above. In Fisher's case, the MOOC played the role of the earlier video lectures or textbook, providing students with a structured introduction to some of the course content.

Data Collection and Analysis

In order to explore student experiences learning in this wrapped course, a focus group was conducted with the students during one of the weekly class sessions just after the MOOC ended. The focus group, with all 10 students participating, was conducted during the first half hour of class that day. Students were informed that the focus group was part of a research project exploring hybrid teaching models and that their instructor was interested in their feedback on the course. The focus group was audio recorded and transcribed for later analysis. An informal and de-identified summary of the student remarks was shared with Fisher shortly after the focus group, before the end of the semester.

Later in the semester, students were asked to complete the standard end-of-course evaluation forms used widely at the University. Students' responses to two holistic, Likert-scale questions on these forms are discussed below. Additionally, a few weeks after the course had concluded (after winter break), students were asked by Fisher to complete a post-course survey, which was designed by the researchers to further explore some of the themes that emerged from the focus group. The survey consisted of 14 Likert-scale questions and three open-ended questions, and was taken by the students anonymously. Only five of the 10 students in the course completed the survey, yielding a 50% response rate.

Qualitative data analysis for this study involved the constant comparative method (Strauss & Corbin, 1998) and the development of case studies (Yin, 2003). During the initial phase, the transcripts of the focus group and students' responses for the open-ended survey questions underwent line-by-line coding in order to establish categories and subcategories related to students' experiences and views of the Machine Learning class and the wrapper approach. These overarching themes were triangulated (Strauss & Corbin, 1998) with Fisher's perspective as the course instructor. During this iterative process, the researchers met regularly to discuss the emergent categories, refine themes, and connect ideas.

Findings

Overall, student response to the wrapper approach in the Machine Learning course was enthusiastic. They described Ng's lecture videos as designed effectively, presented clearly, and informative; they described the MOOC as generally useful for self-paced learning. The students did not engage actively in the online community of peer learners created through the MOOC, preferring to interact with the local learning community provided by the on-campus component of the course. Although their overall response to the wrapper approach was positive, students pointed to challenges in integrating the online and face-to-face components of the course. Student perspectives on these issues are described in the following section.

Value of Self-Paced Learning

According to students, the major advantage of the MOOC over a traditional lecture-based course was its greater flexibility, customization, and accessibility, which students saw as encouraging structured self-paced learning.

Students valued the flexibility offered through the MOOC, which allowed for them to watch the weekly video lectures at their own pace and on their own schedule. As one student described:

"I really, really like the absorbing information on your own time at your own speed, and through this sort of video format with someone that you know is a really good lecturer, has really carefully prepared these topics, and I think that's much more efficient [than traditional lectures]." (Focus group transcript, November 14, 2012)

Along with students finding "being able to [watch videos] on your own schedule" as "very valuable," students also described that the videos' shorter length, typically between five and 15 minutes, helped them to keep their attention focused and to better digest the lecture content (Focus group transcript, November 14, 2012).

Various features of the online platform also allowed for students to customize the way in which they viewed lecture videos, which they found to be more efficient and conducive for learning. For example, one student explained how the variable viewing speed, captions, and embedded quizzes helped to "make [Coursera] a wonderful learning experience":

"I love the way Coursera is set up, and that you can kind of set your own schedules, watch it when you want. In addition to being able to watch at 2X [double speed], you also have captions throughout, so 2X plus captions makes it really easy to understand what's going on. And they also have questions based throughout the videos, and quizzes and homework assignments through there also, to totally keep you fully integrated with what's going on, and makes it a wonderful learning experience." (Focus group transcript, November 14, 2012)

Another student described the MOOC lecture videos as "basically the best thing ever," having watched the online lectures at "twice the speed," which helped the student "stay focused" and "feel like [he/she] got a lot more out of the material" (Focus group transcript, November 14, 2012). In addition to being able to speed up video playback and customize features, students also found the almost immediate feedback on quizzes and programming assignments to be helpful. One student explained that this "instant feedback" allowed for him/her to gauge his/her understanding and "make changes" accordingly.

Although students believed the flexibility of MOOC's self-paced environment to be effective, they also described it to be a challenge to stay on schedule. One student explained, "You have to be very disciplined to make sure you're keeping up on the material. If not, you'll find that you're trying to play catch-up a lot of times" (Focus group transcript, November 14, 2012). Another student, however, found that the self-paced environment enabled him/her to work ahead. Despite these differences, students described the face-to-face sessions with Fisher as helping to keep them on track with the material online.

Local vs. Global Learning Communities

Although students participated regularly in the Machine Learning MOOC to complete and submit assignments – for example, the programming assignments and quizzes, also submitted to Professor Fisher – they did not actively participate in either the Coursera discussion forums or the study groups formed online. Students cited time constraints as the main reason for not participating more actively in the online discussion forums. Instead, they used the discussion boards to check for course errata or to quickly troubleshoot questions or problems, but tended to ask questions among their local peers.

Additionally, students found the discussion boards helpful for solving problems they encountered, including sharing strategies and solutions pertaining to those problems. Although no students described posting a question on the discussion boards, students did describe the forums as useful for learning about "other people who were having the same problem" and applying their solutions to the problem. One student reported, "I knew that if I was stuck on something, thousands of other students were trying to do the same thing. In all cases I could find my specific questions in the online forums" (Survey response, January 19, 2013). Another student affirmed, "Whenever I had trouble on an assignment, I could almost always just go to the forum and look at the answers provided by people who had already run into similar stumbling blocks" (Survey response, January 17, 2013).

Instead of utilizing the online discussion boards, students preferred to ask questions about and discuss course content during the face-to-face class sessions. As one student explained in the focus group:

"I think when I had a question, I tended to ask the other people in here, before I would probably ask it on the discussion board. I mean, me and [another student] would talk before class about some of the material review." (Focus group transcript, November 14, 2012)

Students liked the structure of the wrapper format because it opened up space for productive class discussions related to the content. They also described in-class discussions as valuable for generating new ideas and new research projects. As one student described in the focus group:

"One of the things I liked is that, since you did the lecture material at home whenever you had time for it, it saved the class time for discussion. And so we didn't always discuss stuff that was exactly following along with the course, but whenever we did, I found that a lot more helpful." (Focus group transcript, November 14, 2012)

Another student echoed a similar belief: "So, I really liked doing that [online content] sort of outside [of class], and then coming in and sort of like taking all of the knowledge that supposedly you sort of download into your brain and apply." A third student expressed a view of class time as "it's more like you ask questions, you learn." (Focus group transcript, November 14, 2012)

Interestingly, three of the five responses to the online survey question asking how to improve the course suggested even more discussions of the MOOC material during the face-to-face class meetings. One student explained that discussions were valuable because they facilitated "instant feedback from [the] instructor and classmates," (Survey response, January 28, 2013) and another explained,

"I would recommend that you discuss more of the Coursera material in the class. I don't think you should give a repeat lecture of the material, but rather spend some time talking about the methods presented and the main ideas of the methods. This was done in our class to a degree, but I would like even more discussion from Coursera." (Survey response, January 19, 2013)

In the focus group, students also suggested more in-class discussion of the material presented in the MOOC. These suggestions included "short discussion for the first 15 minutes of class to ask questions or consolidate ideas before moving on" and more "discussion of applications" of the online content (Focus group transcript, November 14, 2012).

Misalignment between Face-to-Face and Online Components

According to students, one challenge in this offering of the on-campus Machine Learning course was that the topics covered in class did not always line up with the material covered in the video lectures on a week-to-week basis. Students mentioned that they would have preferred a greater degree of alignment between online and on-campus offerings, so that the material in-class would more directly address, and expand upon, the topics covered online. As one student explained in the focus group:

"I felt like the topics we covered in class – because we'd read some like outside papers – they didn't line up very well with a lot of the online material. I mean, not that it wasn't valuable stuff, but it seemed kind of disjointed to me." (Focus group transcript, November 14, 2012)

The misalignment was particularly problematic for students in terms of the research papers discussed in class. One student commented that the information in the papers was presented in a "less structured format" than the information in the MOOC materials, making the papers seem "less accessible." However, as another student pointed out, the research papers required a "different kind of learning" than the highly structured video lectures. And as that student described, although the papers raised more questions than the online lecture material, the face-to-face sessions provided a space for discussion.

Students emphasized that they were new to machine learning and reported feeling ill-prepared and lacking in context to adequately understand the papers. As beginners, they would have preferred to read papers more directly connected to the video lectures and material covered online. They suggested supplementing reading with review articles assigned before each paper, or by including an outline or key points to guide assigned reading. Even though there was a consensus among students that the papers were challenging, they described the reading in the terms of application of knowledge, an exercise in "Can you get something from it?" – that is, the real-life negotiation of meaning. And, in fact, one student reported learning to read machine-learning papers in the course of the seminar, despite the challenges:

"One thing I was just going to say about the papers is for me, they were kind of a bitter pill to swallow. I went through and I read these papers, but looking back, I'm glad I did it because I feel like I can go to a machine learning paper, and I can read it, and I won't be as intimidated by it because I've kind of struggled through it all semester reading these things. So, I feel like I'm in a better place now than I was before I started this course." (Focus group transcript, November 14, 2012)

Student Perceptions of the Instructors

Valuing both Ng's and Fisher's contributions, students viewed each as having different roles in the Machine Learning course. Overall, students perceived Ng, a "world-renowned researcher and teacher," as the lead lecturer of the course and explained that they found his teaching style to be effective (Focus group transcript, November 14, 2012). Students explained that "he did a really good job with the course" (Focus group transcript, November 14, 2012). Specifically, one student pointed out: "Andrew Ng does a great job of teaching the skills necessary and highlighting potential problems" (Survey response, January 17, 2013).

In contrast, students perceived Fisher's role in the face-to-face sessions as that of a "facilitator." They described him as following up on their work in the MOOC, explaining concepts and providing background for the papers, and leading class discussions. In the focus group, one student explained:

"I thought he [Fisher] was a facilitator, and he would try to facilitate discussions. He would introduce papers for us to read, and then just kind of follow up and make sure we're doing the Coursera stuff by having us submit everything to him each week. That's the word that comes to my mind." (Focus group transcript, November 14, 2012)

Another student built on this comment about Fisher's role as facilitator:

"He did a really good job in facilitating the discussion of the research papers, I thought. And he made sure that everybody talked, even when we didn't want to. He would tease something out of us to get us to talk about the paper and what we thought or what we didn't understand." (Focus group transcript, November 14, 2012)

As noted above, students in the course were asked to complete the University's standard end-of-semester course evaluation. Six of 10 students responded to the two holistic questions: "Give an overall rating of the instructor" and "Give an overall rating of the course." Each question had an average response of 4.17 (on a 5-point scale – 3 being average, 4 being very good, and 5 being excellent), with a standard deviation of 0.68. These ratings were comparable to Fisher's Spring 2012 Machine Learning course, in which students viewed Ng's lectures, but the rest of the MOOC was not used. Before 2012 (Spring and Fall), the last offering of the Machine Learning course by Fisher (or any instructor) was in the Spring of 2006. That offering, occurring well before MOOCs were available, was taught using a more traditional face-to-face approach. The average end-of-semester ratings of instructor and course in 2006 were 3.83 (standard deviation: 0.89) and 3.66 (standard deviation: 1.11), respectively, with six of six students responding. These data, although based on small sample sizes, indicate that the hybrid course of 2012 was somewhat better received than the more traditionally taught course of 2006. While suggestive only, the increase in means and the decrease in standard deviations (from 2006 to 2012) are measures that warrant continued tracking in future wrapper courses.

Discussion and Conclusion

While these numbers are too small to support strong conclusions on the efficacy of the authors' wrapper design, the experience is suggestive and can guide research going forward. In this section, after reviewing key observations from the case study, the beginnings of a categorization scheme for the kinds of couplings that can arise between the online and in-class components of a blended course are introduced. The present study is then framed with this nascent categorization, and an argument is put forward that customizing a wrapper around parts of multiple MOOCs – and other online resources – can both leverage the advantages of MOOC platforms, and also soften design constraints that stem from adopting a MOOC en masse.

MOOCs as Learning Resources

It is clear that the students in this Machine Learning course found the online lecture videos provided by the MOOC to be useful, thanks to both content and form. While it is possible that less experienced students (say, first-year undergraduates) might not find online lecture videos, with their lack of instructor-student interaction, as useful, it is clear that at least in this teaching context, the online lectures were a valuable resource for the students.

Interestingly, "outsourcing" the lecture component of the course to the MOOC instructor did not diminish the students' view of the on-campus instructor as an effective teacher. They noted that Fisher's role was changed from a lecturer to a facilitator, but the students had an overall positive view towards Fisher and Ng. This is perhaps not surprising given the greater (graduate) experience level of the students in this course. It is also possible that Fisher's inclusion of research papers he selected helped students see him as an expert, even if he was not fulfilling that role in the traditional way of lecturing. Less experienced students (again, consider first-year undergraduates) might not place as much value on the facilitator role taken by the on-campus instructor.

Moreover, it is possible that having two instructors, with different points of view on the course content, helped the students better understand debates within the discipline. Again, expert differences could be challenging for underclassmen and/or students in fields where a single "right" answer is not typically mandated, such as literature or history; however, they could also provide a useful tool for helping students move from what Kuhn (1992, pp. 167-168) describes as "absolutist" or "relativist" modes of thought to more "evaluative" modes as they grapple with observations that experts in a field can and do disagree. Though there were no overt disagreements between instructors on the material that was covered, Fisher's inclusion of material unrelated to the MOOC coverage reflected (probably) Fisher's different prioritization of material; different prioritizations among machine learning researchers and practitioners was a topic for some in-class discussion.

Also valuable, at least for some students, were the discussion forums provided within the MOOC. Their use of the online learning community, albeit limited to selective reading and no posting, points to the value of the "M" and the "C" in the acronym MOOC: with thousands of other students ("massive") working through the same material at the same time ("course"), it was highly likely that any difficulty encountered by one of Fisher's students was raised and addressed on the forums.

Although the present case study's results indicate that MOOCs can serve as useful learning resources as part of a blended course, student comments in the study also point to the design challenges involved in wrapping a face-to-face course around a MOOC. The misalignment that students perceived between the online and face-to-face components of the wrapped course speaks to the recommendation that students' work online must be made clearly relevant to their work in the classroom, and vice versa (Babb et al., 2010; Gilbert & Flores-Zambada, 2011; Toth et al., 2010). This design challenge seems particularly difficult when building a blended course around a MOOC, given that the online component is relatively fixed, potentially inviting a schism between the online and face-to-face components of the course. This design challenge is explored in the following subsections.

Coupling between Online and In-Class Components

The authors' study suggests that hybrid courses are characterized and distinguished by the coupling that occurs between the online and face-to-face components, as well as the cohesion of the hybrid course in total. Coupling refers to the kinds and extent of dependency between online and in-class components of a hybrid course, whereas cohesion refers to the relatedness of the course content overall.

There was a relatively low degree of coupling (or loose coupling) in Fisher's course, by the instructor's design, and to the apparent dissatisfaction of some students. The factors behind the low-coupling design were: (1) that there were material and skills that the on-site instructor wanted to cover that was not covered by the MOOC, with limited synergies possible; (2) that the on-site instructor, a machine learning expert himself, felt that the MOOC modules were excellent and self-contained; (3) that those modules were certainly within the grasp of graduate students taking the course (indeed, class assessments confirmed this); and (4) that the instructor needed to maintain a sustainable workload (this was an overload class), and greater coupling generally requires greater time and effort (Aycock et al., 2002). Under (1), the skills at issue are those of reading and understanding journal papers published in the literature, skills in which graduate students must become practiced.

The coupling that did exist in the course involved limited in-class discussion of MOOC material as it related to some of the readings. These interactions, in which part of a class session was used to synthesize across readings and MOOC lectures, was perhaps closest to a traditional flipped classroom. The authors call this subject coupling, because subject matter is shared across the online and face-to-face components of a course. While most periodic (e.g., weekly) assessments were done through the MOOC, Fisher did give a weekly quiz on the readings, and in some cases these quizzes would draw upon both reading and video material (e.g., students received an in-class quiz that asked them to combine concepts from the reading on regression trees with the MOOC material on multivariate regression).

The instructor also intended that final projects by students would draw on methods presented through the MOOC and the readings, and that the final projects would also allow students to practice research methodologies, which they were learning about through both readings and the MOOC. This is an example of what the authors call task coupling, since online and face-to-face components contribute to the completion of a task, typically by learning and applying complementary subject content and/or skills. (These categories, subject and task, are not intended to be mutually exclusive; indeed, much of what would be characterized as active and experiential learning would involve both types.)

Implications of Coupling on Student Satisfaction

Student feedback suggests that students would have liked stronger subject coupling between MOOC and face-to-face components, in which the MOOC material is reviewed in class. It remains an open question as to whether students would have been equally satisfied with task coupling, if only it had been distributed throughout the semester (e.g., regular in-class activities in which students applied what they learned in the MOOC) rather than reserved for the project at the end of the semester.

The previous offering of the Machine Learning course, in the Spring 2012 semester, offers a contrast here. As noted earlier, during that semester Fisher incorporated most of the online lecture videos from the Stanford MOOC into the course, but did not require students to participate in the MOOC itself. Indeed, the MOOC was not offered at that time; only the archived lecture videos were available. (This points to another design constraint in wrapping a course around an external MOOC: the times at which the MOOC is offered might not align well with the local academic calendar.) The Spring 2012 offering of the course had higher subject and task coupling, since the online videos were addressed more directly during class (including through quizzes) and the student projects were started earlier in the semester. Even though end-of-semester students' ratings for both Spring and Fall 2012 course offerings were comparable, none of the Spring students mentioned any kind of misalignment between the online and face-to-face components of the course. This experience offers some evidence for what seems a natural conclusion, that higher coupling results in fewer student concerns about low coupling.

Despite some student discomfort with it, the low coupling approach nonetheless resulted in a very satisfactory class as measured by end-of-semester evaluations. Nonetheless, creative approaches to course redesigns that more highly couple the online and in-class components are certainly of interest. Below, plans are described for a blended approach that draws resources from multiple online resources, including MOOCs, which may have positive implications for coupling.

Cohesion and Coupling

The low coupling design of the Fall 2012 Machine Learning wrapper contrasts with the holistically designed, highly coupled, blended pre-MOOC courses surveyed earlier. As noted, the instructor of a wrapper can craft in-class activities that more significantly connect with the existing MOOC, through subject, task, and other forms of coupling; however, learning and teaching goals and scope necessarily guide and constrain course design.

In addition to coupling between online and face-to-face components, the content cohesion of a blended course, or of any course for that matter, must also be addressed. Indeed, the degree of course cohesion will influence the degree and types of coupling that are most natural within a course. Fisher's Machine Learning course was a survey course, designed to cover a wide range of machine learning methods, many of which are quite disparate. As such, the course cohesion was low, as compared to say, Ng's stand-alone MOOC, in which topic choice and scaffolding created a strong sense of synergy. Thus, in blended environments there can be not only a low coupling between online and face-to-face components of a course, such as Fisher's wrapper, but also a low coupling between the face-to-face modules, or between the online modules. Indeed, to the extent that there exists, in wrapper courses, a difference in the degree of scaffolding provided for materials in the online and the on-campus components, it is possible that additional scaffolding of materials in the on-campus classroom could increase students' perception of cohesion, even without incorporating additional forms of coupling.

Furthermore, the Fall 2012 wrapper combined a graduate seminar course (which included journal readings) delivered through the in-class component with a more structured, lecture-based course delivered through the online component. Including both components in one course is important in the authors' setting, but the very different modalities for learning may have been more responsible for perceptions of schism than the hybrid online and face-to-face structure per se. Moreover, student perceptions of schism may have been magnified because "the two faces" (Professors Fisher and Ng) of the two respective course components were different. An open question is whether students would have been as concerned with schism (between content) in a traditional, one-instructor, entirely face-to-face, low-cohesion survey course, as they were in the authors' wrapper version of a survey. Generally, student expectations regarding a wrapper should be characterized and addressed, because having multiple instructors in multiple modalities is not in the experience of most students.

Finally, this paper has introduced the barest categorization scheme for coupling and cohesion, but the authors expect that this can be usefully expanded and deepened. For example, subject coupling (across or within modalities) may be broken down into repetitive (or reinforcing) treatment of the same material, or connected by prerequisite relationships, with some material building on others. Likewise, forms of task coupling might be further distinguished into modules looking at complementary subject content and/or complementary skill content.

Customization and Other Future Work

The Fall 2012 wrapper used the totality of the Stanford MOOC, but customization strategies might choose to wrap a course around only part of a MOOC, or more ambitiously, to wrap a course around parts of multiple MOOCs. The possibilities to explore this latter type of customization continue to emerge as the Coursera platform continues to expand; for example, the platform hosts the lectures of a second Machine Learning MOOC (Domingos, 2013) from the University of Washington. While this course has some intersections with the Stanford MOOC, it also intersects the additional content that Fisher included in his wrapper. A next step would be to design a wrapper around two or more MOOCs, with the instructor selecting and mixing lectures and assignments from each MOOC, as well as using other online content, some perhaps even produced by the instructor of the wrapper. This is an exciting possibility, which does not require that a MOOC be adopted in its entirety, as is. Currently this process of selection and mixing is technically easy, and such use cases may further drive MOOC providers to design for piecemeal use, accelerating customization and a co-evolution of online and blended course designs. Indeed, some recent ideas in this area highlight the possibilities opened by "mixable" elements (Caulfield, 2012b). In any case, it is expected that a process of mixing online resources will require greater attention to course design, perhaps resulting in certain kinds of coupling between online and in-class components, thus reducing perceptions of schism.

In general, the focus of this paper has been the design of in-class components to complement existing online components (a MOOC in this case study). However, the more typical perspective of designing online components to complement in-class components is equally important. In the case where the online components are MOOCs, however, the question becomes novel: how should MOOCs be designed to best take advantage of in-class component designs? More generally, how can MOOCs be best designed to best leverage differently designed local learning communities?

Finally, drawing on the findings of earlier studies (Aycock et al., 2002; Mehaffy, 2012), the authors believe that greater customization will also lead to a greater realization among instructors that they are members of instructional communities, further promoting open opportunities for collegiality and collaboration among instructors and across disciplines. As members of community, the authors expect that in many cases, instructors of wrapper courses will create and add content to the world's repository as well, which may in turn be picked up by students outside the wrapper. While Fisher did not create and contribute video himself for the Fall 2012 Machine Learning course, he has done so for other courses, and for his graduate course in artificial intelligence, he requires students to create and post content. Fisher (2012) reports that students taking a MOOC in artificial intelligence hosted by another institution visited his YouTube channel for clarification on some concepts, posting a link to the channel on the MOOC discussion board, bringing still other visitors.

While such data is anecdotal, it suggests the fascinating possibility for characterizing student and faculty interactions beyond any single MOOC, to include interactions across MOOCs and across media. Longitudinal studies for understanding the nature, extent, and evolution of ad hoc communities, perhaps MOOC centered but not restricted to a single MOOC, would undoubtedly require data mining across larger spheres of Web interactions than is currently easy to do. Nonetheless, the possibilities for understanding and leveraging student patterns in seeking remedial and advanced material, instructor incentives for creating and posting material, and the movement of people between student and teacher roles, are exciting and within current technical abilities – if only the data could be accessed.

References

Aycock, A., Garnham, C., & Kaleta, R. (2002). Lessons learned from the hybrid course project. Teaching with Technology Today, 8(6). Retrieved from http://www.uwsa.edu/ttt/articles/garnham2.htm

Babb, S., Stewart, C., & Johnson, R. (2010). Constructing communication in blended learning environments: Students' perceptions of good practice in hybrid courses. MERLOT Journal of Online Learning and Teaching, 6(4), 735-753. Retrieved from http://jolt.merlot.org/vol6no4/babb_1210.htm

Baldwin, R. G. (1998). Technology's impact on faculty life and work. New Directions for Teaching and Learning, 76, 7-21. doi:10.1002/tl.7601

Bowen, W. G., Chingos, M. M., Lack, K. A., & Nygren, T. I. (2012). Interactive learning online at public universities: Evidence from randomized trials. New York, NY: Ithaka S+R. Retrieved from http://www.sr.ithaka.org/sites/all/modules/contrib/pubdlcnt/pubdlcnt.php?file=http://www.sr.ithaka.org/sites/default/files/reports/sr-ithaka-interactive-learning-online-at-public-universities.pdf&nid=464

Caulfield, M. (2012a, November 9). How Coursera could walk the talk about MOOC-wrapping [Web log post]. Retrieved from http://mikecaulfield.com/2012/11/09/how-coursera-could-walk-the-talk-about-mooc-wrapping/

Caulfield, M. (2012b, December 11). Threads and the wrappable MOOC [Web log post]. Retrieved from http://www.hapgood.us/2012/12/11/threads-and-the-wrappable-mooc/

Domingos, P. (2013). Machine learning. Retrieved July 25, 2013, from https://www.coursera.org/course/machlearning

Fisher, D. H. (2012, November 6). Warming up to MOOC's. [Web log post]. Retrieved from http://www.chronicle.com/blogs/profhacker/warming-up-to-moocs/44022

Gilbert, J. A., & Flores-Zambada, R. (2011). Development and implementation of a "blended" teaching course environment. MERLOT Journal of Online Learning and Teaching, 7(2), 244-260. Retrieved from http://jolt.merlot.org/vol7no2/gilbert_0611.htm

Guthrie, K. M. (2012). Barriers to the adoption of online learning systems. EDUCAUSE Review, 47(4), 50-51. Retrieved from http://www.educause.edu/ero/article/barriers-adoption-online-learning-systems

Hill, P. (2012, November 1). Online educational delivery models: A descriptive view. EDUCAUSE Review, 47(6), 84-97.Retrieved from http://www.educause.edu/ero/article/online-educational-delivery-models-descriptive-view

Koller, D. (2012, November 7). How online courses can form a basis for on-campus teaching. Forbes. Retrieved from http://www.forbes.com/sites/coursera/2012/11/07/how-online-courses-can-form-a-basis-for-on-campus-teaching/

Kuhn, D. (1992). Thinking as argument. Harvard Educational Review, 62(2), 155-178.

Lage, M. J., Platt, G. J., & Treglia, M. (2000). Inverting the classroom: A gateway to creating an inclusive learning environment. The Journal of Economic Education, 31(1), 30-43. doi:10.1080/00220480009596759

Mangan, K. (2012, October 1). Massive excitement about online courses. The Chronicle of Higher Education. Retrieved from http://www.chronicle.com/article/Massive-Excitement-About/134678/

Mazur, E. (2009). Farewell, lecture? Science, 323(5910), 50-51. doi:10.1126/science.1168927

Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. Washington, DC: U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. Retrieved from http://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf

Mehaffy, G. L. (2012). Challenge and change. EDUCAUSE Review, 47(5). Retrieved from http://www.educause.edu/ero/article/challenge-and-change

Pappano, L. (2012, November 2). The year of the MOOC. The New York Times, ED26. Retrieved from http://www.nytimes.com/2012/11/04/education/edlife/massive-open-online-courses-are-multiplying-at-a-rapid-pace.html

Rodriguez, M. A., & Anicete, R. C. R. (2010). Students' views of a mixed hybrid ecology course. MERLOT Journal of Online Learning and Teaching, 6(4), 791-798. Retrieved from http://jolt.merlot.org/vol6no4/rodriguez_1210.htm

Shirky, C. (2012, November 12). Napster, Udacity, and the academy [Web log post]. Retrieved from http://www.shirky.com/weblog/2012/11/napster-udacity-and-the-academy/

Stone, M. T., & Perumean-Chaney, S. (2011). The benefits of online teaching for traditional classroom pedagogy: A case study for improving face-to-face instruction. MERLOT Journal of Online Learning and Teaching, 7(3), 393-400. Retrieved from http://jolt.merlot.org/vol7no3/stone_0911.htm

Strauss, A. L., & Corbin, J. M. (1998). Basics of qualitative research: Techniques and procedures for developing grounded theory (2nd ed.). Thousand Oaks, CA: Sage.

Talbert, R. (2012, July 13). Screencasting for the inverted proofs class [Web log post]. Retrieved from http://www.chronicle.com/blognetwork/castingoutnines/2012/07/13/screencasting-for-the-inverted-proofs-class/

Toth, M. J., Amrein-Beardsley, A., & Foulger, T. S. (2010). Changing delivery methods, changing practices: Exploring instructional practices in face-to-face and hybrid courses. MERLOT Journal of Online Learning and Teaching, 6(3), 617-633. Retrieved from http://jolt.merlot.org/vol6no3/toth_0910.htm

Wiley, D. A., & Gurrell, S. (2009). A decade of development ... . Open Learning: The Journal of Open and Distance Learning, 24(1), 11-21. doi:10.1080/02680510802627746

Yin, R. K. (2003). Case study research: Design and methods (3rd ed.). Thousand Oaks, CA: Sage.


View the original article here

No comments:

Post a Comment