Saturday, November 24, 2018

Use of clicker questions in the flipped lecture

The design and delivery of clicker questions is a key feature of a flipped class. I am getting ready to teach the fourth instance of my flipped genetics (BIS101) at UC  Davis and as I review the material I reflected on the challenge of the clicker questions. My lecture-discussion on RNA interference is a good example. The lecture objective is to get the students to understand how a genetic screen involving a micro RNA gene would work, how one would analyze it, and what conclusions could be drawn from it. In summary, a key moment in the climb of Bloom Taxonomy of Learning peak.  

At this point in the course, students have learned Mendelian genetics, the central dogma, prokaryotic gene regulation and are learning eukaryotic gene regulation. In preparation for the lecture-discussion students have viewed online material featuring both description and videos of RNA interference. In addition, I provide a short seminar by Gary Ruvkun, which while very clear, turns out to be very challenging for many of my students.  

The objective of the two-hour lecture-discussion is for students to understand how an imaginary genetic screen involving lin-4 and lin-14 would have worked. For those who are not worm connoisseurs,  lin-4 is a mutation affecting a miRNA that targets the mRNA of lin-14, which product promotes larval cell growth. lin-4 mutants have excess larval cells. lin-14 mutants fail to develop certain larval cells. lin-14 is epistatic over lin-4. The two mutations were isolated independently, but in this class I openly pretend that lin-14 was isolated as a modifier of lin-4

A cartoon depiction of a fictional genetic screen in which the neotenic lin-4 phenotype is suppressed by the progeric lin-14 phenotype
I have uploaded a video of the whole lecture-discussion (see below). The video demonstrates the importance of carefully planning the Socratic process of teaching. It displays both success and (partial) failure. Success is illustrated by a series of clicker questions on how a miRNA and target interact, starting here.  As illustrated by the response to this clicker question, students respond well and are getting the concepts. 
One of several clicker questions exploring the concept of how a miRNA and its target interact to provide a new regulatory outcome. Students did very well.

Failure is illustrated by the difficulty in getting students to work through pathway analysis starting with an F2 segregation pattern consistent with recessive epistasis. The problematic phase starts here.

Many students were stumped. The problem with this clicker question is not intrinsic to it: it is a good question. However, in the instance demonstrated in the video, it should have been preceded by a refresher on epistatic analysis. For example, a few easier clicker questions that review it.  

The problem is clear: I jumped into the 9:3:4 F2 ratio and the connected pathway without "warming up" the students. The transition was too abrupt. We had covered this type of analysis a few weeks before and students had done well. I should have make a quick example of epistasis, reminding students of the symbols (such as --| for repression and --> for promotion), and of the strategy used to test hypothetical pathways using mutant phenotypes. Also, this question was delivered in the second hour and fatigue may have also contributed.

All considered, this lecture-discussion was not a flop. The students understood much of it and were reminded that genetic experiments are carried out to understand cellular mechanisms. Following this lecture, they used online quizzes and material to review the problems and eventually did well in the exams.

The moral of the story is that the clicker questions must be designed and delivered very carefully. If done properly, the process is very satisfying because students engage and learn to extrapolate simple knowledge and analyze scientific evidence to derive mechanisms.  

Sunday, April 9, 2017

Workload in the flipped course

The problem

Figure 1. Amount of work (2017)
My flipped course requires online preparation, taking online quizzes, and participation in the lecture-discussion. Is this too much work? Both my 2016 and 2017 students thought so by a considerable margin (Fig. 1). How much work is expected?

According to the UC Davis catalog: "Units of credit are assigned to courses based on 1 unit of credit for three hours of work by the student per week. Usually this means one hour of lecture or discussion led by the instructor and two hours of outside preparation by the student." For a four-credit course such as this, eight hours of work is expected. Eight x ten weeks in the quarter = 80 hours. Canvas logged time is in average 140 hours. If these are real work hours, then the coursework might be too much.

In all fairness, the course structure focused students on genetics and this was acknowledged by many students. From the anonymous reviews:

  • "I really liked the amount of homework required for this class.  It really forced me to put in the effort and learn the material before exams.  Plus I liked that the homework broke the material up into smaller chunks. "
  • "I enjoyed the fact that the course made me study everyday compared to the other courses. It really promoted active learning and really forced me to use basic concepts to answer difficult questions. "
  • Figure 2. Time on Canvas vs. course grade
  • "I liked the flipped course. It was a lot of work but it prepared me and forced me to keep up with the class. And it made skipping lectures not an option."
A minimum amount of work is required. Interestingly, I could not find any relationship between the amount of extra work and the grade. For example, in the A-C range there is no relation between Canvas logged time and grade (Fig. 2). This suggests that either students vary in the amount of work they need to master the material, or that many students do unnecessary or unproductive work.

Solution 1

In 2017 I implemented a 20% drop policy. At the end of the course, the lowest 20% of test scores was reset to the student's corresponding point mean for the remaining 80%. Dropping and prorating was done independently for four categories: midterm exams, practice quizzes, module-end quizzes and lecture clicker sessions. Students love it and so do I.

It takes huge pressure off them and off me. Students can miss, for example, 5 of 25 lectures, or 3 of 15 MEQ and still be fine. I avoid all the assorted overhead that would be required if students are held to 100% compliance. Baja trip for your cousin February wedding? Not an official excuse item, but some students would plead incessantly for mercy, nonetheless. This way, no problem: Vaya con Dios! Fight with significant other screwed up midterm exam preparation and performance? No problem. Got drunk and could not wake up the midterm day? You get the idea.

Implementing solution 1

Figure 3. Effect of the 20% drop and prorate action on
clicker scores. Each dot represents a student.  
Coming up with the concept was far easier than implementing it. Consider the challenge: for each student you have, for example, 25 clicker scores. Each clicker session has a different maximum value, ranging from 12 to 26, depending on the number of clicker questions in any given lecture. For each student you have to find the lowest 20%, calculate the mean for the remainder and prorate correspondingly. Not the calculation you want to do with paper and pencil. I am not sure that it is possible to do it with Excel either.

My solution was to use Python and Pandas. It worked well: Fig. 3 displays the plot of the adjusted score vs the original score. Python is a programming language and Pandas a Python library (a sub-language) to process large tables. If you have some programming knowledge, it is fairly straightforward to set it up. I plan to put the program (a Jupyter Notebook) on Github. In the mean time you can email me if you want the program.

Solution 2?

Well, solution 1 was not enough because students still complain. I think they have a point. True, you get  to drop your darkest moments, but you still have to study all the material if you want to stay afloat. So, the drop is a great stress remover, but the work is still due. So, how do I lighten up the curriculum? I require my students to study in depth items that are probably not covered in every genetics course. For example, LOD mapping in  family pedigree. It is quite satisfying to watch as eventually they all get it. But, that knowledge is gained with some sweat. And, if I drop it, they do not learn about LOD.... You see my conundrum?  Summary: I am still working on this one.

Poetic justice

For what it is worth, the flipped course is a lot of work for the teacher. According to the old Italian adage "Mal  comune, mezzo gaudio" [=common trouble, half a feast], this knowledge should lessen the students' pain. How much work for the teacher? Canvas counted 530 hours. These seem too many. Some of these were likely logged when my computer was idle. Nonetheless, this suggests that I spent 3X the mean student time on Canvas. A lot of work was required to prepare the online content in 2016. In 2017, additional time was required for fixing the modules and the quizzes. Part of the work is off Canvas, such as when new videos are made. Implementing the 20% drop, for example, took me 15 hours (I am a slow programmer). It is an important consideration for would-be-flippers: the formula I followed is a bit work-heavy on both sides of the teaching divide. Lots of room for improvement. Ultimately, it is plausible that teacher's and students' workloads would be similar to those of traditional courses.

Monday, April 3, 2017

Student response to flipping

Students' opinion before taking a flipped course

In 2016 I polled my students anonymously before the flipped course started. A total of 179 students addressed this question "How do you feel about this BIS101 employing the flipped classroom teaching model?". Here is the breakdown of their choice of answer:
  • I feel excited and look forward to this class: 11%
  • I am willing to try it but I am somewhat wary: 40%
  • OK, but I rather have the old style where the prof lectures: 26%
  • I think it is a bad idea and wish I could enroll in a BIS101 with a regular teaching style: 23%
The 50:50 response probably represents well most students' general feelings toward flipped classes. Although, I did not poll in 2017, I would expect the same response.

Students' opinion after taking a flipped course 

Figure 1Figure 2

After the course was completed, I polled my 2017 class using Canvas anonymous survey. Here is some of the feedback. Students' response to "Please indicate the overall educational value of the course" was positive, with 168/196 students ranking the course as excellent or very good (Fig. 1). Faced with the statement: "The course activities required me to develop critical thinking skills and apply my knowledge in creative ways" 174/196 students "strongly" or "somewhat" agreed (Fig. 2).  
Students also felt positive about how much they learned. 177/196 students "strongly" or "somewhat" agreed that they learned genetics. Students' opinion on how much they learned was also made clear in response to this statement: "I have learned more in this course than in the average, equivalent UCD course" with which 144/196 students "strongly" or "somewhat" agreed.  

Interest in taking another flipped course

Figure 4
Figure 3
I asked if there were interested in taking another flipped course, if I taught it, or if someone else taught. The answer can be summarized as "yes" in the first case (Fig. 3) and "maybe" in the second (Fig. 4). The dichotomy between a known instructor and an unknown one suggests that some students are wary of this form of active learning and may accept it once they know what they are getting into.

Comments by students

Student comments in the survey were much more likely to be positive than negative. Below are three of each type that capture the attitude of the corresponding group of students.

Students who like it
"I really enjoyed the flipped course formatting. I felt that although my grades were not the best, I was able to retain the information a lot better as compared to other classes..... I never knew I would like something so much. I finally feel I found my interest subject in science. I've always felt so insignificant as a science major, like I do not belong. But since taking this class, I feel I have found what I am truly interested in." [Grade expected: C]

"I did like the flipped course for the sole reason that it allowed me to stay on top of my studying. Many times, in normal lecture style classes, I tend to procrastinate and leave everything until the end. So, in this class, even if I am swamped with work from my other classes, at least I will know something for the exam." [Grade expected: B]

"Hard to say, I enjoyed my experience in this course. It was a lot of work especially with midterms every 2 weeks and meqs [=online midterm-like quizzes], but it was an enjoyable experience." [Grade expected: B]

Students who hate it
"It's very unfair to make students attend all the lectures AND then watch ~20,30,40 minute lectures outside of class just to understand the material. ....." [Grade expected: A]

"The flipped course simply does not work... There is nothing like old school teacher student interaction that relies very little on technology and online modules. I do not want to learn by myself without being properly taught about the subject and not specificly on clicker questions. It just doesn't work.. I arrived to class eager to learn but I spent most of the class period uncomfortably discussing my wrong option on the clicker with another student beside me that, most of the time, also had no idea what was going on in class lecturewise. Not only is it frustrating, but very discouraging. ..." [Grade expected: D]

"The entire flipped course concept did not work for me. This course could have been better if it were taught in a traditional manner. The online workload is unnecessary and a waste of time. Activity points should only be awarded with clicker questions. " [Grade expected: F]

Amount of work

Figure 5
Students perceived the amount of work in this course to be more than in comparable courses (Fig. 5). This was of some concern to many and probably a point of contention with a minority. It is also of concern to me and I am evaluating how to make the load consistent with the credit hours. I will dedicate a future post to this issue.


I had been warned that teaching a flipped course would cause my student evaluations to plummet. This has not been the case. Rather, the course garnered substantial acceptance with a majority of students satisfied with the learning advantages of flipping.

Making videos for a flipped course

Video design and production strategies

Lecture videos provide useful material for student preparation. After some experimentation here are my favorite guidelines for their design. 
  1. The movies should be very clear, typically based on slide decks or on whiteboards
  2. They must be short, optimally 5 minutes and no longer than 10 min or students attention will drop
  3. Any method that works for the author is a good method
These are by not means rules, and guideline 3 trumps everything else. How do you design a video? The short answer is that it depends. If you are a strong lecturer, one who is typically clear, you can jot down some notes and and take a movie of yourself. Here are two good examples:
  1. Joel Ledford, a colleague of mine at UC Davis delivers a lecture on LUCA using the UCD E-learning studio
  2. Ed Himelblau, a collaborator of mine at Cal Poly delivers a lecture on Lac operon regulation
Figure 1. E-learning studio at UCD. The lecturer delivers
the  lecture while behind and using the glass board.
I am not sure how much time each video required, but I bet each involved less than four hours including planning, delivering the lecture, filming and editing it. Joel uses a fairly sophisticated facility at UC Davis (Fig. 1). Ed is winging it elegantly with a consumer camera (his cell phone?). The presentations exceed my favorite target time, but are clear and easy to follow. 

I have tried the E-learning studio and I do not think it fits my style. The resulting video is too long and I find my lecture to be adequate, but not as compelling as I would like. You can edit them later of course. Using a video editing software like Camtasia, you can add any thing on top of your lecture (see example in the link above).

If a studio setup such as the one in Fig. 1 is not available,  you can set up a simple studio at home. Steve Luck, a colleague at UC Davis, scripts and films his own lectures. See his course introduction for an example. By the way, the description of his hybrid course might also be of interest. 

The Ploid recipe for a lecture video

  1. Establish the learning objectives. What do you want your students to learn? What are the targets and what level of mastery do you aim for?
  2. Plan the lecture. I am very visual and use drawings to outline my message. One could also use a slide deck or a textbook.   
  3. I scribble using Doceri on a iPad. I make a video of the drawings using the Doceri time track feature. 
  4. Script the lecture. Write out exactly what you will say in 500-1000 words. Then transfer it to a teleprompter program. I use Promptsmart on an iPad. Read it while recording it to a sound file. I use Quick Time on a Mac and a dedicated microphone. You should be able to fit 100 words in one minute.
  5. In Camtasia join the sound and video tracks. The clip speed on the video track allows stretching and compacting specific segments of the Doceri video to fit the sound track.
  6. Add a head and a tail frames for title and end acknowledgements. 
  7. Upload to YouTube. You are done. 
I find Youtube to be the most convenient repository. It is reliable, you can set different privacy level, such as "unlisted" or "public", if you want just your class or the public to see them. It provides extensive "Analytics". The following two videos are examples of this method. 
  • Genetic epistasis. 7min. This is my most popular video to date (April 2017) with ~2500 views a month. 
  • Protein motifs. 3min. Not so popular, but I like it. 
The videos are fun to make. After some practice, I can design them and produce them with relative ease. Still, from conception to posting making a video takes me a minimum of 8 hours. 

Sunday, April 2, 2017

Cheating in the flipped class

Do students cheat in online tests?

I think that cheating in my BIS101 is not a major problem, but I am aware that students can easily cheat when taking an online quiz called MEQ (Module End Quiz). A student wrote me an email close to the course end wondering about the cost of honesty:
"I just wanted to bring to your notice that there are a lot of people in class that worked on MEQs and GPQs in groups. This heavily affects the curve for the class because people who got help from others that already finished the quiz had an unfair advantage over people who did not have any help..... I personally know people that have a grade much above the current average solely because of their quiz grades. ..."
GPQ (Graded Practice Quizzes) can be taken as many times and with as much help as desired. That is not the case for the MEQs.

MEQ: Module End Quiz

Figure 1. Structure of a module
MEQs are online capstone quizzes at the end of each of 15 modules in my BIS010 course. Collectively they are worth 63 of the 1000 total points. The students are instructed to take them as online midterm exams: alone, with closed books, off the internet, and with no help. You can see two example MEQs in this Course sample.

A cheating index

I decided to derive a cheating indicator by the following formula:
 MEQ cheat = % score in MEQ/% score in class exams
Figure 2. Cheating potential in online Module End Quizzes.
Scores >1 are consistent with cheating. 
The index compares performance in the MEQ to that of in class tests, both exams and clicker quizzes. Plotting the distribution of the index (number of students at each level) yields a bell-shaped curve with a majority of students close to 1. Students whose index is below 1 do poorly in the MEQ and much better in class tests. Students above 1 do the opposite. The fact that their success in the MEQ is not matched by corresponding success in the exams, suggests that they cheat.

There could be other explanations. Understandably, the student could be nervous in class. Time pressure is relatively lower for MEQ, for which students are given 90 to 120 minutes.

The most interesting finding is the relation of the "MEQ cheat index" to overall course performance. The next plot compares the MEQ cheat Index to the final course grade.
Figure 3. Cheating potential vs final course grade. MEQ_cheat > 1 suggests cheating
In the swarm plot each student is a dot. Clusters illustrate the distribution.  The "A" students make the most homogenous cloud. Progressively, the "B", "C" and "D" students become more dispersed. Surprisingly, two distinct swarms are formed by the C students.


Cheating index smaller than 1
These students perform better in the "in class" exams. I doubt they need a stressful environment. One explanation is that they use the MEQ experience to identify and address their deficiencies. This is exactly one of my objectives in designing the module structure. MEQ resemble midterms and provide a reality check without a large exposure to point loss. Alternatively, they are procrastinators who cram before exams.

Cheating index larger than 1
These students perform better in the online exams. It is plausible that reduced stress and additional time allowed enhance their performance. I prefer, however, the explanation that they cheat. They take the exam in groups or get the answer list from better students. The dichotomy displayed by the C students suggests that half of them are honest. The other half dishonest. If the interpretation is correct, students who are struggling are more willing to cross the ethical boundary of the honor system. B students may include a similar cheating component. Their MEQ cheat Index may be less extreme because they are simply more likely to do better in the in class exams. 

Impact of cheating
All together the MEQs count for 6% of the final grade. Students who cheat could be 1-2% ahead of their honest classmates. Unfair, but unlikely to matter in the long run. Cheating in other grade components is harder. In a form of poetic justice, students that take the MEQ challenge seriously and learn from it, may eventually perform better in the exams. At least, this is what I hope. 

Should I flip my course?

Is that stinkin' ol' teaching method really that bad?

There are many professors who teach effective conventional courses. There is nothing wrong with it and from the point of view of many teachers and students it can be a win-win approach. From the professor's effort point of view, the old way may also get the most bang for the buck. Profs can teach their favorite topics with a relatively parsimonious effort. Often, they are very good at it and deliver fascinating and highly instructional lectures. Active learning components are frequently included providing compelling experiences.
Ploid in his BIS101 lecture. Credit: Universal Pictures

From the point of view of the student, "passive learning" is a familiar environment, one in which she is likely adept. Several students in my flipped classes have told me that they learn best by taking notes during a professor's conventional lecture, then revising, supplementing, and organizing the notes after lecture. In conclusion, traditional teaching should not be thrown under the bus. Rather, you should carefully reflect on the cost and profit of changing teaching method. How much time will it take? Will the effort required pay back in terms of personal satisfaction, career objectives, and enhanced student learning? In my case, the balance of these factors seems positive, but this may not be the case for others.

Why I decided to flip

My decision to flip was partially serendipitous. Three years ago I had limited knowledge of the topic of flipping and active learning. I liked teaching, although I was (and still am) careful at balancing it with an active research program. Although I did not realize it at the time, my flipping activity started with my decision to provide custom video content to explain specific, difficult concepts. For example, how to analyze and derive a restriction enzyme site map. Or how to map genes on bacteriophages. Once I made a few of these movies, I decided that they could be effective and I could do better ones. I applied for intramural funding to pursue this objective. After I got the funds, my interaction with education improvement personnel at UC Davis convinced me to try flipping. 

Why I will continue flipping

On the lecturer skill scale, I am pretty good, but not great. I am lively and fairly clear, but not crystalline. I can go off on a tangent and get down the rabbit hole of delicious details before I realize that most of my pack is lost. At the ripe age of my late 50s I realized that I needed more structure and discipline. I also needed to figure out what exactly I wanted my students to learn. I needed to have very clear objectives. I also am a junkie for the new and untested, for something that forces me to learn, plan and create. Flipping fulfilled these needs. It forced me to adopt a clear and preplanned structure in which specific objectives are addressed, while providing an outlet for penned creativity. Setting up my flipped course involved overcoming challenges, solving problems, addressing errors and a lot of work. Nonetheless, I feel that I am probably better at it than at conventional teaching. For this reason, I will continue doing it. 

In conclusion and in case this has not already transpired, I do not consider myself a flipping guru or prophet. A better characterization would be somebody who, while fumbling through it, has considerable fun, some success and learned a few lessons. In future posts, I will address further my work, experience and outcomes connected to flipping. 

Clickers and discussion in the flipped classroom

Lecture flow

Figure 1. Lecture flow
"Lectures" consist of a short introduction followed by Socratic questions and discussion. Ad hoc microlectures address lacunae.  With 200 students in the classroom, options for discussion and participation are limited, but not zero. My clicker question and discussion strategy follow online examples and colleague recommendations. Succinctly, this entails posing clicker questions for which points are only given to correct answers. Students have about a minute to figure out the answer by themselves. I count down five seconds to the end and give extra time if students ask for it. After the poll closes, it reopens on a new round of the same question. Students can collaborate in the second round by interacting with neighbors, TAs, or asking me for clarification. Voting preferences are displayed on the laptop computer running the clicker base displaying the degree of comprehension. When comprehension is high, I usually close the question (on the Clicker receiver) and then proceed to cold-call a student from a random list to discuss the different answers. When comprehension is low, I cold-call with the Clicker receiver still open and work with the students through the answers. This enables students to get at least some clicker points, prevents desperation, and keeps them focused on the question.

Student preparedness

Students are supposed to come prepared by work on the online module. While points and deadline on online quizzes stimulate compliance, experience and exit polls indicate that not all students come prepared.
Figure 2

Figure 2 illustrates students' response about their preparedness. It indicates that ~1/4 of the students come to class unprepared. I could ignore this fraction and tailor the lecture-discussion to the compliant rest. I am convinced this would be a bad idea, as it would dig a chasm between student types. My strategy instead is to aim for the middle, trying to keep the top students challenged while carrying along as many as possible.

Clicker question strategy

Clicker questions are a critical component of my course. Good planning and design is important for success. Bad clicker questions can throw the session into near chaos, causing confusion, delay, and students detachment from the lecture objective. The safest question, of course, is a dumb question, such as asking the students to spit back a fact that was just presented. It is also ineffective at teaching. In most lectures I attempt to take students though a Socratic journey that ideally should resolve into self discovery and deeper understanding. This is a more risky strategy, but one that offers potential higher rewards. I plan to illustrate successes and pitfalls of this in future posts.