Flipped Classrooms

The Flipped Classroom: A Survey of the Research, Jacob Lowell Bishop and Dr. Matthew A Verleger

Presenter: Shanna Shaked

Review of the literature:

  • 24 studies (updated in 2012), but eliminated some for various reasons, leaving them with 13.
  • Only 1 study actually tracked a flipped classroom course from beginning to end, compared to a traditional lecture course, and found that the students in the flipped classroom performed better on their exams.
  • Lots of suggestion, anectodal evidence, but little hard proof that flipped classrooms are better for the students.
  • Flipped classrooms are beneficial especially for large classrooms. Students need to be provided with interactive video lectures outside of the classroom.
    • learning through reading is proven to be most effective for learning, but most college students will not read material outside of the classroom
    • interactive video lectures have been shown to be more effective than classroom lecturing
    • students tend to score better on homework problems and tests (in the flipped classrooms)

Is total flipping necessary?

  • No obvious evidence that flipping the entire classroom is necessary. Partial flipping seems to produce positive results as well.
    • Partial flipping --> only flip a few classrooms each quarter

How do you quantitatively measure effectiveness in a flipped classroom? How could we do a controlled experiment at UCLA?

  • use COPUS to compare percentage of time teacher is talking to percentage of time students are talking/working
  • Use two otherwise identical classrooms and try flipping a few classes throughout the quarter
  • compare an engaging lecturer to a flipped classroom (Carl Weimann did this)
  • connect with Education Department people and/or Statistics Department people
  • two problems:
    • we haven't had enough practice using innovative techniques; lecture-style classrooms are more polished
    • the important metric is long-term retention; we need to be able to track the students through upper division courses as well
  • What do we want to measure?
    • Pre/post concept inventories
    • exam scores
    • student attitudes (CLASS - UC Boulder) <-- this might be where we see the biggest gains at first
  • What variables do you need to account for?
    • presentation style (does the instructor use humor, etc)
    • student preparedness (match "twin" students between classrooms)
  • Student evaluations (and scores) may drop at first --> long-term tracking is necessary
  •  

 

Informal Discussion

What can you do if you are not doing clicker questions or lecturing?
    - interactive videos/demonstrations
    - model correct problem-solving strategies

Flipped classrooms
    - does the research support the fact that these are better for the students than traditional lectures?
    - do students get more out of a live lecture than they do out of a video
    - if we are asking students to watch videos on their computer, won't they be distracted by other things
    - would the best idea to do both lecturing and flipping in the classroom? Then you're asking the students to spend a lot of time in the classroom.
    - usually there are questions after the video lectures, so the students have to spend time watching the video
    - those classrooms need to be deep enough that students can't just skim the video to find the answer
    - do flipped classrooms work differently for lower div courses than they do for upper div courses?
    - is it too stressful for students (every day there is some sort of quiz-like scenario)?
        - this is college - it's supposed to be preparing them for real life
        - the struggle is where the learning happens
        - learning is difficult - it doesn't always make the students happy
        - the point of teaching is not to make the students happy - it is about to help the students learn the material
    - is it the job of a teacher to teach the material AND to provide emotional/psychological support?

Structured classrooms
    - it's important to be clear about the desired classroom structure from day one ("this is what office hours are for. this is what discussion sections are for.", etc)

How do you know what's going on in students' heads?
    - ask one student how they went about solving a problem
    - cold-calling requires all students to be ready to solve the problem
    - how do you do that without scorings students away?
    - In a large class, do classroom discussions actually engage the class as a whole, or only a small fraction (and not the people you are trying to reach)?
        - is it better if you are cold-calling and asking students if they agree with each other?
    - Can we set up a camera at the front of the classroom and film the students' reactions to different pedagogical strategies

 

The Importance of Conceptual Knowledge of Solving Physics Problems

Using Qualitative Problem-Solving Strategies to Highlight the Role of Conceptual Knowledge in Solving Problems - William Leonard et al. 1996

Presenter: Anna Boehle

By focusing on problem-solving strategies, are we forgetting/compromising the conceptual knowledge we want students to have?

We know that even efficient problem solvers are not taking away the conceptual knowledge that they should at the end of undergraduate physics courses. You can be an efficient problem-solver if you know how to "plug-and-chug" without really understanding why you are using the equations. We want scientists to be able to explain what they've done, and that takes conceptual knowledge.

When TAs and professors model problem-solving, we usually don't explicitly write-down the explanations; we just write the relevant equations on the board. Are we doing our students a disservice?

Suggested way of teaching problem solving:

  1. highlight the major principles and/or concepts (WHAT)
  2. justify the use of those principles/concepts (WHY)
  3. write the procedure down (HOW)

Take the time to write out the strategy before attempting to solve the problem. This method can be lengthy and time-consuming. You might only have time to present/model one problem during class time. Homework solutions should include strategies as well as the quantitative solution (equations). Grading is based on strategies as well as the computation.

Challenge at UCLA: How would we grade strategy writing? We don't have enough TAs to handle that kind of in-depth homework analysis. Is there a way we can automate it by looking for keywords (that's what the Blackboard app does).  Maybe we could use peer graders - they grade each others' homework sets and are, in turn, graded on their evaluation. Alternatively, the TA can spot-grade the homework sets (though students might not like it). Maybe TA's could give feedback during discussion sections.

Posting homework solutions on the Internet

One the one hand, students expect the solutions to be posted. On the other hand, once you post the solutions, you can never use those problems again. What do we do to combat that? TAs and faculty do NOT want to re-write their problems every year. Is there a more secure way to deliver homework solutions? What about giving weekly quizzes that are similar to homework problems; don't grade the homework, but if the students work through the homework problems, they will do better on the quiz.

Group work could also be a solution; students working in groups need to justify their solutions to each other. Plus, if you have groups submitting a single solution, you cut down on the number of problem sets your TA has to grade.

Most importantly, stress the importance of practice problems to the students; be explicit about your expectations. Would it help students to have a separate required course on study skills for all freshmen? Maybe the problems given should be tougher ("challenge problems") to encourage students to work together and be explicit about their strategies.

Evaluating the effectiveness of this approach

Can you teach two courses - one traditional, and one using a strategic-approach? The IRB (Institutional Review Board) will not let you "experiment" on undergrads by providing a "sub-par" class, but where do they draw the line? Can we do this sort of experiment at UCLA to make sure that we have the data we need to back up the claims we're making? If you don't publish your results, you don't need to go through IRB. But wouldn't you want to publish the results at some point (or at least have it as an option)?

What is the effect on television programs and/or outreach on the education of the general public?

We don't want our physics classes to be like Cosmos - we need to focus on real content besides just getting students excited. How important is it to sacrifice excitement for content? Hype does a lot for the students, and will get you very good evaluations. But how much are they actually learning? If students are very excited about the topic, but under-prepared for future courses, have you helped them or hurt them? Is there some excitement to be gained (for first-time scientists) from the successful completion of a problem?

The key is to encourage the students to develop critical thinking skills. Those usually lie between the initial excitement and the quantitative evaluation of the physics. This is the crux of "thinking like a scientist" which is important for students even if they are in "Physics for Poets" type of schools.

 

AAS/AAPT Update

This was the first meeting of the Winter 2015 quarter.  The meeting consisted of the following:

  1. Laura Vican
    • experiences at AAS - discussing various outreach programs
  2. Josh Samani
    • experiences at AAPT - workshop in research-based methods in alternative problem solving.

Next meeting, a member will lead a discussion on problem solving research.

Action items:

  1. Create a repository of papers that future speakers can choose from so that they don't have to search forever for a relevant, interesting paper.