Here are three things I’m trying to *ramp* up in my physics class this fall. By ramp up, I mean, they are all things I’m already doing, but I am trying to make them more coherent and threaded throughout the entire course.
Invention or exploration activities before every assigned reading
Norm for My Institution: Our physics class has a “traditional” flipped model thing going on. Students are expected to read before coming to class, then they come to class for some combination of conceptual questions ,group problem-solving, demonstrations, and labs.
What I’ve been Doing: I’ve been doing some re-flipping of this to a limited extent throughout the semester. For example, the day before the reading on acceleration, I have students engaged in the Invention Tasks designed by Andrew Boudreaux, Steve Kanim, and Suzanne Brahmia. Or, the week before students read anything about forces, we are doing explorations with hover pucks and bowling balls.
Plans for this Semester: This semester I am committing to having a “preparation for future learning” activity before every assigned reading. I plan on using a combination of invention tasks, PhET simulations, hands-on explorations,
Online questions that support deeper-processesing of reading
Norm at My Institution: Our physics courses uses graded MC quizzes to incentivize reading.
What I’ve been doing: I use online reading questions (for formative assessment), which are “graded” on effort only. The other thing I do is discuss reading and learning explicitly in class. For example, I have started the 2nd day of class with students watching and discussing this video on shallow vs. deep processing. It’s a kind of a cheesy video, but if when I stop at the right points and ask the right questions, we get really interesting, engaging, and authentic discussions going on around learning.
Plan for this Fall: I wrote about this in this previous post, but the gist is that I’m going to ask online questions that support students in going back to interrogate the text. I also plan on not just “starting the semester” with some discussion about reading, I plan on spend more time in class actually reading, in order to continue to discuss, practice, and foster reading comprehension strategies.
Competency-oriented and formative-assessment-driven quizzes
Norm at My Institution: We grade students on tasks like labs, quizzes, tests, and projects, and give them points for participating through clicker questions.
What I’ve been doing: All my quizzes have been standards-based, with many opportunities for reassessment. To a lesser extent, I dabbled in the summer with standards-based lab skills, wherein successful completion of the lab and lab write up was merely a “ticket” to get a new data set which you had to analyze all on your own. The benefit here was that it changed the game of lab from “getting it down in your lab notebook” to “I better make sure I understand how to do this”.
Plan for the Fall: I also discussed this recently here, but I’m going to be paring down the number of standards (by bundling) to emphasize synthesis over isolated skills. I am also working to make the problems my students practice better aligned with the high stakes exams that will be administered. For labs, I’m trying to develop a rubric I can use in a SBG-way, and plan on assigning grades at the end based on average of Best 3 plus Last 3.
Right now, my lab rubric is looking something like this:
I carefully choose, depict, and consistently use variables relevant to the purpose of the lab
- Unambiguous sketch of the experimental setup in which variables are clearly defined
- Variables are used consistently and clearly in data tables, graphs, and algebra
I relate graphical, verbal, and algebraic representations of the model being tested
- Trend lines depicted in graphs are consistent with model being used (or noted for its inconsistency)
- Appropriate verbal descriptions of the model are given
- Shown how the case-specific algebraic model relates to the general model
I make evidenced-based argument to support or reject the use of a model
- A claim is made about whether to reject or support the model
- Specific trends in the data are used to support claim
- Reasoning is provided to explain why trends mentioned are relevant
I carryout error analysis to determine bounds on numerical result
- Estimated uncertainties are provided and seem reasonable
- The “weakest link” is correctly identified using fractional uncertainty
- The uncertainty of final result is correctly determined
- Final result is reported with correct units and sig figs.
My work is represented with due care
- Data tables are neat and legible
- Graphs are labeled and legible with trend lines made with straight edge
- Error analysis work is easy to follow
- Complete sentences that are legible are used when appropriate.