SBG Grain-size: Assess the Small, Evaluate the Large

One of the biggest changes to my assessment plan this year is that I intend to bundle standards in intro physics. See, one of the tensions that exists in competency-based grading is what grain-size of standards to use. Small-grain standards have the benefit of being very detailed and explicit about the concepts and skill students need to master. Two downsides are these however: Logistically, it can create a problem where you have TOO many standards. Learning-wise, it can create a problem where you’ve disassembled doing science into discrete skills so much that students aren’t really doing science anymore. Large-grain standards have the benefit of keeping the number of standards down (logistically) while also focusing on synthesis of skills (learning-wise). However, large-grain standards may leave students with feedback that is too broad and not targeted to the specific things they need to work.

Josh Gates, who blogs over at Newton’s Minions, approaches this problem by bundling fine-grained skills and concepts in broader competencies. Students receive feedback on the fine-grained skills and concepts, but competency at the synthesis level is what matters at the end. I am by no means saying this is the only solution or the best solution, but it has particularly affordances for my situation. Here’s why I think so:

  • Since I don’t have control over curriculum coverage or pacing, I was always having to make compromises about which fine-grained standards were the most important to use. Students were getting practice and feedback on certain skills but not others. Even with this parsing down, I still felt bogged down by having too many standards. Simultaneously, I had too many standards and not enough standards. Re-packaging I think will help, because shifting my grain size up or down alone wasn’t going to help.
  • Since 40% of the students’ grade is determined by high-stakes exams not written by me, the fine-grained standards I was using were helping to give students practice and feedback on underlying skills, but not enough on synthesis problem-solving. By bundling, I can make sure they are getting practice and feedback at the level they are expected to perform on exams, while also giving them feedback on the fine-grained stuff.

Here’s an example from Josh, and how he’s bundled skills skills into a competency that students understand the balanced-forces particle model.

Screen shot 2013-07-22 at 6.51.48 AM

I’ll be spending the next couple weeks revising and bundling my old standards to better support students learning and better align with the implicit curricular coverage established by the third-party exams.

 

 

Physics Education: Research, Assessment, & Poverty

Participations in this summer’s conferences has really got me thinking about a many  thing about physics education research. Here I’m going to begin writing about one of those things. To start this conversation, I want to talk about a poster from the 2013 Physics Education Research Conference.

Who We Study, Who We Teach

Steve Kanim from New Mexico State University

Steve analyzed Physics Education Research publications from the American Journal of Physics and Physical Review–Special Topic in PER. The analysis was limited to publications that included actual student data (i.e., no discussions, opinions, sharing of best practices, dissemination only papers, etc).  Steve finds that 75% of the students we study are enrolled in calc-based physics. This is disproportionate to distribution of classes we teach–only 33% of the students we teach take calc-based physics. The population of students least studied are those in two-year colleges, which comprise 25% of the students we teach (less than one percent of the students we study). Students in algebra-based courses are also under-represented in our research.

Steve is careful not to overly criticize our community’s beginnings. Our field has grown, in part, due to the fact that our research has focused on how even our “best” students struggle to develop functional understandings of basic physics concepts. Rather than blaming our past, Steve’s analysis points to a gap we need to address now.

Steve also looked at this data by disaggregating studies based on the SAT MATH distribution. From this perspective, it still appears that we are studying students on the high end. For me (Steve did not say this), this is especially critical due to the correlations that exist between achievement tests like the SAT and poverty and correlations between poverty and race. It could easily be said that we have been focusing more of our efforts and resources on the privileged. Steve also mentioned some research, which I can’t remember right now, that has found that a SAT Math score of 600 is a threshold for achievement in upper-level physics.

Oh, Force Concept Inventory

Steve’s poster also referenced some research about the FCI, which has also got me thinking again about the Force Concept Inventory (FCI), and how the FCI relates to our field’s focus on the upper end. If you don’t know, the FCI is the most widely used assessment / evaluation instrument in physics education. When using the FCI, normalized gains are the most widely used method to report student learning outcomes.

The idea behind normalized gain is to “take into consideration” students pre-test scores. Normalized gains can be interpreted as the “fraction of gain that could have occurred.”  For example: a student who starts with a score of 40% and ends with 70%, gains +30% out of possible +60% gain, thereby having a normalized gain of 50%.

Despite measuring scores this way it appears that normalized gain (can be) strongly correlated with pre-test score. (Coletta and Phillips, 2005).

Screen shot 2013-07-20 at 9.35.19 AM

Underlying this correlation is additional findings that normalized gains on the FCI are strongly correlated with student scores on the Lawson Test of Scientific Reasoning Test, and they also correlated with students’ SAT scores (Coletta and Phillips, 2007).

Screen shot 2013-07-20 at 9.21.03 AM

A potentially huge problem we have as a community is that we report normalized FCI gains with out disaggregating these scores along such measures. I’d argue that this tendency is potentially dangerous, because it can lead us to make claims and offer implications for instruction that are distorted. For examples of how failures to disaggregate student achievement with measures of poverty lead to trouble, see Michael Marder’s prezi on Education and Poverty.

What can we do?

#1 We need Steve to publish his analysis of the mismatch between who we teach and who we study. This will enable those seeking funding to study under-represented populations to point to Steve’s research on the immense need for such research. It will also enable us to press funding institutions to create more parity in funding priorities. I emailed Steve this morning to offer encouragement and any help in making sure this happens.

#2 We need to begin as a community to publicize our own FCI normalized gains along with accompanying data that aides with meaningful disaggregation. This is true not only for publications about research. It should also include standards of reporting to funding agencies, and even standards of reporting on blogs. For example, right now, my own institutions reports normalized FCI gains from our algebra-based physics course to PhysTEC, and PhysTEC shares back data from all PhysTEC supported sites scores without disaggregation. I’ll start this process here: Our normalized gains at MTSU for algebra-based physics hover just below 0.3, and our SAT MATH scores are 460-570 range, with SAT Reading being 460-510 range. Note that this falls nicely in line with the graph above. Along this issue, we should really support the PER user’s guide. Although not on the site yet, they are working hard to create an Assessment Database and Analyzer tool that will make it easier for everyone to upload, use and interpret data in meaningful ways.

#3 Physics Education Researchers as individuals need to go out of their to engage with more research concerning students who aren’t just down the hall. The disproportionate focus on calc-based physics and severe shortage on two-year colleges is not malicious–it comes from convenience and a desire to improve our own local educational settings. Research-intensive universities are more likely to have students at the higher end of preparation and opportunity, and are also likely to have professors who have time and resources to do research. Instructors at two-year colleges have the opposite situation–no time, resources, or support to conduct research, and more likely to have students with less preparation and opportunity. I emailed three community college physics instructors this morning to begin that conversation.

What say you? (Feature Comments)

Eric Brewe: “We should think about the use of normalized gain. It over values gains made at high end schools.”

Gasstation without pumps: “One question remains—why are students taking algebra-based physics? … Is the FCI the appropriate measure?”

Writing Goals for next 6 Months

Near-term Grant Writing Goals:

Internal Faculty Research Grant to support follow-up on work pertaining to this post stemming from an undergraduate’s thesis. (Sept 25th)

Internal Public Service Grant to support Physics Teacher Collaboratives (Oct 1st), which we are starting this fall.

Spencer Small Research Grant to support research and instructional efforts on Responsive Teaching in our physics pre-service concentration (Oct 15th)

Longer-Term: Career Grant Next Year

 

Near-term Paper Writing Goals:

C&I paper, with Natasha on a micro-analysis of development of knowledge for teaching (Aug/Sept)

Phys Rev.–PER paper, on varied meanings of “straight” in student discussions of light. (Oct/Nov)

AJP paper, when F ≠ -grad(U)? (Dec/Jan)

Longer-term:  TE paper(s) with Leslie

Two Requests to Review

I just noted how different these requests come off. Ultimately, it doesn’t seem to effect my final decision about reviewing, but it does effect my initial reaction.

Journal #1 Request

Dear [Potential Reviewer]

We are writing to ask if you’d be interested in reviewing a manuscript for [Journal] entitled [Article Name]. Based on your expertise, we feel that you would provide a thoughtful assessment of the manuscript’s strengths and weaknesses.

We hope that you agree to review this paper, in which case you will receive another email with instructions for accessing the manuscript using our on-line review system. We would need this review returned within [time frame]. If you are unable to review for us at this time, we would appreciate receiving names of other potential reviewers you would recommend. In order to expedite the review process, we ask that you please respond to this request within the next 2-3 days.

[Abstract Attached]

Journal #2 Request:

Dear [Potential Review]

 We would appreciate your review of this manuscript, which has been submitted to [Journal]. We ask that you return your report within [time frame]

Thank you for your help.

[Paper Attached]

 

Maybe some people like short and to the point?

Advice from My Students

I recently asked my inquiry students to reflect on what they have learned this semester about teaching science. Here are snippets from what they wrote.

“Instead of just giving the students the answers or the details, it is important for them to think for themselves and come up with their own ideas and explanations. Also, I want to remember that it is very important to listen to my students, their ideas, discussions, and arguments that will occur.”

Children have more knowledge from experiences which you need to try to take what they already know and let them explore ideas with that knowledge.  The more that you build on that knowledge the more what they learn will stick.  Building on that knowledge needs to come from a place of self exploration.  Allow the students to question, work together to find out answers and discover their truth.”

“You can’t stand at the front of a room and talk at a group of children and hope they listen and then expect them to perform well on a test. If children are listening to you explain something that makes sense to you, they won’t necessarily understand it for themselves. You have to allow them time to process things in their own way and then you can help them better understand it after they’ve developed a good general idea of what causes things to happen.”

“Well, I want you to remember that it is okay to not be the main voice during the discussion.

“Always look at different sides of the situation, and never judge a kid for having a bad idea. Ask them why they think that, but never tell them it’s bad! Tell them you’re curious to know more or to explain more what they mean. A lot of times they will be right, but off on another path.”

“Don’t forget that part of teaching is listening.

Make them wonder about science outside of the classroom. Encourage them to explore and make connections.”

“I must always remember children’s prior knowledge or common sense of the world and things around them. I’ve learned that the most in depth and substantial learning comes from starting with a student’s prior knowledge and building off that. Sometimes you are able to build on their prior knowledge and sometimes their prior knowledge is a little miscued from the actual reality of the topic that you have to guide them through group discussions and hands on experiments and self-reflections that they create a way for themselves to learn the concept and implement it into their everyday thinking and common knowledge.”

“Children as well as nearly all people can learn through developing their own thoughts and ideas. Science might be treated as memorization of facts, but it was all discovered by people who had an idea and explored it further.”

” Teaching is not all about lectures. Listening intently and posing good questions for students can be a better tool!”

“Allowing the students to discuss among themselves will engage them and pull them much deeper into the content.  And don’t be afraid of noise! When the students are discussing and bouncing ideas off each other, learning is happening.  Dont be afraid of disagreements.  If a student can say I agree because, or I disagree because, building on knowledge can happen!”

“You want to be able to show your students that it is ok to have the wrong answer and it’s not always about the right answer, but how you figured out something.

“Students should have a comfortable classroom environment where they feel confident enough to share their thoughts. They should know that it is okay to be wrong. Always remember how the group discussions help connect everyone’s thinking.”

 

Lots of good reminders here.

Also, I am thinking: Even if they were just spitting back to me what they thought I wanted to hear, at least they knew what I wanted to hear.

Unpacking One Instructional Activities–

Facilitating a classroom discussion around a conceptual question that targets a specific learning goal (e.g., interpreting kinematics graphs) and addresses specific student difficulties (e.g., distinguishing position and velocity)

 Teacher Skills/Practices Hoping to Foster and Embed:

Planning:

  • Articulating how a question relates to specific learning goal
  • Articulating how a question relates to specific student difficulties with a topic
  • Anticipating student thinking–what they will say in response to question to support different answers, both correct and incorrect.

Launching:

  • Question posing (poise, clarity, pacing, and checking for understanding)
  • Setting tone and expectations for how students should engage in the task–talking to each other, listening to each other, voting, etc.
  • Monitoring and fostering engagement (as students talk in pairs and during whole-class discussion)
  • Circulating and attending to student ideas

Facilitating

  • Eliciting student ideas and explanations in whole-class discussion
  • Re-voicing or representing student ideas
  • Positioning students as competent learners and authors of ideas
  • Orienting students to peer ideas (to agree, disagree, compare)–reflective tosses
  • Deciding if and when to have students re-vote, talk again with neighbors, etc.

Closing

  • Summarizing arguments heard and positions taken
  • Explaining correct answer by relating students’ ideas /arguments to canonical concepts
  • Explaining away incorrect answers by explicitly addressing alternative conceptions.
  • Checking for understanding of learning goal (e.g., a follow up question), or fostering consolidation (asking students to summarize, generalize, etc.)

Reflecting

  • What students’ ideas came up that were and were not anticipated? What did you learn about student thinking about the learning target /question/ concept?
  • What opportunities to elicit, probe, or re-voice students ideas were taken up? Were there any that were missed? How might you have done things differently?
  • In what ways were student arguments leveraged in explaining the physics concepts? How might you have used those ideas differently?
  • What challenges if any were faced in getting participation and engagement? What actions were taken? How successful were they? What other choices could have been made?

 

In practicing and rehearsing, I’m imagining building in organized “trouble” that teachers will may have to “repair”. Things that I can think of that actually happen in class, include

* No students spontaneously volunteering to share their answer or ideas

* When students are asked to talk with neighbors, a significant fraction of the class remains quiet with blank stares.

* When students are told to talk to their neighbors, they immediately share answers with each other (e.g, “I picked B”, “Oh yeah, me too”), but they don’t go on to explain their reasoning to each other.

* A student starts to explain their reasoning to the class, but stops midway, and dismissively says, “I don’t know.”

* Student gives reasoning that is very brief or too vague to understand.

* Student gives an answer, but does not share their reasoning spontaneously.

* Student says his reasoning and answer really quietly so no one else can hear

* Students voices his reasoning quite aggressively or authoritatively, causing the other students to shut down (“Well obviously the right answer is C, because we all know that in textbook it says that…”

* A really difficult to understand (or perhaps even bizarre) idea is voiced

* One student voices an idea, and another student laughs or rolls eyes, seeming to mock it.

* A student gives reasoning like,”It’s probably less than, because I would have picked greater than, and I’m always wrong.”

* You try to get the class’ attention after they have been talking with their neighbors, but many students keep talking in small groups.

* When one student is addressing the class with her reasoning, several groups keep talking, making it difficult to hear what was said.

* A significant fraction of class responds in ways that suggest they have misunderstand the question… or seems to be answering a different question.

* With a sequence of question, only same three of four students participate in whole-class discussion.

* One student is completely disengaged-head down, on cell phone, etc.

* Students voices a jumble of science vocabulary jargon as an explanation

* Two students talk back and forth for a while, perhaps in a heated argument, and other students start to disengage.

Obviously, in practicing, we carefully “role” out different kinds of trouble, as to not overwhelm anyone. But in writing this out, I’m realizing even more so how immensely challenging real teaching is, having to respond to all of these kinds of situations (and more), on the fly. But it also makes me realize how beneficial it will be engage in this kind of work with pre-service teachers.

Teaching of Physics: Instructional Activities?

Brain-dumping ideas about next year’s version of teaching of physics, and what kinds of instructional routines I want to be at the core of the things students learn to do in class. There’s way too much here for a one semester course, and at the same time it doesn’t cover everything… anyway, here it my current brain dump:

Launching and monitoring small-group exploratory activities that are well-structured and designed so that students are expected (and likely) to author initial ideas about a set of phenomena that are “close to” canonical.

(e.g., PBI has students develop an operational definition for uniform motion as they explore motion of ball’s on track).

 (e.g., Many examples in which students develop rules and ideas for relating motion-detector graphs to how a person moves)

 (e.g., develop generalizations about the patterns that do and don’t allow a light bulb to light while exploring with a piece of wire, a battery, and a bulb)

This seems to be generic enough to cover lots of good research-based materials. Students would get “exposure” to such curriculum, and analyzing curricular structures that make them effective. Opens up with some issues in managing materials, classroom space, and group work; and in listening to students, using questioning to check for understanding, and using questioning in order to guide students in different directions.

Facilitating a classroom discussion around a conceptual question that targets a specific learning goal (e.g., interpreting kinematics graphs) and addresses specific student difficulties (e.g., distinguishing position and velocity).

I’m really think about peer instruction here. I think this a a good first step toward practicing eliciting student ideas, fostering engagement in a whole-class discussion, probing for student to explain their reasoning, re-voicing and representing students ideas, and orienting students to other students’ ideas. It also build capacity for thinking about learning goals and student difficulties, and thinking about properties of good questions. I big part here is also in explaining canonical physics by connecting it student arguments/ideas and explicitly addressing alternative ideas. What I like about this is that it’s a short cycle, one that can be practiced–starting with anticipating what students will say, to question posing, eliciting ideas in a whole class discussion, and summarizing arguments in order to emphasize underlying physics.

Introducing a pedagogical representation (e.g., motion maps) and providing opportunities for guided practice with feedback (e.g., providing students with various scenarios to work through, then white-boarding and discussing one’s that groups perhaps diagrammed differently)

This I think is important, because it can provide a model for “lecturing”, and “seat work” that can be done meaningfully. It gives students a chance to learn about lots of pedagogical representations they might not have learned (motion maps, schema systems, energy pie charts, whatever). It also begins teachers thinking about how to monitor students work in order to decide “what’s juicy enough or that the whole class” needs to discuss to further learning.

Launching a problem-solving activity (e.g., where will fast and slow buggy meet?) and facilitating a strategy-sharing discussion (e.g., whiteboard meeting).

When I think of launching problem-solving, I think a lot about Dan Meyer’s three acts kinds of stuff, but I also think a lot about “Five Practices for Orchestrating Productive Mathematical Discussion”, and those two together form the core of how I think about a lot of this.

Launching, structuring, and monitoring a laboratory investigation in which students are to collect data to determine a quantitative relationship (through graphing).

(e.g., analyzing position and time data for constant velocity buggy)

(e.g., investigating relationship between spring extension and hanging weight)

(e.g., investigating relationships between force, mass, acceleration)

This, of course, isn’t the only kind of laboratory work, but it’s a kind of laboratory work that is common and useful. Lots of materials management and group work managing, etc. Modelling curriculum has lots of good examples of this kind of thing. Many of our students will have only experience “confirmation” labs, so this gets us away from that model and toward a model of uncovering relationships, etc. There are other kinds of labs that are missing with this, but may be able to be incorporated, but thinking about claim-evidence-reasoning structure, etc.

Collaboratively establishing classroom expectations through student engagement in activities and discussion. (e.g., Marshmallow challenge, Weird paper airplanes, Science Notebooks)

This is something I’ve been thinking alot about, especially since one of our students has really struggled with classroom management. I realize that it’s not fully my job, but we can model some ways to do “science-y” activities in order to get students to contribute to establishing procedures about how the science class is run. Each of these examples is not just about classroom routines or expectations, but links to the nature of science or learning in meaningful ways, if done right.

Establishing and enacting structures for self/peer assessment of work

This idea is fuzzy in my head a little bit, but I’m thinking about how this is pretty important.

Providing (narrative) feedback to students based on work they have done.

This, too, fuzzy in my head. But I think it’s a kind of thing to get them examining, interpreting student work in ways that aren’t about “grading”, but understanding what students are saying, etc.

These next two are more about “larger planning”, about how to productively plan for and respond to student ideas and thinking in order to help them learn something difficult or learn something over a long period of time.

Sequencing and enacting a set of activities/discussions in order to help students understand a particularly difficult concept or situation.

  • Elicit-Confront-Resolve (prior knowledge is problematic in some way)
  • Bridging Analogies (prior knowledge is useful but needs refinement)
  • Inventing in order to PFL (build some new prior knowledge to leverage later)

Eliciting student ideas/explanations for a phenomena and supporting them in a process of developing ways of testing and further developing their ideas.

 (e.g., develop an initial model of magnetism as they explore the magnetization of a nail, and help them t test those ideas)

a lot of ISLE activities are structured like this;

Daily Sheet Snippets (again)

I got really excited when our group…, and we came out with what we feel are valid claims that…

It makes sense to me that… what I’m still confused about is…

I feel like we have made sense of…based on the assumption that…

I feel like …., but still want to hear the last groups’ argument on…

What does not make sense to me now is… Is it … or …? It makes sense that it would be…

I really liked the example of… Seeing it this way helps to show…

I was wrong in my original prediction where I thought…, because…

Our claim from Monday about… really confuses me now.

I feel like … is beginning to make more sense. I think that after observing this weekend, it may come together more.

It makes sense to me that… I changed my mind after…, because I now know…

After thinking about…, it makes very little sense.

I’m really blown away with… I never would have guessed that.

I like tinkering with… it helped me to visualize

I do not understand the theory for how… The group has not explained it clearly enough yet.

When we did…it made more sense. Now I realize that…

I believe that it cannot be… or else we would … We do not see…, so it must…

I don’t understand the way that… If it’s…, wouldn’t we see…?

What made sense to me is that… we were able to see that there is no way for… but there is at some point, but when? And how? What’s the difference between … and …?

Initial Moon Ideas in Inquiry (Brain Dump, Processing)

In Inquiry, we have two not-yet satisfying theories to explain the moon phases:

Shadow-Theory: The first is that the earth can act as an obstacle for the paths of light from the earth, thereby being capable of casting a shadow on the moon. As the moon passes behind the earth, it can move into the shadow either fully or partially, creating moon’s phases. This idea clearly places the new moon as occurring when the moon is on the far side of the earth.

Cupping-Theory: The second is that the rays from the sun can only “cup” half the moon–the side facing the sun. When the moon is between the moon and the earth, you are staring at the uncupped side which is dark. You don’t see the moon because the lit side is facing the wrong way for you to see it.

  • One problem this group is facing is that they *want* the full moon to be when the moon is opposite the sun, but they can’t figure out how the light rays get to the moon. That is, we know we’ve seen a full moon at night, but we can’t figure out how the light gets there, when the moon is tucked behind the earth. Here are possibilities we came up with:
    • Maybe light scatters off the atmosphere of the earth, like the way we’ve seen light scatter off tissue paper.
    • Maybe light “turns” and bends around the earth–the way we’ve seen happen with glass.
    • Maybe light is getting to the moon from other objects besides the sun–star, planets, redirected sunlight off of asteroids.
    • Maybe the sun is so big that there can still be straight paths that get to the moon, even when it’s behind the earth. Maybe the problem is we aren’t drawing things to scale.
    • Maybe the moon orbits around the earth north-south, not east-west, so it’s never behind the earth.

Love those ideas.

Anyway, there was lots of movement in class towards the cupping theory (which by the way it’s called cupping because the student was trying to show which side of tennis ball was lit by cupping their hands around one half of the ball). I acknowledge what made sense about the idea, while really selling the problems we had identified. Another problem also came up, in which, a student said that the new moon explanation doesn’t really make sense, because it seems like that would be more of an eclipse, when the moon passes directly over the sun.

Moon Orbiting

The idea about the how exactly the moon orbits around the earth has lead to some other ideas as well.

One group claims that the moon orbits the earth in the opposite directions that the earth spins. This group has some ideas for why this makes sense to them, and I’m not sure yet, but I think it has to do with thinking about what an app on their phone is showing them.

One group claims that the moon orbits the earth in same direction that the earth spins. This group claims that this can explain why the moon appears to rise later and later each day.

Other have floated the idea that the moon might changes its orbit, or that it’s orbit might not be perfectly with equator, or perfectly north-south, maybe it’s at an angle.

What about Moon Spin?

Another group worked during the day to develop the claim that the moon doesn’t spin. I pressed them to collect evidence for it, and they are working diligently to exam all the photos we’ve collected over the semester to prove that the same side of the moon is always facing toward the earth. Curious to see where they have gotten.

 

Thoughts about Directions for Tomorrow:

Push groups to seeing if by using props, if we can model all the moon phases by either casting shadows or by creating different moon cuppings. Then, bringing the challenge to representing those 3D models in 2D diagrams. Basically, right now our theories only discuss new and full moon, and I want to push the span of those theories to see how and if they can explain other phases.

Getting our orbit ideas well-developed enough so that they can be linked to observations. One group has done this–linking their model to moon rise times. But we need to do same thing with north-south orbits–what would we see differently in the earth was passing more northward or southward.

Scale! I’m stuck on whether we should try stick to things we can figure out from direct observations. OR, letting them first think about and look up the distance information… and then beginning modeling those scales. Going out to the football field with props, figuring out to diagram things to scale, and what implications that can have on whether light can get to the moon when it’s behind the earth.

I need an opportunity to problematize what N,E,S,W means, and how to relate that to maps, globes, our earthly perspective. That’s always a trouble.

Groups, of course, are interested in other questions

  • How is moon seen differently in different parts of the world?
  • Why does moon sometimes appear BIG, sometimes appear different colors?
  • Where did the moon come from? How would life on earth be different without the moon?
  • What’s the relationship between moon and the tides?
  • Is their a relationship between moon and the seasons?
  • Why does the moon appear to “turn” throughout the day?

Moon Clock!

Oh! Last idea that is really cool. The group that is focused on the claim that the earth rotates the same way the moon orbits the earth is using a cool clock analogy. They are claiming the moon only orbits a little bit each day. They are saying how when you look at a clock, you can see the second hand moving, and that’s like the earth moving–you can tell the earth is moving by watching the moon/sun. It’s harder to tell the minute hand is moving because it doesn’t move very far, you sort of have to wait a minute, and you can tell it’s moved. They think the moon is like this, it’s moving so slowly, that you can’t tell in a given day that it’s moved, but if you wait a day, you can tell it’s moved. They have this diagram labelled, “moon clock” that I haven’t yet had a chance to learn about, but I’m intrigued.

 

Observational Markers that Force(s) are Happening

In teaching physics, our plunge into forces has been:

#1 Qualitative observations with hover pucks, consolidating observations into generalizations about what effect doing nothing, tapping, and pushing has, to some degree following the activities and discussion outlined by Kelly O’Shea.

#2 Introduction and practice with system schema diagrams, followed by readings and discussion about the ontology of force, interactions, and force pairs, and what this has to do with understanding / learning the force concept.

#3 Yesterday, we formalized our ideas into specific observational markers that force(s) are happening. We then worked our way through reasoning, discussing, and collecting observational evidence that a table exerts in upward force on a book, largely following the instructional sequence outlined by Clement.

Here are our observational markers for force(s) happening

Squishing or scrunching–visible deformations like when you stand on a carpet, or lie down on pillow top mattress.

Stretching or elongating–like when you are stretching a rubberband, or using pulling back a slingshot.

Sound + contact: Two surface in contact with accompanying sound–a baseball bat hitting a ball and making a knocking sound, or scratchy sound of sandpaper over wood,

Tightness or taughtness–like when a string is pulling you can feel the string is tight and see the string is more straight than it would otherwise be.

Bending–standing at the edge of a diving board, you can see the board bending.

An object is speeding up, slowing down, turning around, or changing direction.

 

Tomorrow, we try to quantify operational quantify amount of force, largely following the thinking and experimenting outline by Arons, which is really trying to bootstrap back and forth off observational marker #2 (about stretching) and observational marker #6 (about speeding up). Here is the gist:

  • Use an uncalibrated spring to tug on an low-friction cart of certain mass. First, try to keep the amount of “stretching” constant and pull. See what effect that has on mass using motion detector. Verify that this is repeatable–same stretching of string attached to cart always results in the same acceleration. And verify that it doesn’t just speed up but it’s speed up with a relative constant acceleration when you have a relatively constant spring stretch.
  • Stretch the spring to different amounts, and see how the effect changes, using same cart. We don’t assume linearity or spring force–i.e., don’t assume twice as much stretch = twice as much force. Instead, see what acceleration happens with different stretchings. This allows us associate the spring stretch with a specific acceleration for known mass, without making assumptions about the spring.
  • Motivate and invent a unit of force– for example “1 Robert” could be an amount of force that will accelerate the known mass at a rate of 1 m/s/s. Label various spring stretches in terms of their Roberts.
  • Now, vary the mass to see what effect that has–what effect does a “1 Robert” force have on different masses. Construct various plots of Acceleration vs. Roberts for various masses.

Curious to see how this goes.

Blog at WordPress.com.

Up ↑