What Students Write When You Ask

I ask students to give me written feedback on a variety of things:

I hand out daily sheets that ask students to reflect on what did and didn’t make sense today

I hand out mid-semester feedback that ask students to reflect on what is and isn’t help them to learn.

One thing I haven’t written about is that I ask students at the beginning of the semester to write to me about their professional learning goals and their student benchmark goals. I ask the questions the following way, usually after I’ve framed the course a little bit for them.

Professional Learning Goals

If you’re not focusing on yourself as a “student” (who might be worried about grades and such), but rather you are thinking of yourself as a future professional, what are you hoping to understand better, learn more about, or become skill at doing through this course?

Student Benchmark Goals

Given that you are a student, what grade would you be satisfied with in this course? What kind of grade would signal to both you and I that there is trouble?

This feedback helps me get a sense for how students are making sense of what this course might offer them, how they view themselves as a learner, and what their hopes for the future are. It also gives me a benchmark for talking to students about their performance in class. I approach students who are not on target for their own goals, rather than me having to judge which students are struggling according to my own standards. It also can be helpful knowing which students have low grade expectations as well.

Opening the Flood Gates: Self-Efficacy, Science Anxiety, Science Identities

One thing that happens is that these invitations for written feedback often open up a flood of statements about self-efficacy, science identify, and science anxiety:

I was never good at the many science courses I’ve taken before, so I’m hoping this will help

Last semester I took physical science… I almost failed to course because I couldn’t understand anything. I would love to improve my attitude toward science.

A’s aren’t realistic sometimes… I’m not that smart and I don’t have a lot of time.

I’m not much of a critical thinker, so…

I just don’t really understand science…

Given that most of these kinds of statements come from my class for future elementary school teachers, I’m thinking about this in the context of research on Math Anxiety, Elementary School Teachers, and how math anxiety develops in children.

Behind our Department’s Award: Political and Practical

Yesterday, our department received from the MTSU President the first ever, “President’s Award for Exceptional Departmental Initiatives for Student Academic Success”.

The Political View:

Tennessee State schools used to be funded based on the number of students enrolled. The more students you had, the more state money you got. Historically, MTSU has become the largest undergraduate serving institution in our state, in part, by having admission standards that are fairly low and accepting nearly 100% of students who meet those requirements. I’ve heard that students must either have a GPA of at least 2.0 OR an ACT of 19. For reference, college readiness on the ACT clocks in at 24. At MTSU, the average ACT score of students is around 19, and we mostly draw students from our own state which has the 3rd lowest ACT scores in the country.

As of two years ago, TN state schools are funded not by enrollment but by measures of retention and graduation–a complex formula that takes into consideration how many freshman become sophomores, how many sophomores become juniors, how juniors become seniors, how many seniors graduate, as well as how many students graduate within four years and how many students graduate within six years. Within this landscape, MTSU is looking for ways to increase retention and graduation. And like many other schools around the country, MTSU has (too) many introductory math and science courses with high percentages of students who either receive a grade of D, F, or withdrawal (DFW).

So how do we fit into this? Due to a variety of reforms that we’ve put in place in our introductory physics courses, our DFW rates are far below other math and science courses on campus. In addition, over the past 15 years, we have grown from graduating 1-2 students physics majors per year to 10-12 per year. And our majors excel–achieving very high marks on the Major Fields Test and earning a disproportionate number of Goldwater and Fulbright Awards. Within the current political and financial pressures, our department stands out in both retention and graduation, both in terms of how we compare to other departments and how we have progressed.

The Practical View

All of this has not happened by accident. So what have we actually done? Here is a brief summary.

#1 Introductory Course Reform. Both our algebra-based and calculus-based physics course are now run in a studio-setting, which involves a lot collaborative problem-solving and class discussion. The one hour of lecture that exists per week uses interactive engagement methods as well (i.e., clickers). In the calculus-based course, we have adopted a research-based text (Matter & Interactions). In the spring, we hope to pilot a new section of algebra-based physics that uses a research-based curriculum as well. Not only have these reforms reduced DFW rates, but it appears that many of our majors now switch to physics after going through our introductory courses.

# 2 Physics Majors as TAs. All physics majors are required to serve as an undergraduate TA in our introductory physics courses. In the spring, we hope to pilot an actual LA program.

#3 Full-time Instructors. While many colleges hire a revolving cast of part-time instructors, our department hires and keeps full-time instructors. Most of them have been around longer than I have.

#4 Targeted Recruiting. Our department has been sending out MTSU physics brochures to students with high ACT scores and an indicated interest in math or science. In the mailing, students are asked to go online and fill out a brief survey in order to get an MTSU Physics and Astronomy T-shirt. That information allows us to follow up with students and invite them on campus later on.

#5 Career-focused Tracks. One criticism that can often be placed on physics departments is a curriculum that only focuses on preparing graduate students in physics. Our department seems to be constantly tweaking our curriculum to meet the needs of our students, to attract majors, and to prepare students in diverse ways. Among others, we now have concentrations in Teaching Physics and Applied Physics, which are more job oriented.

#6 Changing Prerequisites. We just recently shifted many of our courses prerequisites from students merely having to “pass” the prerequisite courses to now having to get a C or better. Looking at the statistics, we were finding that almost all of the students who received D’s in the prerequisite courses, ended up failing or withdrawing from our classes. Raising the prerequisites will not likely hurt enrollment, because we already turn away students due to over-enrollment.

#7 Free Tutoring Services. Our department pays physics majors to man a tutoring room several days a week. Previously, minority students could receive free tutoring services or students could pay for tutors.

Other Factors:

#1 Leadership Our department chairs deserves a whole lot of the credit here in addition to another faculty member in the department who has committed his career to improving courses, curriculum, and student success. Our efforts are sustained and coherent, rather than fleeting and fragmented.

#2 Data-driven Mindset. Our department has been making changes based on data, not just willy nilly opinions. Analyzing how the DFW rates interact with prerequisites before making changes. Our lower than desired FCI gains is the current impetus to adopt a research-based curriculum and LA program. Data from interviews with graduates led to the most recent Applied Physics concentration. Data about the number of physics teachers we’ve prepared and state need for physics teachers spurred our Physics Teaching Concentration. While any one person might not like some of the changes we make, or perhaps someone might think we need to make more, our department seems to be engaged in a process of continual improvement one that is subject to arguments and outcomes based on evidence. That’s where you want any educational community to be.

 

[Back to School] Applications and Registration

I wrote previously about how I am enrolling as a student this fall, taking one course a semester in order to complete a minor in education in four years. Here I begin writing about my experiences:

First Steps:

To get things started, I had to apply as a student- non-degree seeking and get accepted. The process was fairly easy with only a few snags after filling out the online application. First, I had to pay a library fee from a poster I had printed this summer. That was only challenging because it took me a while to figure out how to pay it. Every fee except the library is paid through your student account. The library you pay directly through their website. Second, I had to call over several times to find out why I still hadn’t been accepted. It seems that someone just needed to manually click a button to admit me. Because I’m an employee, there are a few obstacles I didn’t have to go through, including having transcripts sent and meeting with an advisor.  For registration, I had to wait to register until after freshman orientation, because they don’t want employees taking any seats away from students. There was still space in the class I needed. Finally, I had to get permission from my department chair to take the course, which involved filling out a form that justified why I was taking the course and certifying that the course would not interfere with my job responsibilities.

So now, I’m enrolled in MSE 1010: Step 1: Inquiry Approaches to Teaching. It is a one-credit introduction to teaching that includes field experiences, co-teaching inquiry lessons at the upper-elementary level. I think they teach students about the 5E Lesson Plan, provide you with a inquiry lesson plan that is nearly complete, and you have to prepare for teaching with your partner. Students have to do a practice run with a Master Teacher and get approved before being allowed to go to the school. I think you end up in the classroom twice during the semester.

Annoying Undergrad Things

So far the only annoying thing related to being a student has been selling of my personal information. I now receive junk mail to my home address from lenders and churches in the area. I have looked into trying to have this stopped, but apparently their system is set up so that if I don’t want my information shared, I have to completely remove myself from the directory. For me as a faculty member that means I wouldn’t have my office number and email listed. I’ve called around, and someone is looking how to fix the issue, but I’m not confident it will get resolved.

Anyway, classes start next week. I’m hoping to publicly blog about my experiences here using [Back to School] in the title.

Disaggregation of Learning Gains–Please Argue with Me

In a previous post, I brought up the issue of disaggregating FCI learning gains. For some perspective, Colleta, Phillips, and Steinhart (2007) have looked at disaggregating FCI learning gains by SAT scores. What they find is that students with higher SAT scores learn disproportionately more than students with low SAT sores. They find correlations around 0.5, and note that the trend is more parabolic than linear. Whether you are more inclined to view SAT scores as a measure of “cognitive ability” (as the authors do) or more inclined to view such scores as a proxy for SES (as I do), I’m becoming increasingly confident that disaggregation is important, and that everyone should be thinking about it.

An emerging site that should especially be thinking about this is the PER user’s guide. At AAPT summer 2013, Adrian Madsen gave a talk about their new initiative to develop a data explorer.

Screen shot 2013-08-09 at 7.40.44 AM

 

The last line of their abstract asks for feedback. What I can offer here is a conversation.

Our institution doesn’t use the SAT much as we are in ACT country. Here is some mock data that somewhat reflects what our institution actually looks like.

Screen shot 2013-08-09 at 7.30.29 AM

 

This red points on the graph show a broad distribution of students (some students with ACT scores < 15 with majority between 20-25), arising from a fairly low barrier to admittance–it’s something like either a 19 on the ACT or 2.0 GPA. The blue data points show normalized FCI gains broken down by ACT scores, and gives the same overall trend that Colleta, Philips, and Steinhart find with SAT. A colleague of mine who saw this trend said, “The smart get even smarter”. The highlighted horizontal line just indicates what our average normalized gain tends to be, which is typically what is typical yo report.

What I’d like to do now is walk through some different scenarios where higher FCI gains are achieved in different ways, to further the conversation about why disaggregation matters.

Scenario #1–Improved Learning for Everyone

Screen shot 2013-08-09 at 7.30.34 AM

In this scenario, a course has higher FCI scores, because instruction leads everyone to do better. Students on the high end receive a bump, but importantly so do students on the low end. In this scenario, the distribution of students remains the same. If we were to make changes like this, we’d be improving learning without addressing the learning gap. You can of course imagine ways in which you improve learning and increase the learning gap, by helping the high end students more than the low end.

Scenario #2 Reducing the Learning Gap

Screen shot 2013-08-09 at 7.30.38 AM

 

In this scenario, learning gains are better because students with low ACT scores are learning more with out necessarily helping students on the high end. In this case, learning gains improve and the learning gaps decreases.

Scenario #3 Become More Selective

Screen shot 2013-08-09 at 7.30.43 AMIn this scenario, higher learning gains are achieved due to a shift in the distribution of students. This might happen at more prestigious institutions, or it might be achieved by having more stringent requirements for enrolling in introductory physics. Or, this could be achieved by designing a course to pressure struggling students into withdrawing, so they don’t end up on the post test. In this scenario, I’m assuming that the “trend” stays the same, but there could easily be interactions among distributions (red) and learning (blue). For example, a broad distribution may create difficulties with differentiating instruction. Tighter distributions may benefit from having more homogeneous populations.

Discussion 

What I hope I’ve done here is started a conversation that might convince you that NOT disaggregating masks important features of what’s happening in our classroom. This I believe should be important to people whether they are a staunch believer in “cognitive ability” or staunch advocates for “social justice.” or both. First off, not disaggregating makes comparisons among institutions difficult. A average learning gain of 0.7 and a learning gain of 0.4 can’t be compared meaningfully without disaggregation. If both courses have similar distributions of students, then they can be compared. If one has a distribution of students on the right end and the other has a distribution toward the left end, then comparisons need to be done carefully. Second, disaggregation might allow us to better understand how specific institutions have enacted successful reforms (or failed to), and furthermore it might help institutions to assess their situation and make reforms in a more informed way.

Things I’m not saying:

  • I’m not saying, “Students with high ACT scores will learn no matter what.” Findings from PER suggest that even successful students fail to meaningful learn even basic physics concepts when instruction relies heavily on lecture. What we are talking about here is courses that do enact reformed instruction.
  • I’m not saying, “You should be satisfied with marginal learning gains if you are at an institution with underprepared students”.
  • I’m not saying, “If you have high learning gains, it must be because you teach at a selective school.”

Things I’m wondering about:

  • How should we disaggregate? Based on standardized tests? Poverty measures? Math or Reasoning? There are all kinds of ways we can and do and could disaggreate? But what standards of reporting can we argue are most important / informative? What obstacles are there to reporting in these way?
  • What are the downsides of disaggregation? I assume there are, and we should think and talk about them.
  • Scenario one is really interesting to me. Why? There are reasons to suspect education communities alone cannot address the fact that the blue dots trend the way they do. (Perhaps we can lessen the steepness?) The truth is that the correlation between poverty and educational achievement is strong–it is robust across scales (e.g., classrooms, schools, states, countries), and is robust across shift in measures (SAT, PISA, FCI, etc). That said, there is good reasons to believe that some institutions of learning fair better, despite being subject to the same poverty correlations. For example, I’ve seen Michael Marder talk about poverty and education, in which he shows how every state is subject to the poverty trend, but despite that, they find that the poorest students in states like Texas and Massachusetts achieve about the same as the richest students in Alabama and Mississippi. In other words, it appears you can shift the line up and down vertically.
  • This isn’t to say that Scenario two isn’t important, or that I don’t care about it. I have argued previously–what is really Steve Kanim’s point–(in my previous post) that more research needs to be done regarding physics learning in courses, institutions, and students with less preparation and opportunity. It could be that students with high ACT and SAT scores learn more in most of our reformed environments exactly because they are based on research on how those students learn, rather than they are learning less because of their under-preparation. In that case one might say, “the smart get smarter, in part because we mostly study how to help the smart get even smarter”

 

Bundled ABP Standards

OK, draft of 13 standards for intro physics are below. That’s realistically one per week.The goal here has been to align standards with the high stakes assessment that will be administered to students during the semester, while supporting their learning in ways that I know matter. There’s lot of compromises going on here, and it still needs some adjustments. The standards are grouped by what students will need to have mastered for each of the four high stakes assessments–exams that focus on problem-solving.

Student Initiated Re-assessments

I students will have to apply in order to reassess with some evidence submitted with it that they practiced. I don’t want to have crazy barriers to reassess, but I don’t want anyone reassessing without putting in some work to learn first.

Feedback to Students

Intend to give feedback on learning indicators. Units and Vectors are the only ones that don’t have proficiency standards, since they aren’t separate problem-solving types they’ll see.  I’m leaning toward evaluating on a four point scale.

Developing (Minus): At least one learning indicator

Developing: All learning indicators but no problem-solving proficiency

Proficient (Minus): Problem-solving proficiency with not all learning indicators

Proficient: Problem-solving proficiency with all learning indicators (all together)

Assessment Format

The quizzes will typically have a problem to solve and 1-3 conceptual/reasoning questions. Students can get marks for learning indicators either on conceptual questions or in the midst of problem solving. Proficiencies (whether minus or not) will only be given for completely correct problem-solving with adequate work shown to justify credit.

Students can apply for reassessment that I will bring to class up until the exam that covers the relevant topic. After that, students must apply reassessment that happens during office hours. This means that I only have to juggle 3-4 standards at a time. High stakes exams, including the final, can be used as evidence for developing or proficiency.

 

Concerns:

#1 Don’t have a standard for uniform circular motion, but it’s something that will be on the test. I don’t want to have five standards during the time before exam two. While I could get rid of Newton’s laws (basic), I don’t think that’s going to help student learning in the long run.

#2 I’ve folded free-fall into constant acceleration. That’s to keep the number of standards low, especially toward the beginning while we are all figuring this out.

#3 Energy and momentum are bundled–super scary. The reason for this is that students are typically asked to solve problems that involve both concepts together. Have to think about this one, but I am inclined to have standards that align with expectations. They’ll still get separate practice and feedback on energy and momentum concepts.

#4 I have previously done binary grading (Y/N), so I’m concerned about the time it will take me to grade these / write these. With fine-grained targeted assessments, I graded same day and gave back to students. It was hectic, but doable. Probably not anymore. So now, I’m thinking self-assessment at the back of the room is only way to go.

#5 How will I translate this into a portion of the grade I have control over?

#6 Please tell me what else I should be concerned about. Comments, criticisms, concerns, questions are more than welcome.

 

1.1 Units

  • I am familiar with SI units and their prefixes
  • I can correctly re-express quantities using different units
  • I recognize unit cancellations and can simplify expressions involving them.

 

1.2 Constant Velocity

Learning Indicators

  • Distinguish among position, change-in-position, and distance
  • Use and interpret position vs. time graphs
  • Distinguish between average speed and velocity

Proficiency Indicator

  • Solve complex back-and-forth motion problems

 

1.3 Constant Acceleration

Learning Indicators

  • I can relate acceleration, velocity, and change in velocity
  • I can use and interpret velocity vs. time graphs
  • I use a reliable “getting started” method that includes drawing a sketch, choosing a coordinate system, & identifying variables from text/diagrams
  • I can identify the direction of kinematic vector quantities and utilize such information consistently using algebraic sign.

Proficiency Indicator

  • Solve complex problems involving constant acceleration.

 

2.1 Vectors

  • I can determine the components of vectors given magnitude and angle
  • I can describe the magnitude and angle of a vector given its components

 

2.2 Projectiles

Learning Indicators

  • I use a reliable “getting started” method, including drawing a sketch, choosing a coordinate system, and identifying variables from the text.
  • I correctly identify and distinguish dimensions with constant a and constant v
  • I can recognize when vector analysis is needed and can perform it
  • I can apply the independence of dimensions to qualitatively reason about special cases of projectile motion.

Proficiency Indicator

  • I can solve projectile motion problems.

2.3 Newton’s Laws (Basic)

Learning Indicators

  • Recognize when the forces on an object or system are balanced or unbalanced from graphs, equations, or descriptions of motion
  • Draw a force diagram (FBD) accurately showing directions and types of forces acting on an object or system.
  • Write net force equations describing an object or system.

Proficiency Indicator

  • Solve problems using net force equations and diagrams

 

2.4 Newton’s Laws (Advanced)

Learning Indicators

  • Use trigonometric relationships to find force components
  • Recognize when to and be able to apply specific force models (e.g., static friction, kinetic friction, ideal springs, etc).
  • Write net force equations describing an object or system.

Proficiency Indicator

  • Solve problems using net force equations and diagrams

 

 

3.1 Energy and Momentum

 Learning Indicators

  • I can calculate the work due to a force
  • I can recognize situations where mechanical energy is conserved
  • I can write a correct conservation of energy equation
  • I can recognize situations where conservation of momentum applies
  • I can write a correct conservation of momentum equation

 Proficiency Indicator

  • I can solve problems that requires conservation of energy & momentum

 

3.2 Static Equilibrium

Learning Indicators

  • I can determine the torque associated with a force around a given pivot
  • I can write a correct sum of torques statement
  • I can write a correct sum of forces statement

Proficiency Indicator

  • I can solve problems involving static equilibrium

 

3.3 Rotational Kinematics

Learning Indicators

  • I can relate frequency, angular frequency, and period
  • I can relate angular displacements, (average) angular velocity, and time
  • I can relate angular velocities, angular accelerations, and time
  • I can relate angular kinematic variables to tangential kinematic variables

 

Proficiency Indicator

  • I can solve rotational kinematic problems

 

4.1 Oscillations

Learning Indicators

  • I can identify amplitude and period in graphs, equations, and pictures
  • I can identify the factors that do and do not influence frequency for both a simple pendulum and a simple mass-spring system
  • I can compare velocity, acceleration, and force for various points along the motion of an object in a simple mass-spring system
  • I can qualitatively analyze the energy transformation for an oscillating system

 Proficiency Indicator

  • I can analyze an oscillating system using kinematics, forces, and/or energy concepts to solve a problem.

 

4.2 Waves

Learning Indicators

  • Relate string length and wavelength for standing waves on a string
  • Reason about and use relationships for wave speed, wavelength, & frequency
  • Reason about & use relationships that relate wave speed to medium properties

Proficiency Indicator

  • I can solve problems involving vibrations among multiple media

 

4.3 Hydrostatics

Learning Indicators

  • I can quantitatively/qualitatively reason about pressure changes in a liquid
  • I can relate pressure, force, and area and recognize the need to do so
  • I can qualitatively reason about buoyant force using Archimedes principle
  • I use Newton’s laws to analyze the statics/dynamics of submerged objects

Proficiency Indicators

  • I can solve hydrostatic problems

Blog at WordPress.com.

Up ↑