In-Class Formative Assessment for Physics
In-Class Formative Assessment for Physics
OVERVIEW
The materials we have developed consist of carefully sequenced sets of multiple-choice items that emphasize
qualitative, conceptual questions (See Figure 1 for a sample). They are designed to maximize student-instructor
interaction and allow rapid assessment of student learning in a large-class environment. This assessment then aids
instructors in structuring and guiding their presentations and class activities
The design of the materials is based on the assumption that the solution of even very simple physics problems
invariably hinges on a lengthy chain of concepts and reasoning. Our question sequences guide the student to lay
bare these chains of reasoning, and to construct in-depth understanding of physical concepts by step-by-step
engagement with conceptual sticking points. Carefully linked sequences of activities first lead the student to confront
the conceptual difficulties, and then to resolve them. This strategy is based on one developed at the University of
Washington over the past 30 years [1,2,3,4]. Complex physical problems are broken down into conceptual elements,
allowing students to grapple with each one in turn and then return to synthesize a unifying perspective [5].
Over several years the materials have undergone a continuous process of testing and revision in actual
classroom situations. Constant in-class use reveals ambiguous and confusing wording which is then rapidly
corrected in new versions of the materials. Analysis of assessment data provides additional guidance for revision.
MOTIVATION
(NB: Here and below, only selected, representative references to the physics education research
literature are given. Relevant references to other and earlier work are provided in the Appendix.)
Research in physics education suggests that instructional methods that incorporate in-class problem-solving
activities with rapid feedback can yield improved learning gains, in comparison to traditional lecture methods [5,6].
A key to the success of these methods is that instructional activities should elicit and address common conceptual
difficulties, difficulties that are often uncovered or probed through in-depth research on student understanding
[1,2,3,4]. When students grapple with conceptual issues by thinking about and solving qualitative problems—
problems in which straightforward algebraic procedures may be insufficient (or inefficient) solution methods—
learning and retention has often been observed to improve. Instructional methods that engage students in
problem-solving activities are sometimes called “active-learning” methods. A particular genre of active-learning
methods used in physics has often been referred to by the term “interactive engagement” [6].
173
INTERACTIVE ENGAGEMENT
Traditionally, instructors (and textbooks) have tended to focus on presenting clearly, precisely, and in
painstaking detail the concepts and techniques they wish their students to learn. The emphasis is on the
thoroughness and clarity of the presentation [4]. However, in recent decades, research into student learning of
physics and other technical subjects has demonstrated that for each new concept or technique to be learned, there
will often be a number of conceptual “sticking points” for the student [4,7]. Moreover, there has been increasing
recognition of the important role of students’ prior (i.e., pre-instruction) knowledge in generating these sticking
points and in providing a basis for their eventual resolution [1,2,3,4,8]. In addition, more attention has been paid
both to the ways in which students’ ideas are linked and organized, and to the nature of students’ approaches
to applying their knowledge and to solving problems [9]. These realizations have led to a revised view of the
instructor’s role.
In this revised view, the central function of the instructor is to direct the focus of class activities and discussion
toward the key sticking points in the students’ thought process, and toward specific weaknesses in the organization
of students’ knowledge. One has to illuminate in a stark and glaring light, so to speak, the phases in the student’s
thought process where a key concept or organizational link may be lacking, so that in the student’s own mind the
gap to be filled is clearly sensed, and the eventual synthesis of the concept or link becomes dramatically apparent.
Since ideally one must determine where a student stands conceptually—in the process of understanding a
particular idea—in order to guide them to the next phase, some form of back-and-forth interchange with them is
essential, even in very large classes. The main focus of instruction is first, to identify the ways in which students are
putting the idea together in their own minds, so as to pinpoint any errors or gaps that may exist; second, to identify
elements of students’ thinking that can potentially form useful and productive components of an improved
understanding; and third, to allow the students to grapple with a question, example, or problem that requires them
to fill out and perfect their understanding. This could be a problem on which they may all work for several minutes,
or instead something as simple as the question: “What is the next step here?” The essential point is to ensure their
active mental participation as thoroughly as is feasible.
The crux of the instructional problem is that students’ minds are not blank slates, and they do not absorb
concepts simply by being told (or shown) that they are true. They must be guided continually to challenge their
present state of understanding, and to resolve conceptual confusion through a process of active engagement [1].
This may occur either by predicting and then personally investigating the outcome of real physical processes in
the instructional laboratory, or by a step-by-step confrontation with conceptual sticking points in the context of a
theoretical example [3]. Promoting student interaction through the use of cooperative groups can aid this process
by having students challenge each others’ understanding, and by encouraging them to help each other deepen
their comprehension of the subject matter. As any teacher knows, articulating one’s thoughts helps improve one’s
own learning.
These considerations regarding student learning have led to the development and implementation of a
variety of instructional methods which, in the context of physics instruction, have often come to be called by the
general term “interactive engagement” [6]. It is particularly challenging to specify what is meant by this term,
in part because it generally refers not simply to specific behaviors by the instructors and the students, but also
174
to specific aspects of the content of the instructional materials and activities. These aspects of content refer to
features that are explicitly based on consideration of students’ pre-instruction knowledge and of their typical
learning behaviors. Research has suggested that instruction which incorporates certain useful behaviors without
also utilizing appropriate content may fall far short of the outcomes that result from an appropriate combination of
these two key elements [9,10,11].
In view of these considerations, I will outline some of the prominent features of interactive-engagement instruction
in physics. Interactive-engagement instruction generally refers to:
1) Instruction that is informed and guided by knowledge of students’ pre-instruction knowledge state
[1,2,3,4,12,13,14], as well as of their learning trajectory [15,16]. This refers to both their pre-existing ideas and to
their learning tendencies. These tendencies constitute the ways in which students typically attempt to apply their
pre-existing understanding and reasoning processes to issues that emerge during the course of instruction. These
include in particular:
a) Specific student learning difficulties related to particular physics concepts [1,2,3,4,6, 8,12,14,17]
b) Specific student ideas and knowledge elements that are productive and useful in helping them grapple with
new physics concepts [18]
c) Students’ beliefs about what they need to do in order to learn [14, 19]
d) Students’ actual behaviors in the context of the learning process [20]
2) Instruction that guides students to elicit [14] and address specific difficulties typically encountered when
studying new concepts, whether by relatively direct methods (in which students are guided to “confront” these
difficulties [1-4]) or less direct methods (in which students are guided to “refine” their ideas to “reconcile” them to
physics concepts [18]). Other terms that have been applied to this process include “bridging” [21] (i.e., between
more familiar and less familiar concepts) and “weaving” (i.e., of loosely connected initial ideas into more complete
understanding) [22].
3) Instruction that emphasizes having students “figure things out for themselves” [13] to the extent that is practical
and appropriate. This implies that students are guided to reason out concepts and key ideas through a questioning
and discussion process (“guided inquiry”), in contrast to receiving these ideas fully and clearly developed in advance
of their problem-solving activity [1,2,3,4,13,23]. In the initial stages, instructors tend to ask students many questions
rather than provide either direct answers or detailed formulations of generalized principles. Carefully structured
question sequences are often used in this process (Detailed formulations of general principles may however be
appropriate at a later stage of the process) [3].
4) Instruction that emphasizes having students engage in a wide variety of problem-solving activities during class
time, in contrast to spending most of the time listening to an instructor speak [6, 8].
5) Instruction that leads students to express their reasoning explicitly both in verbal form by interacting with
instructors and other students, and in written form through explanations written as part of responses to quiz,
homework, and exam problems [1,2,3,4,13,14,22,23,24,25,26]. This helps students more clearly expose—and
therefore modify—their own thought processes.
175
6) Instruction that incorporates students working together in small groups in which they are led both to express
their own thinking, and to comment on and critique each others’ thinking regarding problems and questions posed
for their consideration [3,4,14,17,26].
7) Instruction that ensures that students receive rapid feedback in the course of their problem-solving activity [5,6]
(rapid in the sense of a minute-to-minute time scale). This includes feedback from instructors through frequent
questions and answers, and feedback from fellow students through small-group interaction.
8) Instruction that emphasizes qualitative reasoning and conceptual thinking [1,2,3,4,5,13,14,23,24,25]. Non-
quantitative means of problem solving are used to strengthen students’ understanding of fundamental ideas, and
to avoid having students focus on mastery of mathematical algorithms as a substitute for that understanding.
9) Instruction that seeks to deepen conceptual understanding by posing problems and eliciting solutions in a wide
variety of contexts and representations, incorporating diagrammatic, graphical, pictorial, verbal, and other means
of representing ideas and resolving questions [2,4,5,14,17,22,23,24,25,26,27,28,29,30,31].
Note that this list emphasizes the content of instructional materials and activities (particularly in items 1, 2, 8,
and 9) as much as it does the specific instructional behaviors (such as those in items 3 through 7). It has become clear
that in order to fulfill the objectives of this form of instruction, substantial prior investigation of students’ thinking and
learning behaviors is required. This type of research lays the basis for, in particular, the first two items in the process
outlined above. Instruction that is based on physics education research of this type is often called “research-based”
instruction. Instruction that, by contrast, employs some of the same learning behaviors but in which the content does
not focus on areas identified with specific learning difficulties is not, apparently, as successful.
Several investigations have addressed the issue of ostensibly “interactive,” yet not-very-effective learning
environments within the context of physics education. A common theme is that such less-effective environments
are missing a key element by not addressing students’ actual learning difficulties. (Such difficulties may be
uncovered through research.) In a study by Redish, Saul, and Steinberg [9], even lectures “with much student
interaction and discussion” had little impact on student learning. Hake discusses and analyzes courses supposedly
based on interactive engagement that produced subpar learning results [6]. In her Ph.D. research, Pam Kraus
looked at this issue more systematically [10]. After a lengthy investigation, she arrived at the following conclusion:
In many of our efforts to improve student understanding of important concepts, we have been able to create an
environment in which students are mentally engaged during the lecture. While we have found this to be a necessary
condition for an instructional intervention to be successful, it has not proved sufficient. Of equal importance is the
nature of the specific questions and situations that students are asked to think about and discuss. ( [10], p. 286]
Kraus specifies the key criteria she found effective in improving instruction: eliciting students’ preconceptions
with carefully designed questions, guiding them to confront these ideas through appropriate discussion and
debate involving all the students, and leading students to resolve their difficulties with well-chosen alternative
models. A somewhat different alternative approach that has been reported as successful is to guide students to
176
generate and then test their own explanations for patterns observed in simple experiments [28].
In a careful study reported by Cummings et al. [11], “studio” instruction that involved students working together in
small groups using computers was compared with research-based instruction in a similar environment. They found
that although the studio-physics classrooms appeared to be interactive and students seemed to be engaged in
their own learning, learning outcomes were the same as with traditional instruction. By contrast, introduction of
research-based techniques and activities generated significant gains in conceptual understanding, although the
studio-classroom environment was otherwise the same as before.
It is worth emphasizing that extensive empirical evidence of the instructional effectiveness of these various
techniques has been published both in the references cited, and in many other sources cited in turn by those
references. To choose just one illustrative example, the effectiveness of the elicit-confront-resolve method, as
implemented in the Tutorials developed at the University of Washington [3,37], has been demonstrated repeatedly
by multiple investigators at a variety of institutions, including the use of longitudinal studies, with very consistent
results [45]. Learning gains generated through use of these materials were clearly superior to those achieved with
more traditional instruction. In view of this vast array of direct empirical evidence, the recent finding of only a
“weakly positive” relationship between science achievement and loosely defined “reformed-oriented practices” [46]
must be taken to reflect limitations either of that particular study, or of the specific instructional practices probed
by that investigation.
177
discuss their ideas with each other, and provide their responses to the instructor using a classroom communication
system. The instructor makes immediate use of these responses by tailoring the succeeding questions and
discussion to most effectively match the students’ pace of understanding.
In an office or small-group environment, the instructor is relatively easily able to get an ongoing sense of
where the students are “conceptually,” and how well they are following the ideas that are being presented. By
getting continual feedback from them, the instructor is able to tailor his or her presentation to the students’ actual
pace of understanding. The methods we use allow one, to a large extent, to transform the environment of the
lecture hall into that of a small seminar room in which all the students are actively engaged in the discussion.
Our methods begin with a de-emphasis of lecturing. Instead, students are asked to respond to questions
targeted at known learning difficulties. We use a classroom communication system to obtain instantaneous
feedback from the entire class, and we incorporate group work using both multiple-choice and free-response
items. We refer to this method as the “fully interactive lecture” and have described it in detail elsewhere [47]. In the
remainder of this section I give a brief synopsis of this method. (Note: Since this particular project was restricted to
creation of the multiple-choice items, I will not further discuss the free-response items in this paper.)
We ask questions during class and solicit student responses using printed flashcards (containing letters A, B,
C, D, E, and F) or with an electronic “clicker” system. The questions stress qualitative concepts involving comparison
of magnitudes (e.g., “Which is larger: A, B, or C?”), directions (“Which way will it move?”), and trends (“Will it
decrease, remain the same, or increase?”). These kinds of questions are hard to answer by plugging numbers into
an equation.
We give the students some time to consider their response, 15 seconds to several minutes depending on the
difficulty. Then we ask them to signal their response by holding up one of the cards, everybody at once. Immediately,
we can tell whether most of the students have the answer we were seeking—or if, instead, there is a “split vote,” half
with one answer, half with another. If there is a split vote, we ask them to talk to each other. Eventually, if necessary, we
will step in to—we hope—alleviate the confusion. If they haven’t already figured things out by themselves, they will
now at least be in an excellent position to make sense out of any argument we offer to them.
The time allotted per question varies, leading to a rhythm similar to that of one-on-one tutoring. The
questions emphasize qualitative reasoning, to reduce “equation-matching” behavior and to promote deeper
thinking. Questions in a sequence progress from relatively simple to more challenging. They are closely linked to
each other to explore just one or two concepts from a multitude of perspectives, using a variety of representations
such as diagrams, graphs, pictures, words, and equations. We maintain a small conceptual “step size” between
questions for high-precision feedback on student understanding, which allows more precise fine-tuning of the
class discussion. In line with this objective, we employ a large proportion of “easy” questions, that is, questions to
which more than 80 percent of students respond correctly.
We find that easy questions build confidence, encourage student participation, and are important signals to
the instructor of students’ current knowledge baseline. Often enough, questions thought by the instructor to be
simple turn out not to be, requiring some backtracking. Because of that inherent degree of unpredictability, some
178
proportion of the questions asked will turn out to be quite easy for the students. If the discussion bogs down due
to confusion, it can be jump-started with easier questions. The goal is to maintain a continuous and productive
discussion with and among the students.
Many question variants are possible. Almost any physics problem may be turned into an appropriate
conceptual question. By using the basic question paradigms “increase, decrease, remain the same,” “greater than,
less than, equal to,” and “left, right, up, down, in, out,” along with obvious variations, it is possible to rapidly create
many questions that probe students’ qualitative thinking about a system. By introducing minor alterations in a
physical system (adding a force, increasing a resistance, etc.), students can be guided to apply their conceptual
understanding in a variety of contexts. In this way, the instructor is able to provide a vivid model of the flexible and
adaptive mental approach needed for active learning.
The development and validation of the question sequences is the central task of this project. Many question
sequences are needed to cover the full range of topics in the physics course curriculum. (Other materials needed
for interactive lecture instruction include free-response worksheets and text reference materials, but these are
under development as part of separate projects.)
RESULTS OF ASSESSMENT
In earlier projects related to this one, we have carried out extensive assessment of student learning. We found
that learning gains on qualitative problems were well above national norms for students in traditional courses,
at the same time that performance on quantitative problems was comparable to (or slightly better than) that of
students in traditional courses [47]. These findings are typical of other research-based instructional methods [4,8].
In this type of process, students are asked to work through the sequence of questions, explaining their
reasoning as they go, while the interviewer examines the details of the student’s thinking with gently probing
questions. This process can be very effective in 1) uncovering confusing or ambiguous language and word usage;
2) confirming that the students interpret the meaning of the question in the manner intended; and 3) determining
whether the students make any tacit assumptions intended by the question (e.g., no external electric field), and do
not impose any unintended assumptions (e.g., a need to consider very weak forces). The outcome of this process is
to substantially strengthen the quality and utility of the collection of assessment items as a whole. Our data from
this phase of the project are as yet only preliminary, but we hope to significantly expand this aspect of the work in
the future.
179
One of the goals of this project was to record student responses to each of the assessment items, including
those items already developed and class tested, as well as the items that were developed as a result of the present
project. These response data will provide a baseline benchmark for comparison when other instructors make use
of the assessment materials, and will assist other instructors in planning and interpreting the use of the materials.
Samples of these data (obtained at Iowa State University) are shown in Figure 1. They illustrate that correct
response rates on the first few questions in a given sequence are relatively high (80 percent or greater); as the
sequence progresses to more challenging items, response rates can drop to the 50 to 70 percent level or less. It is
these more challenging questions that usually generate the most productive discussions.
As part of previous projects, initial versions of question sequences for topics in electricity and magnetism,
optics, and modern physics had been created. During the present project, we have worked on additional materials
for magnetism and modern physics, as well as materials for selected topics in mechanics. Ultimately, we intend to
complete question sequences for the full two-semester introductory physics course.
CONCLUSION
Although the methods described here have focused on physics instruction, it is clear that they have broad
potential applicability to a wide variety of technical fields. As may be verified in part by consulting the rapidly
expanding list of citations [51] to Crouch and Mazur’s paper on peer instruction [34], similar methods have been
embraced and found useful by, among others, astronomers, geoscientists, physiologists, chemists, engineers, and
computer scientists.
ACKNOWLEDGMENTS
Thomas J. Greenbowe assumed responsibilities as Principal Investigator for this project upon the author’s
move to the University of Washington.
Much of the preliminary work on this project was carried out by Ngoc Loan P. Nguyen, a former graduate
student at Iowa State University. Mr. Nguyen died unexpectedly in November 2005 as a result of a sudden illness.
This was a devastating loss both personally for this author and for the ongoing work of this project, the completion
of which is now significantly delayed.
This work has been supported in part by NSF DUE-0243258, DUE-0311450, PHY-0406724, and PHY-0604703.
The Physical Science Study Committee project initiated in 1956 by MIT physicists Jerrold Zacharias and
Francis Friedman was one of the first steps in this process [52]. Eventually involving a broad array of world-famous
180
physicists, this project resulted in a dramatic rethinking of the high-school physics curriculum and generated a
new textbook [53], along with ancillary curricular materials. The new curriculum was distinguished by a greatly
increased emphasis—in contrast to traditional curricula—on communicating a deep conceptual understanding of
the broad themes of physical principles. It represented a rejection of traditional efforts that had relied heavily on
memorization of terse formulations and “cookbook”-style instructional laboratories.
Further catalyzed by the launch of Sputnik in 1957, and with strong funding support by the National Science
Foundation (NSF), similar curriculum development efforts were initiated by chemists (in 1957), biologists (in 1959),
mathematicians (also in 1959, although preliminary efforts had started in 1952), and earth scientists (in 1962) [54].
A joint conference in 1959 sponsored by the National Academy of Sciences brought the scientists together with
prominent psychologists and educators such as Harvard’s Jerome Bruner and Piaget collaborator Bärbel Inhelder
[55]. General pedagogical principles that emerged from these discussions were enunciated by Bruner [56], Joseph
Schwab [57], and others. Soon, the reform effort expanded to include the elementary schools and, backed by the
NSF, an explosion of more than a dozen new science curricula aimed at younger students was generated [58].
Prominent physicists again played a central role in several of these curriculum reform projects, notably including
Cornell’s Philip Morrison (in the “Elementary Science Study” project [59]) and Berkeley’s Robert Karplus (a key leader
in the “Science Curriculum Improvement Study” [60]). Beginning in the late 1960s and early 1970s, these instructional
methods were put into action at the university level by the Washington group led by Arnold Arons [61] and Lillian
McDermott [62,63]. In these early efforts, Arons and McDermott put great emphasis on the need for students to
formulate and express reasoned responses in written or verbal form to questions that they themselves raised during
instruction. Initially, these efforts focused on improving the preparation of prospective K-12 science teachers.
Prominent in all of these efforts was a strong emphasis on learning through guided inquiry (sometimes
called “discovery”), utilizing the investigational process of science as a means of teaching scientific concepts
themselves [57]. In this process, students would be expected to engage in “discovery of regularities of previously
unrecognized relations” [64]. The notion that instructors could guide students through a process of discovery was
expressed in the three-phase “learning cycle” propounded by Robert Karplus [65]. In this cycle, students’ initial
exploration activities led them (with instructor guidance) to grasp generalized principles (concepts) and then to
apply these concepts in varied contexts. These ideas of inquiry-based “active” learning could themselves be traced
back to workers who came much earlier, including Piaget [66] and his followers, and to proponents of the ancient
notions of Socratic dialogue. Piaget’s emphasis on the importance of explicitly cultivating reasoning processes that
employed hypothesis formation, proportional reasoning, and control of variables, later had an enormous influence
on both physics and chemistry educators [67].
Inspired in part by Piaget’s earlier groundbreaking investigations, science educators began to perceive the
pedagogical importance of the ideas that students brought with them to class. Piaget had emphasized that new
ideas being learned had to be “accommodated,” in a sense, by a student’s already-existing ideas [66]. As Bruner put
it, the learning process at first involves “acquisition of new information—often information that runs counter to or
is a replacement for what the person has previously known implicitly or explicitly. At the very least it is a refinement
of previous knowledge” [68]. Later, researchers began systematic efforts to probe students’ thinking on a variety
of science topics, initially at the elementary and secondary levels [69]. In the late 1970s, Viennot [70] in France,
and McDermott and her students in the United States [71], were among the very first to systematically investigate
181
understanding of science concepts by students enrolled in university-level courses. These investigations led
immediately to the development and implementation of research-based instructional methods and curricula.
McDermott’s research formed the basis for development of curricular materials that explicitly addressed students’
pre-instruction ideas. The research-based materials guided students both to elicit their pre-instruction ideas, and
then to carry out the thinking processes needed to resolve conceptual and reasoning difficulties that emerged
during the instructional process. By doing this research and then bringing to bear on university-level science
instruction the pedagogical perspectives and methods employed earlier for younger students, McDermott and
other physicist-educators “closed the circle.” They had laid the foundation for an ongoing process of research and
reform in science education that could engage all participants in the process from the elementary grades on
through graduate school. It is on this foundation that the present project is built.
Figure 1. Excerpts from a sequence of “flash-card” questions for interactive lecture, showing student
response rates obtained at Iowa State University. The excerpts consist of three (non-consecutive) pages
from Chapter 3 of the Workbook for Introductory Physics by D. E. Meltzer and K. Manivannan.
182
183
REFERENCES
[1] McDermott, L.C., “Millikan Lecture 1990: What we teach and what is learned – closing the gap,” American
Journal of Physics, vol. 59, pp. 301-315, 1991.
[2] McDermott, L.C., “Guest comment: How we teach and how students learn – a mismatch?” American Journal of
Physics, vol. 61, pp. 295-298, 1993.
[3] McDermott, L.C., “Bridging the gap between teaching and learning: The role of research,” in Redish, E.F., and
Rigden, J.S., (eds.), The Changing Role of Physics Departments in Modern Universities: Proceedings of the
International Conference on Undergraduate Physics Education, AIP Conference Proceedings 399, (AIP, Woodbury,
New York), pp.139-165, 1997.
[4] McDermott, L.C., “Oersted Medal Lecture 2001: Physics Education Research—The key to student learning,”
American Journal of Physics, vol. 69, pp. 1127-1137, 2001.
[5] Reif, F., “Milliken Lecture 1994: Understanding and teaching important scientific thought processes,” American
Journal of Physics, vol. 63, pp. 17-32, 1995.
[6] Hake, R. R., “Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics
test data for introductory physics courses,” American Journal of Physics, vol. 66, pp. 64-74, 1998.
[7] Heron, P.R.L., “Empirical investigations of learning and teaching, Part I: Examining and interpreting student
thinking,” in Redish, E.F., and Vicentini, M., (eds.), Research on Physics Education: Proceedings of the
International School of Physics “Enrico Fermi” Course CLVI, (IOS, Amsterdam), pp. 341-350, 2004; “Part II:
Developing research-based instructional materials, Ibid., pp. 351-365; Meltzer, D.E., “The questions we ask and
why: methodological orientation in physics education research, in Marx, J., Franklin, S., and Cummings, K.,
(eds.), Physics Education Research Conference, Madison, Wisconsin, 6-7 August 2003, AIP Conference Proceedings
720, (AIP, Melville, New York), pp.11-14, 2004; Heron, P.R.L., “Empirical investigations of student understanding,”
Ibid., pp. 15-18.
[8] Redish, E.F., and Steinberg, R.N., “Teaching physics: figuring out what works,” Physics Today, vol. 52, no. 1, p. 24,
1999.
[9] Redish, E F., Saul, J.M, and Steinberg, R.N., “On the effectiveness of active-engagement microcomputer-based
laboratories,” American Journal of Physics, vol. 65, pp. 45-54, 1997.
[10] Kraus, P.A., Promoting Active Learning in Lecture-based Courses: Demonstrations, Tutorials, and Interactive Tutorial
Lectures, Ph.D. dissertation, University of Washington, University Microfilms, UMI Number 9736313, 1997.
[11] Cummings, K., Marx, J., Thornton, R., and Kuhl, D., “Evaluating innovations in studio physics,” American Journal
of Physics, vol. 67, pp. S38-S44, 1999.
[12] Halloun, I.A., and Hestenes, D., “The initial knowledge state of college physics students,” American Journal of
Physics, vol. 53, pp. 1043-1055, 1985.
184
[13] Hake, R.R., “Promoting student crossover to the Newtonian world,” American Journal of Physics, vol. 55, pp. 878-
884, 1987.
[14] Goldberg, F., and Bendall, S., “Making the invisible visible: A teaching/learning environment that builds on a
new view of the physics learner,” American Journal of Physics, vol. 63, pp. 978-991, 1995.
[15] Thornton, R.K., “Conceptual Dynamics: Following changing student views of force and motion,” in Redish, E.F.,
and Rigden, J.S., (eds.), The Changing Role of Physics Departments in Modern Universities: Proceedings of the
International Conference on Undergraduate Physics Education, AIP Conference Proceedings 399, (AIP, Woodbury,
New York), pp. 241-266, 1997.
[16] Meltzer, D.E., “How do you hit a moving target? Addressing the dynamics of students’ thinking”, in Marx, J.,
Heron, P.R.L., and Franklin, S., (eds.), Physics Education Research Conference, Sacramento, Calif., 4-5 August
2004, AIP Conference Proceedings 790, (AIP, Melville, New York), pp. 7-10, 2005.
[17] Hake, R.R., “Socratic pedagogy in the introductory physics laboratory,” The Physics Teacher, vol. 30, pp. 546-552,
1992.
[18] Elby, A., “Helping physics students learn how to learn,” American Journal of Physics, vol. 69, pp. S54-S64, 2001.
[19] Redish, E. F., Saul, J.M., and Steinberg, R.N., “Student expectations in introductory physics,” American Journal
of Physics, vol. 66, pp. 212-224, 1998; Hammer, D., “Student resources for learning introductory physics,”
American Journal of Physics, vol. 68, pp. S52-S59, 2001; May, D. B., and Etkina, E., “College physics students’
epistemological self-reflection and its relationship to conceptual learning,” American Journal of Physics, vol.
70, pp. 1249-1258, 2002; Adams, W.K., Perkins, K.K., Podolefsky, N.S., Dubson, M., Finkelstein, N.D., andWieman,
C.E., “New instrument for measuring student beliefs about physics and learning physics: The Colorado
Learning Attitudes about Science Survey, Physical Review Special Topics–Physics Education Research, vol. 2, pp.
010101-1–010101-14, 2006.
[20] Thornton, R.K., “Uncommon knowledge: Student behavior correlated to conceptual learning,” in Redish, M.,
and Vicentini, M., (eds.), Research on Physics Education: Proceedings of the International School of Physics “Enrico
Fermi” Course CLVI, (IOS, Amsterdam), pp. 591-601, 2004.
[21] Clement, J.J., “Using bridging analogies and anchoring intuitions to deal with students’ preconceptions in
physics,” Journal of Research in Science Teaching, vol. 30, pp. 1241-1257, 1993.
[22] Minstrell, J.A., “Teaching science for understanding”, in Resnick, L.B., and Klopfer, L.E., (eds.), Toward the
Thinking Curriculum: Current Cognitive Research, (Association for Supervision and Curriculum Development,
Alexandria, Va.), pp. 129-149. 1989.
[23] Arons, A.B., “Cultivating the capacity for formal reasoning: Objectives and procedures in an introductory
physical science course,” American Journal of Physics, vol. 44, pp. 834-838, 1976.
[24] Redish, E.F., “Implications of cognitive studies for teaching physics,” American Journal of Physics, vol. 62, pp.
796-803,1994.
185
[25] Leonard, W.J., Dufresne, R.J., and Mestre, J.P., “Using qualitative problem-solving strategies to highlight the role
of conceptual knowledge in solving problems,” American Journal of Physics, vol. 64, pp. 1495-1503, 1996.
[26] Heller, P., Keith, R., and Anderson, S., “Teaching problem solving through cooperative grouping. Part 1: Group
versus individual problem solving,” American Journal of Physics, vol. 60, pp. 627-636, 1992; Heller, P., and
Hollabaugh, M., “Teaching problem solving through cooperative grouping. Part 2: Designing problems and
structuring groups,” American Journal of Physics, vol. 60, pp. 637-644, 1992.
[27] McDermott, L.C., “A view from physics,” in Gardner, M., Greeno, J.G., Reif, F., Schoenfeld, A.H., diSessa, A., and
Stage, E., (eds.), Toward a Scientific Practice of Science Education, (Erlbaum, Hillsdale, N.J), pp. 3-30, 1990.
[28] Etkina, E., and Van Heuvelen, A., “Investigative Science Learning Environment: Using the processes of science
and cognitive strategies to learn physics, in Franklin, S., Marx, J., and Cummings, K., (eds.), Proceedings of the
2001 Physics Education Research Conference, (PERC Publishing, Rochester, N.Y.), pp. 17-21, 2001; Etkina, E.,
Murthy, S., and Zou, X., “Using introductory labs to engage students in experimental design,” American Journal
of Physics, vol. 74, pp. 979-986, 2006.
[29] Van Heuvelen, A., “Learning to think like a physicist: A review of research-based instructional strategies,”
American Journal of Physics, vol. 59, pp. 891-897, 1991.
[30] Hestenes, D., “Modeling methodology for physics teachers,” in Redish, E.F., and Rigden, J.S., (eds.), The
Changing Role of Physics Departments in Modern Universities: Proceedings of the International Conference on
Undergraduate Physics Education, AIP Conference Proceedings 399, (AIP, Woodbury, New York), pp. 241-266,
1997.
[31] Meltzer, D.E., “Relation between students’ problem-solving performance and representational format,”
American Journal of Physics, vol. 73, pp. 463-478, and references therein, 2005.
[32] Van Heuvelen, A., “Overview, case study physics,” American Journal of Physics, vol. 59, pp. 898-907, 1991.
[33] Mazur, E., Peer Instruction: A User’s Manual, (Prentice Hall, Upper Saddle River, N.J.), 1997.
[34] Crouch, C.H., and Mazur, E., “Peer instruction: Ten years of experience and results,” American Journal of Physics,
vol. 69, pp. 970-977, 2001.
[35] Sokoloff, D. R., and Thornton, R.K., “Using interactive lecture demonstrations to create an active learning
environment, The Physics Teacher, vol. 35, no. 10, pp. 340-347, 1997.
[36] Novak, G.M., Patterson, E.T., Gavrin, A.D., and Christian, W., Just-In-Time Teaching: Blending Active Learning with
Web Technology, (Prentice Hall, Upper Saddle River, N.J.), 1999.
[37] McDermott, L.C., Shaffer, P., and the Physics Education Group, Tutorials in Introductory Physics, (Prentice Hall,
Upper Saddle River, N.J.), 2002.
186
[38] Dufresne, R.J., Gerace, W.J., Leonard, W.J., Mestre, J.P., and Wenk, L., “Classtalk: A classroom communication
system for active learning,” J. Comp. Higher Educ., 7, pp. 3-47, 1996; Mestre, J. P., Gerace, W.J., Dufresne,
R.J., and Leonard, W.J., “Promoting active learning in large classes using a classroom communication system,”
in Redish, E.F., and Rigden, J.S., (eds.), The Changing Role of Physics Departments in Modern Universities:
Proceedings of the International Conference on Undergraduate Physics Education, AIP Conference Proceedings 399,
(AIP, Woodbury, New York), pp. 241-266, 1997; Wenk, L., Dufresne, R., Gerace, W., Leonard, W., and Mestre, J.,
“Technology-assisted active learning in large lectures”, in McNeal, A.P., and D’Avanzo, C., (eds.), Student-Active
Science, Models of Innovation in College Science Teaching, (Saunders College Publishing, Fort Worth, Tex.),
pp. 431-452, 1997; Beatty, I.D., Gerace, W. J., Leonard, W.J., and Dufresne, R.J., “Designing effective questions for
classroom response system teaching,” American Journal of Physics, vol. 74, pp. 31-39, 2006.
[39] Poulis, J., Massen, C., Robens, E., and Gilbert, M., “Physics lecturing with audience paced feedback,” American
Journal of Physics, 66, pp. 439-441, 1998.
[40] Shapiro, J.A., “Electronic student response found feasible in large science lecture hall”, J. Coll. Sci. Teach., vol. 26,
pp. 408-412, 1997.
[41] Burnstein, R. A., and Lederman, L.M., “Using wireless keypads in lecture classes,” The Physics Teacher, vol. 39,
pp. 8-11, 2001.
[42] Lenaerts, J., Wieme, W., and Van Zele, E., “Peer Instruction: A case study for an introductory magnetism course,”
European Journal of Physics, vol. 24, pp. 7-14, 2003.
[43] Reay, N.W., Bao, L., Li, P., Warnakulasooriya, R., and Baugh, G., “Toward the effective use of voting machines in
physics lectures,” American Journal of Physics, vol. 73, pp. 554-558, 2005.
[44] Beichner, R. J., Saul, J.M., Allain, R.J., Deardorff, D.L., and Abbott, D.S., “Introduction to SCALE-UP: Student-
centered activities for large enrollment university physics,” in Proceedings of the 2000 Annual meeting of the
American Society for Engineering Education, Session 2380, 2000. http://www2.ncsu.edu/ncsu/pams/physics/
Physics_Ed/index.html
[45] Numerous papers (more than a dozen) by the Washington group itself have documented learning gains
resulting from this method, as cited for instance in references 1 through 4 above. Confirmatory evidence in
replication experiments has been published in a longitudinal study by investigators at Montana State
University [Francis, G. E., Adams, J.P., and Noonan, E.J., “Do they stay fixed?” The Physics Teacher, vol. 36,
pp. 488-490, 1998.], as well as in other studies at the University of Colorado [Finkelstein, N.D., and Pollock, S.J.,
“Replicating and understanding successful innovations: Implementing tutorials in introductory physics,”
Physical Review Special Topics–Physics Education Research 1: 010101-1–010101-13, 2005], and the University of
Cincinnati [Koenig, K.M., and Endorf, R.J., “Study of TAs’ ability to implement the Tutorials in Introductory
Physics and student conceptual understanding,” in Marx, J., Franklin, S., and Cummings, K., (eds.), 2003 Physics
Education Research Conference , Madison, Wisconsin, 6-7 August 2003, AIP Conference Proceedings 720, (AIP,
Melville, N.Y.), pp.161-164,2004.].
[46] Le, V-N, Stecher, B.M., Lockwood, J.R., Hamilton, L.S., Robyn, A., Williams, V.L., Ryan, G., Kerr, K.A., Martinez, J.F.,
and Klein, S.P., Improving Mathematics and Science Education, (RAND Corporation, Santa Monica, Calif.), 2006.
187
[47] Meltzer, D.E., and Manivannan, K., “Transforming the lecture-hall environment: The fully interactive physics
lecture,” American Journal of Physics, vol. 70, pp. 639-654, 2002.
[48] McDermott, L.C., and Redish, E.F., “Resource Letter: PER-1: Physics Education Research,” American Journal of
Physics, vol. 67, pp. 755-767, 1999.
[49] The model for the interview techniques we employ lies in the “individual demonstration interviews”
pioneered at the University of Washington in the late 1970s. See, e.g., references 1 through 4. This interview
format in turn was substantially motivated by the “clinical interview” technique employed by Piaget in the
early 1900s; see, e.g., Piaget, J., The Child’s Conception of the World, (Littlefield, Adams, Patterson, N. J.), 1963.
[50] For instance, see: Meltzer, D.E. “Investigation of students’ reasoning regarding heat, work, and the first law of
thermodynamics in an introductory calculus-based general physics course,” American Journal of Physics, vol.
72, pp. 1432-1446, 2004.
[52] Finley, G., “The Physical Science Study Committee,” The School Review, vol. 70, pp. 63-81, 1962.
[54] Deboer, G.E., A History of Ideas in Science Education: Implications for Practice, (Teachers College Press, New
York), Chaps. 7 and 8., 1991.
[56] Bruner. J., The Process of Education, (Harvard University Press, Cambridge, Mass.), 1960.
[57] Schwab, J.J., “The teaching of science as enquiry, in Schwab, J.J., and Brandwein, P.F., The Teaching of Science,
(Harvard University Press, Cambridge, Mass.), pp. 1-103, 1962.
[58] Hurd, P.D., and J.J. Gallagher, New Directions in Elementary Science Teaching, (Wadsworth, Belmont, Calif.), 1969.
[59] Morrison, P., and Walcott, C., “Enlightened opportunism: An informal account of the Elementary Science
Summer Study of 1962,” Journal of Research in Science Teaching, vol. 1, pp. 48-53, 1963.
[60] Karplus, R., “The Science Curriculum Improvement Study,” Journal of Research in Science Teaching, vol. 2, pp.
293-303, 1964.
[61] Arons, A.B., “Cultivating the capacity for formal reasoning: Objectives and procedures in an introductory
physical science course,” American Journal of Physics, vol. 44, pp. 834-838, 1976.
[62] McDermott., L.C., “Combined physics course for future elementary and secondary school teachers,” American
Journal of Physics, vol. 42, pp. 668-676; “Practice-teaching program in physics for future elementary school
teachers,” American Journal of Physics, vol. 42, pp. 737-742, 1974.
188
[63] McDermott, L.C., “Teacher education and the implementation of elementary science curricula”, American
Journal of Physics, vol. 44, pp. 434-441, 1976.
[65] Atkin, J. M., and Karplus, R., “Discovery or invention,” The Science Teacher, vol. 29, pp. 45-51, 1962; Karplus, R., “Science
teaching and the development of reasoning,” Journal of Research in Science Teaching, vol. 14, pp. 169-175, 1977.
[66] Piaget, J., “The new methods: Their psychological foundations, (1935)”, in Science of Education and the
Psychology of the Child, (Orion, New York), pp. 139-180), 1970.
[67] McKinnon, J.W., and Renner, J.W., “Are colleges concerned with intellectual development?” American Journal
of Physics, vol. 39, pp. 1047-1052, 1971; Herron, J.D., “Piaget for chemists,” Journal of Chemical Education, vol.
52, pp. 146-150, 1975.
[69] Driver, R.P., “The Representations of Conceptual Frameworks in Young Adolescent Science Students,” Ph.D.
dissertation, University of Illinois at Urbana-Champaign, University Microfilms, UMI Number 7412006, 1973;
Nussbaum, J., and Novak, J.D., “An assessment of children’s concepts of the Earth utilizing structured
interviews,” Science Education, vol. 60, pp. 535-550, 1976; Novick, S., and Menis, J., “A study of student
perceptions of the mole concept,” Journal of Chemical Education, vol. 53, pp. 720-722, 1976. Following on
these early studies, an enormous explosion of work on “alternative conceptions in science” arose and
expanded throughout the field of K-12 science education. This is documented in exhaustive detail in:
Wandersee, J.H., Mintzes, J.J., and Novak, J.D., “Research on alternative conceptions in science,” in Gabel, D.L.,
(ed.), Handbook of Research on Science Teaching and Learning, pp. 177-210, (McMillan, New York), 1994.
[70] Viennot, L., Spontaneous reasoning in elementary dynamics,” European Journal of Science Education, vol. 1, pp.
205-221, 1979.
[71] Trowbridge, D.E., and McDermott, L.C., “Investigation of student understanding of the concept of velocity in
one dimension,” American Journal of Physics, vol. 48, pp. 1020-1028, 1980.
189