Which mathematics research-based assessment should I use in my physics class?
This recommendation initially appeared as an article in the American Journal of Physics: A. Madsen, S. B. McKagan, E. C. Sayre, and C. A. Paul, Resource Letter RBAI-2: Research-based assessment instruments: Beyond physics topics. Am. J. Phys, 87, 350 (2019).
RBAIs for mathematics can be used in physics classes to assess students’ level of math readiness for a given physics class, or to assess students’ understanding of math topics that are covered in physics classes. These tests are often used in concert with or instead of mathematics placement exams developed locally. We discuss three mathematics assessments, developed by mathematics education researchers, that you can use before instruction to get a sense of students’ pre-requisite mathematics skills and to assess calculus readiness. You could also use these as pre- and post-tests to see how your students’ calculus skills improved because of your course. These are the Precalculus Concept Assessment (PCA) (Carlson, Oehrtman, and Engelke 2010), the Calculus Concept Inventory (CCI) (Epstein 2007; Epstein 2003) and the Calculus Concept Readiness Instrument (CCR) (Carlson, Madison, and West 2015). There are three additional assessments, developed by physicists that assess mathematics topics often taught in physics classes, i.e., vectors and mathematical modeling. These are the Quadratic and Linear Conceptual Evaluation (QLCE) (Thornton 2006), the Test of Understanding of Vectors (TUV) (Barniol and Zavala 2014), and the Vector Evaluation Test (VET) (Thornton 2006). You can use these as a pre- and post-test, to both get a sense of what your students know at the start of your course and what they learned because of your course. Other tests exist (e.g., the Basic Skills Diagnostic Test (BSDT) (Epstein 1997)), but there is no published information available about them, and their developers are unavailable for consultation, and/or we cannot access the assessments.
|Title||Content||Intended Population||Research Validation||Purpose|
|Calculus Concept Inventory (CCI)||
Derivatives, functions, limits, ratios, the continuum
|Intro college, high school||Gold||
Assess student understanding of the most basic principles of calculus.
|Precalculus Concept Assessment (PCA)||
Rate of change, function, process view of functions, covariational reasoning
Assess essential knowledge that mathematics education research has revealed to be foundational for students’ learning and understanding of the central ideas of beginning calculus.
|Calculus Concept Readiness Instrument (CCR)||
The function concept, trigonometric functions, and exponential functions
Assess the effectiveness of pre-calculus level instruction or to be used as a placement test for entry into calculus.
|Test of Understanding of Vectors (TUV)||
Vectors, components, unit vector, vector addition, subtraction and multiplication, dot and cross product
Assess students’ understanding of vector concepts in problems without a physical context.
|Quadratic and Linear Conceptual Evaluation (QLCE)||
Graphing, mathematical modeling
Intro college, high school
Measure student understanding of equations (linear and quadratic) as functional relationships. Also, to measure students’ mathematical knowledge in both traditional and reform courses.
|Vector Evaluation Test (VET)||
Vector addition and subtraction, component analysis, and comparing magnitudes
Intro college, high school
Measure students’ conceptual understanding of vectors
Precalculus Concept Assessment (PCA)
The Precalculus Concept Assessment (PCA) (Carlson, Oehrtman, and Engelke 2010) is a multiple-choice pre/post assessment of foundational concepts of beginning calculus, including reasoning abilities around the process view of functions, co-variational reasoning and computational abilities, understanding of the meaning of function concepts, growth rate of function types, and function representations. The PCA can be used to help a physics faculty member understand their student’s calculus readiness. The PCA questions were developed based on a taxonomy of precalculus concepts (The PCA Taxonomy) using an iterative process of developing questions, testing them with students, interviewing students about their responses, and revising the questions and answer choices.
Calculus Concept Inventory (CCI)
The Calculus Concept Inventory (CCI) (Epstein 2007; Epstein 2003) is a multiple-choice pre/post assessment of the most basic principles of calculus, where questions are conceptual with no computation on the test. The topics covered on the CCI include functions, derivatives, limits, ratios, and the continuum. The CCI was modeled closely after the Force Concept Inventory (FCI), where the questions look trivial to experts, but students in lecture courses score quite poorly on the test. The CCI questions were first developed by a panel of experts who defined the content to be tested and wrote the questions, and then tested iteratively with students and revised.
Comparing the PCA and CCI
While the PCA was developed using a research-based taxonomy of concepts, the CCI was designed to mimic the FCI. This difference means that students’ responses to CCI questions are more likely to surprise physics faculty (“they should have gotten that!”) while PCA questions are more likely to present a robust and varied sense of students’ understanding of function concepts in a classroom. The CCI is designed for more advanced math skills than the PCA and may be inappropriate for students enrolled in conceptual or algebra-based physics classes; however, in courses which require substantial calculus or differential equations (e.g., intermediate mechanics), it may be a more appropriate pre-test.
Calculus Concept Readiness (CCR)
The Calculus Concept Readiness (CCR) (Carlson, Madison, and West 2015) instrument is a multiple-choice pre/post assessment of foundational concepts for introductory calculus, including the function concept, trigonometric functions, and exponential functions. The CCR was developed to assess students’ readiness for calculus courses or to assess the effectiveness of pre-calculus courses. Like the PCA, the CCR was developed using a research-based taxonomy of concepts. The CCR is owned by the Mathematical Association of America and is available for a fee through Maplesoft.
Comparing PCA, CCR and CCI
At first blush, the PCA, CCR, and CCI cover very similar topics at a very similar level. However, their emphasis is different, and care should be taken to match the test with your students. The CCR surveys students’ understanding of a broad base of mathematics concepts from pre-calculus, including both functions and trigonometry, while the PCA focuses only on the mathematics needed to move into calculus (primarily functions) before calculus instruction. The CCI is designed to test the core concepts of calculus and is aimed at students before and after calculus instruction. Both the PCA and the CCR were developed by the same team of researchers using very similar development methods, and the tests have very a similar structure and feel. The CCI was independently developed by a different team using less robust research methods. If you use these as part of a mathematics placement package or to measure their students’ mathematics skills, the CCR is recommended because of the trigonometry and solving equations cluster, though you must pay to use it. Physicists are typically not as interested as mathematicians are in the intricacies of how students understand “function” as a concept, devoid of the physical context, so the PCA and CCI may not be as helpful as the CCR for these purposes.
Quadratic and Linear Conceptual Evaluation (QLCE)
The Quadratic and Linear Conceptual Evaluation (QLCE) (Thornton 2006) is a multiple-choice assessment about relating kinematics to quadratic graphs and equations, relating coefficient changes in linear equations to linear graph changes and vice versa. Some questions have a kinematics context, and some questions have a generic context. The developers created the QLCE because they had heard faculty say that their students “understood the math, but not the concepts,” and wanted to see if their physics students did indeed understand these mathematical concepts. There are several sets of questions where students fill in a matrix to answer, so you would need to renumber them for use with Scantron and will need a special Scantron sheet that can take up to 10 answers and multiple responses for each question. These questions were developed based on research into student ideas about quadratic and linear equations and the developers’ experience with the concepts with their students.
Test of Understanding of Vectors (TUV)
The Test of Understanding of Vectors (TUV) (Barniol and Zavala 2014) is a multiple-choice test that assesses introductory physics students’ understanding of vector concepts without any physical context. Concepts tested include unit-vector notation, graphical representation of vectors and components, calculation of vector components, vector addition, subtraction and scalar multiplication, and dot and cross product. The TUV questions were developed from students’ open-ended responses to questions about vectors, so the multiple-choice answers strongly reflect students’ ideas about vectors (both correct and incorrect). The TUV was developed in Mexico in Spanish and then translated into English.
Vector Evaluation Test (VET)
The Vector Evaluation Test (VET) (Thornton 2006) is a multiple-choice, multiple-response (can pick more than one option), and open-ended assessment of vector concepts for introductory physics classes. About a quarter of the questions are asked in a physics context, and the rest are given no physical context. The VET questions were based on the developers’ experience with students thinking about vectors.
Comparing the TUV and VET
Both the TUV and VET cover vector decomposition, addition, subtraction, dot products, and cross products, which are the major issues for using vectors in introductory physics. Additionally, the TUV uses both graphical representations and vector-hat representations, so it is possible to compare students’ performance across representations. The VET covers coordinate rotation and time changes of kinematics vectors, so it is more appropriate to use this test if you would like to test more topics instead of more representations. Though it is a more thorough test of the topics it does cover, the TUV’s reliance on few questions per topic means that scores are still sensitive to the peculiarities of the questions on the test.
Recommendations for choosing a mathematics assessment
You can use these mathematics assessments before instruction to get a sense of what your students already know, or after instruction if you are implementing new teaching practices to increase student understanding of a given topic and want to assess the effectiveness. Because the QLCE, PCA, CCR, and CCI test overlapping concepts, you should select one of these four that best matches your population and assessment needs. Do not mix-and-match these tests for pre- and post-test because you will have difficulty comparing pre-scores to post- scores. If you are using a test only before instruction to see if your students are ready to take your course or to adjust your teaching to best fit their incoming skills, select a test of more elementary content that might be fully covered in prerequisite classes. If you are using a test before and after instruction, you might select a test that includes some content covered in co-requisite courses.
- P. Barniol and G. Zavala, Test of understanding of vectors: A reliable multiple-choice vector concept test, Phys. Rev. ST Phys. Educ. Res. 10 (1), 010121 (2014).
- M. Carlson, B. Madison, and R. West, A Study of Students’ Readiness to Learn Calculus, Int. J. Res. Undergrad. Math. Educ. 1 (2), 209 (2015).
- M. Carlson, M. Oehrtman, and N. Engelke, The Precalculus Concept Assessment: A Tool for Assessing Students’ Reasoning Abilities and Understandings, Cog. Instr. 28 (2), 113 (2010).
- J. Epstein, Cognitive development in an integrated mathematics and science program,
Jour. of Coll. Sci. Teach., 27 (3), (1997).
- J. Epstein, The Calculus Concept Inventory - Measurement of the Effect of Teaching Methodology in Mathematics, Notices Amer. Math. Soc. 60 (8), 1018 (2003).
- J. Epstein, Development and Validation of the Calculus Concept Inventory, presented at the Ninth International Conference on Mathematics Education in a Global Community, Charlotte, NC, 2007.
- R. Thornton, Measuring and Improving Student Mathematical Skills for Modeling, presented at the GIREP Conference 2006: Modeling in Physics and Physics Education, Amsterdam, Netherlands, 2006.