Which beliefs and attitudes research-based assessment should I use in my class?

posted April 10, 2021 and revised April 19, 2021
by Adrian Madsen, Sarah B. McKagan, Eleanor C. Sayre, and Cassandra A. Paul

This recommendation initially appeared as an article in the American Journal of Physics: A. Madsen, S. B. McKagan, E. C. Sayre, and C. A. Paul, Resource Letter RBAI-2: Research-based assessment instruments: Beyond physics topics. Am. J. Phys, 87, 350 (2019). 

There are 14 research-based assessments of students’ beliefs and attitudes that we discuss here. We discuss belief and attitude assessment from four categories: students’ beliefs about learning physics in general, students’ beliefs about specific aspects of physics or their own learning (e.g., labs and problem solving), students’ self-efficacy in their physics class, and students’ views about the nature of science. There are also additional assessments of motivation, discussed in Lovelace and Brickman 2013 that may be of interest, but will not be discussed here.

Since these surveys of beliefs and attitudes do not assess the content covered in any course, they can be used at the high school level and at all levels in the undergraduate and graduate curriculum (unless otherwise noted below). Many of these surveys can be used across disciplines or have versions specifically tailored to other disciplines. Most of these beliefs and attitudes surveys (unless otherwise noted) are meant to be given as a pre-test at the beginning of the semester and post-test at the end of the semester. In order to look at the shifts in belief scores during your course, they are also appropriate to give at other times in the semester (e.g., near exams) or across an entire course sequence.

Belief surveys are carefully designed to measure what students believe about a topic rather than simply whether they like that topic. However, they have several important limitations. First, they can only measure self-reported explicit beliefs, not students’ implicit beliefs. For example, a student might say and really believe “When I am solving a physics problem, I try to decide what would be a reasonable value for the answer” but not do that in real life. Second, it may be difficult to distinguish in students’ answers whether they are thinking about the structure of the course they are enrolled in or in the practice of learning physics more broadly. Finally, many belief surveys assume a context of a typical physics course that includes elements such as solving problems, reading the textbook, and taking exams and thus may not be appropriate in a very nontraditional physics course or in a context outside of a physics course.

Title Focus Intended Population Format Research Validation Purpose 

Beliefs About Physics Learning in General

Colorado Learning Attitudes about Science Survey (CLASS)

Self-reported beliefs about physics and learning physics

Upper-level, intermediate, intro college, high school

Agree/disagree, available online

Gold

Measure students’ beliefs about physics and learning physics and distinguish the beliefs of experts from those of novices.

Maryland Physics Expectations Survey (MPEX)

Beliefs about one’s physics course

Upper-level, intermediate, intro college, high school

Agree/disagree

 Gold

Probe some aspects of student expectations in physics courses and measure the distribution of student views at the beginning and end of the course.

Epistemological Beliefs Assessment for Physical Sciences (EBAPS)

Epistemological beliefs, structure of knowledge, nature of knowing and learning, real-life applicability, evolving knowledge, source of ability to learn

Intro college, high school

Agree/disagree, multiple-choice, contrasting alternatives

Silver

Probe the epistemological stances of students in introductory physics, chemistry and physical science.

Views About Science Survey (VASS)

Structure and validity of scientific knowledge, scientific methodology, learnability of science, reflective thinking, personal relevance of science

Intro college, high school

Contrasting alternatives

Silver

Characterize student views about knowing and learning science and assess the relation of these views to achievement in science courses.

Beliefs About Physics Learning in a Specific Context
Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS)

Affect, confidence, math-physics-data connection, physics community, uncertainty, troubleshooting, argumentation, experimental design, modeling

Upper-level, intermediate, intro college

Agree/disagree, available online

 

  

Gold

Measure students’ epistemologies and expectations around experimental physics.

Attitudes and Approaches to Problem Solving (AAPS)

Attitudes about problem-solving

Graduate, upper-level, intermediate, intro college

Agree/disagree

Silver

Measure students’ attitudes and approaches to problem solving at the introductory and graduate level.

Physics Goal Orientation Survey (PGOS)

Goal orientation and motivation in physics

Intro college

Agree/disagree

Silver

Assess students’ motivation and goal orientations in university-level physics courses.

Student Assessment of Learning Gains (SALG)

Self-assessment of learning

Intro college

Agree/disagree

Silver

Understand students’ self-assessment of their learning from different aspects of the course and their gains in skills, attitudes, understanding of concepts, and integrating information.

Attitudes toward Problem Solving Survey (APSS)

Attitudes about problem-solving

Intro college

Agree/disagree

Bronze

Survey students’ attitudes towards and views of problem solving.

Nature of Science
Views of the Nature of Science Questionnaire (VNOS)

Nature of science, theories and laws, tentativeness, creativity, objectivity, subjectivity, social and cultural influences

High school, intro college

Agree/disagree

Silver

Elucidate students’ views about several aspects of the nature of science.

Views on Science and Education Questionnaire (VOSE)

Nature of science, theories and laws, tentativeness, creativity, objectivity, subjectivity, scientific method, teaching the nature of science

High school, intro college, intermediate, upper level

Open-ended

Silver

Create in-depth profiles of the views of students or adults about the nature of science and nature of science instruction.

Self-efficacy 
Sources of Self-Efficacy in Science Courses-Physics (SOSESC-P)

Self-efficacy

Intro college

Agree/disagree

Bronze

Assess students’ beliefs that they can succeed in their physics course.

Physics Self-Efficacy Questionnaire (PSEQ) 

Self-efficacy

Intro college

Agree/disagree

Bronze

Measure students’ self-efficacy in their physics course.

Self-efficacy in Physics Instrument (SEP)

Self-efficacy

Intro college

Agree/disagree

Bronze

Examine the relationship between physics self-efficacy and student performance in introductory physics classrooms.

Beliefs about physics learning in general

Many physics faculty care about their students learning to think like physicists but often do not assess this because it is not clear how to do so best. Physics education researchers have created several surveys to assess one important aspect of thinking like a physicist: what students believe that learning physics is all about. These surveys are not about whether students like physics, but about how students perceive the discipline of physics or their physics course. These surveys measure students’ self-reported beliefs about physics and their physics courses and how closely these beliefs about physics align with experts’ beliefs. There are four assessments about students’ beliefs about learning physics in general: The Colorado Learning Attitudes about Science Survey (CLASS) (Adams et al. 2006), Maryland Physics Expectations Survey (MPEX) (Redish et al. 1997), Epistemological Beliefs Assessment for Physical Sciences (EBAPS) (The Idea Behind EBAPS, Elby 2001) and the Views About Science Survey (VASS) (Halloun 1997, Halloun and Hestenes 1998).

Colorado Learning About Science Survey (CLASS)

The Colorado Learning Attitudes about Science Survey (CLASS—pronounced “sea-lass”) (Adams et al. 2006) asks students to agree/dis-agree with statements about their beliefs about physics and learning physics around such topics as real-world connections, personal interest, sense-making/effort, and problem solving. Students are asked to strongly agree, agree, neutral, disagree, or strongly disagree (5-point Likert scale) with a question statement (Fig. 1). The survey is most commonly scored by collapsing students’ responses into two categories (“strongly agree” and “agree” are grouped, “strongly disagree” and “disagree” are grouped) depending on whether they are the same as an expert physicist would give. For an explanation of the reasons for collapsing student responses into two categories, see the “scoring” section of Adams et al. 2006. An individual student’s “percent favorable” score is the average number of questions that they answered in the same way as an expert. It is most common for faculty to look at the shift in their class average percent favorable scores from pre-test to post-test to understand how their course influences students’ attitudes and beliefs about physics, on average. One would hope that after a physics course, students’ beliefs would become more expert-like, so the class average percent favorable scores would increase from pre- to post-test. The CLASS questions contain only one statement that students can agree or disagree with to help students interpret these questions consistently (as opposed to more than one idea in the same question). The CLASS questions were developed based on questions from the MPEX and VASS. The CLASS added questions about personal interest, aspects of problem solving and the coupled beliefs of sense-making and effort that were not included in the MPEX or VASS (Adams et al. 2006).

Figure 1. Example of a 5-point Likert-scale question on the CLASS.

Maryland Physics Expectations Survey (MPEX)

The Maryland Physics Expectations Survey (MPEX) (Redish et al. 1997) measures students’ self-reported beliefs about physics and their physics courses, their expectations about learning physics and how closely these beliefs about physics align with experts’ beliefs. The surveys ask students to rank 5-point Likert scale questions about how they learn physics, how physics is related to their everyday lives, and about their physics course. Some of the MPEX questions are very course specific, e.g., they ask about a student’s grade in the course. The format and scoring of the MPEX questions are the same as the CLASS questions. The questions on the MPEX were chosen through literature review, discussion with faculty, and the researchers’ personal experiences.

Comparing the CLASS and MPEX

The CLASS and MPEX are very similar and several items are the same on both tests. The MPEX and CLASS both ask questions about students’ personal beliefs about learning physics, but the MPEX focuses more on students’ expectations for what their specific physics course will be like. While the CLASS does not include questions about expectations for the specific course, it does include questions that only make sense in the context of a physics course, e.g., asking about students’ belief that they can solve a physics problem after studying that physics topic. The MPEX takes longer to complete than the CLASS, even though it has fewer questions (34 versus 42) presumably because some of the MPEX questions take longer for students to understand and answer because they contain multiple ideas. Both assessments have a strong research validation. The CLASS builds on the MPEX, and has been used more widely, so there is more comparison data available.

Epistemological Beliefs About Physics Survey (EBAPS)

The Epistemological Beliefs Assessment for Physical Sciences (EBAPS) (The Idea Behind EBAPS, Elby 2001) probes students’ epistemology of physics, or their view of what it means to learn and understand physics. The EBAPS also contains questions that are course specific (as opposed to being about learning physics in general), for example, one question asks about how students should study in their physics class. The developers tried to ensure that the EBAPS questions do not have an obvious sanctioned answer and have a rich context in order to elicit students’ views more successfully (Elby 2001). The EBAPS has three question types. Part one contains agree/disagree Likert scale questions, part 2 contains multiple-choice questions, and part 3 gives students two statements and asks them to indicate how much they agree with each (similar to the VASS). The level of sophistication of students’ answers is scored using a non-linear scoring scheme where different responses have different weighting depending on how sophisticated the developers determined each answer was. The EBAPS is most appropriate for high school and college level introductory physics courses. The EBAPS questions were developed based on an extensive review of the MPEX and Schommer’s Epistemological Questionnaire (Schommer 1990). The developers synthesized other researchers’ ideas to create guiding principles, which they used to write the EBAPS questions.

Comparing EBAPS, CLASS and MPEX

The main difference between the EBAPS and the CLASS and MPEX is the style of the questions, where the EBAPS has three styles of questions, and the MPEX and CLASS include only agree/disagree questions. The content on the EBAPS, MPEX, and CLASS is similar and all have high levels of research validation.

Views About Science Survey (VASS)

The Views About Science Survey (VASS) (Halloun 1997, Halloun and Hestenes 1998) is another survey for probing student beliefs about physics and learning physics. The VASS uses a special question format called contrasting alternative design where students compare and contrast between two viewpoints. For example, one question contains the statement “Learning in this course requires:” with the contrasting alternatives “(a) a special talent” and “(b) a serious effort.” Students are asked to compare how much they agree with (a) and (b) by choosing between the following options: (a) >> (b), (a) > (b), (a) 1⁄4 (b), (a) < (b), or (a) << (b). Questions are scored in the same way as the MPEX and CLASS. The VASS can be used in introductory college physics courses and high school physics courses. VASS questions were developed based on an expert/folk taxonomy of student views about science.

Comparing VASS, CLASS and MPEX

The biggest difference between the VASS and the CLASS and MPEX is that the VASS uses the contrasting cases format. The VASS format can be confusing to students if they do not agree that the answer choices given represent opposites. The VASS may be less reliable for measuring expert-like beliefs but still very useful for discussing students’ ideas about learning physics. The CLASS and MPEX have more obvious expert-like answers, so their results can give you a better idea of how expert-like your students’ views are. Although it may seem that if there is an obvious expert-like answer, students would choose this over reporting their own personal beliefs, Gray et al. 2008 found evidence that for the CLASS, students answer based on their own personal beliefs. The content of the VASS is very similar to the CLASS and MPEX. Like the MPEX, the VASS has several questions that are course specific.

Recommendations for choosing a general belief and attitude assessment

Use the CLASS if you want an assessment that is quick to complete, has a large amount of comparison data available, and where the questions are easy for students to understand. Furthermore, use the CLASS if you want to look at categories of questions that were determined through a rigorous statistical analysis, so they reflect students’ views of the relationship between questions. The CLASS and MPEX statements refer to the kinds of activities that students do in a traditional introductory physics course, so the questions may not make sense to students if you are teaching in a very non-traditional way. If you have been using the MPEX, EBAPS, or CLASS in the past, you may want to keep using these to compare your results. The MPEX was designed with a resources perspective, which assumes that students’ ideas are not coherent, so if you want an assessment from the resources perspective, use the MPEX.

Beliefs about physics learning in a specific context

There are five assessments about students’ beliefs about specific aspects of physics or their own learning, e.g., laboratories and problem solving. These are the Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS) (Zwickl et al. 2014), the Attitudes and Approaches to Problem Solving (AAPS) (Mason and Singh 2010, Singh and Mason 2009), the Attitudes toward Problem Solving Survey (APSS) (Cummings et al. 2004), the Physics Goal Orientation Survey (PGOS) (Lindstrom and Sharma 2010), and the Student Assessment of Learning Gains (SALG) (Seymour et al. 2000) These surveys have been created for three specific contexts: experimental physics (E-CLASS), problem solving (AAPS and APSS), and goal orientations (PGOS). There are three additional RBAIs that deal with problem solving more generally, and not attitudes about problem solving, discussed here.

Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS)

The Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS) (Zwickl et al. 2014) is designed to measure the influence of a laboratory course on students’ epistemologies and expectations around experimental physics. The E-CLASS asks about a wide range of epistemological beliefs, so that it can be used in courses with a wide range of goals. The E-CLASS asks students to rate their agreement with statements by answering for themselves, “What do YOU think when doing experiments for class?” and answering f