Which beliefs and attitudes research-based assessment should I use in my class?

posted April 10, 2021 and revised November 17, 2022
by Adrian Madsen, Sarah B. McKagan, Eleanor C. Sayre, and Cassandra A. Paul

This recommendation initially appeared as an article in the American Journal of Physics: A. Madsen, S. B. McKagan, E. C. Sayre, and C. A. Paul, Resource Letter RBAI-2: Research-based assessment instruments: Beyond physics topics. Am. J. Phys, 87, 350 (2019). 

There are 14 research-based assessments of students’ beliefs and attitudes that we discuss here. We discuss belief and attitude assessment from four categories: students’ beliefs about learning physics in general, students’ beliefs about specific aspects of physics or their own learning (e.g., labs and problem solving), students’ self-efficacy in their physics class, and students’ views about the nature of science. There are also additional assessments of motivation, discussed in Lovelace and Brickman 2013 that may be of interest, but will not be discussed here.

Since these surveys of beliefs and attitudes do not assess the content covered in any course, they can be used at the high school level and at all levels in the undergraduate and graduate curriculum (unless otherwise noted below). Many of these surveys can be used across disciplines or have versions specifically tailored to other disciplines. Most of these beliefs and attitudes surveys (unless otherwise noted) are meant to be given as a pre-test at the beginning of the semester and post-test at the end of the semester. In order to look at the shifts in belief scores during your course, they are also appropriate to give at other times in the semester (e.g., near exams) or across an entire course sequence.

Belief surveys are carefully designed to measure what students believe about a topic rather than simply whether they like that topic. However, they have several important limitations. First, they can only measure self-reported explicit beliefs, not students’ implicit beliefs. For example, a student might say and really believe “When I am solving a physics problem, I try to decide what would be a reasonable value for the answer” but not do that in real life. Second, it may be difficult to distinguish in students’ answers whether they are thinking about the structure of the course they are enrolled in or in the practice of learning physics more broadly. Finally, many belief surveys assume a context of a typical physics course that includes elements such as solving problems, reading the textbook, and taking exams and thus may not be appropriate in a very nontraditional physics course or in a context outside of a physics course.

Title Focus Intended Population Format Research Validation Purpose 

Beliefs About Physics Learning in General

Colorado Learning Attitudes about Science Survey (CLASS)

Self-reported beliefs about physics and learning physics

Upper-level, intermediate, intro college, high school

Agree/disagree, available online

Gold

Measure students’ beliefs about physics and learning physics and distinguish the beliefs of experts from those of novices.

Maryland Physics Expectations Survey (MPEX)

Beliefs about one’s physics course

Upper-level, intermediate, intro college, high school

Agree/disagree

 Gold

Probe some aspects of student expectations in physics courses and measure the distribution of student views at the beginning and end of the course.

Epistemological Beliefs Assessment for Physical Sciences (EBAPS)

Epistemological beliefs, structure of knowledge, nature of knowing and learning, real-life applicability, evolving knowledge, source of ability to learn

Intro college, high school

Agree/disagree, multiple-choice, contrasting alternatives

Silver

Probe the epistemological stances of students in introductory physics, chemistry and physical science.

Views About Science Survey (VASS)

Structure and validity of scientific knowledge, scientific methodology, learnability of science, reflective thinking, personal relevance of science

Intro college, high school

Contrasting alternatives

Silver

Characterize student views about knowing and learning science and assess the relation of these views to achievement in science courses.

Beliefs About Physics Learning in a Specific Context
Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS)

Affect, confidence, math-physics-data connection, physics community, uncertainty, troubleshooting, argumentation, experimental design, modeling

Upper-level, intermediate, intro college

Agree/disagree, available online

 

  

Gold

Measure students’ epistemologies and expectations around experimental physics.

Attitudes and Approaches to Problem Solving (AAPS)

Attitudes about problem-solving

Graduate, upper-level, intermediate, intro college

Agree/disagree

Silver

Measure students’ attitudes and approaches to problem solving at the introductory and graduate level.

Physics Goal Orientation Survey (PGOS)

Goal orientation and motivation in physics

Intro college

Agree/disagree

Silver

Assess students’ motivation and goal orientations in university-level physics courses.

Student Assessment of Learning Gains (SALG)

Self-assessment of learning

Intro college

Agree/disagree

Silver

Understand students’ self-assessment of their learning from different aspects of the course and their gains in skills, attitudes, understanding of concepts, and integrating information.

Attitudes toward Problem Solving Survey (APSS)

Attitudes about problem-solving

Intro college

Agree/disagree

Bronze

Survey students’ attitudes towards and views of problem solving.

Nature of Science
Views of the Nature of Science Questionnaire (VNOS)

Nature of science, theories and laws, tentativeness, creativity, objectivity, subjectivity, social and cultural influences

High school, intro college

Agree/disagree

Silver

Elucidate students’ views about several aspects of the nature of science.

Views on Science and Education Questionnaire (VOSE)

Nature of science, theories and laws, tentativeness, creativity, objectivity, subjectivity, scientific method, teaching the nature of science

High school, intro college, intermediate, upper level

Open-ended

Silver

Create in-depth profiles of the views of students or adults about the nature of science and nature of science instruction.

Self-efficacy 
Sources of Self-Efficacy in Science Courses-Physics (SOSESC-P)

Self-efficacy

Intro college

Agree/disagree

Bronze

Assess students’ beliefs that they can succeed in their physics course.

Physics Self-Efficacy Questionnaire (PSEQ) 

Self-efficacy

Intro college

Agree/disagree

Bronze

Measure students’ self-efficacy in their physics course.

Self-efficacy in Physics Instrument (SEP)

Self-efficacy

Intro college

Agree/disagree

Bronze

Examine the relationship between physics self-efficacy and student performance in introductory physics classrooms.

Beliefs about physics learning in general

Many physics faculty care about their students learning to think like physicists but often do not assess this because it is not clear how to do so best. Physics education researchers have created several surveys to assess one important aspect of thinking like a physicist: what students believe that learning physics is all about. These surveys are not about whether students like physics, but about how students perceive the discipline of physics or their physics course. These surveys measure students’ self-reported beliefs about physics and their physics courses and how closely these beliefs about physics align with experts’ beliefs. There are four assessments about students’ beliefs about learning physics in general: The Colorado Learning Attitudes about Science Survey (CLASS) (Adams et al. 2006), Maryland Physics Expectations Survey (MPEX) (Redish et al. 1997), Epistemological Beliefs Assessment for Physical Sciences (EBAPS) (The Idea Behind EBAPS, Elby 2001) and the Views About Science Survey (VASS) (Halloun 1997, Halloun and Hestenes 1998).

Colorado Learning About Science Survey (CLASS)

The Colorado Learning Attitudes about Science Survey (CLASS—pronounced “sea-lass”) (Adams et al. 2006) asks students to agree/dis-agree with statements about their beliefs about physics and learning physics around such topics as real-world connections, personal interest, sense-making/effort, and problem solving. Students are asked to strongly agree, agree, neutral, disagree, or strongly disagree (5-point Likert scale) with a question statement (Fig. 1). The survey is most commonly scored by collapsing students’ responses into two categories (“strongly agree” and “agree” are grouped, “strongly disagree” and “disagree” are grouped) depending on whether they are the same as an expert physicist would give. For an explanation of the reasons for collapsing student responses into two categories, see the “scoring” section of Adams et al. 2006. An individual student’s “percent favorable” score is the average number of questions that they answered in the same way as an expert. It is most common for faculty to look at the shift in their class average percent favorable scores from pre-test to post-test to understand how their course influences students’ attitudes and beliefs about physics, on average. One would hope that after a physics course, students’ beliefs would become more expert-like, so the class average percent favorable scores would increase from pre- to post-test. The CLASS questions contain only one statement that students can agree or disagree with to help students interpret these questions consistently (as opposed to more than one idea in the same question). The CLASS questions were developed based on questions from the MPEX and VASS. The CLASS added questions about personal interest, aspects of problem solving and the coupled beliefs of sense-making and effort that were not included in the MPEX or VASS (Adams et al. 2006).

Figure 1. Example of a 5-point Likert-scale question on the CLASS.

Maryland Physics Expectations Survey (MPEX)

The Maryland Physics Expectations Survey (MPEX) (Redish et al. 1997) measures students’ self-reported beliefs about physics and their physics courses, their expectations about learning physics and how closely these beliefs about physics align with experts’ beliefs. The surveys ask students to rank 5-point Likert scale questions about how they learn physics, how physics is related to their everyday lives, and about their physics course. Some of the MPEX questions are very course specific, e.g., they ask about a student’s grade in the course. The format and scoring of the MPEX questions are the same as the CLASS questions. The questions on the MPEX were chosen through literature review, discussion with faculty, and the researchers’ personal experiences.

Comparing the CLASS and MPEX

The CLASS and MPEX are very similar and several items are the same on both tests. The MPEX and CLASS both ask questions about students’ personal beliefs about learning physics, but the MPEX focuses more on students’ expectations for what their specific physics course will be like. While the CLASS does not include questions about expectations for the specific course, it does include questions that only make sense in the context of a physics course, e.g., asking about students’ belief that they can solve a physics problem after studying that physics topic. The MPEX takes longer to complete than the CLASS, even though it has fewer questions (34 versus 42) presumably because some of the MPEX questions take longer for students to understand and answer because they contain multiple ideas. Both assessments have a strong research validation. The CLASS builds on the MPEX, and has been used more widely, so there is more comparison data available.

Epistemological Beliefs About Physics Survey (EBAPS)

The Epistemological Beliefs Assessment for Physical Sciences (EBAPS) (The Idea Behind EBAPS, Elby 2001) probes students’ epistemology of physics, or their view of what it means to learn and understand physics. The EBAPS also contains questions that are course specific (as opposed to being about learning physics in general), for example, one question asks about how students should study in their physics class. The developers tried to ensure that the EBAPS questions do not have an obvious sanctioned answer and have a rich context in order to elicit students’ views more successfully (Elby 2001). The EBAPS has three question types. Part one contains agree/disagree Likert scale questions, part 2 contains multiple-choice questions, and part 3 gives students two statements and asks them to indicate how much they agree with each (similar to the VASS). The level of sophistication of students’ answers is scored using a non-linear scoring scheme where different responses have different weighting depending on how sophisticated the developers determined each answer was. The EBAPS is most appropriate for high school and college level introductory physics courses. The EBAPS questions were developed based on an extensive review of the MPEX and Schommer’s Epistemological Questionnaire (Schommer 1990). The developers synthesized other researchers’ ideas to create guiding principles, which they used to write the EBAPS questions.

Comparing EBAPS, CLASS and MPEX

The main difference between the EBAPS and the CLASS and MPEX is the style of the questions, where the EBAPS has three styles of questions, and the MPEX and CLASS include only agree/disagree questions. The content on the EBAPS, MPEX, and CLASS is similar and all have high levels of research validation.

Views About Science Survey (VASS)

The Views About Science Survey (VASS) (Halloun 1997, Halloun and Hestenes 1998) is another survey for probing student beliefs about physics and learning physics. The VASS uses a special question format called contrasting alternative design where students compare and contrast between two viewpoints. For example, one question contains the statement “Learning in this course requires:” with the contrasting alternatives “(a) a special talent” and “(b) a serious effort.” Students are asked to compare how much they agree with (a) and (b) by choosing between the following options: (a) >> (b), (a) > (b), (a) 1⁄4 (b), (a) < (b), or (a) << (b). Questions are scored in the same way as the MPEX and CLASS. The VASS can be used in introductory college physics courses and high school physics courses. VASS questions were developed based on an expert/folk taxonomy of student views about science.

Comparing VASS, CLASS and MPEX

The biggest difference between the VASS and the CLASS and MPEX is that the VASS uses the contrasting cases format. The VASS format can be confusing to students if they do not agree that the answer choices given represent opposites. The VASS may be less reliable for measuring expert-like beliefs but still very useful for discussing students’ ideas about learning physics. The CLASS and MPEX have more obvious expert-like answers, so their results can give you a better idea of how expert-like your students’ views are. Although it may seem that if there is an obvious expert-like answer, students would choose this over reporting their own personal beliefs, Gray et al. 2008 found evidence that for the CLASS, students answer based on their own personal beliefs. The content of the VASS is very similar to the CLASS and MPEX. Like the MPEX, the VASS has several questions that are course specific.

Recommendations for choosing a general belief and attitude assessment

Use the CLASS if you want an assessment that is quick to complete, has a large amount of comparison data available, and where the questions are easy for students to understand. Furthermore, use the CLASS if you want to look at categories of questions that were determined through a rigorous statistical analysis, so they reflect students’ views of the relationship between questions. The CLASS and MPEX statements refer to the kinds of activities that students do in a traditional introductory physics course, so the questions may not make sense to students if you are teaching in a very non-traditional way. If you have been using the MPEX, EBAPS, or CLASS in the past, you may want to keep using these to compare your results. The MPEX was designed with a resources perspective, which assumes that students’ ideas are not coherent, so if you want an assessment from the resources perspective, use the MPEX.

Beliefs about physics learning in a specific context

There are five assessments about students’ beliefs about specific aspects of physics or their own learning, e.g., laboratories and problem solving. These are the Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS) (Zwickl et al. 2014), the Attitudes and Approaches to Problem Solving (AAPS) (Mason and Singh 2010, Singh and Mason 2009), the Attitudes toward Problem Solving Survey (APSS) (Cummings et al. 2004), the Physics Goal Orientation Survey (PGOS) (Lindstrom and Sharma 2010), and the Student Assessment of Learning Gains (SALG) (Seymour et al. 2000) These surveys have been created for three specific contexts: experimental physics (E-CLASS), problem solving (AAPS and APSS), and goal orientations (PGOS). There are three additional RBAIs that deal with problem solving more generally, and not attitudes about problem solving, discussed here.

Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS)

The Colorado Learning Attitudes about Science Survey for Experimental Physics (E-CLASS) (Zwickl et al. 2014) is designed to measure the influence of a laboratory course on students’ epistemologies and expectations around experimental physics. The E-CLASS asks about a wide range of epistemological beliefs, so that it can be used in courses with a wide range of goals. The E-CLASS asks students to rate their agreement with statements by answering for themselves, “What do YOU think when doing experiments for class?” and answering for a physicist, “What would experimental physicists say about their research?” This helps instructors differentiate students’ personal and professional epistemologies. The E-CLASS can be used in introductory, intermediate, or upper-level laboratory courses and is administered online through the developer website. The E-CLASS score is calculated using the responses to the questions about students’ personal beliefs (not the prompts about what they think a physicist’s response is). The E-CLASS score is calculated by giving +1 point for an expert-like response (favorable), 0 points for a neutral response and –1 points for a novice-like response (unfavorable). The total score for the 30 questions can range from –30 to 30 points. The percentage of students who give the expert-like response, and how this changes from pre- to post-test, determines how the course influenced students’ beliefs about experimental physics. The E-CLASS questions were developed based on consensus learning goals defined by faculty at the University of Colorado at Boulder for their laboratory curriculum. The questions were modeled after questions on the CLASS and based on common challenges instructors observed students having in laboratory courses.

Attitudes toward Problem Solving Survey (APSS)

Two surveys measure students’ attitudes and approaches to problem solving in physics. These surveys are important because the way students think about problem solving can affect how they learn this skill, and faculty can target the development of problem-solving skills to help their students improve.

The Attitudes toward Problem Solving Survey (APSS) (Cummings et al. 2004) is a survey of students’ attitudes toward problem solving, e.g., how they think about equations, the process they go through to solve problems and their views on what problem solving in physics means. Like other attitude and belief surveys, students are asked to agree with statements using a 5-point Likert scale, strongly (dis)agree and (dis)agree are collapsed, and the percent expert response is calculated as the percentage of questions where students agree with the expert response. In addition to the agree/disagree questions, there are also two multiple-choice questions on the APSS. The APSS is appropriate for introductory college courses. Some of the APSS questions were adopted from the MPEX, while others were newly created.

Attitudes and Approaches to Problem Solving (AAPS)

Like the APSS, the Attitudes and Approaches to Problem Solving (AAPS) (Mason and Singh 2010, Singh and Mason 2009) measures students’ agreement with statements about their attitudes and approaches to problem- solving using a 5-point Likert scale. To calculate the average score for a question, +1 is assigned to each favorable response, –1 is assigned to each unfavorable response, 0 is assigned to neutral response, and the overall score is the average of the score for each question. The AAPS can be used at all levels of undergraduate courses and at the graduate level.

Comparing the AAPS and APSS

Since the AAPS was developed by expanding the APSS, the topics covered and the questions on the AAPS and APSS are quite similar. Fourteen of the questions are the same or very similar between the tests. The AAPS has more questions (33 questions versus 20 questions), so it covers a few more aspects of problem solving than the APSS, including how students feel about problem solving, how they learn from the problem-solving process, use of pictures/diagrams, and what students do while solving a problem. The AAPS also includes questions that target graduate-level problem solving.

The CLASS, MPEX, EBAPS, and VASS also contain questions about students’ attitudes and beliefs about problem solving, similar to those on the APSS and AAPS. The AAPS and APSS can specifically target problem-solving beliefs, while the CLASS, MPEX, EBAPS, and VASS ask about a wider range of beliefs and attitudes.

Physics Goals Orientation Survey (PGOS)

The Physics Goal Orientation Survey (PGOS) (Lindstrom and Sharma 2010) is a survey of students’ motivations and goal orientations in their physics course. These motivations can influence how students engage in their physics class and how well they learn the material. The PGOS addresses four goal orientations: task orientation (the belief that success is a product of effort, understanding, and collaboration), ego orientation (the belief that success relies on greater ability and attempting to out-perform others), cooperation (when students value interaction with their peers in the learning process), and work avoidance (the goal of minimum effort–maximum gain). The PGOS uses a 5-point Likert scale, with 1 point given for strongly disagree, 5 points for strongly agree, and 2–4 points for disagree, neutral, or agree, respectively. The average score for each of the four goal orientations is calculated separately, and there is no overall score calculated. The PGOS is appropriate for introductory and intermediate university physics courses. It can be given as a pre- and post-test to determine how your course may have influenced students’ goal orientations. The PGOS questions were taken from a previous survey of goal orientation by Duda and Nicholls and revised so that they would be appropriate for a university-level physics course, with some new questions created. The PGOS was developed in Australia.

Student Assessment of Learning Gains (SALG)

The Student Assessment of Learning Gains (SALG) (Seymour et al. 2000) is an online assessment where students self-assess how different parts of their course impacted their learning using a 5-point Likert scale. It is like the student evaluation given at the end of most courses, but the questions only ask students about what they gained from different aspects of the course instead of what they liked. The SALG developers found that students’ observations about what they gained from the class were useful to help faculty improve the course, whereas their observations about what they liked were not helpful (Seymour et al. 2000). You can use the SALG online system to choose questions to include from each of the following categories: understanding of the class content, increase in skills, class impact on attitudes, integration of learning, the class overall, class activities, assignments, graded activities and tests, class resources, the information you were given, and support for you as an individual learner. You can also edit and reorder questions. You can give the SALG at a midpoint in your class to get a sense of which parts of your course could be improved, or at the end to evaluate your students’ understanding of how your course supported their learning. The SALG website also has a “baseline instrument” available that can be used at the beginning of a course. The SALG was developed using data from more than 300 student interviews where students discussed what they had gained from certain aspects of a course, and what they liked.

Recommendations for choosing a specific belief and attitude assessment

Use the E-CLASS if you want to measure students’ beliefs in the context of experimental physics. Use the APSS if you want to probe your students’ attitudes about problem solving, including undergraduate and graduate students. Use the PGOS if you want to understand your students’ motivations and goal orientations in their physics course. Use the SALG if you want to understand your students’ perspective on which parts of your course helped them learn the most.

Nature of science

There are two main research-based surveys about the nature of science, the Views on Science and Education Questionnaire (VOSE) (Chen 2006) and the Views of the Nature of Science Questionnaire (VNOS) (Lederman et al. 2002), which probe students’ views about the values and epistemological assumptions of science. These surveys can help faculty understand how their courses and teaching methods influence students’ views of the nature of science. These can be especially useful in courses that aim to develop these views, such as courses for pre-service teachers. Both are intended as both a pre- and post-test. 

Views on Science and Education Questionnaire (VOSE)

The Views on Science and Education Questionnaire (VOSE) (Chen 2006) is a Likest-scale survey of students' beliefs about the nature of science and beliefs about how you should teach the nature of science. The VOSE addresses seven major topics including tentativeness of scientific knowledge, nature of observation, scientific methods, hypotheses, laws and theories, imagination, validation of scientific knowledge, and objectivity and subjectivity in science. It also includes five questions about students’ beliefs about teaching the nature of science. Each question consists of a question statement and 3–9 possible responses, with which students can agree or disagree with using a 5-point Likert scale. There are no right or wrong answers, but each statement corresponds to a particular “position” on one or more subtopics of nature of science. The developer has created an extensive list of coding categories to “create an in-depth profile of a [student’s] nature of science views and educational ideas.” (Chen 2006) The coding categories can be found in Chen 2006. Burton 2013 developed a numerical system for calculating a numerical score for each issue or topic, by assigning a number between 0 and 4 to a student’s response for each item listed under that issue or topic and calculating the average. The VOSE can be used in high school courses and in introductory, intermediate, and upper-level undergraduate courses. The VOSE questions were developed based on questions from the Views on Science-Technology-Society (VOSTS) (Aikenhead and Ryan 1992) and VNOS to address concerns about the VOSTS and VNOS being open-ended and hard to administer and score. The VOSE aims to increase the validity of the survey and decrease interpretation biases, as compared to the VOSTS and VNOS.

Views on the Nature of Science Questionnaire (VNOS)

The Views of the Nature of Science Questionnaire (VNOS) (Lederman et al. 2002) is an open-ended survey of students’ ideas about the nature of science, including the empirical, tentative, inferential, creative, theory-laden nature of science, and the social and cultural influences on scientific knowledge. Many of the questions ask students to give an example to support their ideas. In addition to students written responses, the developers encourage faculty to do individual follow-up interviews with students to better understand the meanings of their responses to the questions. Students’ responses can be scored as naive, transitional, or informed based on a rubric for each question. The VNOS can be used with middle school, high school, and introductory college students. The VNOS questions were created by the developers and tested with students and experts.

Comparing the VNOS and VOSE

The VNOS and VOSE cover similar topics around the nature of science. The main difference between them is the format. The VNOS is open-ended while the VOSE asks students to agree/disagree with different options. Because the VNOS is open-ended, it can be time consuming to score and subject to interpretation bias, though conducting interviews with students about their responses reduces the chance of bias in scoring. Another difference between the VOSE and VNOS is that in addition to asking about students’ philosophical beliefs about science, the VOSE asks students to agree/disagree with statements about how to teach the nature of science.

Other nature of science assessments 

Many other multiple-choice instruments to assess students’ views of the nature of science were developed in the 1960s, 1970s, and 1980s but were based on researchers’ ideas and not on student interviews or research into student thinking (Abd-El-Khalick and  Lederman 2000). The VOSTS (Aikenhead and Ryan 1992) published in 1992, was the first nature of science instrument to use a student-centered design process, including analysis of student responses and student interviews. However, other researchers found many problems with students’ interpretations of the VOSTS (Chen 2006, Lederman et al. 2002, Abd-El-Khalick et al. 1998). Both the VOSE and the VNOS were developed in response to these problems.

Surveys about the nature of science, such as the VNOS, have been criticized for measuring only what students say declaratively about the nature of science, which may be quite different from what they do procedurally when engaged in authentic scientific practice (Salter and Atkins 2014). It is worth recognizing that this is an inherent limitation of such surveys. 

Recommendations for choosing a nature of science assessment

Use the VOSE if you want a multiple-choice assessment that is quick and easy to score. Use the VNOS if you would like to use an open-ended survey to get a more detailed understanding of your students’ views on the nature of science.

Self-efficacy

Self-efficacy is a person’s situation-specific belief that they can succeed in a given domain (Bandura 1978). There are three assessments of students’ views of their self-efficacy in their physics classes: Sources of Self-Efficacy in Science Courses-Physics (SOSESC-P) (Fencl and Scheel 2005), Physics Self-Efficacy Questionnaire (PSEQ) (Lindstrom and Sharma 2011), and the Self-efficacy in Physics Instrument (SEP) (Shaw 2004). There are numerous other assessments of self-efficacy with differing focuses, e.g., other disciplines and self-efficacy in general. We focus on those specifically developed for physics courses. All three of these assessments ask students to rate their agreement with statements on a five-point Likert scale and are appropriate for introductory college students.

Sources of Self-Efficacy in Science Courses-Physics (SOSESC-P)

The Sources of Self-Efficacy in Science Courses-Physics (SOSESC-P) (Fencl and Scheel 2005) assesses students’ beliefs that they can succeed in their physics course by asking them to agree or disagree with a series of statements. The questions are divided into four categories, corresponding to four established aspects of self-efficacy: performance accomplishment, social persuasion, vicarious learning, and emotional arousal. These questions ask about students’ feelings about different aspects of the course, how the instructor and other students influenced their views of themselves, the students’ behavior in the course (paying attention, working hard, etc.), and more. Several of the Likert-scale questions on the SOSESC-P were taken from existing mathematics and general academic surveys of self-efficacy. Additional new questions were written based on the developers’ experience with undergraduate science education.

Physics Self-Efficacy Questionnaire (PSEQ)

The Physics Self-Efficacy Questionnaire (PSEQ) (Lindstrom and Sharma 2011) is a similar survey of students’ beliefs that they can succeed in their physics course. The PSEQ has five questions, so it probes only one dimension of self-efficacy. Specifically, the PSEQ focuses on students’ confidence in their ability to succeed in their physics course. The questions do not mention specific portions of the course or specific members of the course (other students, instructor, etc.). They simply ask the students about themselves and their own ability in their physics course. Most of the Likert-scale questions on the PSEQ are modified versions of questions from the General Self-Efficacy Scale (Schwarzer 1993), while one PSEQ question was written by the developers. The PSEQ was developed in Australia.

Self-Efficacy in Physics (SEP)

The Self-Efficacy in Physics (SEP) (Shaw 2004) instrument is another survey that asks students to agree with statements about their beliefs about their ability to succeed in their physics course. The SEP contains eight questions, which are more specific than those on the PSEQ. These questions ask students how good or bad they are at science/mathematics, if they are good at using computers, and if they believe they can solve two specific mechanics problems. The SEP questions were developed based on a literature review and modeled after self-efficacy questions from surveys in other disciplines.

Comparing the SOSESC-P, PSEQ and SEP 

The SOSESC-P has 33 questions, whereas the PSEQ and SEP have 5 and 8 questions, respectively, so the SOSESC-P probes more dimensions of self-efficacy in more depth than the other surveys. There is a lot more variety in the questions on the SEP than the questions on the PSEQ. The SEP asks students about their belief that they can solve very specific physics problems, their comfort using a computer, and if they consider themselves good at mathematics, whereas the PSEQ questions are about physics in general. All have the same level of research validation.

Recommendation for choosing a self-efficacy assessment

If you want to measure detailed changes in your students’ physics course specific self-efficacy, use the SOSESC-P, as it probes several dimensions of self-efficacy and uses several questions to probe each. If you need a shorter self-efficacy assessment that can be combined with some other assessment, use the five-question PSEQ, which can give you a general sense of your students’ belief and confidence in their ability in your course.

References