‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 1
How To: Assess Mastery of Math Facts With CBM: Computation
Fluency
Computation Fluency measures a student's accuracy and speed in completing 'math facts' using the basic
number operations of addition, subtraction, multiplication, and division. Computation fluency in the
elementary grades is a strong predictor of later success in higher-level math coursework (Gersten, Jordan,
& Flojo, 2005).
For students to attain 'computational fluency', however, they must be both accurate and speedy in solving
basic math facts--ideally through automatic recall (VanDerHeyden & Burns, 2008). In an influential report,
the National Mathematics Advisory Panel (2008) stressed the need for students to become proficient in
math facts, calling on schools to make it a priority to "develop automatic recall of addition and related
subtraction facts, and of multiplication and related division facts." (p. xix).
The Common Core Standards also recognize the importance of computation fluency. For example, a 4th-
grade math standard in Number and Operations in Base Ten (CCSM.4.NBT.4) states that the student will
"fluently add and subtract multi-digit whole numbers using the standard algorithm" (National Governors
Association Center for Best Practices et al., 2010; p. 29). However, the challenge for teachers is to define
specifically what level of performance is required to identify a student as fluent in compuation.
CBM-Computation Fluency is a brief, timed assessment that can indicate to teachers whether a student is
developing computation fluency and is thus on track to master grade-appropriate math facts (basic
computation problems). This assessment can be administered to an individual student or to larger groups.
The student is given a worksheet containing math facts and is given 2 minutes to answer as many problems
as possible. The worksheet is then collected and scored, with the student receiving credit for each correct
digit in his or her answers. Teachers can then compare any student's performance to research norms to
determine whether that student is at risk because of delayed computational skills (Burns, VanDerHeyden, &
Jiban, 2006).
Computation Fluency Measures: How to Access Resources. Teachers who would like to screen their
students in grades 1 through 6 for possible delays in computation skills can obtain these free Computation
Fluency assessment resources: (1) materials for assessment, (2) guidelines for administration and scoring,
and (3) research-based norms.
Materials for assessment. Schools can customize their own CBM Computation Fluency assessment
materials at no cost, using the Math Worksheet Generator, a free online application:
http://www.interventioncentral.org/teacher-resources/math-work-sheet-generator
This program generates printable student and examiner assessment sheets for CBM Computation
Fluency.
Guidelines for administration and scoring. Instructions for preparing, administering, and scoring CBM-
Computation Fluency assessments appear later in this document:
Research-based norms. A table, Curriculum-Based Measurement: Computation Fluency Norms is
included in this document. The table contains fluency benchmarks for grades 1-6, drawn from several
research studies (e.g., Burns, VanDerHeyden, & Jiban, 2006).
References
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 2
Burns, M. K., VanDerHeyden, A. M., & Jiban, C. L. (2006). Assessing the instructional level for
mathematics: A comparison of methods. School Psychology Review, 35, 401-418.
Gersten, R., Jordan, N. C., & Flojo, J. R. (2005). Early identification and interventions for students with
mathematics difficulties. Journal of Learning Disabilities, 38, 293-304.
Hosp, M.K., Hosp, J. L., & Howell, K. W. (2007). The ABCs of CBM. New York: Guilford.
National Governors Association Center for Best Practices & Council of Chief State School Officers. (2010).
Common core state standards for mathematics. Washington, DC: Authors. Retrieved from
http://www.corestandards.org/
National Mathematics Advisory Panel. (2008). Foundations for success: The final report of the National
Mathematics Advisory Panel. Washington, DC. U.S. Department of Education. Retrieved from
http://www2.ed.gov/about/bdscomm/list/mathpanel/index.html
VanDerHeyden, A. M., & Burns, M. K. (2008). Examination of the utility of various measures of mathematics
proficiency. Assessment for Effective Intervention, 33, 215-224.
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 3
Figure 1: A Sampling of Math Computational
Goals for Addition, Subtraction, Multiplication,
and Division (from Wright, 2002).
Addition
Two 1-digit numbers: sums to 10
Two 3-digit numbers: no regrouping
1- to 2-digit number plus 1- to 2-digit number:
regrouping
Subtraction
Two 1-digit numbers: 0 to 9
2-digit number from a 2-digit number: no
regrouping
2-digit number from a 2-digit number: regrouping
Multiplication
Multiplication facts: 0 to 9
2-digit number times 1-digit number: no
regrouping
3-digit number times 1-digit number: regrouping
Division
Division facts: 0 to 9
2-digit number divided by 1-digit number: no
remainder
2-digit number divided by 1-digit number:
remainder
Wright, J. (2002) Curriculum-Based Assessment
Math Computation Probe Generator: Multiple-
Skill Worksheets in Mixed Skills. Retrieved from
http://www.interventioncentral.org/
teacher-resources/math-work-sheet-generator
Curriculum-Based Measurement-Computation Fluency:
Guidelines for Use
CBM-Computation Fluency: Description
CBM-Computation Fluency measures a student's accuracy and speed in completing 'math facts' using the
basic number operations of addition, subtraction, multiplication, and division. CBM-Computation Fluency
probes are 2-minute assessments of basic math facts that are scored for number of 'correct digits'.
There are 2 types of CBM math probes, single-skill worksheets (those containing like problems) and
multiple-skill worksheets (those containing a mix of problems requiring different math operations). Single-
skill probes give instructors good information about
students' mastery of particular problem-types, while
multiple-skill probes allow the teacher to test children's
math competencies on a range of computational
objectives during a single CBM session.
Both types of math probes can be administered either
individually or to groups of students. The examiner
hands the worksheet(s) out to those students selected
for assessment. Next, the examiner reads aloud the
directions for the worksheet. Then the signal is given to
start, and students proceed to complete as many items
as possible within 2 minutes. The examiner collects the
worksheets at the end of the assessment for scoring.
CBM-Computation Fluency: Materials
The following materials are needed to administer CBM-
Computation Fluency:
Student and examiner copies of CBM Computation
Fluency Probes
Stopwatch
Pencils for students
CBM-Computation Fluency: Preparation
After computational objectives have been selected, the
instructor is ready to prepare math probes. The teacher
may want to create single-skills probes, multiple-skill
probes, or both types of CBM math worksheets. The
teacher will probably want to consult the Common Core
State Standards for Mathematics or district math
curriculum when selecting the kinds of problems to
include in the single- or multiple-skill probe.
Creating the single-skill math probe. As the first step in
putting together a single-skill math probe, the teacher will select one computational objective as a guide.
The worksheet, then, will consist of problems randomly constructed that conform to the computational
objective chosen.
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 4
For example, the instructor may select any of the computational objectives in Figure 1 as the basis for a
math probe. The teacher would then construct a series of problems that match the computational goal, as
in Figure 2. In general, single-skill math probes should contain between 80 and 200 problems, and
worksheets should have items on both the front and back of the page. Adequate space should also be left
for the student to show his or her work, especially with more complex problems such as long division.
Creating the Multiple-skill Math Probe. To assemble a multiple-skill math probe, the instructor will first select
the range of math operations and of problem-types that will make up the probe. Once the computational
objectives have been
chosen, the teacher can make up a worksheet of mixed math facts conforming to those objectives. Using
our earlier example, the teacher who wishes to estimate the proficiency of his 4th-grade math group may
decide to create a multiple-skills CBM probe. He could choose to sample only those problem-types that his
students have either mastered or are presently being taught. Figure 3 shows four computation skills with
matching sample problems that might appear on a worksheet of mixed math facts.
NOTE: Schools can customize their own CBM Computation Fluency assessment materials at no cost, using
the Math Worksheet Generator, a free online application:
http://www.interventioncentral.org/teacher-resources/math-work-sheet-generator
CBM-Computation Fluency: Directions for Administration
1. The examiner distributes copies of math probes to all the students in the group, face down. (Note:
These probes may also be administered individually). The examiner says to the students: "The sheets
on your desk are math facts."
2. If the students are to complete a single-skill probe, the examiner says: "All the problems are [addition or
subtraction or multiplication or division] facts."
Fi
g
ure 2: Exam
p
le of a sin
g
le-skill math
p
robe: Three to five 3-
and 4-di
g
it numbers: no re
g
rou
p
in
g
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
105
+ 600
+ 293
|
|
|
|
|
2031
+ 531
+ 2322
|
|
|
|
|
111
+ 717
+ 260
|
|
|
|
|
634
+ 8240
+ 203
|
|
|
|
|
Figure 3: Example of a multiple-skill math probe:
Division: 3-digit number divided by 1-digit number: no remainder
Subtraction: 2-digit number from a 2-digit number: regrouping
Multiplication” 3-digit number times 1-digit number: no regrouping
Division: Two 3-digit numbers: no regrouping
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
9
/431
|
|
|
|
|
20
-18
|
|
|
|
|
113
x 2
|
|
|
|
|
106
+ 172
+ 200
+ 600
|
|
|
|
|
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 5
If the students are to complete a multiple-skill probe, the examiner then says: "There are several types
of problems on the sheet. Some are addition, some are subtraction, some are multiplication, and some
are division [as appropriate]. Look at each problem carefully before you answer it."
3. The examiner then says: "When I say 'begin', turn the worksheet over and begin answering the
problems. Start on the first problem on the left on the top row [point]. Work across and then go to the
next row. If you can't answer a problem, make an 'X' on it and go to the next one. If you finish one side,
go to the back. Are there any questions? ".
4. The examiner says 'Start' and starts the stopwatch. While the students are completing worksheets, the
examiner and any other adults assisting in the assessment circulate around the room to ensure that
students are working on the correct sheet and that they are completing problems in the correct order
(rather than picking out only the easy items)..
5. After 2 minutes have passed, the examiner says, "Stop" and collects the CBM computation probes for
scoring.
6. Initial Assessment: If the examiner is assessing the student for the first time, the examiner administers
a total of 3 computation probes during the session using the above procedures and takes the median
(middle) score as the best estimate of the student's computation fluency.
Progress-Monitoring: If the examiner is monitoring student growth in computation (and has previously
collected CBM-Computation Fluency data), only one computation probe is given in the session.
CBM-Computation Fluency: Directions for Practice
If the student is not yet familiar with CBM-Computation Fluency probes, the teacher can administer one or
more practice computation probes (using the administration guidelines above) and provide coaching and
feedback as needed until assured that the student fully understands the assessment.
CBM-Computation Fluency: Scoring Guidelines
Traditional approaches to computational assessment usually give credit for the total number of correct
answers appearing on a worksheet. If the answer to a problem is found to contain one or more incorrect
digits, that problem is marked wrong and receives no credit. In contrast to this all-or-nothing marking
system, CBM assigns credit to each individual correct digit appearing in the solution to a math fact.
On the face of it, a math scoring system that awards points according to the number of correct digits may
appear unusual, but this alternative approach is grounded in good academic-assessment research and
practice. By separately scoring each digit in the answer of a computation problem, the instructor is better
able to recognize and to give credit for a student's partial math competencies. Scoring computation
problems by the digit rather than as a single answer also allows for a more minute analysis of a child's
number skills.
Imagine, for instance, that a student was given a CBM math probe consisting of addition problems, sums
less than or equal to 19 (incorrect digits appear in boldface and italics):
Fi
ure 4: Exam
le of com
leted
roblems from a sin
g
le-skill math
p
robe
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
105
+ 600
+ 293
988
|
|
|
|
|
2031
+ 531
+ 2322
4884
|
|
|
|
|
111
+ 717
+ 260
1087
|
|
|
|
|
634
+ 8240
+ 203
9077
|
|
|
|
|
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 6
If the answers in Figure 4 were scored as either correct or wrong, the child would receive a score of 1
correct answer out of 4 possible answers (25 percent). However, when each individual digit is scored, it
becomes clear that the student actually correctly computed 12 of 15 possible digits (80 percent). Thus, the
CBM procedure of assigning credit to each correct digit demonstrates itself to be quite sensitive to a
student's emerging, partial competencies in math computation.
The following scoring rules will aid the instructor in marking single- and multiple-skill math probes:
Individual correct digits are counted as correct.
Reversed or rotated digits are not counted as errors unless their change in position makes them appear
to be another digit (e.g., 9 and 6).
Incorrect digits are counted as errors.
Digits that appear in the wrong place value, even if otherwise correct, are scored as errors.
The student is given credit for "place-holder" numerals that are included simply to correctly align the
problem. As long as the student includes the correct space, credit is given whether or not a "0" has
actually been inserted.
In more complex problems such as advanced multiplication, the student is given credit for all correct
numbers that appear below the line.
Credit is not given for any numbers appearing above the line (e.g., numbers marked at the top of
number columns to signify regrouping).
Example
55
x 82
110
4400
4510
Since the student correctly placed 0 in the "place-
holder" position, it is given credit as a correct digit.
Credit would also have been given if the space
were reserved but no 0 had been inserted.
Example
97
x9
8730
"873" is the correct answer to this problem, but no
credit can be given since the addition of the 0
pushes the other digits out of their proper place-
value positions.
Example
1
46
+ 39
85
Credit is given for the 2 digits below the line.
However, the carried "1" above the line does not
receive credit.
Example
33
x 28
264
660
924
Credit is given for all work below the line. In this
example, the student earns credit for 9 correct
digits.
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 7
Curriculum-Based Measurement: Computation Fluency Norms
(Burns, VanDerHeyden, & Jiban, 2006; Deno & Mirkin, 1977; Fuchs & Fuchs, 1993; Fuchs &
Fuchs, n.d.)*
CBM-Computation Fluency measures a student's accuracy and speed in completing 'math facts' using the
basic number operations: addition, subtraction, multiplication, and division. CBM-Computation Fluency
probes are 2-minute assessments of basic math facts that are scored for number of 'correct digits'.
NOTE: The norms for grades 2-5 presented below are for 1 minute, while the norms for grades 1 and 6 are
for 2 minutes. To use any of the 1-minute norms, (1) administer and score a standard 2-minute Computation
Fluency probe; (2) divide that student score by 2; and then (3) compare that converted student score to the
appropriate 1-minute norm within grades 2-5 (Burns, VanDerHeyden, & Jiban, 2006).
Grade End of Year Benchmark: Correct Digits
per 2 Mins
(Fuchs & Fuchs, n.d.)
Weekly Growth:
'Realistic'
(Fuchs & Fuchs, 1993)
Weekly Growth:
'Ambitious'
(Fuchs & Fuchs, 1993)
1
20
0.3
0.5
Grade Performance Level Correct Digits per 1
Min
(Burns, VanDerHeyden, &
Jiban, 2006)
Weekly Growth:
'Realistic'
(Fuchs & Fuchs, 1993)
Weekly Growth:
'Ambitious'
(Fuchs & Fuchs, 1993)
2
Mastery More than 31
0.3
0.5
Instructional 14-31
Frustration Less than 14
3
Mastery More than 31
0.3
0.5
Instructional 14-31
Frustration Less than 14
4
Mastery More than 49
0.75
1.2
Instructional 24-49
Frustration Less than 24
5
Mastery More than 49
0.75
1.2
Instructional 24-49
Frustration Less than 24
Grade Performance Level Correct Digits
per 2 Mins
(Deno & Mirkin, 1977)
Weekly Growth:
'Realistic'
(Fuchs & Fuchs, 1993)
Weekly Growth:
'Ambitious'
(Fuchs & Fuchs, 1993)
6
Mastery More than 79
0.45
1.0
Instructional 40-79
Frustration Less than 40
‘How the Common Core Works’ Series © 2013 Jim Wright www.interventioncentral.org 8
References:
Burns, M. K., VanDerHeyden, A. M., & Jiban, C. L. (2006). Assessing the instructional level for mathematics: A
comparison of methods. School Psychology Review, 35, 401-418.
Deno, S. L., & Mirkin, P. K. (1977). Data-based program modification: A manual. Reston, VA: Council for
Exceptional Children.
Fuchs, L. S., & Fuchs, D. (n.d.). Using curriculum-based measurement for progress monitoring in math. National
Center on Student Progress Monitoring. Retrieved from http://www.studentprogress.org
Fuchs, L. S., & Fuchs, D. (1993). Formative evaluation of academic progress: How much growth can we expect?
School Psychology Review, 22, 27-49.
Gersten, R., Jordan, N. C., & Flojo, J. R. (2005). Early identification and interventions for students with
mathematics difficulties. Journal of Learning Disabilities, 38, 293-304.
*Reported Characteristics of Student Sample(s) Used to Compile These Norms:
Burns, VanDerHeyden, & Jiban, 2006: Number of Students Assessed: 434 students across grades 2-
5/Geographical Location: Southwest: Sample drawn from 1 elementary school/ Socioeconomic Status: 15% rate
of Free & Reduced Lunch/ Ethnicity of Sample: 74% Caucasian-non-Hispanic; 17% Hispanic or Latino; 6%
African-American; 3% Asian-American; 1% Native American/Limited English Proficiency in Sample: 2% of
students.
Deno & Mirkin, 1977: Number of Students Assessed: Not reported/Geographical Location: Sample drawn from
1 elementary school; location not reported/ Socioeconomic Status: Not reported/ Ethnicity of Sample: Not
reported/Limited English Proficiency in Sample: Not reported.
Fuchs & Fuchs, n.d.: Number of Students Assessed: Not reported/Geographical Location: Not reported/
Socioeconomic Status: Not reported/ Ethnicity of Sample: Not reported/Limited English Proficiency in Sample:
Not reported.
Fuchs & Fuchs, 1993: Number of Students Assessed: Year 1: 177 students in grades 1-6; Year 2:1208
students across grades 1-6/Geographical Location: Upper Midwest: Sample drawn from 5 elementary schools/
Socioeconomic Status: 33%-55% rate of Free & Reduced Lunch across participating schools/ Ethnicity of
Sample: Not reported/Limited English Proficiency in Sample: Not reported.
Where to Find Materials: Schools can create their own CBM Computation Fluency assessment materials at no cost,
using the Math Worksheet Generator, a free online application:
http://www.interventioncentral.org/teacher-resources/math-work-sheet-generator
This program generates printable student and examiner assessment sheets for CBM Computation Fluency.
Limitations of These Research Norms: Norms generated from small-scale research studies--like those used here--
provide estimates of student academic performance based on a sampling from only one or two points in time, rather
than a more comprehensive sampling across separate fall, winter, and spring screenings. These norms also have been
compiled from a relatively small student sample that is not fully representative of a diverse 'national' population.
Nonetheless, norms such as these are often the best information that is publically available for basic academic skills
and therefore do have a definite place in classroom instruction decision-making.
These norms can be useful in general education for setting student performance outcome goals for core instruction
and/or any level of academic intervention. Similarly, these norms can be used to set performance goals for students
with special needs. In both cases, however, single-sample norms would be used only if more comprehensive
fall/winter/spring academic performance norms are not available.