2000  2003
Summary. All college students who seek to become licensed teachers in Minnesota must verify the extent of their academic skills by completing the PreProfessional Skills Test (PPST) before they are accepted as candidates by an approved teacher preparation program. Those who fail to reach Minnesota’s minimum qualifying scores for each of the three subtests must be offered opportunities to strengthen their deficient skills in reading, writing, or using mathematics by the institutions accepting them as candidates pursuing licensure through an approved program. All candidates must pass all areas of the PPST before they can be recommended by their preparation program for licensure by Minnesota’s Board of Teaching.
While these results are generally encouraging, our expectations of a candidate’s academic skills exceed the level of performance defined by Minnesota’s qualifying scores for reading, writing, and mathematics. In our experience, some of those students who “pass” the PPST with lower scores will face significant challenges as they attempt to acquire the knowledge and skills that will inform their practice as competent teachers.
Context. The 1985 Minnesota Legislature empowered the state’s Board of Teaching to select an academic skills examination and to establish procedures ensuring that all teachers licensed in Minnesota would read, write, and use mathematics at the level required for their roles. The Board chose the PPST, also known as Praxis I, to help prospective teachers meet this licensure requirement.
Testing began in the fall of 1987. Initial test results were used by the Board to set qualifying scores for each subtest that could estimate the minimum knowledge and skill required for successful performance as an elementary or secondary level teacher. Qualifying scores for reading, writing, and mathematics are currently set at 173, 172, and 169 respectively. These scores represent one standard error of measurement (1SEM) below the mean of scores gathered during a 1986 field study conducted by the Educational Testing Service. That study included a sample of Minnesota students pursuing teacher licensure and a national sample of students who completed the PPST during that year.
In its 1997 summary report describing the adoption of the PPST and the selection of qualifying scores, the Board of Teaching reasoned that setting qualifying scores set at these levels would reduce the risk of rejecting otherwise competent candidates whose theoretical “true scores” would probably equal or exceed the cut score were the to complete the test on several occasions (page 5). The Board later increased the mathematics qualifying score to 171 for those submitting licensure applications after 1 September 2003 regardless of when they might have completed the mathematics examination, encouraging those with scores of 169 or 170 to sit for a second test in an attempt to reach the new qualifying score.
While we hold significantly higher expectations of the academic skills of those we hope to successfully prepare for the teaching profession through our licensure programs, an examination of recent PPST test results offers one indicator of our candidates’ potential to meet those higher expectations. Performance on the PPST also provides a useful look at prospective education students’ preparation for collegiate study.
Saint Benedicts’ and Saint John’s host two “paper and pencil” administrations of this multiple choice examination during each academic year. Other institutions throughout Minnesota offer additional testing opportunities. Some colleges offer a computer managed version that now provides equivalent scores. Those who wish to enroll for one of these test sessions do so by completing a registration form and including it with their testing fee of $30.00 for each of the three subtests they expect to complete (reading, writing, and mathematics). Computer managed testing, arranged by appointment with a cooperating test center, requires less testing time and provides immediate results, but at a higher cost ($85.00 for reading, $110.00 for writing, and $135.00 for mathematics). The following summary of the current version of the PPST is drawn from the second edition of the PPST: PreProfessional Skills Tests Study Guide published by the Educational Testing Service of Princeton, New Jersey, in 2003 (pages 89).
Table 1.0 PPST Praxis I Test Design
Skill Area 
Test Type 
Length 
Content 
Reading 
Computerized 
75 min 
Literal Comprehension: 26 items, 56% Inferential Comprehension: 20 items, 44% 
Paper/Pencil 
60 min 
Literal Comprehension: 23 items, 55% Inferential Comprehension: 17 items, 45%  
Writing 
Computerized 
68 min 
Grammatical Relationships: 12 items, 13% Structural Relationships: 16 items, 18.5% Idioms: Word Choice, Mechanics, 16 items, 18.5% Essay: 1 question, 50% (30 min) 
Paper/Pencil 
60 min 
Grammatical Relationships: 10 items, 13% Structural Relationships: 14 items, 18.5% Idioms, Word Choice: Mechanics: 14 items, 18.5% Essay: 1 question, 50%, (30 min)  
Mathematics (Calculators prohibited) 
Computerized 
75 min 
Conceptual/Procedural Knowledge: 21 items, 45% Representing Quantitative Data: 13 items, 30% Measurement/Geometry/Reasoning: 12 items, 25% 
Paper/Pencil 
60 min 
Conceptual/Procedural Knowledge: 18 items, 45% Representing Quantitative Data: 12 items, 30% Measurement/Geometry/Reasoning: 10 items, 25% 
We encourage all students who might wish to apply for acceptance by the Education Department as a candidate for elementary or secondary licensure to complete the PPST as soon as possible, requiring all to do so by the semester in which they formally request candidacy. Those whose test performance fall below the qualifying score on any of the three skills will have opportunities to review their strengths and develop their weaknesses in that area through a variety of learning opportunities before completing another version of the test. Since the time required to complete developmental work will vary for each individual, the sample of those completing one or more PPST examinations in any one test year will include some who are thinking about becoming teachers but may not pursue that goal, others who have applied and await the Department’s acceptance, and still others who failed the test but who have been conditionally accepted upon reaching or exceeding the qualifying score.
Findings
Reading Performance. Reading as tested using the PPST includes literal comprehension, defined by ETS as “the ability to accurately and completely understand the explicit content of a written message.” This skill is balanced by an exploration of an examinee’s Inferential comprehension, “the ability to evaluate a reading selection and its messages” (PPST, p. 267). Additional information on these two facets of the test appears in Appendix A on page 12 of this report.
Table 2.1 summarizes the reading performance of our candidates as estimated by their scores on that subtest of paper and pencil versions of the PPST. Students’ performance was similar for each of the three test years included in this table. Median and high scores as well as the middle range of scores for the 66 completing this test in 20002001 as well as for the 79 who did so in 20012002 and the 106 testing during the following year are generally consistent with the performance of all students completing the reading section of the PPST during those two years. Our lowest scoring student for each test year (167, 165 and 163 respectively) exceeded the lowest score in the national samples (150), while our highest observed scores fell below the high score in the national sample by one point for each of the three test years.
2.1 Group Summary: PPST Reading Performance
Test Year 
Number 
Low Score 
Median 
High Score 
25^{th}75^{th} %tile 
0001 
66 (61,652) 
167 (150) 
179 (179) 
186 (189) 
176182 (174182) 
0102 
79 (67,558) 
165 (150) 
180 (179) 
188 (189) 
178183 (174183) 
0203 
106 (64,019) 
163 (150) 
180 (178) 
188 (189) 
175182 (173182) 
The Educational Testing Service offers additional summaries of a test group’s performance in each of the two reading skill areas. Table 2.2 summarizes the mean percent correct earned by our students in “literal” and “inferential comprehension” for the two or the three test years included in this review. Means for our students are contrasted with those awarded to candidates prepared by other Minnesota programs and with a national sample of all who completed the exam for each of the two test years. Differences between mean percentages for our each group and subtest are included.
2.2 Mean Percentage Correct: PPST Reading Performance by Skill Area
Content Area 
Year 
Points 
CSB/SJU 
Minnesota 
U. S. 
Literal Comprehension 
0102 0203 
25 25 
79% (+ 1%) 75 (  2) 
78% (+ 3%) 77 (+ 3) 
75% 74 
Inferential Comprehension 
0102 0203 
20 18 
79 (+ 5) 70 (  3) 
74 (+ 3) 73 (+3) 
71 70 
Note: ETS did not calculate group mean percent correct by PPST content area for the 20002001 test year.
Those of our students who completed the PPST reading test during 20012002 correctly answered 79% of the questions about written passages testing their literal comprehension, 1% above the average percent correct earned by their Minnesota peers (78%). The average percentage correct on that subtest for all Minnesota examinees (78%) exceeded that of the national sample (75%) by 3 percentage points. A similar pattern appears for inferential comprehension in the scores of those taking the test during 20012002. Our candidates exceed the mean of all Minnesotans tested in that year (74%) by 5%, who themselves exceed the national mean percent correct for that area (71%) by 3%.
Those of our candidates completing their PPST reading test in the following year, however, fell below their state peers on both subtests (20022003: Literal comprehension, 75% versus 77%, Inferential Comprehension, 70% versus 73%). As ETS will not provide item analysis data for its Praxis tests, we cannot look more closely for explanations for the drop in performance evident between these two groups.
We also receive an additional group report that may offer some help in our search for similarities and differences among PPST test results for these three years. Table 2.3 places scores earned by our students on each subtest of the PPST Reading exam in a quartile distribution formed from the subtest scores earned by each year’s national sample.
2.3 Score Distribution by Area: PPST Reading Performance
Content Area 
Year 
1^{st} Quartile 
2^{nd} Quartile 
3^{rd} Quartile 
4^{th} Quartile 
Tested 
Literal Comprehension 
0001 0102 0203 
5 (8%) 16 (20) 22 (21) 
22 (33%) 19 (24) 38 (36) 
25 (38%) 25 (32) 35 (33) 
14 (21%) 19 (24) 20 (19) 
66 79 106 
Inferential Comprehension 
0001 0102 0203 
6 (9%) 4 (5) 20 (19) 
29 (44%) 28 (35) 38 (36) 
24 (36%) 30 (38) 34 (32) 
7 (11%) 17 (22) 14 (13) 
66 79 106 
Note: Quartiles that include 25% of the scores earned by all tested in each year are set by ETS to capture the range of those scores for each Praxis test. A distribution of scores earned by an institution’s candidates may thus fail to replicate the national distribution of 25% of all scores in each quartile.
We should always be concerned about the preparation for college level reading of students whose scores fall into the first quartile, as this subgroup could include students whose skills in reading may be unequal to the demands placed on them by their college coursework. Looking at this first quartile, we find that more of our students who completed their reading PPST in 20022003 may have difficulty drawing defensible inferences from their reading of longer or more complex passages, a weakness which should be watched in subsequent years (0203: Inferential Comprehension, 20 of 106, 19%, in 1^{st} quartile as compared with only 4 of 79, 5%, of those tested during 0102). Our experience with developmental opportunities in reading comprehension suggests that a significant weakness in inferential understanding will prove difficult to resolve.
Writing Performance. The Educational Testing Service uses responses to both an essay and multiple choice format questions to describe the writing performance of those students who complete the PPST test for this skill area. Appendix B offers a summary of the concepts included in this test. Table 3.1 describes the writing performance on this test for prospective and accepted candidates completing the test in each of the three test years. A national sample, enclosed in parentheses within each cell, offers a comparison.
3.1 Group Summary: PPST Writing Performance
Test Year 
Number 
Low Score 
Median 
High Score 
25^{th}75^{th} %tile 
0001 
64 (62,900) 
170 (152) 
178 (175) 
185 (190) 
174179 (172178) 
0102 
80 (69,070) 
171 (150) 
177 (175) 
189 (190) 
175180 (172178) 
0203 
105 (64,642) 
169 (150) 
177 (175) 
186 (190) 
175180 (172178) 
As in reading comprehension, the lowest observed scores in writing earned by our prospective and accepted candidates exceeded the lowest scores in the national sample (150) during each of the three test years (170 in 0001, 171 in 0102 and 169 in 0203). Our students earned higher median scores in writing than did their national peers during each year. The highest observed score earned by our students fell below the highest score in the national sample by 5 scaled score points in the first test year, one in the second, and by four points in the third.
Table 3.2 provides a closer look at the performance of our students on each of the four major content areas that form the PPST writing test. Our accepted and prospective candidates averaged a higher percentage of correct answers for each of the four content areas included in the writing examination.
3.2 Mean Percentage Correct: PPST Writing Performance
Content Area 
Year 
Points 
CSB/SJU 
Minnesota 
U. S. 
Grammatical Relationships 
0102 0203 
15 13 
55% (+ 1%) 59 (+ 2) 
54% (+ 1%) 57 (+ 3) 
53% 54 
Structural Relationships 
0102 0203 
19 19 
60 (+ 8) 58 (+ 2) 
52 (+ 2) 56 (+ 4) 
50 52 
Idiom & Word Choice; Mechanics 
0102 0203 
19 19 
62 (+ 2) 65 (+ 3) 
60 (+ 3) 62 (+ 2) 
57 60 
Essay 
0102 0203 
12 12 
73 (+ 3) 70 (+ 2) 
70 (+ 2) 68 (+ 2) 
68 66 
Note: ETS did not calculate group mean percent correct by PPST content area for the 20002001test year.
The mean of all students taking this exam in Minnesota exceeded the mean for their national peers. With the exception of those tested during 20012002 on “structural relationships” (comparisons, coordination, correlation, negation, parallelism, subordination), the performance of those tested in the first year appears generally similar to that of the second.
Given stronger group performance on the writing examination as whole (Table 3.1) and in each content area forming that test (Table 3.2), we might expect a strong quartile distribution by content area. Table 3.3 offers a comparison of our students in a quartile distribution of the national sample for each test year that confirm this assumption.
3.3 Score Distribution by Area: PPST Writing Performance
Content Area 
Year 
1^{st} Quartile 
2^{nd} Quartile 
3^{rd} Quartile 
4^{th} Quartile 
Tested 
Grammatical Relationships 
0001 0102 0203 
7 (11%) 9 (11) 6 (6) 
18 (28%) 27 (34) 34 (32) 
26 (41%) 30 (38) 36 (34) 
13 (20%) 14 (18) 29 (28) 
64 80 105 
Structural Relationships 
0001 0102 0203 
4 (6%) 3 (4) 13 (12) 
20 (31%) 13 (16) 28 (27) 
22 (34%) 35 (44) 27 (26) 
18 (28%) 29 (36) 37 (35) 
64 80 105 
Idiom & Word Choice; Mechanics 
0001 0102 0203 
5 (8%) 5 (6) 16 (15) 
20 (31%) 26 (33) 33 (31) 
19 (30%) 35 (44) 35 (33) 
20 (31%) 14 (18) 21 (20) 
64 80 105 
Essay 
0001 0102 0203 
11 (17%) 9 (11) 10 (10) 
19 (30%) 19 (24) 34 (32) 
18 (28%) 23 (29) 30 (29) 
16 (25%) 29 (36) 31 (30) 
64 80 105 
Note:Quartiles are set by ETS to capture the range of scores awarded to those candidates in the national sample for each Praxis test administered during each test year. By definition, 25% of the scores of students in that national sample must fall within each quartile. Scores earned by an institution’s candidates and prospective candidates, however, may not replicate the national distribution in each quartile.
Overall, fewer of our students earned low scores in the four content areas that placed them into the first quartile than did students in the national samples for each test year as revealed by their recall or recognition of grammar, structure, or the “mechanics” of writing and by their written essays. In the most recent two years, about onethird of our students earned scores placing them in the fourth quartile for “structural relationships (36% and 35%) and for their essays (36% and 30%). Some improvement in essay performance is evident across the three test years described in this table (from 17% in 0001 to 10% in 0203). The performance of those whose scores placed them in the first quartile (lowest 25% of examination scores) could profit from developmental work.
Mathematics. All who are licensed to teach any grade level in Minnesota should have conceptual knowledge of “the foundational ideas of numbers,” and the procedural knowledge “required to represent quantitative relationships” and to “plan, execute, interpret, or complete operations to solve problems” (PPST, 2003, p.268) They understand representations of quantitative information as they strive to “retrieve information from data to determine whether statements based on data are true or false, to recognize relationships in and make inferences from data, and to represent a given set of data graphically (p. 269). Teachers must also “demonstrate a basic understanding of measurement, if the U.S. customary and metric systems of measurement, and of geometric properties and relationships” (p. 269).
Such knowledge and use of mathematics as is defined by the questions included in the PPST can prove to be a difficult challenge for many who would pursue their vocation as an elementary or secondary level teacher. Table 4.1 offers a group summary of our students’ performance on this test.
4.1 Group Summary: PPST Mathematics Performance
Test Year 
Number 
Low Score 
Median 
High Score 
25^{th}75^{th} %tile 
0001 
64 (64,508) 
158 (150) 
182 (178) 
190 (190) 
178185 (172183) 
0102 
80 (70,589) 
160 (150) 
183 (178) 
190 (190) 
179186 (172183) 
0203 
104 (65,609) 
161 (150) 
183 (178) 
190 (190) 
178186 (171183) 
On the whole, our prospective and accepted candidates for licensure perform somewhat better on the PPST Mathematics examination than do their peers in the national sample. The lowest observed score for our students (158 in 0001) exceeds the lowest score observed among all who completed this test for each of those three years, while the highest observed score (190 for each of three years) equals the highest score earned in the national samples. The median score earned by our students during each year also exceeds the national median. The range of scores capturing onehalf of our students for each test year is smaller (7 score points, 178185 in 0001; 6 score points 0102; 7 score points in 0203) than the range of scores in the national samples for each of the three test years (11 score points in the first and second years, 12 in the third). Overall, this pattern suggests that we are attracting better prepared candidates having stronger knowledge and improved skills in mathematics.
If so, we should expect to see confirmation of such performance in an analysis of our students’ performance for each of the content areas included in the PPST mathematics examination. Table 4.2 provides offers this analysis along with comparisons to state and national samples.
4.2 Mean Percentage Correct: PPST Mathematics Performance
Content Area 
Year 
Points 
CSB/SJU 
Minnesota 
U. S. 
Conceptual & Procedural Knowledge 
0102 0203 
20 20 
74% (+ 6%) 73 (+ 5) 
68% (+ 4%) 68 (+ 6) 
64% 62 
Representations of Quantitative Information 
0102 0203 
13 13 
78 (+ 3) 78 (+ 4) 
75 (+ 5) 74 (+ 7) 
70 67 
Measurement, Geometry, and Reasoning 
0102 0203 
10 10 
78 (+ 8) 76 (+ 7) 
70 (+ 6) 69 (+ 8) 
64 61 
Note: ETS did not calculate group mean percent correct by PPST content area for the 20002001 test year.
Those of our students who completed a PPST mathematics examination in the second and third test years presented in Table 4.2 attained a higher mean percentage correct than did their Minnesota or national peers on each of the three areas included in this test. Their knowledge of measurement, geometry, and use of mathematical logic seems stronger than that of their Minnesota peers for both years (0102, 78% correct versus 70%, +8%; 0203, 76% versus 69%, +7%). Differences between CSB/SJU students and those in other Minnesota teacher preparation programs are smaller, but still favor our candidates, in the visual representation and use of quantitative information for both test years (+3% in 0102, 78% correct versus 75%; +4 in 0203, 78% versus 74% in Minnesota as a whole).
We might expect to see further evidence of this pattern in a quartile distribution of scores earned by students seeking licensure through our program. Table 4.3 provides this evidence.
4.3 Score Distribution by Area: PPST Mathematics Performance
Content Area 
Year 
1^{st} Quartile 
2^{nd} Quartile 
3^{rd} Quartile 
4^{th} Quartile 
Tested 
Conceptual & Procedural Knowledge 
0001 0102 0203 
8 (13%) 5 (6) 10 (10) 
19 (30%) 20 (25) 11 (11) 
22 (34%) 24 (30) 42 (40) 
15 (23%) 31 (39) 41 (39) 
64 80 104 
Representations of Quantitative Information 
0001 0102 0203 
3 (5%) 7 (9) 5 (5) 
15 (23%) 19 (24) 21 (20) 
34 (53%) 33 (41) 47 (45) 
12 (19%) 21 (26) 31 (30) 
64 80 104 
Measurement, Geometry, and Reasoning 
0001 0102 0203 
5 (8%) 6 (8) 2 (2) 
19 (30%) 11 (14) 19 (18) 
12 (19%) 33 (41) 45 (43) 
28 (44%) 30 (38) 38 (37) 
64 80 104 
Note: Quartiles set by ETS capture the range of scores awarded to those candidates in the national sample who completed the PPST mathematics examination administered during each test year. Scores earned by an institution’s candidates may thus not replicate the national distribution of 25% of scores as noted for each quartile.
While we should reserve continuing concern for those of our candidates whose scores fall into the first quartile of all candidates’ PPST mathematics scores, their experience is balanced by stronger performance experienced by their CSB/SJU peers with scores in the third and fourth quartiles. Despite the comforting trend toward higher scores in mathematics across the three test years, developmental opportunities will still be required by those needing help in this skill area to perform to our expectations.
PPST Failure Rate.The present climate of increasing accountability expected of those in colleges who prepare teachers and the state agencies that license them casts greater importance on the proportion of candidates who fail licensure tests than such instruments might otherwise warrant. All teacher preparation programs in all states must report the pass rate of their candidates on all licensure tests required of them by their state licensing authorities to the U.S. Department of Education for the Commissioner’s analysis and report to Congress. States make similar analyses available to their legislators and citizens. Colleges publish test results for their current and prospective students to inspect to help them seek out an institution and a major that will best meet their goals. Summaries of our candidates’ performance on all relevant licensure examinations appear at our Title II website.
Table 5.0 offers the failure rate for each of the two test years included in this report. Drawn from a distribution of test scores provided by ETS which included all PPST tests completed in each test year, the table overestimates our students’ actual failure rate of from one to two percent across all licensure examinations. All students who noted on their test registration forms that they were prepared for licensure through our program are included in this analysis. At times students from other colleges have been incorrectly classified by ETS as our candidates. Further, students may attempt to pass the PPST on more than one occasion during a test year. The results of each attempt are included in Table 5.0, but only the highest score earned during a candidate’s multiple attempts appears in federal Title II reports.
Table 5.0 Examinees Reporting Preparation for Licensure at CSB and SJU
Who Did Not Attain Minnesota Qualifying Scores on PRAXIS I Examinations
Praxis II Licensure Tests 
Qualifying Score 
Proportion Below Qualifying Score 
Reading Comprehension: 0001 
173 
4 of 66 6% 
0102 
173 
6 of 79 8 
0203 
173 
11 of 106, 10 
Writing: 0001 
172 
3 of 64 5 
0102 
172 
2 of 80 3 
0203 
172 
3 of 105 3 
Mathematics: 0001 
171 
5 of 64 8 
0102 
171 
5 of 80 6 
0203 
171 
4 of 104 4 
Note Examinees may complete tests during any year of their enrollment in CSB and SJU. Test results reported in this table may include more than one attempt to pass one or more tests by the same individual in any one test year. This summary does not include students who may have completed the computermanaged version of the PPST. For these reasons results presented here may not agree with state and federal Title II reports, which only include the highest score when a candidate makes more than one attempt to pass a test. Title II reports include scores from verified program completers, while PRAXIS reports may include scores from students who were not prepared for licensure through our program but who nonetheless indicated our colleges as their preparation site on their Praxis I registrations. Tabulations exclude scores from computerbased versions of the PPST.
In past years all but one or two of our candidates have passed all three sections of the PPST, although some did so only after completing developmental work and a second attempt. Few students required more than two attempts. Actual PPST failure rates annually hover between one and two percent annually.
Observations
The information provided by the Educational Testing Service concerning the PPST performance of our prospective and accepted candidates for teacher licensure affirms that many of those students possess skills in reading, writing, and using mathematics that will sustain their preparation for elementary or secondary teaching (see Tables 2.3, 3.3, and 4.3; third and fourth quartiles). Those whose scores fall into the first and second quartiles, even if they “pass” with scores that equal or exceed the qualifying score for an examination, may still require support as they meet the challenges of collegiate instruction. These students will find that support should they enroll in courses, request guided tutoring, or complete computermanaged instruction designed to strengthen their academic skills.
While the trends revealed by only three years of comprehensive data must of necessity be advanced with caution, weaker performance in reading may be an emerging area of concern as more students seem troubled by their inability to draw valid inferences from written passages or to recall the literal meaning of a message (see Table 2.2 and 2.3). At the same time, fewer students tested during each of the three years reviewed in this report seem to be challenged by weaker skills in mathematics or writing (see Tables 3.2 and 4.3). Should this trend toward weaker reading comprehension continue, we may see a shift in the demand for developmental assistance grow in reading while requests for help with writing or in using mathematics may stabilize or decline. Such shifts would have significant implications for those who provide or manage our colleges’ developmental services.
References
PPST: PreProfessional Skills Tests Study Guide (2003). Educational Testing Service. Princeton: New Jersey.
Report of Minnesota’s Administration of the Skills Area Examinations: 19871996. (1997) Minnesota Board of Teaching. Roseville, Minnesota.
D. Leitzman
July 2004
Appendix A: Design of the PPST Reading Examination
I. Literal Comprehension: The ability to understand accurately and completely the explicit content of a written message.
A. Main Idea questions involve identifying summaries or paraphrases of the main idea or primary purpose of a reading selection.
B. Supporting Idea questions involve identifying summaries or paraphrases of supporting ideas.
C. Organization questions involve recognizing how a reading selection is organized, how it uses language, how the ideas in a selection are related to one another, or how the key phrases and transition words are used in a reading selection.
D. Vocabulary questions involve identifying the meanings of words as they are used in the context of a reading selection.
II. Critical and Inferential Comprehension: The ability to evaluate a reading selection and its messages.
A. Argument evaluation questions involve determining the strengths and weaknesses of arguments in a reading selection, determining the relevance of evidence presented in the reading selection to the assertions made in the selection, or judging whether material presented is fact or opinion.
B. Inferential reasoning questions involve drawing inferences and implications form the directly stated content of a reading selection, determining the logical assumptions underlying a selection, or determining the author’s attitude toward the material discussed.
C. Generalization questions involve recognizing situations that are similar to the material in a reading selection, drawing conclusions about the material in a selection, or applying ideas from the selection to new situations.
This summary is adapted from information on page 267 of the PPST: PreProfessional Skills Tests Study Guide as published in 2003 by the Educational Testing Service of Princeton, New Jersey.
Appendix B: Design of the PPST Writing Examination
I. Grammatical Relationships. Questions call for identification of errors in adjectives, adverbs, nouns, pronouns, and verbs.
II. Structural Relationships. Questions in this area seek recognition of errors of comparison, coordination, correlation, negation, parallelism, and subordination.
III. Idiom./Word Choice, Mechanics, and No Error. Examinees respond to questions seeking their identification of errors in the use of idiomatic expressions, word choice, capitalization, and punctuation. Additional questions test for recognition of sentences without error.
IV. Essay. Examinees completing this test write an essay that fits an assigned task and audience. They must organize and develop their ideas logically, making clear connections between them. Writers must also provide and sustain a clear focus or thesis. They will use supporting reasons, examples, and details to develop clearly and logically the ideas presented in their essays. They will thus demonstrate facility in their use of language and the ability to use a variety of sentence structures. Successful writers will construct sentences that are generally free from errors in standard written English.
This summary is freely adapted from information on pages 270271 of the PPST: PreProfessional Skills Tests Study Guide as published in 2003 by the Educational Testing Service of Princeton, New Jersey.
Appendix C: Design of the PPST Mathematics Examination
I. Conceptual Knowledge: Demonstrate number sense and operation sense—that is, an understanding of the foundational ideas of numbers, number properties, and operations defined on numbers (whole numbers, fractions, and decimals).
A. Order: demonstrate an understanding of order among whole numbers, fractions, and decimals.
B. Equivalence: Demonstrate that a number can be represented in more than one way.
C. Numeration and Place Value: Understand how numbers are named, their place value, and their order of magnitude.
D. Number Properties: Demonstrate an understanding of the properties of whole numbers without necessarily knowing the names of those properties.
E. Operation Properties: Demonstrate an understanding of the commutative, associative, and distributive properties of basic operations (addition, subtraction, multiplication, and division) without knowing the names for those properties. Recognize equivalent computational procedures.
II. Procedural Knowledge: Demonstrate an understanding of the procedures required to represent quantitative relationships and the ability to plan, execute, interpret, or complete operations to solve problems.
A. Computation: Perform computations; adjust the result of a computation to fit the context of a problem; identify numbers or information or operations to solve a problem.
B. Estimation: Estimate the result of a calculation; determine the reasonableness of an estimate.
C. Ratio, Proportion, and Percent: Solve problems involving ratio, proportion, and percent.
D. Probability: Interpret numbers used to express simple probability; assign a probability to an outcome.
E. Equations: Solve simple equations and inequalities; predict the outcome of changing some number or condition in a problem.
F. Algorithmic Thinking: Demonstrate an understanding of the algorithmic point of view—that is, follow a given procedure; recognize various ways to solve a problem; identify, complete, or analyze a procedure; discover patterns in a procedure.
III. Representations of Quantitative Information: Demonstrate an ability to interpret visual displays of quantitative information, retrieve information from data, determine whether statements based on data are true or false, recognize relationships in and make inferences from data, and represent a give set of data graphically.
A. Interpretation: Read and interpret visual displays of quantitative information, such as bar graphs, line graphs, pie charts, pictographs, tables, stem plots, scatter plots, schedules, simple flow charts, and diagrams; recognize relationships in data; determine an average, a range, a mode, or a median.
B. Trends: Given a data display, observe groupings, make comparisons, and make predictions or extrapolations.
C. Inferences: Given a data display, draw conclusions or make inferences from the data.
D. Patterns: Identify and recognize patterns in data such as variation.
E. Connections: Demonstrate an understanding of the relationship between numerical values in a table, the symbolic rule relating table values, and the corresponding graphical representation of the table and the rule; choose a graph appropriate to represent a given set of data; recognize quantitative relationships in symbols or in words.
IV. Measurement and Informal Geometry: Demonstrate a basic understanding of measurement, of the U.S. customary and metric systems of measurement, and of geometric properties and relationships. At least half of the questions will focus on informal geometry.
A. Systems of Measurement: Demonstrate basic literacy in the U.S. customary and metric systems of measurement; convert from one unit to another within the same system; recognize and use appropriate units for making measurements; read a calibrated scale.
B. Measurement: Determine the measurements needed to solve a problem; recognize and use geometric concepts in making linear, area, and volume measurements; solve measurement problems by using a formula, estimating, employing indirect measurement, using rates as measures, making visual comparisons, using scaling/proportional reasoning, or using a nonstandard unit.
C. Geometric Principles: Recognize and use geometric properties and relationships in both pure and realworld situations, such as recognizing a symmetrical design or determining a distance using the Pythagorean relationship.
V. Formal Mathematical Reasoning: Demonstrate the ability to use the basics of logic in a quantitative context.
A. Logical Connectives and Quantifiers: Interpret statements that use logical connectives (and, or, if…then) as well as quantifiers (some, all , none).
B. Validity of arguments: Use deductive reasoning to determine whether an argument (a series of statements leading to a conclusion) is valid or invalid.
C. Generalization: Identify an appropriate generalization, an example that disproves an inappropriate generalization, or a hidden assumption.
This summary is adapted from information on pages 269 and 270 of the PPST: PreProfessional Skills Tests Study Guide as published in 2003 by the Educational Testing Service of Princeton, New Jersey.
Appendix D: A note our PPST tests and samples
All PPST information included in this summary was drawn from Institutional Summary Reports issued by the Educational Testing Service at the close of each test year. Variations in tests and in reporting practices in each year reduced the number of students whose scores could be combined and reported. Scores were excluded from the review if different forms of a subtest hindered meaningful comparisons within or among groups. Incompatible scores on discontinued computerbased tests (CBT) were excluded from our analysis. Small samples of SJU students excluded by ETS from reports of larger joint college samples during some test years were also excluded in the absence of summary statistics for these few candidates While ETS has introduced new computermanaged tests that describe students’ performance using scaled scores placed on the same metric as the paper and pencil PPST, this “CPPST” presents examinees with a different test design and testing conditions that may not be equivalent with the experiences of those who complete the paper and pencil version. Since ETS did not provide full reports for those who completed CPPST tests and did not combine their scores with those of PPST examinees in common analyses, all students completing the CPPST series have been excluded from our analysis until we learn more about the properties of these new examinations. Table D.1summarizes all versions of the PPST completed by all prospective and accepted candidates for licensure during each of the three test years examined for this review, noting those which were excluded.
Table D.1 Sampling Frame for Review of PPST Performance
Year 
Sample 
Test 
Included 
Excluded 
Reason for Deletion 
0001 
CSB/SJU 
PPST Reading 0710 
66 




CBT Reading 0711 

32 
Scoring 


PPST Writing 0720 
64 




CBT Writing 0721 

30 
Scoring 


PPST Math 0730 
64 




CBT Math 0731 

29 
Scoring 

SJU 
PPST Reading 0710 

12 
Sample 


CBT Reading 0711 

21 
Sample & Scoring 


PPST Writing 0720 

13 
Sample 


CBT Writing 0721 

22 
Sample & Scoring 


PPST Math 0730 

12 
Sample 


CBT Math 0731 

20 
Sample & Scoring 
0102 
CSB/SJU 
PPST Reading 0710 
79 




CBT Reading 0711 

11 
Scoring 


CPPST Reading 5710 

26 
Test Design 


PPST Writing 0720 
80 




CBT Writing 0721 

11 
Scoring 


CPPST Writing 5720 

25 
Test Design 


PPST Math 0730 
80 




CBT Math 0731 

12 
Scoring 


CPPST Math 5730 

26 
Test Design 
0203 
CSB/SJU 
PPST Reading 0710 
106 




CPPST Reading 5710 

39 
Test Design 


PPST Writing 0710 
105 




CPPST Writing 5720 

35 
Test Design 


PPST Math 0730 
104 




CPPST Math 5730 

37 
Test Design 

SJU 
PPST Reading 0710 

3 
Sample 


PPST Writing 0720 

4 
Sample 


PPST Math 0730 

3 
Sample 
The following tables reveals similarities between our candidates’ performance on the CPPST and the PPST for each skill area on tests completed during the 20022003 testing year. National samples for each test are included in parentheses as a point of comparison. The final table, D.3, reveals the number and proportion of those tested whose scaled scores fell below Minnesota’s minimum qualifying score for each of the three skill areas.
Table D.2.1 Comparison of PPST and CPPST Scores in Reading; 20022003
Test 
Number 
Low Score 
Median 
High Score 
25^{th}75^{th} %tile 
PPST 
106 (64,019) 
163 (150) 
180 (178) 
188 (189) 
175182 (173182) 
CPPST 
39 (85,064) 
170 (151) 
181 (179) 
186 (187) 
177183 (174183) 
Table D.2.2. Comparison of PPST and CPPST Scores in Writing; 20022003
Test 
Number 
Low Score 
Median 
High Score 
25^{th}75^{th} %tile 
PPST 
105 (64,642) 
169 (150) 
177 (175) 
186 (190) 
175180 (172178) 
CPPST 
35 (79,891) 
170 (153) 
177 (176) 
184 (188) 
174179 (173178) 
Table D.2.3. Comparison of PPST and CPPST Scores in Mathematics; 20022003
Test 
Number 
Low Score 
Median 
High Score 
25^{th}75^{th} %tile 
PPST 
104 (65,609) 
161 (150) 
183 (178) 
190 (190) 
178186 (171183) 
CPPST 
37 (87,091) 
159 (150) 
184 (178) 
190 (190) 
178187 (173183) 
Table D.3 Comparison of Candidate Failure Rates for PPST and CPPST; 20022003.
Examination 
Qualifying Score 
Number and Percentage Below Qualifying Score 
PPST Reading 
173 
11 of 106, 10% 
CPPST Reading 
173 
2 of 39, 5 



PPST Writing 
172 
3 of 105, 3% 
CPPST Writing 
172 
1 of 35, 3 



PPST Math 
171 
4 of 104, 4% 
CPPST Math 
171 
2 of 37, 5 
M:\NCATE 2005\Praxis I Results 0001; 0102; 0203.doc