Element 2: Data Collection, Analysis, and Evaluation

Implementation. The components of the unit's assessment system focus on key dimensions of candidates' experiences in their licensure programs, offer useful information likely to strengthen those programs, and fit within the limits of time and energy available for managing the assessment system. Some planned methods for collecting information have not survived beyond their trial application, often because their development proved to be too difficult or their application demanded an investment greater than the value of the information they could provide.

We initially planned to extend the use of performance profiles to methods courses, but the classroom teaching time included in two credit, one-half semester courses proved to be too brief to provide to provide a meaningful sample of their performance or for faculty to judge their performance as our notion of a profile required. Anticipated use of teacher work samples was delayed, then set aside as guiding candidates to prepare useful samples presented a greater burden for instructors than anticipated.

Since 2009, however, more area schools have formed partnerships with the unit offer the opportunity to recast two credit methods courses as full semester courses with integrated clinical practica. These clinically-focused experiences provide candidates with more and longer opportunities to plan, teach, assess, and reflect on K-12 students' learning, individually and in teams. Building on the work of her predecessor, the unit's K-6 mathematics methods instructor refined a two week clinical at Saint Boniface Elementary School offering one week of observation followed by a week-long supervised teaching experience (EDUC 325). The unit's "Reading, Writing and Literacy" course (EDUC 347) places elementary candidates in area schools for a clinical experience extending through most of semester wherein they provide individual, small group, and class literacy assessment and instruction under the shared supervision of the course instructor and cooperating teacher (EDUC 347).  Collaborating with a partner school requesting help for middle level students with modest reading skills, unit faculty combined their mid-level social studies (EDUC 358SOC) and language arts pedagogy courses (358LA) to provide candidates a longer term clinical experience teaching reading skills to students in social studies and English classes. Candidates enrolled in world language pedagogy courses work in teams and individually to plan a two week language unit for which they teach and assess elementary and middle level students enrolled in two area partner schools.

These recent changes in how we use clinical experiences to prepare candidates for teaching middle level students may encourage a return to some form of work sampling, as many of its elements are employed in these courses. The unit elected to embed preparation for some tasks included in Minnesota's Teacher Performance Assessment in candidates' methods experiences, suggesting that a modified performance profile could be devised to contribute indicators of candidates' progress toward unit goals should unit faculty find it helpful.

In place of a summative indicator, the formative assessments included in all methods courses of candidates' emerging understanding of their licensure areas may continue to serve as a viable estimate for most candidates.  All candidates are observed and critiqued as they teach K-12 students in area schools for their methods courses. Literacy profiles developed by elementary candidates working with K-6 students in area schools for Education 347, Reading, Writing and Language Growth, the teaching resource binders developed by elementary candidates in their methods courses, and the curriculum projects that all candidates prepare in Human Relations (Education 390) along with the classroom management plans all devise for Issues in Education (Education 359) together provide formative screens through which candidates deficient in knowledge and skills cannot pass without resolving their deficiencies.

While attractive as an additional indicator of candidates' maturing knowledge of their subject matter, the future of a formal integrative experience in addition to student teaching, as described in the 2001, plan remains uncertain. We now have assessments of candidates who design an instructional unit and related lessons as part of their student teaching work samples. This indicator can estimate candidates' grasp of the "central concepts, tools of inquiry, and structures of the disciplines they are preparing to teach" (Framework; Program Goal 1). Arts and sciences faculty provide such experiences for many secondary level candidates. The developed of an integrative experience remains an attractive option for affirming candidates' knowledge in disjoined licensure areas such as social studies, but securing resources to design and assess such projects will prove challenging.

The unit has developed and is using an internet-based first year teacher survey to gather the views of graduates completing their first year of teaching. Unit goals form the core of the instrument along with questions exploring other themes of interest to the unit. Those graduating, licensed, and holding a teaching position in 2009, 2010, and 201l responded to questions focused on their approach to and skills in teaching diverse learners (Unit/MTLE Goal 3, NCATE Standard 4), the preparation for teaching reading following our embedding of new reading standards in nearly all licensure programs, and ways in which the unit might have improved their preparation for their first year of practice. While encouraged by our use of on-line over paper and pencil surveys, small samples reflecting difficulties in securing graduates' email addresses as they move to their new positions or change their family names upon marriage remain troublesome.

Using information gathered from graduates responding to their first year teacher survey, the unit has devised and will disseminate an internet survey of first year teachers' principals or supervisors in late summer of 2012.

We continue to use our candidate exit survey and interview, grounded in unit goals and completed by all elementary and secondary candidates since fall 2002, to gather their perspectives on areas of their licensure programs in need of review. The exit survey and graduate survey share some items, offering opportunities to look for confirmation of trends in our preparation of candidates.

The unit first used the Academic Profile to assess the academic skills of prospective candidates in 1986. This criterion-referenced test was developed by the Educational Testing Service and the College Board to describe students' college level skills in reading, writing, and mathematics as well as their knowledge of the natural sciences, humanities, and social sciences. The unit discontinued the Academic Profile in 2003 and adopted use of candidates' ACT and PPST scores as indicators of expected performance in reading, writing "mechanics," and mathematics. In 2010 Minnesota's Board of Teaching adopted new licensure examinations prepared by Pearson Education to replace the Educational Testing Service (ETS) Praxis tests of academic skills, content knowledge, and pedagogy. The unit now uses ACT scores as well as results from the hew Minnesota Teacher Licensure Examination (MTLE) academic skills tests in reading, writing, and mathematics to inform the selection of prospective teachers for acceptance as licensure candidates. Conversion from ETS to Pearson tests was not without difficulty which some attributed to flaws in planning and test design or the test vendor.

The unit replaced its Education Department Writing Assessment, first used in 1986, with a set of three writing assignments embedded in each section of its introductory course, Teaching and Learning in a Diverse World (EDUC 111). Completed by all prospective candidates, these embedded essays are anchored in the content and outcomes of the course, are written on topics common to all sections, and are scored by course instructors using a common rubric derived from the "six traits" approach to writing assessment. Despite some unforeseen variations in the testing protocol, preliminary reliability studies suggest merit in continuing to refine this replacement for our former writing examination. Modifications in scoring procedures and improved consistency of instructor's use of these three writing tasks should further improve the consistency of writing scores they assign. Correlation of scores of from our embedded writing exercises with prospective candidates' ACT writing score and their score on the state's new MTLE writing test reflected the divergent approach to developing and scoring an authentic writing sample. Further work should improve the unit's use of such indicators to identify prospective candidates who could profit from developmental work in writing.

Our directors of elementary and secondary student teaching continue to offer modest revisions of the unit's Student Teaching Performance Profile used to summarize the summative performance of student teachers in both programs. Profiles capture observed performance as described by standards-based ratings that used behaviorally anchored scales. Completed by college supervisors who observe and counsel our candidates during their semester long student teaching experience, the directors of student teaching review supervisors' findings on key unit and state standards. Flaws in some supervisors' ratings during the introduction of this summative review process long ago encouraged additional training in their use of this indicator. Consequently, all college supervisors are trained in the preparation of the profiles they prepare for their candidates.

While candidates have planned and completed units of instruction as part of their student teaching experience for many years, the systematic review of those units in the summer of 2000 encouraged our adaptation of the teacher work sample used in some teacher preparation programs. As the primary indicator of whether candidates can help all their students learn, the work sample is an important indicator in our assessment system as well as a significant part of our summative performance profiles.

During the 20011-2012 academic year the unit completed the first two of three phases in a review of the consistency of supervisor's ratings of candidates' performance. Unit plans, scored using rubrics focused by the generally similar requirements of the two programs, found high consistency among six supervisors reading and evaluating anonymous candidates' unit plans.  Conversation among scorers during this process revealed areas for improvement in the rubrics used for scoring candidates' work as well as in the guidance offered to candidates as they complete the tasks. The final phase of the study, to be concluded in the spring of 2013, will likely confirm findings in hand.

Minnesota's Board of Teaching has selected the Stanford/Pearson Teacher Performance Assessment (TPA) as an indicator of institutional performance. The unit began integrating the TPA with its teacher work sample in 2009. Candidates completed the four tasks that form the TPA portfolio tasks in the spring of 2010. Enjoying nearly constant revision to accommodate both Minnesota's licensure standards and Person's evolving scoring procedures, all of the unit's candidates again completed TPA portfolios in the fall of 2011 and the spring of 2012 and will do so again in the coming academic year, the last of two the state's two year "field test" of this technique.  Scores from state-trained TPA raters were not provided during the pilot year or the first field test year, although unit faculty helped score other institutions' TPA packets. Scores have been prepared by Pearson for TPA materials submitted by our candidates during the spring 2012 semester. Our unit score on each of eleven scales was "3" on a five point scale for 50 scored TPA packets. Given the limitations of this indicator reflect difficulties in scoring and scorer training, a limited three to five day "slice" of candidates' sixteen weeks of teaching, motivation of candidates to complete the required packets, and the TPA's reliance on self-reported and provided data.  Such difficulties may be resolved as the final year of the field trial is completed, but the value of the data provided by this costly indicator may prove to be modest for our assessment needs. We have added TPA to the unit plan in hopes that the information it provides may prove useful, but for not we have little to show for our and our candidates' investment in what they now regard as an unproductive addition to their student teaching experience.

In 2000 the unit consulted with the colleges' information technology specialists on the design of a diversity transcript that would capture the wide range events or activities related in some way to ethnic cultural, racial, or other forms of diversity. Records might show diversity-related plays or concerts attended, work with diverse students in field or clinical settings, or other ways of learning about the wider world around us. Incremental improvements in an initially flawed system have increased its usability. A diversity transcript is required of every prospective candidate completing the unit's introductory course.

The habit of recording such events beyond the first course has been hindered by difficulties in retrieving, printing, and analyzing the transcript.  Further modification is needed to make full use of the information transcripts could provide. We have considered requiring transcripts when candidates apply for acceptance as candidates but have not done so in deference to technological flaws. If accurately maintained, these self-report summaries could prove helpful in identifying needed curricular revision.

While the findings of assessment research driven by the unit's system were shared with the unit in the first year of its development, more extensive explorations of candidates' performance were hampered by changes in the technology used to develop and maintain the performance database that could capture, store, and examine useful information.

Prototypes developed in several software formats were partially successful, but became obsolete as our colleges' adopted new programs. Temporary "work around" solutions, while providing needed data, were both time consuming and unable to realize the full promise of the planned system. The "final" final prototype for our performance database, in use since 2001, continues to efficiently capture candidate data. While some difficulties in retrieving data have been overcome with the help of technicians from our colleges' shared Instructional Technology Services, new demands from our state regulatory agency and perhaps from our federal government may encourage the unit to purchase a commercial product that can meet increasing state demands for more data provided more often about all candidates using an as yet untested format for data management. Compliance with state requirements may require the purchase of commercial products designed to meet anticipated demands that would significantly alter the ways in which we gather and use assessment information.

Click the link below to access the Information Sources Table.

Information Sources Table