cwu home     site map         

Technology Education Program

1.   Please describe your program's assessment process and what standards you are measuring in relation to the NCATE and State standards of knowledge (content, pedagogy and professional), skills (professional and pedagogical) and dispositions. Is the system course based, end of program based, or other? Be sure to reference how the faculty in your program was involved in developing the assessment process. In addition, describe how the assessment of standards relates to the unit's and program's conceptual framework.

Program Interpretations and Conclusions:

In the entire major, there is only one course that is exclusively taken by technology education majors. Therefore it was decided that the assessment process would need to be an “end of program” type assessment. The state and national standards are very similar, so it was decided to set up an assessment system using the 11 state standards. A goal was to see if the courses and assignments within the core courses were doing an adequate job of preparing students. Therefore, each candidate must provide artifacts and a reflection for each of the 11 standards. One standard in particular (standard 9) requires students to show competence in five of seven areas in technology education. Since technology education is so broad, I knew this would be difficult to show, but wanted to see how students demonstrated their competence and in which areas they felt they had competence. This same conversation has taken place on a national level with other universities as to how broad the degree should be versus how much depth there should be. This assessment system is helping us see where our students are in this dilemma.

2. Below is an analysis of the frequency with which your program cites CTL, WA State Standards/Competencies, and/or national standards within your LiveText artifacts, rubrics, and reports. Please examine the charts and write your program's interpretations and conclusions based on the information provided. (e.g., Are the standards dispersed appropriately in your program? Are all the standards represented as you wish them to be? After reviewing this analysis are there changes your program would recommend making to the way you cite standards or assess your candidates using LiveText?)

Program Interpretations and Conclusions:

The standards shown in the table above are not quite correct. In IET 433 (S 10), CTL 1 should be referenced one time, CTL 4 one time, and CTL 5 four times.

IET 433 (S 9) should show that CTL 2 is referenced seven times, and CTL 3 should be referenced seven times.

If these changes were reflected in the table, it would show that the standards are dispersed more evenly. In the future, we would like to add additional assessments for the national standards to show the correlation to the state standards that are not reflected.

3. Below you will find one sample of your Live Text Report that identifies an aggregation of candidate learning outcome data. Please examine all of your reports in the LiveText exhibit area and discuss the accuracy, consistency, and fairness of the data, as well as what improvements could be made in the program assessment rubrics, courses, artifacts, or reporting. Include your interpretations relative how well your candidates are meeting standards. After examining all of your report data, list any changes your program is considering.

Program Interpretations and Conclusions:

Since there is such a small number of students in the technology education program, the data can easily skew conclusions. A year from now, the data will have three times the data (due to growth in program). The current data does show that the program is meeting the needs. The data is further supported by the student’s success rate of passing the West E.

If the program continues to grow, we would like to see additional courses made available in the areas of transportation technology and information & communication technology which would help candidates reach the target in those areas. The major does not require any coursework in construction either, but through proper advising students can take construction related courses (as a specialty course) or use industry related experience to help meet the standard.

4. Below you will find a chart of the CTL Standards aggregated by course. Please examine the data results and discuss any improvements if any you might consider for your program. Using these data, please reflect upon your candidates' success in meeting standards. Compare these data to the data provided in the WEST B and E charts that follow. Is there consistency in the rates of success? What do these data tell you?

Program Interpretations and Conclusions:

Again, the program is small and therefore the number of students represented is small. The end of program assessment did not have a candidate prior to May 07 and therefore no data for that time period. The program is growing, and therefore in the coming years will have more data to gather information from. If the successes of the students who have taken the West E are an indication, we are currently doing things well.


Please find below the West B data for the teacher residency program. Please use these data, the LiveText data, and the West E data found below to predict candidate success in your program. Given theses summaries, are there changes to your program or to the unit your program recommends the CTL consider?

  • Between 2005-2007, 49% of the candidates passed all three sections of the exam their first attempt, 84% passed the reading portion in their first attempt, 82% math their first attempt, and 65% passed writing their first attempt.
  • The mean number of candidates not passing reading portion is 11%, math 12%, and writing 25%.

CTL WEST B Data Summary 2002 to Present


Program Interpretations and Conclusions:

Program faculty are amazed and concerned with the relatively low pass rate on the writing portion of the West B. The data suggests the unit should examine when the students take the exam to see if proper coursework has been taken. The unit should also examine what remedial steps are being taken by the candidates before taking second and third attempts. The data suggests that every endorsement area, including non-traditional writing areas, should look at ways to incorporate writing into their curriculum at both high school and college levels.

6. The WEST E is administered by ETS as a state requirement for program Exit, measuring content knowledge by endorsement area. ETS has not sent the final corrected data summary at the time of this report, however, the data we keep on a continuously updated basis is described below in the following graph. The graph compares 2005-2006 and 2006-2007 data by endorsement area. We suspect the 2006-2007 data will change after all scores are received from ETS. According to this set of data, 2005-06 pass rates were 90%. Remember all candidates must pass the test to be certified, so they take it multiple times. We are working on authenticating a different process that will show how many times candidate take the test and when. The 2006-07 data indicates pass rates of 87%. If your program is one of those with a pass rate below 80%; what program recommendations are you considering that will positively affect the rate of passing the WEST-E for 2007-2009?

Program Interpretations and Conclusions:

The data for technology education for 05/06 appears to be incomplete as there were candidates who took the West E during that time. Data shows that the technology program is on the right track as there is a 100% pass rate, which occurred on the first attempt.


Please find below the EBI teacher and principal data for all program completers. Discuss and report in the space provided what your program recommends the unit should accomplish to improve overall satisfaction, or what your program is doing to improve the trend.

  • This survey is administered through OSPI and is contracted through Educational Benchmarking Inc. These data are collected for all new teachers in public schools by surveying new teachers and their principals.
  • Response rate average over the seven years n=105
  • The graph represents a seven year average satisfaction trend by category
  • Highest satisfaction ratings are in the areas of:
    • Student learning
    • Instructional strategies
    • Management, control and environment
  • Lowest satisfaction ratings are in the areas of:
    • Reading skills
  • 5 year Principal responses followed similar patterns as teachers n=41


Program Interpretations and Conclusions:

Incremental improvements are being made in the data shown, which indicates some progress is being made. However, the data shows the unit should be discussing strategies to improve further.


Please find below first year and third year teacher survey results summarized by graphing mean responses for each question.

  • This survey is administered by CTL and data trend summary represents 2004-07
  • The average response rate for 2004-2007 is 15%
  • First year teacher N= 375, Third year teacher n =200
  • The graph and subsequent ANOVA demonstrates a significantly higher average satisfaction rating from first year teachers when compared to third year teachers (p<.05)
  • Highest satisfaction ratings are in the areas of:
    • Subject matter knowledge
    • Application of EALR's
  • Lowest satisfaction ratings are in the areas of:
    • Classroom management
    • Involving and collaborating with parents

Program Interpretations and Conclusions:

This data is difficult to interpret due to the differences in N. At first glance, it would appear that the mean score is declining after the third year, but since the number of respondents is close to half of the first year surveys, the data in unclear whether there is truly a drop.

It is not surprising (as previous data shows) that subject matter knowledge received a high score. An explanation for a lower score in classroom management suggests that candidates likely experienced “real” situations that were difficult to recognize and control. Information on collaborating with parents might indicate an area of concern in the professional sequence.


Please find below a comparative analysis of candidate dispositions from beginning candidates to finishing candidates. Please comment on the changes you observe in your candidates over time and describe how and why you think this occurs. What does your program specifically do to engage candidates in developing professional teacher dispositions?

  • This inventory is administered by the CTL at admissions (N=645), and again at the end of student teaching (N= 195). Some of the 645 candidates have not yet student taught, which is why the n's are different.
  • There is a significant difference in 12 of 34 items (p<.05) between beginning candidates and candidates completing student teaching
  • Change is in the preferred direction from agree to strongly agree
  • This means somewhere between entry and before exit, the teacher program candidates are developing stronger professional beliefs and attitudes that reflect the underlying values and commitments of the unit's conceptual framework. Future work will include data that tells us where this change is occurring and if there are difference caused by demographic variables. If you want to read more about this disposition instrument, the validation study is published on the OREA web site under research.

Program Interpretations and Conclusions:

Cannot currently interpret data as link is not active.



Final Student Teaching Evaluation Report on LiveText

  • The data report is too large to be placed in this document. Please access the data by going to this link on our assessment system web site
  • The report reveals the final assessment of elements found in state standards IV and V
  • Candidates are generally performing at a high level, although there are some candidates as depicted by the colors green and red who are not performing to standard.
  • Examination of those elements indicates some agreement with results provided in the 1st and 3rd year teacher survey.

Please look at these data carefully and discuss with your program faculty some ways the teacher residency program can begin to address the few but common deficits occurring in candidate knowledge and skills relative to the State standard elements. If you need to refer to state standards please refer to this link in the assessment system website:


Program Interpretations and Conclusions:

Data suggests the professional prep. program is satisfactory. Preparing for the social interaction i.e. classroom management, parent communication, etc. correlates with new teacher in-service evaluations. Different pedagogy is required. As a student teacher, one incident of a frustrating experience will distort or drive down data (negative experience is what is remembered).


Please examine these data and report any discussions your program has regarding the reported results.

  • This survey is conducted by Career Services and reported to OSPI. The report, however, has been reanalyzed and the summary reflects the new analysis, which covers 2002-2006.
  • Average response rate = 57%
  • Of that 57%, the average percent of graduates who get jobs in state is 94%
  • The average percent of graduate still seeking a position is 27%
  • Two percent of the 57% have decided not to teach
  • For 2005-2006; 35 % of the program graduates responded to questions regarding ethnicity and gender. Out of the 35% who responded, 90% were Caucasian, 5% were Hispanic, 3% were African-American, and 1.8% were Asian.

Program Interpretations and Conclusions:

The data is interesting to say the least. In certain fields (most CTE programs) there are more jobs than there are candidates to fill. In technology education, schools are actually closing their doors because they cannot find teachers to fill the positions. OSPI always seems to find a way to create "other methods of certification" to get industry folks certified to teach. The candidates in the table who are not employed still seeking a teaching position, are not likely CTE teachers.






© Central Washington University   |   All Rights Reserved