Contributing USMA Research Unit(s)
Electrical Engineering and Computer Science, Robotics Research Center
American Society for Engineering Education Conference (ASEE)
Program outcome assessment is an integral part of systematic curriculum review and improvement. Accrediting commissions expect each student to achieve program outcomes by the time of graduation. Programs undergoing accreditation must have an assessment process that demonstrates program outcome achievement. Documenting and assessing just how graduates are meeting program outcomes can become a tedious and data intensive process. We report on our “assessment” of our assessment process that resulted in more streamlined procedures by targeting performance indicators. Our methodology included the development of a learn, practice and demonstrate model for each outcome that focuses performance indicators at the appropriate point in development. We target actual outcome achievement during the “demonstrate” phase with rubrics to detail the level of mastery on a modified Likert scale.
We originally used seventy-eight embedded performance indicators spread throughout the curriculum. We reduced to thirty indicators using a mixture of internal and external measures such as individual classroom events and fundamentals of engineering exam topical area results. We also emplaced guidelines targeting a single outcome measurement per indicator. For example, in our capstone senior design course, virtually every assignment was being reviewed by one of our outcome monitors. By targeting performance indicators at specific sub-events and looking at those which had to be assessed during the course versus indicators assessed by advisors or senior faculty, we were able to reduce the embedded performance indicators by a factor of three. We applied similar techniques to reduce individual course director workload. We have found that by streamlining the outcome process and using a rubric approach applied across multiple outcomes, we can greatly reduce the number of performance indicators yet preserve our ability to accurately assess our program. Reduced workload assessing the program has enabled us to place more effort into improving the program.
Sadowski, Robert; Shay, Lisa; Korpela, Christopher; and Fretheim, Erik, "Assessing the EE Program Outcome Assessment Process" (2007). West Point Research Papers. 13.