Fall 2008 Showcase
Assessment Workshop for New Faculty/Staff
- Betty Stevens, Student Learning - Assessment - Accreditation (.pdf)
- Briana Nelson Goff and Steven Hawks, Completing an Assessment Plan: Utilizing Available Tools and Resources (.pdf)
- Ray Yunk, Architectural Engineering and Construction Science
The Department of Architectural Engineering and Construction Science has demonstrated an outstanding method of assessment in both their Architectural Engineering and Construction Science & Management Programs. Although both programs are accredited, they cast a “broad net,” in the variety and thoroughness of their assessment methods, including the use of senior, alumni, and employer surveys, the use of program advisory councils, national examination results, employment/internship opportunities, as well as the traditional academic performance and course assessment. They appear to have a streamlined method of gathering the data, as well as a concise method of reporting the data and applying the outcomes directly to the programs.
- Steve Smethers, Journalism & Mass Communications
For consecutive years, the School of Journalism and Mass Communications has demonstrated the impact that "closing the loop" in assessment can provide. Incorporating the results of direct assessment procedures as well as student majors' indirect evaluations, faculty and administrators have responded with an ambitious re-envisioning of the program. They have enhanced their advisement process, and they are instituting an alumni survey. They have examined and improved the way courses are scheduled. They have revised their assessment methods, including development of more effective rubrics, better training for reviewers of the assessment results, and better explanation of the assessment process to the faculty as a whole. They have fostered greater dialogue among professors about how writing standards are enforced throughout the curriculum. They have developed of a new 500-level course to better address the needs of undergraduate majors. For their ambitious re-envisioning of the program in response to the results produced by assessment, we honor their achievements.
Shing Chang & Bradley Kramer, Industrial & Manufacturing Systems Engineering
This department demonstrated outstanding achievement through their development of a web-based paperless system in an effort to streamline the assessment process. Assessment data and SLOs are uploaded into an online system, which are used by department faculty. This technology is being considered by the University to develop into a university-wide paperless system for assessment.
Barbara Anderson, Interior Design, and Bob Garcia, Communication Sciences and Disorders
These programs are being recognized because of their efforts to initiate change at K-State. Many accredited programs have extensive annual reporting requirements by their external accrediting agencies, which include assessment of student learning. Just as an example, these two accredited programs have _____ and ____ outcomes that have to be met, and include individual student outcomes as well as program-level student learning outcomes. These two programs prompted the change in the annual progress reports at K-State to better fit the annual reporting being conducted for accreditation requirements. As a result of their efforts, an alternative annual progress report template has been introduced for accredited programs, to allow them to focus on the multiple requirements of the external accrediting agencies.
Paul Burden, College of Education
Although students demonstrated that they met the minimum standards on diversity-related SLOs, the College of Education enacted a professional development seminar series dealing with student diversity issues (e.g., poverty, special needs, race/ethnicity, ESL, etc.) to more comprehensively accommodate students’ needs and reinforce efforts in courses where diversity is the primary emphasis. This diversity initiative was attended by undergraduate, graduate, faculty and staff in the college. The college plans to continue monitoring student performance in this area to determine the effect these initiatives have had on student learning in the area of diversity.
Anne Phillips, English, and Irma O’Dell, Leadership Studies
In English, faculty formally have assessed five of the nine learning outcomes that they originally approved in 2003. In the five years that they have been practicing assessment, they have met regularly, throughout the school year, to assess the outcomes, ensure that they are being assessed in appropriate required courses, and share the assessment results. Working together, they have ensured that the courses and requirements are educating and benefiting K-State students; moreover, although faculty value and foster academic freedom, they are confident that students in different sections are having comparable learning experiences. We applaud the achievements of the School of Leadership Studies and the Department of English.
- Engineering College Assessment Review Committee (CARC)
The College of Engineering has had a long-standing assessment, evaluation, and continual improvement process associated with its undergraduate academic programs in accordance with ABET and other accreditation agencies. All but two of the College’s undergraduate programs are accredited by ABET. For the past decade this activity has been coordinated at the college level by the Program Assessment Task (PAT) Force. Each department has two representatives serving on the PAT Force. The College Assessment Review Committee (CARC) serves as a subset of the PAT Force. CARC has been instrumental in implementing the review process for the programs’ initial University assessment plans and their annual assessment report associated with the University assessment program. Consistency with the evaluation rubric and constructive feedback to the programs have been keys to the success of the overall assessment activities of the College. Therefore, it is with great pleasure that we can appropriately recognize the importance and valuable contributions that the College of Engineering’s CARC provides to the College’s and University’s assessment efforts.
CARC Members include:
- Gary Clark (BAE)* Chair
- Rodney Howell (CIS)*
- Dave Fritchen (ARE/CNS)*
- Dave Soldan (EECE)*
- John Schlup (CHE)*
- Shing Chang (IMSE)*
- David Pacey (MNE)*
- Hani Melhem (CE)*
Best of Assessment Showcase
- Breakout Sessions 1 - 3
- Ray Yunk, Architectural Engineering and Construction Science, College of Architecture, Planning & Design
The Architectural Engineering program is accredited by the Accreditation Board for
Engineering and Technology (ABET). The ABET criteria for engineering programs requires
that each program have an assessment process which documents that graduates have achieved
the educational objectives and program outcomes of the engineering program. Thirteen
Student Learning Outcomes (SLO's) have been determined to satisfy ABET requirements
adapted to the context of learning objectives for the program
- Steve Smethers, Journalism and Mass Communications, College of Arts & Sciences
The A.Q. Miller School of Journalism and Mass Communications is one of 112 journalism programs accredited by the Accrediting Council on Education in Journalism and Mass Communications, an organization that heavily emphasizes assessment. The Miller School faculty has adopted an assessment program that includes a list of eleven learning objectives related to practical and conceptual skills important to media-related professions. This presentation will focus on one learning objective, and how a combination of direct and indirect measures were used to improve instruction and make curriculum changes relating to one of those learning objectives.
- Shing Chang and Bradley Kramer, Industrial & Manufacturing Systems Engineering, College of Engineering
This presentation describes an effective outcomes assessment approach that does not add a lot of work to overloaded faculty and staff members. This system was developed by the industrial and manufacturing systems engineering (IMSE) faculty at Kansas State University (K-State) in a quest to improve the direct assessment of the industrial engineering (IE) program outcomes while decreasing the use of faculty time and effort to assure the quality of our programs.
There are two parts to this presentation. The first part describes a course-based assessment process and how information gleaned from the process is used to improve our program. The second part describes the automation of the process through the development of a web-based system that streamlines the collection of assessment data, facilitates the evaluation of assessment data, and automatically archives data, decisions, and responsibilities.
Breakout Sessions 4 - 6
- Barbara Anderson, Interior Design, and Bob Garcia, Communication Sciences and Disorders, College of Human Ecology
The programs of Interior Design and Communication Sciences and Disorders from the College of Human Ecology will present an overview of student assessment. The audience will learn important factors to consider when developing an assessment program and the positive outcomes for students and faculty.
- Paul Burden, College of Education
Program assessment data indicated that our teacher candidates needed stronger understanding of human diversity. As one avenue to address this, the College of Education offered a series of half-day professional development programs for all college faculty, instructors, and graduate assistants. An advisory task force was selected to make all arrangements for the professional development programs on nine dimensions of diversity: ethnicity, race, socioeconomic status, gender, exceptionalities, language, religion, sexual orientation, and geographical area. This session will examine program objectives, content and format, presenters, attendees, budget, and assessment of the program.
- Anne Phillips, English, and Irma O' Dell, Leadership Studies
This presentation will summarize the English Department's undergraduate Assessment program, with particular emphasis on the widespread and ongoing involvement of English Department faculty. Additionally, it will suggest different, concrete ways in which the process of Assessment tangibly and identifiably benefits students as well as faculty.
What a joy it has been to work with the School of Leadership Studies faculty on this assessment adventure. There have been ups and downs but we have all grown from this experience. Our story of assessment is a process. We have worked to develop assignments and rubrics that match our student learning outcomes. We believe in this current demand to establish a culture of evidence, we cannot gather evidence for the sake of gathering evidence. We need to use this evidence to improve leadership studies courses because according to Shavelson (2007) a culture of evidence will not automatically lead to educational improvement.