Summary of Annual Progress Reports on Assessment of Student Learning - September 2006

Connie Manzo and Cia Verschelden, the Office of Assessment 
Based on the College Summaries of Annual Progress Reports (2004-2005) 
submitted to the Provost in June and July 2006

I. Introduction

By the time of the NCA Focused Visit in February 2005, almost all of the undergraduate and graduate degree programs had submitted three-year assessment plans (2005-2007). As noted in the self-study report, “a great many programs are at the point of just beginning to collect data. Thus, many units – especially those that do not have relationships with accrediting agencies – have not yet entered into a feedback/exchange mode of continuous improvement.” Since then, the departments and colleges have begun to implement their assessment plans and many programs have begun to develop/pilot their assessment instruments or to collect data. Most of the degree programs expected to collect data in 2005-2006 and some have baseline or preliminary data from 2004-2005. As departmental faculty have discussed the results, they have proposed changes in curriculum, teaching approaches, and departmental practices, and suggested refinement of assessment instruments and processes. The degree program Annual Progress Reports (APRs) on Assessment of Student Learning (ASL) cover calendar year 2004-2005 academic year, though some reports included activities through Fall 2005. Thus, the issues commonly being addressed and reported by the degree programs relate to the earlier stages of the assessment process.

II. Undergraduate Degree Program Assessment

College Deans provided a summary of the degree program APRs in their respective colleges and submitted them to the Provost in June 2006. The college summaries and analysis provide much of the basis of this report relating to the status of progress in each college.

A. College of Agriculture

The College of Agriculture Assessment Review Committee (CARC) reviewed the assessment reports of the 13 undergraduate majors in the College. As most of the assessment plans in the college were approved in Spring 2005, the degree programs have just begun to implement their assessment plans. Many departments are developing or have developed course-embedded assignments, lab reports, tests, exams and quizzes as the vehicle for assessment. In addition, departments are using samples of student work, senior exit surveys, alumni/employer surveys and internship projects as evidence of student learning. Many assessment plans in the college assess degree program student learning outcomes (SLOs) in various courses across the curriculum. With clearer alignment of the SLOs and the measures that assess them in these courses, this strategy can help determine progress in student learning through the program. The agricultural education program seems to be the most advanced in the use of rubrics and student portfolios in the assessment process. The department could possibly take a lead role in rubric development across the college. The horticulture program is noted by the

CARC as the most advanced in the use of pre- and post-tests to assess their SLOs as they have developed a local test that is given to first-year and junior/senior majors.

The reporting process was an opportunity for the faculty to review their assessment measures and for the CARC to provide more feedback to refine their assessment plans, especially relating to the validity and reliability of the measures and the appropriateness of these measures to assess the SLOs. The CARC recommends that questions in course-embedded tasks be reviewed and defined more clearly. This is to ensure that the selected questions accurately assess a particular SLO to provide better indicators of specific strengths and weaknesses in student learning. The CARC also suggests further review for some degree programs regarding whether a large number of courses are needed to do the assessment for all the SLOs or if a capstone course would provide a more appropriate and manageable setting in which to assess some of the SLOs. The development of common rubrics to assess some outcomes (e.g., communication skills, critical thinking) is highly recommended to be able to standardize the assessment across multiple instructors and courses.

To support these directions to improve the assessment processes in the college, more examples of high quality assessment tools such as effective rubrics were requested by departments. Examples of effective reports and strategies to implement changes based upon assessment results will also be helpful.

B. College of Architecture , Planning and Design

A primary focus of the College of Architecture , Planning and Design this year was making program modification and major college-wide curriculum changes as the degree systems move from the baccalaureate to the master's as their first professional degree. In the landscape architecture program, the SLOs were reviewed and discussed by the faculty and the CARC recommends that the “assessment plan be amended accordingly to more clearly reflect the new curriculum nomenclature.”

Samples of student work (e.g., drawings, graphics, written presentation) in portfolios compiled from previous semesters and courses or in the form of studio and class assignments are the common direct measures used by the departments in the college. These are meant primarily to assess knowledge, communication, design, and critical thinking skills. Grading rubrics have been used by the Interior Architecture and Product Design (IAPD) program to collect some baseline measures and the rubrics will be presented for faculty review. Exhibits by students are reviewed by faculty and outside critics. These are coupled with student self-assessments. The Architecture program reports some improved student performance and fine-tuning of studio assignments from the assessments collected to date.

The CARC commends the descriptive IAPD report and recommends that future reports from the other departments be more informative and unambiguous for a better understanding of the assessment plans, especially consistency between outcomes, measures, and results. The CARC also suggests that to establish an ongoing culture of assessment, departments be reminded early in the year to execute the plans that it has reviewed in the past year. The reporting process was not very clear to the committee, e.g., “whether reports were to come through our committee or simply be forwarded directly to the graduate school once the changes in nomenclature have been formally approved.” There seems to be the need for better and more timely response to clarifications requested by the colleges.

C. College of Arts and Sciences (CAS)

The CAS CARC provided a comprehensive summary of the progress of assessment in the college. As the college includes diverse disciplines, various assessment methodologies are being implemented: assessment of senior student majors' skills in capstone courses, where rubrics are being developed or have been developed, pre- and post-test strategies (primarily in the social sciences) to determine knowledge and abilities in both lower-level and advanced courses in the major, and the continued use of indirect measures like exit interviews and surveys. Since this is the first year, most departments are in the midst of evaluating their assessment tools as they implement them and gather preliminary data. Theater faculty are piloting work with student portfolios to determine how best they can guide their students and evaluate student work. As noted by the CARC, some departments are already proposing or making specific changes to their curricula, teaching practices or advising/program presentation as a result of their assessment activities. Examples of “closing the loop” from the CAS are below:

Geography Outcomes: Graduates will be able to…1) interpret maps and use them to solve geographic problems; 2) comprehend and associate geographic patterns at various spatial scales; and 3) understand the processes and patterns of the physical world and how human actions impact and interact with national systems . Exam questions and lab exercises in classes were scored and senior majors were interviewed by faculty. In response to assessment results, “…a lower-level Geographic Information Systems course has been established and a capstone course has been created,” both of which will be required of all geography majors.

Journalism and Mass Communication Outcomes: Students will be able to… 1) conduct research and evaluate information by methods appropriate to the communications profession in which they work; and 2) write correctly and clearly in forms and styles appropriate for the communications professions, audiences and purposes they serve. After scoring student work from several classes, “…faculty have identified areas of concern and are developing approaches for addressing them. More attention to statistical tools and basic grammar are among the curricular adjustments; other areas of concern include advising (for which the department has adapted its web and printed program information and distributed additional information to faculty advisors).

Sociology Outcomes: 1) Sociological imagination: Appreciate the connection between individuals' personal troubles and social problems. 2) Structural inequality: Understand structures and processes of local and global inequalities across dimensions such as race, class, and gender. After offering an on-line pre- and post-test to students in introductory courses and getting a poor response rate, the faculty have revised the tests and decided to give them in class, not only to introductory classes but to theory and methods courses as well. “In addition, faculty are working on a mechanism that will track the progress of majors and link the pre-/post-test results to lists of students by curriculum. As a result of these assessment procedures, faculty have begun to discuss changes in the way that they teach the introductory courses. For example, more regular faculty will be involved in these courses; graduate students will participate in a teaching seminar before being allowed to teach their own courses.

Best practice in terms of high faculty involvement is observed in the English, Psychology, Speech Communication and Theatre. The Chemistry and Theatre departments have linked their assessment procedures to professional association expectations. The CARC noted that the department assessment reports of Chemistry, English, Geography, Mathematics, Music, and Philosophy are well-focused, clear and specific, and could be shared with those departments.

In most cases, the CARC found that the degree programs are implementing effective assessment strategies and that the reports are thorough and clear. There are a few departments that indicated that personnel issues and/or heavy University demands of their time and resources (multiple job searches, for example) have delayed the implementation of their assessment plans. To improve the reports, the CARC suggested more clarity in the descriptions of outcomes and measurement strategies. In addition, more opportunities need to be provided for all faculty, not only the instructors involved in the specific measure or strategy, to receive information about and give input into the assessment process. Some departments need to include more direct measures.

To facilitate communication between the CARC and the departments, the CARC requested a detailed, multi-year calendar for deadlines/expectations so that they can remind the departments ahead of time of the upcoming deadlines from the Office of Assessment. They are also asking if they can get help from the Provost to compel the ‘recalcitrant' departments to comply with assessment expectations, e.g., submission of reports.

D. College of Business Administration (CBA)

Locally developed knowledge questions and tests are the common direct measures used by the various departments in the CBA. Most departments use these tests to measure value added in the program in the form of pre- and post-tests, e.g. comparing 1 st -year and senior cohorts or entering majors and students in a capstone course. This methodology can give useful information to track improved learning as students progress through the program. The strategy was initiated by the Department of Accounting and was shared with other units in a faculty meeting. The Management program now has results and will do item analysis to fine-tune the test. The Marketing program, to plot progress, is assessing various SLOs at many points across the program through required courses at different levels. Additional measures used to assess other SLOs are case studies, videotaped student presentations, papers, and exit surveys.

The college has established very helpful protocols to facilitate the assessment process within the college. These include the incorporation of discussion of assessment practices at its annual faculty retreat in August, presentations by faculty and monthly meetings to review progress and goals on assessment at the department level and the involvement of business executives at annual Department Advisory Council meetings to provide input on the department SLOs and assessment practices. This reflects a collective college effort to sustain and incorporate assessment into the culture of the degree programs.

The college is eager to keep up the momentum through the next year and would like to develop websites within each department to capture past and current information and results. This will help track progress and actions based on past results or decisions. The resource would include all SLOs of the programs and courses, together with the matrix of logical links between course, department, the CBA and the university SLOs. The CBA has developed its college SLOs that, together with the university SLOs, serve as a basis for the development of degree program and course SLOs. The Department of Marketing has put this type of information on its website; such readily accessible information will be helpful in their reaccreditation process.

E. College of Education (COE)

The Elementary (EDEL)and Secondary Education (EDSEC) programs assessed the same SLOs and used similar assessment methods. As with many of the degree programs in other colleges, the focus for this year was to begin to implement the assessment plan and identify measures, develop assessment instruments, and collect preliminary data. The three SLOs assessed were those aligned with the Kansas Department of Education Professional Education Standards and the COE Conceptual Framework: diversity, critical thinking and reflection, and professional integrity. Using current direct and indirect measures, data are drawn from the student teaching portfolio, student teaching final evaluation, performance on the nationwide standardized Principles of Learning and Teaching Praxis test, and a senior exit survey.

Faculty in both departments reviewed the data and reflected on the findings. The programs report satisfactory results from the initial findings, although the pass rates and mean scores for EDEL are different from those for EDSEC. A possible source for this gap is that EDEL interns have more support in completing portfolios as they are often placed in Professional Development Schools with onsite Clinical Instructors while most EDSEC interns are not. Discussions with the Unit Assessment Committee suggest a concern regarding the inter-rater reliability of some of the instruments and this will be addressed in the future. Improvements are suggested in the student teaching final evaluation (EDEL) and the updating of the senior exit survey for both programs. The college would like to ensure consistency, accuracy, and fairness of the assessments and will investigate them more in the coming year to provide a better basis for future data-driven decisions.

As the degree programs also have a history of accreditation, they are requesting to be allowed as much flexibility as possible so that data can be collected and submitted to multiple agencies. They believe that the colleges need more resources for collecting, managing, analyzing, and reporting data.

F. College of Engineering

An essential factor that influences the assessment processes in the College of Engineering is the accreditation of the undergraduate degree programs in the college by various accrediting agencies. In 2004-2005, the Accreditation Board of Engineering & Technology (ABET), Inc. conducted an accreditation visit to all but two of the undergraduate degree programs. A significant achievement was the close linkage developed by the degree programs between the SLOs and the assessment plans and the program outcomes associated with ABET's accreditation plans and professional accreditation criteria. As assessment of the SLOs dovetails into the program accreditation processes, improved student learning would be a core element for improved academic programs.

Also partly driven by accreditation processes, the degree programs assess more of their SLOs during this assessment period (the requirement was two to five) and use various direct and indirect measures to assess student learning and the program itself. Most of the direct measures used to assess the SLOs are embedded in the classroom, such as specific exam questions, laboratory reports, team experiences, capstone experiences, and project reports and presentations. These are used to measure discipline-specific knowledge, abilities and skills, including critical thinking, and oral and written communication. Some degree programs, such as Biological and Agricultural Engineering and Chemical Engineering, have established metrics to align these measures with each SLO (e.g., how specific exam problems, not the overall performance on the exam, relate with each SLO). Determining how accurately a measure assesses a specific SLO is especially important in these cases when a degree program is monitoring both evidence of student learning and evidence of program outcomes. Performance in professional exams such as the Fundamentals of Engineering Exam and student design competitions, senior exit surveys, advisory board feedback, employer feedback, and graduation placement data are also used as complementary or additional measures to assess student learning and/or provide program evaluation and feedback.

Various departmental groups, such as instructional program teams, student teams, industry partnership teams, and departmental assessment committees, have been formed and assessment results and activities are discussed at regular faculty meetings. The departments have collected data on student learning and reported results, and some are implementing changes as a result of their findings. Course revisions or curricular changes are being proposed in the Architectural Engineering, Construction Science and Management, Civil Engineering, and Mechanical Engineering programs. In the Chemical Engineering program, oral and memo reports and redesigned computation techniques were added, together with earlier training in these areas. A new Language Laboratory course was created and changes in a course pedagogy were made in the Computer Science program. In the Industrial Engineering program, the assessment plan was fine-tuned after evaluation of results.

An example of “closing the loop” from the College of Engineering :

Chemical Engineering Outcome: Students will use techniques, skills, and modern engineering tools. To assess learning, work from several required classes was scored and students, employers, and alumni were asked about their perceptions in appropriate surveys. Employers and alumni indicated a need for more computer skills. Students in an upper-level course did poorly on a key task. Mean responses of seniors was 3.7/5.0. Action: Computational technique courses have been redesigned to promote 1) early learning and 2) subsequent utilization of techniques.

In order to further improve the college's measurement techniques, evaluate the assessment results, and implement changes that ensure improved student learning, the college is requesting more support from the Office of Assessment in the form of periodic updates on innovative assessment techniques, periodic training sessions on assessment for faculty (especially new faculty), an improved rubric to be used by the CARCs for evaluating the Annual Progress Reports and more specific and pertinent questions to be addressed in the College Summary of the Annual Progress Reports (e.g., relevant to first-year implementation).

G. College of Human Ecology

All of the degree programs except one collected baseline data in 2004-2005, direct measures were predominantly used, and all faculty in the departments were involved in the discussion of results and evaluation of the data. Widespread faculty involvement indicates a high level of ownership of the assessment process and faculty commitment within the college. Based on the initial results and findings, many degree programs are proposing changes in the assessment methods to better align them to the SLOs being measured and possible revision in pedagogies to provide better learning opportunities for students to improve their learning on specific areas that are being identified as possible weaknesses. Attention to the improvement of alignment of measures with SLOs helps obtain more useful information to guide improvement changes. The faculty has taken this opportunity to reflect on specific changes to address weaknesses in student learning and in the assessment methods and instruments. As these are baseline measures, more data will be collected to compare with future semesters, although preliminary results indicate that students meet minimum criteria set by the programs.

Most of the programs assess many SLOs in various courses, while the Personal Financial Planning department chose to assess one SLO in a course of graduating seniors. Some of the measures used are lab tests and projects, exam questions or questionnaires, team experiences, design and class projects, case studies, and other class assignments. The Human Nutrition and Athletic Training programs plan to track students through future courses so that they will have longitudinal performance data for some SLOs.

An example of “closing the loop” from the College of Human Ecology :

Apparel and Textiles Outcome: Students will be able to identify generic fibers, yarn structures, fabric structure (including carpet), finishes, and methods of coloration. Students were assessed on fiber identification using an unknown fibers lab test. The results – 82% of students scored 70% or higher – surpassed the performance standard set by the department. Even so, the faculty are using the results to improve student learning: “Although the majority of students passed the lab, 17% failed it. The students had the most difficulty getting the fibers in focus under high power on the microscope. The instructor plans to check every student's slides during the first two labs on samples of known fiber content to make sure that they are really learning the skill and not depending upon their lab partner for help. Hopefully, this tactic will improve performance when they have to conduct experiments alone.”

The college has an assessment coordinator and the one-on-one meetings with the coordinator, department chairs and faculty were reported to be very beneficial in identifying specific areas in the individual degree program assessment plans and reports that need to be improved as well those that are done well. The forum also provided a chance for everyone to ask questions and get guidance for future improvement.

H. College of Technology and Aviation

The Professional Pilot and Aviation Maintenance degree programs at the Aviation Department are using the Federal Aviation Administration (FAA) national standardized exam as one of the direct measures to assess student learning in their programs. The have collected data for a few years and pre-testing of new 1st-year students is planned for next year in the Aviation Maintenance degree program. As with most of the degree programs in the college, grading rubrics are also being developed to assess mainly knowledge, written communication skills and teamwork. Assessment of diversity is being pursued by the Profession Pilot program with a study of the Tuskegee Airmen and a survey instrument to assess diversity is being developed in the Mechanical Engineering Technology program option.

The College is fortunate to have a Writing Center that is assisting some faculty in developing rubrics to assess written communication skills through student writing assignments. ABET's accreditation review of the Engineering Technology program and options resulted in commendations for the assessment plans and the program has begun to implement these by developing assessment instruments; data collection was to start in the Fall of 2005. As this is only the first year of the implementation of the assessment plans, only a few departments have cited some preliminary findings. The Electronic and Computer Engineering Technology program, for instance, proposed that for several technical topic areas, additional homework and quizzes were needed to ensure satisfactory student performance.

An example of “closing the loop” from the College of Technology and Aviation:

Aviation Outcome: Students will demonstrate communication skills and apply these skills in the aviation environment . Out of 15 students, the department set a goal of 14 of them scoring 85% or above on the assessment of their oral presentations. Less than 14 met the standard. The faculty identified the specific areas in which students scored poorly (e.g., closing , audience rapport, visual aids) and will caution students regarding these factors when presentations are assigned. They also modified the rubric that students use to evaluate their presentations so that these key grading points were emphasized. A similar process was applied to a written communication assignment.

The College CARC plans to provide more guidance to ensure more uniformity in the reports, which currently range from ‘too sketchy' to ‘too long and detailed.' As the degree programs move toward executing their assessment plans, the committee also hopes to support the college in making assessment more fully integrated into the culture and not a seasonal activity at reporting time. To help improve their assessment program, the college feels that examples of ‘best practices' from other colleges will be very useful. These could be shared on the office website, through workshops/seminars involving both campuses or through presentations to the Salina faculty.

I. College of Veterinary Medicine

The main measure for assessment in the D.V.M. program is performance on the North America Veterinary Licensing Examination (NAVLE) and state licensing examinations. Additional indirect measures come from feedback on student performance and new graduate performance through focus groups and student, alumni, and faculty surveys. Feedback from alumni and general performance of students on externships and within the veterinary teaching hospital give the faculty an idea of whether or not the students can function competently as professionals.

III. Graduate Degree Program Assessment

Given that most of the programs had only limited data from 2004-2005, the CARC modified the rubric for evaluating the progress reports and mainly audit the process, e.g., whether the assessment plans had been implemented, data were being collected, etc. In general, the assessment plans are being implemented and are underway. Many of the programs are developing or have piloted their assessment instruments and those programs with preliminary data have small numbers of students so that results will have to be aggregated over several years to be meaningful as a baseline.

Among the common SLOs assessed in these programs are the abilities to demonstrate advanced knowledge and understanding of the theories, principles and literature in the specific disciplines; apply theories or models to practical, real-world situations or to implement designs; synthesize and critically evaluate information; solve problems using computational and analytical skills and discipline-specific methods; carry out independent research utilizing the appropriate research methods, data analysis, and interpretation (some with the purpose to contribute to original research) and demonstrate written and oral communication skills. Some programs assess working with disciplinary teams and understanding how the discipline is shaped by diverse points of view, contexts and cultures. Almost all of the graduate programs use capstone experiences and examination of student work, such as preliminary exams, theses, research proposals, and written or oral defenses, to assess the SLOs. Lopez (1997) observed that these experiences provide students effective means to integrate what they have learned in the program and they are useful as assessment tools if “structure and content is clearly linked with SLOs of the degree program' and ‘standards used for evaluating student learning during and upon completion of a capstone course or project are well documented.” Using common scoring rubrics, faculty are able to assess student learning of several outcomes through these integrated projects.

Some degree programs, for example, the English, Entomology, Horticulture, Human Nutrition, Counseling and Student Development, Curriculum and Instruction, Kinesiology, Biological & Agricultural Engineering, Civil Engineering, Mechanical & Nuclear Engineering, Geography, Economics, have defined or operationalized their SLOs more clearly by breaking them down into smaller, more descriptive components. This facilitates more consistent understanding of what specific knowledge, skills or behavior are expected from their students. The process also makes it easier to link their measures to the SLOs and further construct scoring rubrics. Clearer definitions and differentiation of the levels of performance between students may be needed for those programs that use “% passing” as the primary measure to better determine specific strengths and weaknesses in student learning. As most of the capstone experiences involve Graduate Committee members, scoring rubrics are being developed to gather feedback from faculty. To aid faculty in the interpretation of the various levels of rubrics, some programs, such as the English and Entomology, anchor the various levels by providing examples of reports or types of manuscripts at each level.

Some departments have assessment committees that initiate and facilitate discussion of assessment plans, activities or findings. Other departments meet regularly as a faculty group – these seem to demonstrate more collective effort, analysis, and action, which can result in more effective assessment processes and proposed changes. Examples of the more engaged model are in Plant Pathology, Horticulture, and the Counseling and Student Development.

The Graduate CARC observed that programs that have been involved in assessment of student learning for some time have the most fully-developed assessment plans and processes for collecting and using data; the MS and PhD programs in Mechanical and Nuclear Engineering and Physics were specifically commended.

The implementation of the assessment plans and discussions among faculty have made some of the programs realize the need to revise some parts of their assessment plans and/or refine some of their measures. Even in light of only preliminary data, some programs have ideas about areas that need better alignment, more focus, or that need some adjustments in approach as they continue to implement assessment activities.

As some of the degree programs may have to revise their assessment plans, there might be a need to have them submit revised plans so the CARC and the Office of Assessment will be aware of the most current plan. There is also a need to review the Graduate School CARC process to ensure continuous monitoring of the graduate assessment processes but at the same time be more manageable for the CARC members.

_____________

Lopez, C. Opportunities for improvement: Advice from consultant –evaluators on programs to assess student learning. NCA-CIHE. March, 1996, p. 13