Measuring things is a part of everyday life. We do health screenings to determine how healthy we are, with the results being used by a physician to recommend lifestyle changes or to alter prescriptions. We evaluate how well players handle themselves in a game and then make adjustments at the next practice. Teachers use homework assignments, projects and exams to evaluate whether students are learning the course material. We monitor gas mileage with our vehicles for indications of inefficiency, which will result in a visit to the mechanic to improve vehicle performance. This process – planning, evaluating, adjusting – is called assessment.
Assessment is not about making reports, sending reports, or accumulating reports, but rather the evaluation of our effectiveness according to our purpose and goals. Some questions to be asked: Are students really learning what we say they should? Are the degree programs really preparing students for their fields? Are the service units of the university really providing appropriate support for the students and the rest of the university community?
Additional questions naturally follow: How do we know? What do we use to measure effectiveness? What do the results mean? What changes need to be made to reflect those results and meanings?
The reality is that ALL OF US ALREADY DO ASSESSMENT ALL THE TIME, particularly on a personal level. For the university, however, we must go beyond ourselves and assess at the course level, program level (degrees) and unit level (services). This is not the domain of only administrators; all of us must be involved in assessment at these levels, since we are all partners in providing a quality education for our students.
Each academic and service unit determines its own assessment details, which varies greatly across campus. The questions, concerns, goals, measurement tools and evaluations need to be useful, relevant and meaningful to the unit. Assessment is not about creating paperwork or compliance to an administrative directive – it is about making improvements in what we do.
So, here are the three P’s of assessment: PRACTICAL, PARTICULAR and PURPOSEFUL. Remember these words when assessing your courses, programs and services!
You are not alone -- the Testing and Outcomes Assessment Office provides helpful support and guidance, and the University Assessment Committee exists to lead and support university-wide efforts. We are here to help, to consult and to encourage you in improving and refining your assessment process. Please contact me or the Testing Office if you have questions or concerns.
Dr. Scott Carrell, on behalf of the University Assessment Committee
Success Stories in 2013-14
Database & System Administration
In June 2013, multiple hardware failures caused the loss of a large amount of user data (about 8 terabytes) along with our primary data backups, which were all stored in Harding's main data center on the Searcy campus. We were forced to recover the lost data from our remote data backup located in the Oklahoma Christian University data center, a process which took about a week to complete because of the long distance and limitations in internet bandwidth between our two campuses.
This experience prompted us to analyze how we could more quickly restore large amounts of data if a similar data loss were to happen again. As a result of our analysis, we have added a second set of local data backups in a different building on the Searcy campus. Now, if we have a localized failure in either building, any lost data can be quickly restored through the local network from the surviving copy of the backups in the other local building. We would only need to rely on our remote backups at OC in the event of a major disaster that compromises the backups in both of the local buildings.
Jon M. Wrye, Sr.
Manager, Database & System Administration
At the time our department was challenged by the administration to obtain National Recognition by our SPA, there were only 13 institutions across the US that had been able to achieve this status. Harding grads had always ranked number one in performance on National Exams in both French and Spanish (NTE and Content Praxis) at the state level but those assessments were not sufficient. Therefore during 2008-2009 we began assessing each student orally as a 20% minimum component of each class grade. We created a backward design curriculum for the entire department with what we felt were realistic stair-steps for each level. However, after the first year, assessment data demonstrated that we had set the norms too low for the beginning levels and we were able to adjust accordingly. An Advanced Conversation class and 200-level conversation labs were also added to support oral assessment. Additionally, data from pretests in our Senior Seminar showed critical needs for upper-level students and drove changes to the course content, as well as modifications for content in all upper level courses. Each upper level course must now demonstrate a minimum of a 60/40 split of course content (literary, culture) and proficiency skills (oral, reading, writing, listening, grammar, vocab). Based upon this data and the scores from our nationally administered exams, we designate a focus area for each subsequent year.
This approach to assessment has yielded dramatic results without weakening the quantity of any course content. As an example, results on the pretest in the Spanish Senior Seminar have jumped from an overall class entry performance in 2009 of 60% in skill-based tasks to an average of 77.9% in Spring of 2014. Furthermore, during the 2014-15 academic year, the department is targeting improvement of listening comprehension levels due to data from the 2014 senior exit exams in French and Spanish. With regard to the oral assessment protocol that was implemented in 2008-2009, assessment results within the first two years demonstrated a 25% improvement in performance level with an average of 95% of our 201 students performing at the 80% level and above. Over the last two years, oral interview levels have reached multiple AM recipients in both French and Spanish as compared to IM in 2009 (a difference of four steps upward). These approaches to assessment have transformed our department at all levels, encompassing faculty and students. We truly are a multi-cultural department...with one of those cultures being assessment.
Chair, Foreign Languages
Carr College of Nursing
National nursing accreditation standards and state boards of nursing require that nursing program graduates pass the nursing national licensing exam, NCLEX -RN®,on the first attempt.The national accreditation organization of the Carr College of Nursing requires that the percentage of graduates passing the licensing exam to be equal to or exceed the national licensing percentage. When BSN pass rates fell between 2009 and 2011, faculty initiated comprehensive strategies to increase the passing percentage. Effective fall 2013, all nursing courses required the passing standard of ≥ 75% in order to pass the course with a final letter grade of C or higher. In addition to raising the pass standard, faculty began evaluating and revising exam questions to ensure the cognitive level was application or analysis similar to the national licensing exam. Later in 2013, the faculty formalized item analysis for the exam questions to ensure reliability. The faculty increased the consistency in clinical competency evaluations. Additionally, faculty implemented two NCLEX -RN® review courses in the final semester in the program. These changes along with the change in the HCCN course grade policy seemed to have an important impact on the NCLEX -RN® pass rate for 2013. The National Council of State Boards of Nursing (NCSBN) published the Total, First Time, US Educated pass rate at 83.04%; the Baccalaureate Degree Program pass rate was 85.18%, and the HCCN graduate pass rate at 91.67%.
The faculty believed the decision to increase the passing standard to be validated within the first year of implementation. The first attempt pass rate percentage for Carr College of Nursing graduates for 2014 is illustrated below.
Carr College of Nursing (HCCN) and US Baccalaureate NCLEX -RN® Pass Rates 2012 to Present
US Baccalaureate First Time Pass Rate/Percentage
HCCN Baccalaureate First Time Pass Rate/Percentage
Susan D. Kehl, Ph.D, RN, CNE,
Dean, Carr College of Nursing
Academic Advising – 2014
The biggest change that has occurred in the Academic Advising office in the 2013-2014 school year is the creation of the UNIV 150 class required for Probation students. This class used to be offered to suspension students for no credit, and because our assessments showed that we had some decent success with that class, we decided to now offer an enhanced version of the class, for credit, before students were even placed on suspension. We hoped that the creation of the UNIV 150 class would eventually reduce the number of students being placed on academic suspension. So far the numbers show that this is true. In Spring 2013, there were 66 students placed on suspension after that semester. After the Spring 2014 semester, the number was even lower with 56 suspensions. Hopefully we’ll continue to see the number of academic suspensions decline even further in the next couple of years thanks to this course and other efforts implemented by the Academic Advising office and the other offices within the Center for Student Success.
Jake Brownfield, Ed.S.
Director, Academic Advising
Student Health Services
The goal of Student Health Services at Harding University is to encourage total wellness for the students that seek our services. The clinic provides primary care within the scope of practice of Registered Nurses. The condition and needs of each student are assessed and addressed. They are either treated with over the counter medications and taught self-care pertinent to their situation, or given a referral to the appropriate professional health care provider, doctor, dentist, or counselor. Student Health also is charged with the responsibility of giving and entering excuses into Pipeline for any medical absences.
In the fall of 2011, Harding instituted an Excuse Policy. It limited each student to seven excuses that were not accompanied by a doctor’s note per semester. Once the student had five excuses, an email was sent by the director of the clinic informing them of their absence status and encouraging them to closely monitor their attendance. In an effort to assess the effects of this policy, data has been kept each semester of the number of patient visits, the number of cautionary emails sent, and the number of students who accumulated the maximum number of excuses, despite the emails. The following is a brief summary of that data: Fall 2011, 6546 visits, 29 emails sent, and 6 students reaching or exceeding the limit of 7 excuses; Spring 2012, 6781 visits, 42 emails, and 15 students reaching the maximum limit; Fall 2012, 6902 visits, 45 emails, and 9 students reaching the limit; Spring 2013, 6601 visits, 69 emails, and 12 students reaching the limit; Fall 2013, 6892 visits, 52 emails, and 6 students reaching the excuse limit, Spring 2014, 6495 visits, 42 emails, and 10 students reaching the maximum limit of excuses per semester. These numbers indicate that this concern for and attention to the student’s absence history has a positive effect on their attendance which correlates with their academic success.
Lynn McCarty B.S.N,
Director, Student Health Services
In the last three years I have given an advising survey to all undergraduate students to assess the quality of advising across campus. The number of survey participants was surprisingly consistent and higher than expected, and the results are helpful. The good news is that we have seen a 3% increase in advising satisfaction over the last three years. While this increase is good, there is still much room for improvement. An advising committee will be formed this year which will be more like a task force to improve advising across campus. The committee will create a new mission for advising, create more goals and objectives for advising on campus, come up with new ways to communicate, assist, and encourage the faculty advisors, and even create a new way of assessment to measure these new goals and actions. Because the percentage of advising satisfaction is almost the same as our retention rate, we suspect that an increase in advising satisfaction will also show an increase in retention. Future assessments will need to verify.
Another area that the surveys showed needs improvement is faculty advisors contacting students to invite them to advising sessions. The first survey’s results in this area were disappointing and somewhat concerning. Fortunately there was a 9% increase in this area 2 years later, so it is encouraging that there was some improvement. Interestingly enough, students self-reported in the comments section that they would like for their advisor to invite them in for advising sessions. Thanks to these survey assessments, we know that more of an effort needs to be made by faculty advisors to invite their students in for advising sessions. Future assessments will show whether or not this will increase advising satisfaction, which in theory could increase our retention rates.
Jake Brownfield, Ed.S.
Director, Academic Advising
Past Success Stories
Department of Art and Design, John E. Keller 2007-08
Early in our Assessment process, we noticed how poorly our students were doing in our Senior Knowledge Base Survey. The Survey covers content of the four classes that all of our majors have in common: Art 231 (Prehistory through the Middle Ages), Art 232 (Renaissance through Modern), Art 200 (Two-Dimensional Design), and Art 260 (Color Theory).
Our concerns over their working knowledge of major historical artists and art periods was suspect before we began giving the Knowledge Base Survey. The Survey has put an exclamation point on the problem. Early on, in our current Assessment protocol, we implemented two, three-hours Art Hisotry survey courses, to replace the two-hour Art Appreciation course for all of our Art, Interior Design, Art Education, Graphic Design and Art Therapy majors.
We have seen an increase in their retained knowledge. However, we still have a concern, as reflected in the Senior Knowledge Base Survey, over what they remember about major historical artists and art periods, as they leave Harding. We are currently discussing ways of strengthening their knowledge. The means we are currently exploring is the inclusion of more historical information in all of our studio classes. The difficulty will be orchestrating it in a methodical, comprehensive manner.
Success Stories in 2003-04
College of Education
In 2003 the questionnaires used to survey Harding Teacher Education Program Alumni and their supervisors were revised. A question was added to determine respondents' perceptions of the adequacy of Harding's program in training teachers to use instructional technology. Although 78% of the alumni and 81% of the supervisors agreed that Harding's program adequately prepared beginning teachers to use instructional technology, this was the only question on which the rate of agreement fell below 90%. Two faculty members requested a faculty development grant to investigate the technology competency of the students of students who had recently completed their first education course that included training in use of instructional technology. The grant was matched by funds from an instructional technology assessment corporation. The students were able to participate, at no cost to them , in an online assessment for IC3 Internet and Computing Core Certification.
Success Stories in 2002-03
Postal Services Department, Tobey Nichels
- Based on review of daily records, trends of heavy mail volume were noted which enabled us to reschedule staff and student employees for added coverage. Resulted in: Overall workload of Cam pus Mail Employees was less stressful while improving customer service.
- Assessment process gave real hands on evidence of our client satisfaction. If a client is dissatisfied, they may be vocal and memorable. The many positive comments and kind gestures by other clients are often forgotten. As a result of the survey evaluations: There is improved morale and renewed energy by the staff.
- By proactively pursuing the postage discounts available for volume mailers, the Mailing Center has been able to reduce the cost per piece of mail processed by 8.25%.
Career Center, Rebecca Teague
A point of contact survey yielded the following results based on a 5 point scale:
- Overall satisfaction: 4.5
- Friendliness of Career Center Staff: 4.8
- On Campus Interviews: 4.5
Due to the changes in employment and the economic situation, more opportunities for students to find employment were sought. In addition to the Business, Industry, and Government Fair, Graduate School Fair, Teacher Job Fair, Nursing Career Fair, Opportunity Day and on campus interviews, the Career Center added an e-Fair, using Monster TRAK. It is a free service to students and is useful for:
- Locating Internships
- Promoting opportunities to specific audiences such as liberal arts
- Targeting geographic regions
- Forming consortiums with other schools to share jobs and internships thereby further increasing contacts for students
Department of Student Services, Dr. Dee Carson
- The subscore on the criteria, Athe dorms enhance your ability to study effectively@ was less than desired by the staff. As a result, the student leadership, the Resident Assistants, and Resident Life Coordinators have implemented new strategies to make the dorms a better environment for study purposes.
- Student Orientation programs were also areas of concern. On the senior exit questionnaire one of the questions for response was Abeing introduced to Harding through an Orientation program, Summer Experience or Student Impact, was of benefit to my success in college.@ Even though the 3.4 rating was satisfactory, we are really working to reorganize the orientation programs in order to better benefit the new students.
Specifically: Butch Gardner has now assumed the new position of First Year Experience Director and is working with a student/parent/faculty advisory committee to ensure the best possible program.
In 2005-06, a rotation plan was implemented to replace 4 randomly selected charter faculty members each year with new members from the same assessment units. After 3 years, all charter members will have been replaced, and the rotation plan is to replace the 4 faculty members who have served on the committee the longest with new members from their assessment units. When a volunteer wants to continue service, then with the dean's permission, they can be approved for another term on the committee. A member who cannot complete their term of service will be replaced by a new member from their assessment unit. The chairperson is appointed to a 3 year term by the Provost.
The individuals below represent the University Assessment Committee for the academic year 2014-2015.
|Forrest Smith, Chair|
College of Pharmacy
Office of Student Life
College of Science
College of Bible & Ministry
College of Arts & Humanities
College of Education
Office of Testing & Outcomes
College of Arts/History
|John Mark Warnick|
Office of the Registrar
College of Pharmacy
College of Education
Adult & Online Studies
College of Nursing
College of Physician Assistant
College of Business
Office of Student
Harding School of Theology
1000 Cherry Rd, Memphis
|**Larry Long, Provostemail@example.com|
|**Marty Spears, Associate Provostfirstname.lastname@example.org|
|*Serving a 2nd term|
Harding University is accredited by the Higher Learning Commission. http://www.hlcommission.org/
Phone: 312-263-0456, 800-621-7440
In an attempt to improve the efficiency and effectiveness of assessment reporting, the university implemented a custom-designed software package to the university system. The user-friendly Global Assessment Tracking software, named GATE, allows assessment managers and coordinators to easily enter assessment plans and evaluations as well as print several types of reports. College deans and other administrators can access the plans and summaries of their respective areas of responsibility. Full implementation occurred during the 2011-2012 academic year, with several previous years being added in preparation for the upcoming accreditation review.
Harding’s Quality Initiative Project (2011-2013), which is the first part of the reaffirmation process scheduled for 2015, was based upon the Lumina Degree Qualifications Profile. The 28-page "Degree Quality Profile Report for the Lumina Foundation" is available upon request.
Timeline for Assessment
In September of each year, assessment reports for the past cycle and plans for the new cycle are due.An Exemption Form must be completed and turned in to the Office of Testing & Outcomes Assessment Office and approved by the Provost Office, if this deadline cannot be met.
GATE location: In Pipeline- Home tab- My Info box- Employee Services-GATE near bottom
Past Assessment Plans & Reports for Academic & Administrative & Support units can be viewed by all faculty and staff. Location for past Assessments: In Pipeline- Employee tab- Assessment Report box in the bottom left corner
At Harding University we value our student's feedback and continually strive for excellence in teaching and communicating with our students. For this reason, we ask our students every semester to go online and complete evaluations for their courses. These evaluations are a simple online process that takes a small amount of time. The information gathered from these evaluations is reviewed and analyzed by deans and department chairs to insure that students are receiving the best possible educational experience. The instructors also have a opportunity to review comments made by students to improve communication. All comments left by students are strictly confidential. We ask you to give honest, constructive feedback. The teacher evaluation process is of great value to the experience at Harding University!
As a note of appreciation to participants, students who complete all of their evaluations are given the opportunity to register one day early for classes in the following semester. Classes are usually evaluated in the last week of the class, before finals begin. The only exceptions are classes with less than 4 students, labs, and classes with guest lectures. Every year the department heads review each teacher evaluation to look for any problems that may have arisen. Teachers are not allowed access to their evaluations until your grades have been released. All comments and ratings remain anonymous, so that students are protected from any unfair treatment.
Chapel dedicated to Christian Evaluations
Administrators: Provost - Larry Long
|College of Allied Health||Beckie Weaver, Dean|
|College of Pharmacy||Julie Hixson-Wallace, Dean|
|College of Arts & Humanities||Warren Casey, Dean|
|College of Bible & Ministry||Monte Cox, Dean|
|College of Business Administration||Bryan Burks, Dean|
|College of Education||Tony Finley, Dean|
|College of Nursing||Susan Kehl, Dean|
|College of Sciences||Travis Tompson, Dean|
Center for Adult & Online Studies
Harding School of Theology
Mike James, Dean
Jeffery Hopper, Dean
Christopher Davis, Director
Evertt Huffard, Vice President
Administrative & Support Units
|Administrator: Senior Vice President - Jim Carr|
|Glenn Dillard, Asst. Vice President|
Bob Reely, Assoc. Director
Nicky Boyd, Director
Jonathan Roberts, Director
|Administrator: Executive Vice President - David Collins|
Lew Moore, Director
|Administrator: Vice President of Finance, CFO - Mel Sansom|
Judith Hart, Director
Danny Wood, Manager
Randy Smith, Director
|Administrator: Vice President of Information Systems and Technology, CIO - Keith Cronk|
Jean Waldrop, Director
Mike Chalenburg, Asst. Vice President
|Administrator: Vice President of Advancement||Mike Williams, Vice President|
|Administrator: Vice President of Church Relations||Dan Williams, Vice President|
|Additional Administrative & Support Units|
The mission of the Harding University Assessment Committee is to promote a cohesive, structured, dynamic assessment program consistent with the mission of the University, the criteria of the Higher Learning Commission of the North Central Association, and programmatic accreditation associations.
Purpose of the UAC
- To develop and sustain logistics for effective, efficient assessment.
- To maintain quality assurance in assessment throughout the University.
- To recommend assessment policies.
- To promote a culture of assessment among all constituencies.
- To ensure that the process of assessment results in improvement of student learning through a well defined feedback loop that links to and informs mission, programs, data collection and analysis, strategic planning and budgeting.
University Assessment Statement
Harding University, since its charter in 1924, has been strongly committed to providing the best resources and environment for the teaching-learning process. The board, administration, faculty, and staff are wholeheartedly committed to full compliance with all criteria of the Higher Learning Commission of the North Central Association of Colleges and Schools. The university values continuous, rigorous assessment at every level for its potential to improve student learning and achievement and for its centrality in fulfilling the stated mission of Harding. Thus, a comprehensive assessment program has been developed that includes both the Academic units and the Administrative and Educational Support (AES) units. Specifically, all academic units will be assessed in reference to the following Expanded Statement of Institutional Purpose: The University provides programs that enable students to acquire essential knowledge, skills, and dispositions in their academic disciplines for successful careers, advanced studies, and servant leadership.
AES Units: Administrative and educational support units.
Academic Units: Units that offer degrees.
“Closing the Loop”: A critical component of the assessment cycle by which the results are connected to strategic planning. Also known as “using the results to improve.”
ESIPs: Expanded statement of institutional purposes derived from the mission statement.
HLC-NCA: The Higher Learning Commission of the North Central Association of Colleges and Schools.
Institutional Effectiveness: The extent to which the University achieves its mission and goals.
Outcomes Assessment: The systematic process by which faculty and staff identify the appropriate outcomes for specific programs, determine the extent to which those outcomes are achieved, and use the results to make changes that will improve learning and services. Outcomes assessment supports informed decision-making, self-improvement, and accountability.
Unit Assessment Plan: The three components are:
(1) Intended Outcomes: Seeks to answer the question, “What are we trying to do?”
a. Academic Units: The knowledge, skills, and dispositions that students will demonstrate upon completion of a degree program.
b. AES Units: What the unit intends to accomplish.
(2) Means of Assessment: How will the outcome be measured? Seeks to answer the question, “How well are we doing?”
(3) Criteria for Success: Expresses in specific, measurable terms what is the acceptable performance of a specific program or unit.
Unit Assessment Report: An annual report of the actual results obtained, the identified areas for improvement, and the specific changes that will be made for continuous improvement.
Sites of Interest
The University Assessment Committee (UAC) was formed by Dr. Dean Priest and held its first meeting September 19, 2001. Dr. Beth Wilson and Dr. George Oliver were chairpersons from the inception of the committee until 2005-06. Charter members of the committee included: faculty/staff members Deb Bashaw, Nicky Boyd, David Code, Steve McLeod, Jack Shock, Marty Spears, Sheila Sullivan, Linda Thornton and Flavil Yeakley, and student members Rachel Campbell and Aaron Henderson.
During the fall 2001 semester, all Academic Units and Administrative and Educational Support Units prepared 3-column assessment plans following the Nichols model. This was done under the direction and involvement of the UAC and with the encouragement and support of the University’s top academic administrators. The UAC reviewed each unit’s assessment plan, made suggestions for improvement and returned them for revision. All units had approved assessment plans filed and in place by January, 2002.
Dr. Cecilia Lopez visited Harding University in the spring of 2003 as a consultant in preparation for the HLC site visit in the fall of 2004.
Dee Bost, Allen Figley, Gail Fry and Ken Turley joined the committee in the fall of 2003.
In November of 2004 the self study report for Harding University was approved by the Higher Learning Commission of the North Central Association of Colleges and Schools. During that visit one noted strength was our assessment tools, materials, and documentation. The UAC will strive to build on this accomplishment in the coming years. The next onsite visit by the HLC is in the Spring of 2015.
Dr. Long appointed Dr. Sheila Sullivan to serve as the new chairperson in 2006-07. New committee members included Monte Cox, Mark Davis, Allen Henderson, Donny Lee, Kristin Prince and Forrest Smith.
Since its inception, the UAC has represented a wide variety of University stakeholders and departments, as reflected in this pdf document.
For a list of current members as of academic year 2014-2015, please see The Committee section.