Regardless of country or geography, all higher education institutions should have the same objective when it comes to measuring learning: to know their learners have gained the necessary skills and knowledge and that your institution is delivering on their promise. Assessment is critical to measuring both learner and higher education progress. However, it is not uncommon for faculty and other school officials to feel strained and overburdened by the different assessments that must be administered at the end of a semester. One way to reduce the friction related to assessment is to use an external service to help manage business studies assessment and reporting.
Currently, a very popular and effective external assessment of learning outcomes/objectives in business education is the Business Administration Assessment. The assessment service provides a test bank for schools within the US and schools outside the US.
The assessment solution is being used by schools worldwide. It allows school officials to measure the general management subject areas of programs, including the option of pre and post-test, which helps business programs identify actual learning taking place during the program, thereby separating it from prior knowledge. The following article explores the benefits of using an external assessment, including efficient measurement of learning outcomes, external benchmarking, advanced and automated reports, and a real-world case study.
Measuring Student Learning Outcomes
The quality of business education is about giving your learners the skills to succeed in the workforce and their careers. It is the most meaningful goal for a business program. Learners are entering a workforce that is very much influenced by the rapidly changing business environments worldwide. Therefore, we want students to have the knowledge, competencies, and skills they will need to meet the employer’s expectations and grow beyond what companies expect of them. The challenge schools face is knowing if they can deliver on that goal and checking if their learners are gaining knowledge and developing skills that support a business program’s learning outcomes. Placement and employment numbers are significant but do not tell you whether the program is succeeding in meeting specific learning goals.
“It’s good to keep in mind that [higher education institutions] don’t work in isolation. Schools have a direct influence on their community, the workforce, and employers, and by producing quality level graduates, we are actually directly influencing societies and cultures and communities in which these institutions operate.”
~ Dr. Olin O. Oedekoven
Creating a business studies assessment and reporting plan that measures all student learning outcomes can feel complex. Generally, end-of-program assessments include a capstone course, thesis, project, or internship. These are good assessment tools, but for the assessment to be useful, it must cover all program-established learning objectives. Unfortunately, covering all program-level learning outcomes with a course-level assessment is challenging. Most of the time, course-level (formative) assessments only measure proficiency in one to two learning outcomes, leaving a school official to try and figure out how to assess the rest. And although course (or formative) assessment is crucial to helping faculty test what they teach, it does not provide a complete picture of retained knowledge or demonstrate knowledge growth throughout the academic program.
Choosing Your Assessment Tools
One of the ways that colleges and universities can cover all student learning outcomes in their assessment plan is by choosing an assessment tool that is easy to customize. For some institutions, this may mean creating their own assessment tool, but for most, it means adopting an external assessment service. The benefit of an external exam is that the companies who provide them often have access to larger pools of data because they serve many colleges and universities. Therefore, an external exam provider can likely provide a school with external benchmarking. External benchmarking is a great way to understand how your students perform compared to their peers at other institutions and helps you put your scores into perspective. Therefore, making business studies assessment and reporting more powerful.
Additionally, external comparison reports can be used to showcase the value and quality of a business program to potential learners, industry partners, and alumni. What better way to attract the attention of interested candidates than showing how well their graduating students perform against others? Finally, depending on the business program’s accreditation (Association of AMBAs, AACSB, IACBE, ACSBP) and quality assurance requirements, a school may need the ability to externally compare their scores to other schools with similar demographic characteristics to help satisfy those requirements.
Another common reason business programs may decide against internally developing a programmatic exam is that it burdens faculty time and resources unnecessarily. Faculty are already busy teaching, supervising, conducting scholarly activities, and completing other administrative tasks. Requiring them to develop, monitor, and continuously update another assessment tool could be too much. Furthermore, the reporting produced by an internal tool is often limited and trying to organize the data can be cumbersome. With an external assessment tool, faculty often can access more advanced reporting features. As a result, they can focus on analyzing assessment results and using data to inform decisions that impact the program and instruction – thereby impacting the quality of education for learners.
A Case Study: International Institute in Geneva
The following case study illustrates a real-world example of the benefits that an external exam provides for business studies assessment and reporting.
Surabhi Aggarwal, Ph.D., Academic Dean, International Institute in Geneva
The International Institute of Geneva was founded in 1997 and offers undergraduate and graduate programs in several disciplines including, international relations, media and communication, computer science, business administration, international management, and international trade.
As an education institution, our vision is to develop professionals who are committed to a sustainable society and to that end, I see two components of this; for students to apply this vision to their chosen fields regardless of the specialty, and to foster global citizenship. Our goal is to help students acquire these skills. This vision and philosophy guide the entire learning process from the curriculum to our module learning objectives as well as our overall and our programmatic learning objectives. Consulting from Peregrine helped us to better understand program learning outcomes and the relationship to specific module learning outcomes; we are grateful for their assistance. A constant endeavor is to establish the link between our vision and philosophy concerning what and how we teach, and our assessment process. We try to ensure that our students, faculty, and key stakeholders understand the link between our vision, learning outcomes, assessments, and teaching.
With Peregrine we do an inbound and outbound method of assessment; therefore, upon admission students are evaluated to determine their initial level in the different subject areas. When students are about to graduate, they take the outbound exam. For undergraduates it is at the end of three years since it is a three-year program and for graduate students the outbound test is taken at the end of their one-year master’s program.
To ensure full coverage, we require all entering students take the Inbound Exam and for the Outbound Exam; we have linked the assessment to our capstone module. We give 10% for the Peregrine assessment in that module which breaks down to 5% for taking the test and 5% for a score of 50% or better. It is important to provide an incentive to ensure that students are putting in a good effort since reliable results depend on how serious students are when taking the exam. We made this change a while back and since then we have seen students taking the test more seriously which gives us more accurate results. Our standard benchmark is to see a minimum of a 10% increase in scores from the inbound assessment to the outbound assessment.
We are pleased with how results have improved over the years and compared to our peers in Europe, where we are doing rather well. The opportunity to evaluate the results longitudinally, not only internally but also externally with similar institutions, is an advantage we like with Peregrine. Of course, we do identify certain area where we need to improve. We use the data we have collected to deliberate and discuss to determine if the subject area is not adequately covered in a module or if material needs to be presented differently. However, we take it one step further because we want to see if we are getting the same feedback from students, so survey them for their feedback on the efficacy of module content and teaching methods.
As an example, a few years back students’ scores in economics and organizational behavior were not showing the 10% increase between the Inbound and Outbound exams, therefore, we evaluated how students responded to questions on the economics and organizational behavior modules and asked ourselves the questions, ‘was the material delivered in an engaging way’, ‘was the material relevant’, etc. We came to realize that a different approach to teaching the material needed to be made. So, we restructured the modules as well as made a faculty change. As a result, we saw an improvement in both the results in the outbound test and in student satisfaction with the modules. It was all logically correlated. With this evaluation, change, and re-evaluation we were able to successfully close the loop in the continuous improvement cycle.
With the help of Peregrine, we have been able to strengthen our assessment process by adding customized test banks to be in alignment with several specializations we have in place. All our programs are designed to have 60% business topics with the remaining 40% dedicated to specialization areas such as digital media, computer science, international relations, to name a few. Therefore, it is important that our assessments cover both business topics and specialization topics. With this combination we are getting a good global view of our different undergraduate and graduate programs.
To read the full case study, download a FREE copy of the whitepaper, The Use of Summative Assessment to Improve Quality in Business Administration Programs Outside the U.S..
An External Measure for Schools Outside the United States
Business programs teach, ‘You are what you measure.’ As such, we must walk the talk and practice what we teach. The Business Administration Assessment is a highly flexible and customizable assessment service that allows you to evaluate your students’ performance against the rest of the world and gauge where your students stand. Also, the assessment service helps you understand whether you are achieving your quality goals.
The Business Administration Assessment is more than an exam; it is a complete solution. The assessment comes with 16 reports to help you analyze your assessment results, identify strengths, and discover areas for improvement. The solution, helps you streamline your business studies assessment and reporting processes, supporting a sustainable system of continuous improvement. Look at the various reports available below.
Click Here to Download the Quick Guide to Peregrine’s Reports & Client Admin
Individual, Programmatic, Supplement and Aggregate Reports
Individual Results Report. A learner-by-learner report of the exam results in Excel that shows the scores and percentiles obtained on the exam at the Topic and Subject levels. | |
Pairwise Report. A report that shows learner-by-learner results when the same learners who took the Inbound Exam also complete a Mid-point or Outbound Exam. The differences in scores are displayed. | |
Pairwise Report Executive Summary. A summary of the Pairwise Report shows the average Inbound, Mid-Point, and Outbound exam results of learners who completed these exams. | |
Internal Analysis Report. A report of a selected set of exams compared to an aggregate pool. Results are compared at the topic and subject levels based on percent scores and percentile rankings to determine if student performance is below, at, or above desired thresholds. | |
Internal Analysis Report Executive Summary. An abbreviated version of the Internal Analysis Report that is commonly used to share the summarized results with stakeholders. | |
External Comparison Report. A report of a selected set of exams comparing the results against one or more aggregate pools. Comparisons include a comparison of the scores and a comparison of percentage change when Inbound and/or Mid-point exams are included. | |
External Comparison Report Executive Summary. An abbreviated version of the External Comparison Report is commonly used to share the summarized results with stakeholders. | |
Program-Cohort Comparison Report. A side-by-side comparison of the results between two or more academic programs or cohorts of learners where there is an overlap of the exam topics. The report is used to understand any differences existing between the groups. | |
Longitudinal Report. A side-by-side comparison of the same exam over different exam periods. Up to four exam periods can be shown on the report. The report is most often used to evaluate programmatic change and to understand the trends over time. | |
Gap Analysis Report. A report that identifies potential learning gaps associated with specific response distractors based the percent scores to a selected aggregate pool, color-coded to a pre-selected percentile benchmark. The report combines elements from the Longitudinal Report, the Internal Analysis Report, and the Response Distractors Report. | |
Response Distractors Report. A report that summarizes why learners answered questions incorrectly based on five types of response distractors. The report compares the school’s results against both the test bank and an aggregate pool to see if learners are selecting incorrect responses at disproportionately higher or lower rates. | |
Learner Comparison Report. A comprehensive report that includes data analysis elements from both Internal Analysis and External Comparison into one report. | |
Student Exit Survey Report. A summary report of the results from an optional student survey administered in conjunction with an Outbound Exam. | |
Grade Scale Report. A report based upon the school’s exam results used to determine a school-specific grading scale for the Outbound Exam based on percentile scoring. | |
Aggregate Extraction Report. A report with the aggregate data in Excel format that can be used for additional data analysis. | |
Aggregate Schools Report. A listing of the schools included in each of the aggregate pools. |
Closing Remarks
As you travel around the world and your institution may start to work with partner institutions from different continents, you might notice that assessment may have different connotations. Some fear the burden, while others get confused and overwhelmed by all the requirements. This assessment tool is helping to demystify assessment for you – making business studies assessment and reporting easy. It also helps you implement that continuous assessment cycle, allows you to evaluate your programs and your systems, and provides you with systematic trend data that will help your long-term objectives and throughout the accreditation process. It does not matter what country your institution is in or how many branch campuses you have; your quality and accreditation system will applaud your efforts in implementing an external and programmatic assessment. With quality data and analytics, you know better, and you can do better.