CBT Assessment Model

CBT Assessment Model (2011-2012)

NSU College of Business & Technology

Introduction

The College of Business & Technology is poised to take advantage of the opportunity to design an assessment model that will address the expectations of two important constituencies. On the one hand it will provide the data and narrative necessary to meet the mandates of various accrediting organizations. Equally important, however, is the ability to enhance the learning outcomes and employment potential of graduate and undergraduate students within the College. 

Fortunately, the accreditation process allows disciplines to develop their own learning outcomes, the method of collecting those data, analyzing same, and amending the current curriculum to reflect needed changes. Thus, while providing the mandated accreditation materials, disciplines are able to gain insights into the scope of student learning among their majors and to discern areas that may need further attention. The sessions of planning and discussion inherent in the assessment process are welcome occasions for collegial interaction focused on student learning and curriculum. These discipline-defined discussions provide a much-needed forum for the exchange of ideas, pedagogies, methodologies, course delivery media, and so forth. 

NSU CBT Assessment Model

            The basic model employed for assessment purposes within the College of Business and Technology consists of a 4-stage process. First, each discipline/degree program is expected to define learning outcomes for their major and to construct a matrix indicating which courses are responsible for delivering/teaching that learning outcome. Next a document must be submitted that indicates what data will be collected to measure each learning outcome. The third step in the process is the actual collection of data and the final step is the evaluation of those data and implementation of any needed corrections in the outcome/curriculum/assessment process. Each of the four steps will be discussed in greater detail below. Appendix A provides the current status of graduate and undergraduate programs vis-à-vis the assessment process.

            Step 1. Develop Learning Outcomes. The development of learning outcomes is actually contingent upon the performance objectives that stakeholders desire in college graduates, i.e., what attributes, knowledge, and skills faculty and potential employers expect from someone with an MBA or BBA degree in a given major.   From the NSU College of Business and Technology mission statement, one learns that the College intends to graduate professionals who are analytical, timely, effective communicators and leaders with an understanding of the impact of economic, social, political and global forces on business (see Appendix B for complete list of CBT performance objectives). 

            With these college level performance objectives in mind, the CBT Core Committee developed a list of learning outcomes that should be expected from undergraduate students by the time of graduation that contribute to these performance objectives. These learning outcomes, (aka, performance criteria) indicate what specific characteristics the student should exhibit in order to demonstrate achievement/mastery of that criterion. The wording used in the construction of learning outcomes is critical as the specific choice of an observable action verb must be consistent with the level of learning expected (e.g., cognitive, affective, psychomotor). Among several sources listing action verbs by desired learning domain, tables from Rogers work on assessment planning (2004) are included in Appendix C. The learning outcomes developed are specific performance criteria that address the performance objectives in the CBT Mission statement. Thirteen outcomes were identified that addressed several skill areas (communication, interpersonal and organizational, analytical and critical thinking, business) and the adoption of a global orientation as it relates to business (CBT Core learning outcomes are listed in Appendix D). In addition to learning objectives for the CBT Core curriculum, each graduate and undergraduate degree program has developed learning outcomes.

Step 2. Identify Methodology for Measuring Learning Outcomes. Subsequent to the development of learning outcomes, the committee constructed a curriculum map linking specific outcomes to specific core courses (CBT Core Curriculum Map, Appendix E). Faculty teaching core courses have been asked to identify a specific assignment, course activity, or artifact that best represents/demonstrates that desired learning outcome. In choosing an assessment method for a particular outcome, it is incumbent upon the faculty members to consider issues of validity and reliability. From a validity standpoint, the measurement must be relevant, accurate, and useful; from a reliability standpoint, the measurement must yield the same result over repeated administrations or across different reviewers/evaluators. Ideally, the collection of multiple measures for each learning outcome would enhance the likelihood of maximizing validity and reliability in learning outcome measurement. Among the many available alternatives for data collection, the following list is offered:

. Archival Records (esp. biographical, academic, or other file data available from NSU)

. Behavioral Observations (student actions in natural setting; audio, videotape)

. Exit Interviews (F2F dialogue; esp. attitude/opinion)

. External Examiner (3rd party objectivity increases external validity)

. Focus Groups (requires trained moderator; identify trends/patterns in perceptions)

. Locally Developed Exams (content/style geared to specific goals/objectives)

. Oral Exams (depth/breadth of student achievement measured via probes, follow-up)

. Performance Appraisal (Competency-based, direct measure, real world)

. Portfolios (multiple work samples, longitudinal perspective possible)

. Simulations (Competency-based, flexible, provides good internal validity)

. Surveys/Questionnaires (cover broad range of attributes quickly/varied stakeholders)

. Standardized Exams (Commercial, norm-referenced (external validity))

All graduate and most undergraduate degree programs have determined how they will measure each learning outcome. To the extent that they have been consulted, the CBT Assessment Committee has recommended the use of course-embedded assessment whenever practical. In addition to individual assignments designated as assessment instruments, many disciplines have developed a pool of questions tied to learning outcomes that are administered on common final exams.

 

            Step 3. Collect Learning Outcomes Data. The collection of learning outcomes data is simply the implementation of the particular methodology choice made previously for each outcome. It should be noted that more data is not necessarily better. The more clearly defined the learning outcome is, the more focused and efficient data collection will be. Data with high internal and external validity avoid the perceived necessity for infinite cases to assess.    Most degree programs are in the beginning stages of data collection. Although there are limited data available at this point, anecdotal evidence suggests that some are beginning to see the benefits of the assessment process. Even in the situation of incomplete data, program effectiveness in achieving particular learning outcomes can be ascertained. Indeed, in more mature assessment milieus it is not uncommon to stage the assessment of outcomes; i.e., collect data systematically, but periodically; e.g., every other semester or every other year. Thus, a reasonable goal might be to expect every degree program to begin their data collection efforts this year, even if every assessment instrument/methodology has not been decided.

 

            Step 4. Interpret and Integrate Outcome Data. At this final stage, the faculty response is more often “How?” rather than “How come?” Keeping in mind that the goal of assessment is to document and to improve student learning, “How come” is quickly addressed. However, questions regarding how to interpret or how to integrate learning outcome data may require more reflection. Needless to say, the more thought that goes in to the formulation of a learning outcome and the method used to measure that outcome, the more straightforward the interpretation of the data should be. Ambiguities that arise during interpretation generally suggest poor validity for the assessment instrument, i.e., the measurement is not a direct/strong example of the learning outcome. This type of realization is an inherent part of the assessment process and simply points to the need for a new or additional measurement for that learning outcome. When the data suggest that achievement of a particular learning outcome has been ineffective, there are three general responses available:

the specific learning outcome may be modified,

the instrument(s) used to measure that outcome may be modified,

the curriculum may be modified.

Faculty within that discipline will determine which of these three options is appropriate. 

            At present, only three of our programs have reported collecting complete learning outcomes data. Documentation on whether or how this information is being used to inform the curriculum or learning environment process has not been made available to the CBT Assessment Committee. Rogers (2004) recommends that the following questions be asked during the evaluation and feedback stage:

            Are the linkages between student learning and curricular strategies present?

                        Do students get an opportunity to learn and develop in ways that will

 enable them to demonstrate the anticipated outcomes at the appropriate level?

            Are the processes efficient?

                        Are the data collected in a systematic fashion that would indicate both sensitivity

 to faculty workload and an understanding of the assessment question?

            Is the timeline appropriate?

                        Are data being collected in such a way that the assessment question can be

 answered without trying to assess every outcome every semester for every

 student?

            Does the plan pass the, “Does this make sense?” test?

                        When all the processes, activities, timelines, etc. are laid out, do they look

 reasonable given the available resources and the need to create credible

 processes?

 

            Final Thoughts. 

1. We need to apprise faculty within the College of our expectations for assessment progress for the 2011-2012 academic year. 

For example:

. we expect learning outcomes and a matrix of ‘courses by outcomes’ by the end of the

 Fall 2011 semester, and

. we expect at least partial data collection to begin during the Spring 2012 semester.

2. If, as Kathy and I discussed, we are expanding the membership of the CBT Assessment Committee, we need to make that official immediately and bring the group together for discussion of the committee’s charge for the 2011-2012 academic year.

3. We need to make a decision regarding the adoption of e-portfolios for college assessment and a decision regarding the particular application software that we intend to use in order to capitalize on the current interest/curiosity regarding the use of e-portfolios. If input is desired from a broad spectrum, this speaks to the need to bring the expanded Assessment Comm. together sooner rather than later.

4. We need to pay attention to the informal channels of communication for evidence of paranoia regarding the assessment process and take every opportunity presented to assure faculty that they are not being evaluated with these assessment data. Though often unspoken, a paramount concern in the final assessment stage is that the data not be used as a measure of individual faculty effectiveness. To allow for such a connection to be made could completely undermine needed faculty cooperation in this process. The learning outcome literature is rife with warnings regarding this ‘misuse’ of outcome data. 

 

Bibliography

Gronlund, N. E. (1981). Measurement and evaluation in teaching, 4th ed. New York, Macmillan Publishing.

McBeath, R. J., (Ed.). (1992). Instructing and evaluating in higher education: A guidebook for planning learning outcomes. Englewood Cliffs, NJ: Educational Technology
Rogers, Gloria. 

Rogers, Gloria M. “Assessment Planning Flow Chart,”    CD. Rose-Hulman Institute of Technology, 2004.


 

 

Appendix A: NSU College of Business and Technology Program Assessment Status

 

 

B.B.A. Core, Majors, B.S., B.T. degrees

August 2011

B.B.A.’s

 

 

 

 

1. Accounting

Have

Learning

Outcomes

 

 

ü  

Method to

Measure

Outcomes

Current

Outcome

Data Avail.

Method to

Integrate

Outcome Data

 

2. Business Admin.

ü  

ü  

ü  

 

3. Entrepreneurship

ü  

ü  

 

 

4. Financial Mgmt.

ü  

 ~

 

 

5. Financial Planning

ü  

   ~

 

 

6. Hosp./Tourism Mgmt.

ü  

ü  

ü  

ü  

7. Information Systems

ü  

ü  

ü  

 

8. International Business

ü  

ü  

ü  

ü  

9. Management-General

ü  

ü  

ü  

ü  

10. Marketing

ü  

~

 

 

10. Supply Chain Mgmt.

ü  

 

 

 

12. B.S. Environ. Health /

      Safety Management

 

ü  

 

ü  

 

ü  

 

13. B.S. Health Care Adm.

ü  

ü  

ü  

~

15. B.T.

ü  

 

 

 

16. B.B.A.

ü  

ü  

ü  

?

 

 

 

 

College of Business and Technology Graduate Degrees

Assessment Status

August 2011

Degree

Have Learning Outcomes

Method to Measure Outcomes

Current Outcome Data Avail.

Method to Integrate Outcome Data

MBA

ü  

ü  

ü  

 

PMBA

ü  

ü  

 

 

MAFA

ü  

ü  

~

 

MS EHS

ü  

 

 

 

 

 

 

 

Appendix B. Performance Objectives from CBT Mission Statement

 

The NSU, College of Business and Technology has as its purpose, (t)o graduate professionals who:

            A. use an analytical and systems approach to sound problem solving and decision

 making;

            B. communicate effectively;

            C. have current, relevant knowledge;

            D. understand the economic, social, political, legal, and technological environment of

 business in a global context;

            E. are prepared to be dynamic leaders and effective team members

            F. are professional in their attitude and approach to a career; and

            G. appreciate the civic, social, and ethical responsibilities of business organizations and

 of business professions.

 


 

 

Appendix C: Lists of Cognitive, Affective, and Psychomotor Action Verbs

(Source: Gloria M. Rogers, “Assessment Planning Flow Chart,”

Rose-Hulman Inst. Of Technology, 2004).

Cognitive learning is demonstrated by knowledge recall and the intellectual skills: comprehending information, organizing ideas, analyzing and synthesizing data, applying knowledge, choosing among alternatives in problem-solving, and evaluating ideas or actions.

Level

Illustrative Verbs

Definition

Example

Knowledge

arrange, define, describe, duplicate, identify, label, list, match, memorize, name, order, outline, recognize, relate, recall, repeat, reproduce, select, state

remembering previously learned information

memory of specific facts, terminology, rules, sequences, procedures, classifications, categories, criteria, methodology, principles, theories, and structure

Comprehension

classify, convert, defend, describe, discuss, distinguish, estimate, explain, express, extend, generalize, give examples, identify, indicate, infer, locate, paraphrase, predict, recognize, rewrite, report, restate, review, select, summarize, translate

grasping the meaning of information

stating problem in own words, translating a chemical formula, understanding a flow chart, translating words and phrases from a foreign language

Application

apply, change, choose, compute, demonstrate, discover, dramatize, employ, illustrate, interpret, manipulate, modify, operate, practice, predict, prepare, produce, relate, schedule, show, sketch, solve, use, write

applying knowledge to actual situations

taking principles learned in math and applying them to figuring the volume of a cylinder in an internal combustion engine

Analysis

analyze, appraise, break down, calculate, categorize, compare, contrast, criticize, diagram, differentiate, discriminate, distinguish, examine, experiment, identify, illustrate, infer, model, outline, point out, question, relate, select, separate, subdivide, test

breaking down objects or ideas into simpler parts and seeing how the parts relate and are organized

discussing how fluids and liquids differ, detecting logical fallacies in a student's explanation of Newton's 1st law of motion

Synthesis

arrange, assemble, categorize, collect, combine, comply, compose, construct, create, design, develop, devise, design, explain, formulate, generate, integrate, manage, modify, organize, plan, prepare, propose, rearrange, reconstruct, relate, reorganize, revise, rewrite, set up, summarize, synthesize, tell, write

rearranging component ideas into a new whole

writing a comprehensive report on a problem-solving exercise, planning a program or panel discussion, writing a comprehensive term paper

Evaluation

appraise, argue, assess, attach, choose, compare, conclude, contrast, defend, describe, discriminate, estimate, evaluate, explain, judge, justify, interpret, relate, predict, rate, select, summarize, support, value

making judgments based on internal evidence or external criteria

evaluating alternative solutions to a problem, detecting inconsistencies in the speech of a student government representative




Affective learning is demonstrated by behaviors indicating attitudes of awareness, interest, attention, concern, and responsibility, ability to listen and respond in interactions with others, and ability to demonstrate those attitudinal characteristics or values which are appropriate to the situation and the field of study.

Level

Illustrative Verbs

Definition

Example

Receiving

asks, chooses, describes, follows, gives, holds, identifies, locates, names, points to, selects, sits erect, replies, uses

willingness to receive or attend

listening to discussions of controversial issues with an open mind, respecting the rights of others

Responding

answers, assists, complies, conforms, discusses, greets, helps, labels, performs, practices, presents, reads, recites, reports, selects, tells, writes

active participation indicating positive response or acceptance of an idea or policy

completing homework assignments, participating in team problem-solving activities

Valuing

completes, describes, differentiates, explains, follows, forms, initiates, invites, joins, justifies, proposes, reads, reports, selects, shares, studies, works

expressing a belief or attitude about the value or worth of something

accepting the idea that integrated curricula is a good way to learn, participating in a campus blood drive

Organization

adheres, alters, arranges, combines, compares, completes, defends, explains, generalizes, identifies, integrates, modifies, orders, organizes, prepares, relates, synthesizes

organizing various values into an internalized system

recognizing own abilities, limitations, and values and developing realistic aspirations

Characterization by a value or value complex

acts, discriminates, displays, influences, listens, modifies, performs, practices, proposes, qualifies, questions, revises, serves, solves, uses, verifies

the value system becomes a way of life

a person's lifestyle influences reactions to many different kinds of situations




Psychomotor learning is demonstrated by physical skills: coordination, dexterity, manipulation, grace, strength, speed; actions which demonstrate the fine motor skills such as use of precision instruments or tools, or actions which evidence gross motor skills such as the use of the body in dance or athletic performance.

Level

Illustrative Verbs

Definition

Example

Perception

chooses, describes, detects, differentiates, distinguishes, identifies, isolates, relates, selects, separates

using sense organs to obtain cues needed to guide motor activity

listening to the sounds made by guitar strings before tuning them, recognizing sounds that indicate malfunctioning equipment

Set

begins, displays, explains, moves, proceeds, reacts, responds, snows, starts, volunteers

being ready to perform a particular action: mental, physical or emotional

knowing how to use a computer mouse, having instrument ready to play and watching conductor at start of a musical performance, showing eagerness to assemble electronic components to complete a task

Guided response

assembles, builds, calibrates, constructs, dismantles, displays, dissects, fastens, fixes, grinds, heats, manipulates, measures, mends, mixes, organizes, sketches

performing under guidance of a model: imitation or trial and error

using a torque wrench just after observing an expert demonstrate a its use, experimenting with various ways to measure a given volume of a volatile chemical

Mechanism

(same list as for guided response)

being able to perform a task habitually with some degree of confidence and proficiency

demonstrating the ability to correctly execute a 60 degree banked turn in an aircraft 70 percent of the time

Complex or overt response

(same list as for guided response)

performing a task with a high degree of proficiency and skill

dismantling and re-assembling various components of an automobile quickly with no errors

Adaptation

adapts, alters, changes, rearranges, reorganizes, revises, varies

using previously learned skills to perform new but related tasks

using skills developed learning how to operate an electric typewriter to operate a word processor

Origination

arranges, combines, composes, constructs, creates, designs, originates

creating new performances after having developed skills

designing a more efficient way to perform an assembly line task


Gronlund, N. E. (1981). Measurement and evaluation in teaching, 4th ed. New York, Macmillan Publishing.

McBeath, R. J., (Ed.). (1992). Instructing and evaluating in higher education: A guidebook for planning learning outcomes. Englewood Cliffs, NJ: Educational Technology


 


 

 

Appendix D: CBT Core Learning Outcomes

Communication Skills:

Each BBA student will make a professional presentation judged as meeting professional standards prior to graduation.

Each BBA student will present at least three pieces of written communication judged to be effective prior to graduation.

Interpersonal & Organizational Skills:

Each BBA student will demonstrate effective negotiation skills prior to graduation.

Each BBA student will present evidence of experience in two of the following professional activities; business etiquette, professional development, career planning, or leadership in a student or professional organization prior to graduation.

Each BBA student will show evidence of effective participation in a team setting, including evidence of being an effective team member and team leader.

Each BBA student will show evidence of effective participation on teams with diverse or multicultural membership.

Analytical and Critical Thinking Skills:

Each BBA student will be able to apply appropriate technology skills in a professional setting.

Each BBA student will show evidence of critical thinking and problem solving as judged through an integrative project or assignment.

Business Skills:

Each BBA student will demonstrate the requisite business knowledge deemed appropriate by the business curriculum committee.

Each BBA student will demonstrate the ability to apply the appropriate business skills to solve a business problem.

Each BBA student will demonstrate the ability to work effectively in a cross disciplinary setting.

Each BBA student will recognize and analyze ethical dilemmas and propose resolutions for practical business situations.

Global Orientation:

               Each BBA student will develop an awareness of global/international issues as they relate to

                business.

 


 

 

Appendix E:   CBT Core Curriculum Map

                       

 

Oral

Written

Negotiate

 Team

Diversity

Tech Skill

Critical

Apply

 

 

 

 

 

 

 

 

 

 

Thinking

 

 

 

 

 

SPCH

ENGL

   

GEOG/

IS 1003

MATH

 

     

Prereq.

1113

1113

   

SOC SCI

 

1513

 

     

 

 

1114

 

 

 

 

BA 3933

 

 

 

 

ECON 2113

X

     

X

 

X

       

ACCT 2103

             

 X

     

ECON 2213

           

X

       

ACCT 2203

 

X

 

X

 

X

 

X

     

MKT 3213

X

X

   

X

   

X

     

MGMT 3183

 

X

   

X

   

X

     

IS 3063

 

X

 

X

 

X

X

X

     

IS 3113

X

X

X

X

     

X

     

BA 3XX3

     

X

 

X

X

X

     

FIN 3213

           

X

X

     

BLAW 3003

 

X

       

X

       

MGMT/TECH

                     

3223

         

X

 

X

     

MGMT 4213

X

X

X

X

X

 

X

       

 

     
     

 

Integrate

Ethics

Global

 

     

 

 

 

Issues

     

 

   

 

     

Prereq.

   

 

     

 

 

 

 

     

ECON 2113

   

X

     

ACCT 2103

   

 

     

ECON 2213

 

X

X

     

ACCT 2203

 

X

 

     

MKT 3213

 

X

X

     

MGMT 3183

 

X

X

     

IS 3063

 

X

X

     

IS 3113

   

 

     

BA 3XX3

   

 

     

FIN 3213

 

X

X

     

BLAW 3003

 

X

 

     

MGMT/TECH

   

 

     

3223

 

X

X

     

MGMT 4213

X

X

X