Assessment: More than just a dirty word

Print Friendly, PDF & Email

By Kay L. Colley

Texas Wesleyan University

College Media Review - 3 Advisors V3


Assessment: Just the mere mention of the word can send chills up and down the spine of any new or seasoned student media adviser. Whispered in hushed tones or thrown around as an expletive, this 10-letter word connotes educational balderdash, busywork and just plain wrong-headedness to many in the ranks of college media. But much like student media advisers are misunderstood by administrators, assessment is misunderstood by many student media advisers.

According to the National Academy for Academic Leadership, assessment is a process that describes the current situation of a person, program or unit providing evidence of this analysis. Assessment involves goals or outcomes, processes and inputs. Some assessment methods can include surveys, focus groups, portfolios and direct observation with multiple assessment methods being the preferred way to demonstrate meeting goals or outcomes.Goals or outcomes are the desired end results. Processes are the things a person or program does to reach the goals or outcomes, and inputs are the resources needed to accomplish the goals or outcomes. Inputs can be students, faculty and staff members, buildings, money, technology, etc.

Sounds simple so far right? All you have to do is use readership surveys, weekly critique sessions, and “hey, we won some awards.” That should count for something, right?

“No, but…” says Nakia Pope, director for the Center for Excellence in Teaching and Learning at Texas Wesleyan University. “Often it’s the case that the criteria and the method of scoring are not known or disseminated. The ‘but’ would come in when it is.”

In an editor’s corner letter from the Summer/Fall 2005 edition of College Media Review, Pat Parish, former editor of College Media Review and former associate director of student media at Louisiana State University, ruled out contest awards as valid assessments.

“Judges can be fickle, or sloppy, or biased; or your medium loses out to a medium where the adviser’s hand is too often on the mouse,” Parish wrote.

So those Pacemaker awards and Mark of Excellence Awards may look good hanging on the wall, but they won’t necessarily stand up as assessment methods. Assessment methods are only useful when they are valid and reliable, according to the National Academy for Academic Leadership.

ASSESSMENT VALIDITY

Validity is the ability for an assessment tool to measure what it claims to measure. Reliability means your assessment tool will perform consistently in successive uses. Remember, Parish’s statement that judges can be fickle? That’s why contests aren’t good assessment tools, although they may be good tools for demonstrating to administrators your program’s overall excellence—they aren’t reliable because judges are fickle.

The same fickleness may also be evident in your readership surveys. Just because one group of students likes the weekly sex column doesn’t mean next semester’s or even a different sample of students will. Your readership survey methodology must also be reliable and valid to be used as an assessment document. Testing your survey before implementing it is important to making sure you get the answers that you need, and using the same survey year after year, allows you to compare.

Now what about those weekly critiques? Certainly they can be used for assessment.

Critiques are valid forms of assessment if they pass the reliability and validity tests, which usually means creating a rubric and sharing it with your staff before critiquing.

In a recent interview, Parish said that the diverse structures of student media throughout the nation make assessment standards difficult to create, which may be the reason the Council for the Advancement of Standards in Higher Education still lacks overall nationwide standards for student media.

In the Winter/Spring 2006 edition of College Media Review, Merv Hendricks, director of student publications at Indiana State University, wrote an article titled “A Path to Assessment,” that detailed Indiana’s assessment plan. Because Indiana’s student media fell under student affairs, Hendricks and his colleagues reviewed the CAS standards to use an assessment format that was reliable and valid. Unfortunately, CAS had not developed standards for student media at the time, but the conversation started with College Media Advisers (now College Media Association) and the Associated Collegiate Press. So Hendricks and his colleagues adapted CAS standards for other student affairs departments, developing Indiana’s student media assessment plan. He then offered the plan to CAS.

CAS REVIEWS STANDARDS

Seven years later and CAS is still reviewing standards for student media. A vote on standards under review will occur at the November meeting of the entire CAS board. CMA is listed on the CAS website as the professional association that has provided feedback on developing these standards. Parish has served as CMA’s representative to CAS. She said the process has been long and difficult. Once standards are approved, CAS will craft assessment methods.

Currently, some student media outlets have begun their assessment programs based on their own standards. Indiana State’s assessment plan, adapted from CAS standards, is available in the Winter/Spring 2006 edition of College Media Review. Selected documents are also available on the Indiana State student affairs website: http://www.indstate.edu/studentaffairsresearch/StudentPubsRA.htm.

Texas Tech University also crafted a student media assessment plan based on CAS standards and Indiana State’s assessment plan. Tech’s assessment plan is available HERE.

At Oregon State University Student Media, Kami Hammerschmith, assistant director of student media, serves on the Student Affairs Assessment Council, which was started by the Division of Student Affairs. She represents student media. Departments within student affairs are required to write an annual assessment plan and assessment report.

“The plan includes our mission, goals, learning and/or business outcomes and assessment methods,” she said. “In the report, we add the implementation, results and decisions/actions/recommendations shared. This is a standard format for all Student Affairs assessment plans and reports.”

LEARNING TOOL

Hammerschmith said the information is entered into Compliance Assist software in a “read only” capacity so all members of the Student Affairs Assessment Council can see the other plans. Each year, members of the Council are put in pairs and conduct reviews of another department’s assessment report, receiving annual feedback.

“The review process is a great learning tool,” Hammerschmith said.

At Fairfield University, Lei “Tommy” Xie, assistant professor in the department of English and director of the journalism program, said there is no formal assessment mechanism for the entire newspaper. “However, the journalism practicum we have has been a useful tool to ensure that the staff are aware of their performances,” he said. “Also, a media board offers feedback, mostly informal, to the paper at least twice a semester to give the staff a sense of how the publication is received by various groups on and off campus.”

Rachele Kanigel, associate professor of journalism at San Francisco State University and president-elect of College Media Association, said that SFSU doesn’t assess student media, but they do a report each year for Instructionally-Related Activities funds and a section on publications is included in the Accrediting Council on Education in Journalism and Mass Communications accreditation report. Kanigel also teaches Publications Laboratory, which helps produce the Golden Gate Xpress magazine, website and newspaper. Her syllabus contains some extensive devices used to grade student performance, but grades cannot be used for assessment.

So just how can you assess student media? At the University of Texas at Arlington’s Shorthorn, Lloyd Goodman, former director of student media, has been assessing student media for at least 10 years. Assessment is done at the organizational and individual level, helping Goodman and his staff plan training, staff development and resource allocation.

“We came up with that (assessment plan) many years ago before we were required to do assessment,” Goodman said. “We were seeing in our operation, and we were hearing it from some of the students, if they stayed around semester to semester to semester, they felt like they were doing more but not necessarily doing better. They felt like they were doing the same quality work semester to semester to semester, and our commitment to them was ‘you stick around, we will help you improve.’ We agreed with that. It doesn’t matter how many awards you win.”

The assessment plan Goodman and his staff put in place began with looking at what students should be able to do from semester to semester. Goodman asked what the quantifiable skills students should learn from semester to semester and developed that skill set or list of competencies for each position for each semester’s worth of experience.

“We would use this for our basis for training. If 99 percent of your staff is in the first semester, then obviously your training focus is going to be on the first-semester skills,” he said. “We developed it, discussed it and posted it for students on staff as this is our commitment to you, and this is what you should be doing.”

After establishing criteria, Goodman’s professional staff then chose some specific job categories and chose a specific two-week period to review. “That’s what turned it into assessment,” Goodman said. “I thought it provided really nice, usable information that we could then take and build into creating the operation for the next semester.

“The way we do assessment, picking a couple of specific staff groups from the competencies grid keeps the assessment manageable, since we’re working from portfolio reviews,” he said. “We would never try to do all positions, involving 70 student staffers, at the same time.”

STREAMLINING THE PROCESS

Goodman said he has worked to make assessment something that The Shorthorn can use but that also fits into the University’s idea of assessment. This has been an ongoing process of change for seven or eight years with The Shorthorn doing internally relevant and externally mandated assessment. That has changed.

“We were too ambitious,” Goodman said. “Now, we’ve streamlined the process.”

Now, Goodman selects two or three individual learning outcomes for students and two or three

objectives for the department each year, working with assessment specialists at UTA to get data that helps The Shorthorn and fits into the University’s assessment plan.

“That’s an ongoing challenge that we’ve had,” he said.

But the first step for Goodman and UTA was determining what was important. What were The Shorthorn’s goals and how would The Shorthorn staff and students know they had reached those goals? This is the heart of assessment.

While student media may function differently—some as clubs, some as classroom projects and others as divisions within student affairs—these goals from one student media organization to another seem to be pretty much the same as defined by student media mission statements: creating a learning environment for students.

“Assessment is about demonstrating that we are accomplishing what we claim to be accomplishing,” said John W. Williams, associate professor of political science at Principia College and former chair of the Mass Communication Department. “If we are failing, we need to change our practices or change our claims.”

So assessment isn’t a “gotcha” moment for student media. It is a way to demonstrate that we do what we do well—all balderdash, busywork and wrong-headedness aside.


 

kayColleyKay L. Colley is an associate professor of mass communications at Texas Wesleyan University. She is also the student media faculty liaison at Texas Wesleyan. She is working on articles on convergent journalism and curriculum, weighing if curriculum impacts student media or if student media impacts curriculum.