Research (Vol. 53) — Corrections and the College Web

Print Friendly, PDF & Email

Exploring the use of corrections on college newspapers’ websites

Kirstie E. Hettinga
California Lutheran University
Rosemary Clark
The Annenberg School for Communication at University of Pennsylvania
Alyssa Appelman
Northern Kentucky University

Ahhhhh. Photo by Kenny Louie via Creative Commons
Errors: “Ahhhhh.” Photo by Kenny Louie via Creative Commons

Abstract: A previous study found that college newspapers have perceived levels of credibility on par with their professional counterparts, but suggested that quality could be assessed in other ways. Previous research has documented the potential for error corrections to increase perceptions of quality. In a content analysis of College Media Association members’ websites (N = 419), the researchers found that some college publications are publicizing corrections, but some are not. Additionally, these practices seem to depend on publication and university differences. Similarities between college and professional publications are noted, and recommendations for improvement are discussed.

Exploring the use of corrections on college newspapers’ websites

Introduction: The Daily Illini prides itself on the accuracy of its reporting … When The Daily Illini makes a mistake in its print publication, a correction will run on page 2A as soon as possible. When The Daily Illini makes a mistake in … its online publication, the article will remain posted with a disclaimer listing the mistake and the appropriate changes made to the article.

This policy from University of Illinois’ daily student newspaper may be the exception, not the rule. While some newspapers such as The New York Times have established policies for errors and corrections, both online and in print, this is not the case for all professional publications.[1] As such, corrections may be an area of difference for college papers and professional newspapers.

For example, The Daily Illini, despite its policy above, has some issues that yield no corrections on its website. By comparison, in one week, The New York Times averaged about nine corrections per day, with three corrections being a light day and 16 a heavy day. Certainly, The New York Times would be expected to have more content and, therefore, more mistakes than other publications, but are readers to believe that on the days The Daily Illini printed no corrections, there were absolutely no mistakes in the issue? Sources are often reluctant to point out errors and, “Newspapers can hardly be expected to correct errors that they do not know were made.”[2] However, as a learning environment, it is inevitable that mistakes will be made at college newspapers. So why are there so few corrections?

In online publications, the most likely explanation is that errors were made but their corrections were not publicized. Professional news organizations have been inconsistent in their online corrections policies.[3] Some are simply correcting the error without acknowledging it was there in the first place (i.e., “scrubbing”[4]), while others are correcting the mistake as a note on the original page. Is this the case with college publications? Are student editors merely ignoring the mistakes and leaving them online? Are they fixing the mistakes without acknowledging them? Or are they adequately publicizing their errors and corrections? And do these practices depend on publication and university differences? This research explores how college newspapers use corrections on their websites in order to address such questions.

Literature Review

Student newspapers are often considered a proving ground for students who wish to pursue careers in communication and journalism. However, while some student newspapers are independent through advertising, many smaller student papers require supplemental funding to survive, which may be acquired through student fees or funding from their schools. This dependence on their institutions has actually increased in recent years, as college newspapers, like their professional counterparts, struggle with underpaid staffs[5] and lackluster interest in print ads.[6]

With this institutional connection, many campus newspapers tend to be described as experiential learning opportunities. The characteristics of promoting students’ initiative, providing students with regular comments and suggestions on assignments and giving them the ability to learn from their mistakes as those most consistent with experiential learning.[7] In a case study of The Muleskinner, Central Missouri State University’s student newspaper, the publication came under the oversight of the mass communication department.[8] While this setup proved to be effective for Central Missouri (it still runs as a laboratory newspaper today), not all schools have had success with this arrangement. Also, the financial dependence on universities and student fees can lead to fraught relationships and coverage. Indeed, much research regarding student newspapers addresses the relationship between school administration and campus publications,[9] the potential for censorship[10] and issues of control.[11]

No matter the funding structure, “student newspapers can … be valuable and semi-realistic environments in which to teach management, advertising, public relations and law, and in which to learn the complexities and peculiarities of readership communities.”[12] And for students who plan to pursue careers in journalism, having hands-on experience in working for a newspaper, either through internships or working for student papers, is seen as a résumé-builder.[13]

While some may argue that there are significant differences between professional papers and campus papers, previous research found little difference in qualities such as readability, thoroughness and story interest. However, there were other measures of “quality” that he did not assess, such as “story accuracy, the balanced use of sources within a news story, the relative importance and placement of articles, the effect of packaging writing with graphics.”[14]

One possible way to examine story accuracy is through the use of corrections. Corrections and clarifications are one way that journalists are able to demonstrate to readers that they care about being accurate. In fact, The New York Times’ decision to print more corrections “may have improved that newspaper’s reputation for fairness and accountability.”[15] This effort is particularly important, as the public’s trust in the news media has continued to diminish. In fact, all forms of news media have reported double-digit declines in reported “believability” between 2002 and 2012.[16] Previous research that compared credibility in print and online products also found that “content credibility of both platforms is problematic.”[17]

Theoretically, media and newspapers serve a vital function in the process of democracy. Silverman of “Regret the Error” noted, “the press plays an essential role in the flow of critical information that affects every part of our lives.”[18] According to Democratic Theory, “What people know, the accuracy and extent of their understanding, bears directly on their ability to function as citizens.”[19] This would suggest that when information is faulty, media outlets have an obligation to print corrections providing accurate information. Student newspapers, like their professional counterparts will make mistakes. Arguably, as learning environments, student newspapers have even more potential to contain errors. As a logical extension then, student newspapers should also have more corrections.

Getting things right should be at the forefront of all journalists’ minds. Accuracy and credibility are strongly linked. As the presence of corrections has been shown to cultivate a good relationship with readers,[20] it may be beneficial for college newspapers to use corrections to continue to further enhance their reputations.

This research seeks to explore whether campus newspapers are, in fact, using corrections in their publications. Specifically, as it has been documented that not all professional newspapers have transferred their correction practices online,[21] this research seeks to examine the presence and use of corrections in the online version of college newspapers. As such, the researchers put forth the following research questions:

  • RQ1: Do college media websites provide information about: (a) corrected errors, (b) contact information, (c) funding structures, and (d) advertising?
  • RQ2: How do corrections on college media websites differ in terms of: (a) type, (b) objectivity, and (c) impact?
  • RQ3: Do characteristics of college media websites affect the likelihood of published corrections?

Methods

Sample and General Procedures

The researchers conducted a content analysis of college newspaper websites. All of the websites belonged to schools or publications that had at least one faculty member or media adviser listed in the College Media Association directory. After removing duplicates, there were more than 500 college newspapers. The authors then removed any newspapers that did not have websites and then any websites that did not have search functions. Ultimately, the researchers coded 419 college newspaper websites. The unit of analysis was the website.

Coding and Intercoder Reliability       

The websites were coded based on a codebook developed by the primary researcher. Aspects of a previous codebook used to assess corrections at The New York Times were used to code the corrections on the student newspapers’ websites.[22] Two authors coded the first 120 websites, or about 29% (which falls within the ranges of content units needed for reliability tests[23]), and the primary researcher coded the remaining sites. All of the websites were coded for 17 factors. Intercoder reliability was calculated using Cohen’s Kappa (see Table 1). Of the 17 factors, all but one factor had strong Kappa values of 0.8 or higher. However, there were some factors that were found to be more subjective. The categories with discrepancies are discussed below.

Contact information. This measure indicated whether the website had contact information such as email addresses or phone numbers and was coded 1-3 (1 = Yes, 2 = No, 3 = Unknown). The intercoder reliability for this measure was high (k = .812). For this measure, the coders did not specify whether a contact “form” counted as a means of contact. The codebook indicated that the coders were looking for email addresses and phone numbers. One coder also counted contact forms if these were the only means of contact available.

Accuracy. This measure documented whether there was a statement regarding accuracy or ethics on the site. It was coded 1-2 (1 = Yes, 2 = No). The intercoder reliability for this measure was high (k = .816). While some publications had language as explicit as the statement from The Daily Illini that opens this research, other “language” was as simple as The Bucknellian’s policy on free speech, which states, in part, that the school supports free speech and as such, relies “on the good judgment of Bucknell students to follow journalistic ethical guidelines, good taste and compassion.”

Funding. As much of the existing literature addresses the funding of student newspapers, each website was coded for information regarding how the newspaper is funded. This measure was coded as 1-5 (1 = Independent, supported through advertising, 2 = Supported through student fees, such as a media fee, 3 = Supported by student government funding, 4 = Supported through the institution or produced by a class, a laboratory paper, 5 = No information available on website, unknown). The intercoder reliability for this measure was high (k = .825). However, this proved to be a very difficult measure to code. Few publications disclose this information on their websites. Many institutions use a hybrid model of some institutional support and some advertising revenue, or else label themselves as “independent” without distinguishing if that was editorial or financial independence; therefore, many newspapers had to be coded as unknown.

Advertising. As it was so difficult to determine where many college newspapers got their funding, the researchers added a measure for advertising. It was coded 1-2 (1 = Yes, 2 = No = 2). The intercoder reliability for this measure was fair to high (k = .759). This measure was also difficult to code. Occasionally, the only evidence that a newspaper accepted advertisements was the presence of an “advertising sales representative” on the staff list.

The subjective nature of other measured variables — such as the objectivity or subjectivity of corrections, the type of error, and the impact of error — has already been documented.[24] The full list of categories and their options can be seen in Table 2.

Results

Sample description.

Table 2 shows descriptions of the coded college media websites. Most of the schools were located in the South (n = 147), and the category with the fewest schools was international (n = 3). The West was the least represented U.S. region. Most of the schools were four-year programs (n = 347), and the remainder (n = 70) were two-year programs. The majority of schools (n = 391) had some kind of media studies, communication or journalism program. Most of the newspapers were weekly (n = 183), but a good number of the publications did not clearly indicate their publication frequency and were coded as unknown (n = 61). About 84% of the papers (n = 354) were up to date based on their publication frequency.

RQ1: Information on college media websites

(a) Corrected errors. More than half of the websites coded had at least one correction that could be found using the websites’ search functions (n = 237). However, only 6.2% (n = 26) of the websites had a correction for their most recent issue. Most corrections (n = 88) were more than one year old.

(b) Contact information and statements. Most of the websites (n = 374) did have contact information such as a phone number or email address. However, only 17.6% (n = 74) had a statement or language referencing accuracy or ethics on their websites. Only 5.2% (n = 22) had a link that directed readers to a page containing information about how to submit a correction, an error archive, or a policy.

(c). Funding structures. The overwhelming majority of college newspapers—71.1%—did not disclose their funding structure (n = 298). Of those that did report their financial information, just under 10% (n = 41) indicated that the publications were financially independent, and 8.3% described themselves as laboratory papers (n = 35). A smaller number of student newspapers (n = 10) got support through student fees (2.4%), and 1.7% (n = 7) reported getting some financial assistance from student government.

(d). Advertising. While there was little information about the source of all the publications’ funding, the majority of college newspapers, or 71.4% (n = 299), did have some information regarding advertising on their websites. This suggests that the majority could have earned income through advertising, but does not make clear to what extent the advertising supports the publications.

RQ2: Types of corrections on college media websites

(a). Type. Of the errors/corrections that were coded, most did not fall under Tillinghast’s original 14 categories and had to be classified as “other” (n = 63).[25] An example of an “other” correction appeared in Ithaca College’s student newspaper, The Ithican. The correction reads, “The original story said that Toibin came to speak to students in the Ithaca College Honors Program, but he came as a visitor for the Ithaca Seminar Program, including the Honors Program.” This correction could be classified as a clarification, which has been suggested as a possible addition to Tillinghast’s original categories.[26]

The most common error after “other” was “names” (n = 38) followed by “other numbers” (n = 30) and “over emphasis” (n = 21).

(b). Objectivity. Most of the errors coded were objective errors of fact (n = 211).

(c). Impact. The majority of errors were coded as “low-impact” (n = 195). During the coding process, the researchers had to amend the codebook for the categories of objective/subjective, type and impact. Nine publications indicated that an error had occurred but provided no additional information such as what the mistake was, or how it happened. For example, on an article in the Loyola Phoenix, the student newspaper of Loyola University Chicago, a correction read, “Editor’s note: This version of the article has been updated from the version that appeared in print Wednesday, Feb. 27, in order to reflect corrections to the article. The Phoenix regrets these errors.”

RQ3: Presence/Absence of Published Corrections

Logistic regression was used to analyze the influence of seven publication and university characteristics (frequency, funding, type of school, degree offered, presence of accuracy statement, presence of ads, presence of correction link) on whether the website published corrections (Nagelkerke R2 = .26, omnibus model X2 = 85.62, p < .001).

Frequency was positively related to publishing corrections, Wald = 38.27, p < .001. Specifically, newspapers that publish daily (b = -2.83, SE = .59, Wald = 23.10, p < .001), semi-daily (b = -1.78, SE = .49, Wald = 13.18, p < .001), and weekly (b = -.90, SE = .34, Wald = 6.88, p = .009), were significantly more likely to publish corrections. (Note: Publishing corrections was coded as 1 and not publishing was coded as 2; therefore, the negative beta weight indicates a greater likelihood of publishing.)

Funding was also positively related to publishing corrections, Wald = 10.36, p = .035. Specifically, newspapers that are independent (b = -.75, SE = .43, Wald = 3.04, p = .08) and those that are supported with student government fees (b = 2.33, SE = 1.19, Wald = 3.84, p = .05) were significantly more likely to publish corrections.

Type of school was negatively but not significantly related to publishing corrections, b = .10, SE = .35, Wald = .09, p = .77. Degree offered was negatively but not significantly related to publishing corrections, b = .53, SE = .50, Wald = 1.11, p = .29. Presence of an accuracy statement was positively but not significantly related to publishing corrections, b = -.41, SE = .33, Wald = .1.56, p = .21. Presence of ads was positively but not significantly related to publishing corrections, b = -.24, SE = .27, Wald = .79, p = .38. Presence of correction link was not significantly related to publishing corrections, b = .02, SE = .60, Wald = .001, p = .98.

Summary

In response to RQ1, most college media websites had at least one correction, most provided contact information, and most included advertising information. However, most did not include statements about accuracy or ethics, most did not disclose their funding structure, and only 5% linked readers to information about how to submit a correction. Additionally, in response to RQ2, most corrections were objective, low impact, and “other.”  Finally, in response to RQ3, college newspapers that were most likely to publish corrections were those that published more frequently and those that were independent or funded by student government fees.

Discussion

Interpretation

The goal of this study was primarily to document the use of corrections on college newspaper websites. Are college publications adequately publicizing their errors and corrections? And do these practices depend on publication and university differences? Based on this study, yes, some college publications are adequately publicizing corrections, and, yes, these practices seem to depend on publication and university differences.

The first main finding of this study is that researchers could find corrections in just over half of the websites examined. This means that almost half (43%) did not have easily identifiable corrections. As it is difficult to believe that these student publications are perfect, this observation yields additional questions. Are college journalists fixing mistakes without acknowledging them i.e., scrubbing? Are these publications publishing corrections in their print editions? Do they have policies regarding how to address error? As discussed earlier, professional news organizations have been inconsistent in their online corrections policies; based on this study, it appears that college publications have been inconsistent, as well.

Another commonality between professional and college newspapers is the similarity in kinds of corrections and their impact. In this study, most corrections were objective, low impact, and “other.” Previous research has documented similar patterns for corrections at The New York Times.[27] This was mirrored in the corrections coded on the college newspaper websites, which suggests similarities between the publication types.

Interestingly, the findings seem to suggest that the more professional a student newspaper is, based on its publication schedule its financial independence, the more likely it is to use corrections. It is important to note, however, that these student publications all have faculty members or advisers who are members of the College Media Association; this could mean that the publications in this study are already more professional than other student media outlets.

During the coding process, the researchers also observed that many college newspapers lacked information about themselves, which suggests deeper issues with transparency. Nearly 15% (n = 61) of the websites coded provided no way for the researchers to determine the frequency of publication. Among those whose frequencies could be documented, the researchers were often forced to use advertising information to determine print schedules. Additionally, almost 10% of the websites (n = 41) failed to provide any means of contact, such as a phone number or email address. This lack of transparency was also noted in the lack of information about publications’ funding—nearly three-quarters of the websites coded did not reveal their financial situations. The lack of transparency in errors and corrections, then, could be seen as part of a larger lack of transparency across the publication.

Practical Implications

This research suggests that college newspapers are similar to professional publications in terms of the types of errors they correct and in terms of their less than vigilant approach to chronicling errors online.. As responding to mistakes promotes credibility,[28] it may be in the best interest of campus publications to re-establish their corrections policies, especially online.

To fully serve their democratic function, student newspapers have just as much of an obligation to publish corrections as their professional counterparts do. As many students ultimately prefer their campus publications for community news,[29] college newspapers must strive to provide accurate information and corrections whenever necessary.

Limitations and Future Research

This research is limited in that it provides only a snapshot of college newspapers. Not all college newspapers are members of the College Media Association and, as such, did not have the potential to be included in this sample. Additionally, the regions are not equally represented, with more schools in the southern region of the United States being included than schools from other regions. This may reflect a preference for the College Media Association, which was based at Vanderbilt University in Tennessee when the sample was generated. Therefore, this research is not representative of college newspapers in the United States.

The finding that few websites were transparent with information about the publication was interesting in itself; however, the current research is limited by the lack of information about the funding structures. Moreover, upon review it was noted that there was the potential for overlap in the categories of student government funding and student fee funding. Future research should more thoroughly investigate student newspaper funding and the reason for other areas of missing data, such as publication frequency, by obtaining more information through an interview-based study.

Additionally, this data does not suggest that college newspapers are not making or correcting errors; it simply shows that they are not publicizing those errors or corrections on their websites. As discussed earlier, professional online news organizations have been inconsistent in their online corrections policies. Some are simply correcting the error without acknowledging it was there in the first place (i.e., “scrubbing”), while others are correcting the mistake as a note on the original page. It is, therefore, likely that this is happening on college newspaper websites, as well. This study, then, is meant to show the preponderance, or lack thereof, of official, publicized corrections on news websites; because of the inconsistency in online corrections policies, this study cannot make claims about the number of published mistakes, corrected or otherwise. A future study that monitored individual articles for revisions and updates could begin to address this concern, but such studies will still be limited until online publications establish consistent methods for acknowledging and correcting errors.

This research does establish a starting point for research about accuracy and the use of corrections at college newspapers. Future research should compare print and Web editions of campus newspapers. It is possible newspapers are using corrections more frequently than this research documented, but that the corrections have not been transferred online. Additionally, other research may wish to examine the prevalence of student newspaper websites in the south. Other studies could also examine handbooks or policy manuals for college newspapers to see if they have policies or procedures in place for handling error.

Despite limitations, what this study does show is that certain publications are more likely to publish corrections than others. Although the presence of corrections may or may not correlate with the presence of errors, it does indicate a focus on quality and transparency. This study’s findings indicate that, on the whole, college newspapers’ websites still have work to do in increasing quality and transparency through the publication of corrections.

Appendix

Table 1

Intercoder Reliability for 17 coding variables

College_Corrections_Revisions_5-13-16

*Intercoder reliability using Cohen’s Kappa was calculated for two coders for about 29 percent of the overall sample (n = 120).  Total N = 419 college media websites.

Table 2

Frequency and valid percent statistics for 17 coding variables

 College_Corrections_Revisions_5-13-16

Note: N = 419 college media websites.


  1. Kirstie Hettinga, “Typing Corrections: Examining Corrections and Their Role in Democratic Theory,” in Conference Paper — Association for Education in Journalism and Mass Communication (St. Louis, MO., 2011).
  2. Scott R. Maier, “Setting the Record Straight,” Journalism Practice 1, no. 1 (January 2007): 33–43.
  3. Kirstie Hettinga, “Best Practices for Corrections at Online Newspapers” (International Commmunication Association, Phoenix, AZ, 2012).
  4. Craig Silverman, “Scrubbing Away Their Sins,” Columbia Journalism Review, December 5, 2008, http://www.cjr.org/behind_the_news/scrubbing_away_their_sins.php?page=all.
  5. Allie Grasgreen, “College Newspapers Turn to Student Fees for Funding,” Inside Higher Ed, accessed October 26, 2013, http://www.insidehighered.com/news/2013/04/26/college-newspapers-turn-student-fees-funding.
  6. Eric Fidler, “Campus Newspapers: Hard Times, Hard Choices,” Gateway Journalism Review 42, no. 326 (Spring 2012): 12–24.
  7. Wanda Brandon, “Experiential Learning: A New Research Path to the Study of Journalism Education,” Journalism & Mass Communication Educator 57, no. 1 (Spring 2002): 59–66.
  8. Kuldip R. Rampal, “Department-Run Campus Newspaper Has Definite Educational Advantages,” Journalism Educator 37, no. 1 (Spring 1982): 48–50.
  9. Tammy Merrett, “Administrators Move to Control Collegiate Press,” Louis Journalism Review 37, no. 297 (July 2007): 22–26.
  10. Shaniece Bickham and Jae-Hwa Shin, “Who’s in Control of Student Newspapers? An Analysis of Influences, Self-Censorship, and Censorship of Content,” Conference Papers — International Communication Association, Annual Meeting 2010, 1.
  11. Derigan A. Silver, “Policy, Practice and Intent: Forum Analysis and the Uncertain Status of the Student Press at Public Colleges and Universities,” Communication Law & Policy 12, no. 2 (Spring 2007): 201–30.
  12. David C. Nelson, “Making the Most of Your College Paper,” Journalism & Mass Communication Educator 43, no. 39 (1988): 39–41.
  13. Shawn M. Neidobf, “Wanted: A First Job in Journalism-An Exploration of Factors That May Influence Initial Job-Search Outcomes for News-Editorial Students,” Journalism & Mass Communication Educator 63, no. 1 (Spring 2008): 56–65.
  14. John V. Bodle, “The Instructional Independence of Daily Student Newspapers,” Journalism & Mass Communication Educator 51, no. 4 (Winter 1997): 16–26.
  15. Neil Nemeth and Craig Sanders, “Number of Corrections Increase At Two National Newspapers.,” Newspaper Research Journal 30, no. 3 (Summer2009 2009): 90–104.
  16. A., “Further Decline in Credibility Ratings for Most News Organizations,” Pew Research Center for the People and the Press, August 12, 2012, http://www.people-press.org/2012/08/16/further-decline-in-credibility-ratings-for-most-news-organizations/.
  17. Gregg A. Payne and David M. Dozier, “Readers’ View of Credibility Similar for Online, Print,” Newspaper Research Journal 34, no. 4 (Fall 2013): 54–67.
  18. Craig Silverman, Regret the Error: How Media Mistakes Pollute the Press and Imperil Free Speech (Sterling Publishing Company, Inc., 2007).
  19. Jeffrey Scheuer, The Big Picture: Why Democracies Need Journalistic Excellence (Routledge, 2008).
  20. Neil Nemeth and Craig Sanders, “Number of Corrections Increase At Two National Newspapers.,” Newspaper Research Journal 30, no. 3 (Summer2009 2009): 90–104, doi:Article.
  21. Kirstie Hettinga, “Typing Corrections: Examining Corrections and Their Role in Democratic Theory,” in Conference Paper — Association for Education in Journalism and Mass Communication (St. Louis, MO., 2011).
  22. Kirstie E. Hettinga and Alyssa Appelman, “Corrections of Newspaper Errors Have Little Impact,” Newspaper Research Journal 35, no. 1 (Winter 2014): 51–63.
  23. Stephen Lacy and Daniel Riffe, “Sampling Error and Selecting Intercoder Reliability Samples for Nominal Content Categories,” Journalism and Mass Communication Quarterly 73, no. 4 (Winter 1996): 963–73.
  24. Kirstie E. Hettinga and Alyssa Appelman, “Corrections of Newspaper Errors Have Little Impact,” Newspaper Research Journal 35, no. 1 (Winter 2014): 51–63.
  25. William A. Tillinghast, “Source Control and Evaluation of Newspaper Inaccuracies,” Newspaper Research Journal 5, no. 1 (Fall 1983): 13–24.
  26. Kirstie E. Hettinga and Alyssa Appelman, “Corrections of Newspaper Errors Have Little Impact,” Newspaper Research Journal 35, no. 1 (Winter 2014): 51–63.
  27. Kirstie E. Hettinga and Alyssa Appelman, “Corrections of Newspaper Errors Have Little Impact,” Newspaper Research Journal 35, no. 1 (Winter 2014): 51–63.
  28. Neil Nemeth and Craig Sanders, “Number of Corrections Increase At Two National Newspapers,” Newspaper Research Journal 30, no. 3 (Summer2009 2009): 90–104.
  29. Bill Krueger, “Students Prefer Printed College Newspapers over Online,” Poynter, n.d.

Kirstir_Hettinga_editKirstie Hettinga, Ph.D., is an assistant professor of communication at California Lutheran University. She teaches courses in media writing and editing and serves as the faculty adviser to Cal Lutheran’s student newspaper, The Echo. Her research addresses issues of credibility and transparency in news media. Her work has previously been published in Newspaper Research Journal, Journal of Mass Media Ethics, and Media Ethics. She earned her bachelor’s degree in print journalism and master’s degree in mass communication from California State University, Fresno and her doctorate in mass communication from The Pennsylvania State University.

Rosemary_Clark_cropRosemary Clark is a Ph.D. student at the University of Pennsylvania’s Annenberg School for Communication.  Her research traces how feminists in the United States have used traditional and digital media as sites of resistance across the movement’s history. She has an MA in communication from the University of Pennsylvania and a BA in media and communication studies from Ursinus College.

 

 

 

Alyssa_Appelman_editAlyssa Appelman, Ph.D., is an assistant professor in the Department of Communication in the College of Informatics at Northern Kentucky University, where she teaches courses in journalism and mass communication. Her research focuses on journalism and media effects. She examines the cognitive and perceptual effects of news style, and she is particularly interested in the ways news conventions affect knowledge-gain and credibility perceptions. She received her bachelor’s and master’s degrees in journalism from the University of Missouri-Columbia and her doctoral degree in mass communication from The Pennsylvania State University.