Three hundred and fifty years after his birth, the work of Irish satirist Jonathan Swift continues to enjoy great popularity among contemporary readers. Library data tells us that Swift is the most popular Irish author, and the work for which he is best known, Gulliver’s Travels, is the most popular work by an Irish author, in world literature.
“Gulliver’s Travels belongs not just to Irish literature, but to world literature and its relevance only increases over time,” said Dr. Aileen Douglas, Head of the School of English at Trinity College Dublin, in the Irish Times last week. Dublin is marking the 350th anniversary of Swift’s birth with its Swift350 celebration throughout 2017.
Swift, who was born in Dublin in 1667, published Gulliver’s Travels in 1726. The work is now held by more than 40,000 libraries worldwide. Overall, Swift’s works account for nearly 240,000 library holdings worldwide.
Rounding out the top five most popular works by an Irish author are Dracula by Bram Stoker; The Vicar of Wakefield by Oliver Goldsmith; The Picture of Dorian Gray by Oscar Wilde; and Artemis Fowl by Eoin Colfer. Oscar Wilde, Eve Bunting, George Bernard Shaw and Oliver Goldsmith follow Swift in the ranking of the top five most popular Irish authors.
Our research also revealed that Eoin Colfer is the most popular contemporary Irish author, and that Van Morrison’s Astral Weeks is the most popular Irish musical work.
These findings are derived from WorldCat, a union database of the catalogs of thousands of libraries around the world. We define popularity in terms of library holdings—the number of appearances by an author or work in library collections worldwide. These library collections are where world literature is stewarded and defined.
The findings on Irish authors are part of OCLC Research’s continuing work exploring cultural patterns and trends through library bibliographic and holdings data. Published materials are an important way countries project their cultural, intellectual, literary and musical traditions.
OCLC Research has developed methods for identifying a national presence in the published record, encompassing materials that are published in, are authored by people from, and/or are about a particular country. Earlier studies have applied these methods to Scotland and New Zealand.
The next study, focused on Ireland, will be released later this year. Lorcan Dempsey, OCLC Chief Strategist and Vice President, Membership and Research, presented preliminary findings from the Irish study at the recent CONUL annual conference in Athlone, Ireland. CONUL is a consortium of Ireland’s main research libraries.
These studies reveal the importance of library data, not just as organizational tools to track and find library resources, but also as a research resource that can be studied and analyzed in the aggregate—as the collective collection of the world’s libraries.
Dublin’s Swift350 celebration includes special exhibits at the Library of Trinity College Dublin and the Dublin City Public Libraries. An international conference this week, June 7–9, 2017 at Trinity College Dublin, focuses on Swift and his work.
The post Gulliver’s Travels – the most popular Irish work by the most popular Irish author in world literature appeared first on OCLC Next.
Facial recognition technology is coming to an airport near you. While this may seem like we’re now living in the future, this technology already exists in Google Photos, and it’s being implemented by a new security camera from Nest. It’s beginning to hit the mainstream. Will library cards be replaced with facial recognition software? At the rates things are going, we may likely see that in our lifetime.
From the Ohio Web Library:
Some thoughts on the role information plays in the response to recent attacks.
I just realized that I haven’t shared my slides from LOEX a few weeks ago. So, scroll to the bottom for the complete slidedeck. Of course, it would help to have some context to understand what was going on. So here are my presentation notes, slightly reworked for this post. Also, there were a couple of caveats I made during the presentation:
Anyways. Let’s begin…Introduction
Everyone is talking about fake news these days when, in fact, fake news isn’t really new at all. Fake news is simply a means (among many) of monetizing confirmation bias. And it’s been with us for as long as there’s been news. Cicero complains about fake news in De Oratore. The yellow journalism of the 1890s started a major war. The tabloids in the supermarket won’t leave Bat Boy alone. My contention is that the existence of fake news is not the problem we should be focusing on. I mean, it is a problem, but the more salient problem we’re dealing with is the rhetoric of fake news. The spread of a deep mistrust of traditional media coupled with the valorization of motivated reasoning. Otherwise known as post-truth.Post-truth
We know that trust in the media is at an all time low. And the current rhetorical climate is to blame. When the President of the United States dismisses every criticism as “fake news” and urges his followers to reject any news that doesn’t glorify his (objectively terrible) regime, we’re looking at post-truth. The massive deregulation of the 1990s (cf., the Telecommunications Act of 1996) exacerbated nascent market forces in the media industry, leading media companies to slowly trade journalistic integrity for shareholder value. The distinction between professional journalist and partisan hack slowly disintegrated and the amateur enthusiast was elevated to “speak truth to power” while readers were circumscribed into filter bubble after filter bubble. Remember how participatory journalism was supposed to save the media? Fifteen years later, the new lingo is “don’t read the comments.” Journalism is incorporated, information is democratized, the expert is dead. It’s all post-truth. And if it feels liberating, it’s not. As Habermas warned, the existence of an independent, trustworthy news media is essential to the proper functioning of a democratic society (assuming that’s what you’re into). We’ve only got two ways out; (1) make the news media more trustworthy and (2) help people understand the social nature of that trustworthiness. I’ll set aside the first, only because it’s a different conversation, however, I do want to argue that librarians can help with the second…but it’s not going to be with LibGuides. We can only help if we pay attention to the underlying cognitive processes that lead to the post-truth mindset.Post-Truth Psychology
There are dozens of cognitive biases that contribute to the post-truth mindset. I’ll just focus on a few:
These and other cognitive biases show up in the library classroom. Students are motivated to find articles that back up their thesis statement rather than articles that may help guide and refine their thesis. Students are more skeptical towards traditional news sources that their professor’s may prefer (e.g., NYT or WaPo). Students sometimes lack the domain knowledge needed to make accurate judgments about an information source. Students may react negatively when exposed to potentially controversial topics. All this and more. So what should we do?Post-Truth Pedagogy
If we want to help students learn to evaluate popular sources (i.e., news media) then we have to do our best to avoid triggering the cognitive biases that contribute to the post-truth mindset. Surprisingly, there is virtually nothing in the library literature on avoiding confirmation bias in the classroom. Thankfully, there are hundreds, if not thousands, of articles in other disciplines, going back decades in some cases. Here are a few ideas that have been proven to work:Timing matters
Cognitive biases set in very quickly and the effectiveness of an instruction intervention decreases over time (Tetlock & Kim, 1987). When working with novice learners, try to get instruction on information evaluation as early as possible, preferably before they have settled on a research question.Accountability matters
Students show more integrative, complex reasoning when asked to justify their decisions to someone else. (Druckman, 2012; Taber & Lodge, 2006; Tetlock, 1983). So, give them as much opportunity as possible to explore their decisions with their peers. Develop activities and mechanisms for peer-instruction. Put another way, lecturing about how to evaluate information is less successful than asking students to explain their search decisions to a peer. When students feel empowered and take ownership over their own search and evaluation behaviors, they are more receptive to suggestions for improvement.Google is not our enemy
Okay, raise your hand if you use a large, multisubject database to introduce students to searching for popular sources. I think lots of us do. I used to. And then one day I realized something. When I wake up in the morning, head downstairs, make breakfast for the kids, and then sit down to read the morning news, I never say to myself, “all right, let’s fire up ProQuest.” Hell no. I go to Facebook. Google News. Washington Post. Local paper. RSS feeds. Twitter. I never get my news from a database and neither do our students. Nor should they. I want to teach students how to evaluate information in a natural setting, not the artificial setting of a database. So many markers of credibility are stripped out when news gets repackaged for databases: size and placement on original page, comments, images, related articles, and so on. So, we examine Google, talk about algorithms, etc.Avoid controversial topics
Polarizing examples encourage defensive rationalization and directional reasoning. This has been proven over and over again. It’s the way we are. Just in terms of the neuroscience, the way I get agitated when I read something praising Trump is the same way a Trump-supporter gets agitated when they see criticism of Trump. The amygdala lights up, adrenaline spikes, and it’s fight-or-flight. When that happens, it can be harder for some students to focus on the lesson’s objectives. Things like curiosity, openness to new ideas, empathy, critical thinking…these are all affected by strong emotional responses. This is why I don’t stand in front of a class and say “all right, let’s all Google abortion” or “how credible is this article on rape culture” or “let’s come up with synonyms for white supremacy.” These implicitly force students to take a side and focus on the issue rather than on the broader critical thinking skills that are the actual focus of the lesson. This isn’t to say that students shouldn’t explore complex social issues; this is just to say that a 50-minute library session for a Freshman composition class isn’t the best place to do it. Or at least, it’s not the place to do it so blatantly. Save it for a class that’s explicitly addressing those topics. Otherwise it comes across like crass, #critlib virtue-signaling. Speaking of which. While I think we ought to avoid using polarizing topics directly, there is a place for indirect engagement with complex social issues. When I teach students about Google, I use an activity almost identical to this one by Jacob Berg and we talk about AdWords and PageRank and SEO and bias and the under-representation of marginalized voices and so on. It’s an approach built upon leading students to uncover these things organically. Like, giving them the cognitive tools to uncover biases on their own. Berg uses Safiya Noble’s example of searching Google Images for ‘beautiful women’ and asking students to discuss why all of the photos are of white women. I sometimes do the same thing, only with the phrase ‘successful person’ and students discuss why it’s mostly photos of corporate white guys in suits. Again, it’s a Socratic, indirect approach (“Why are there no people of color in the results?”) rather than a top-down, direct approach (“Let’s evaluate these results for #blacklivesmatter”).Rethink reliability
But wait! You’re still talking about searching. When do you get to the information evaluation part? When do we get to use the CRAAP test? The answers are: “you already have” and “never.” Rather than leave information evaluation to a silly mnemonic that gets slapped on after you find an article, I think evaluation should be built into the entire search process. And it all comes down to distinguishing between reliability and usefulness. Let’s take a look at the CRAAP test. Here’s an article I used as an example in my presentation:
Is it any good? Let’s CRAAP test it.
The idea here is that the CRAAP test makes a lot of epistemological assumptions that obscure just how difficult it really is. Another idea here is that the CRAAP test has nothing to do with reliability. It’s a test for the usefulness of an article. Reliability comes from somewhere else. Importantly, reliability is a property of information sources not information itself. Authors, publishers, news outlets, etc. can be reliable, not single articles.
Basically, an information source is reliable (credible) to the extent that it tends to produce true beliefs. I highly recommend checking out Alvin Goldman’s work on reliabilism for the full details. But the short version is that, while no information source is perfect, some information sources are more likely to lead to true beliefs. Take a look at this 2012 poll from Farleigh Dickinson University:
Sure, all media is biased. But, some media are more biased than others. And some media are more likely to yield true beliefs. You can measure it. People who watch FOX News display less political awareness than people who listen to NPR. It’s not a matter of FOX News is unreliable and NPR is reliable. It’s a matter of degree. Just like a broken clock is right twice a day, FOX News has some factual reporting (but only outside of their morning and evening “entertainment” programming). But, on balance, FOX News has way less factual reporting than the Washington Post or NPR. And this gets back to the question of accuracy. We can sidestep having to become experts ourselves if we allow that we have better epistemic grounds to trust articles from some publications than from others. I didn’t have time in my presentation, and it gets sort of technical, but if you look at reliability through a Bayesian lens, you can start to really dig into related issues like consistency across multiple courses, the likelihood that a source would report something (e.g., how likely is it for FOX to report something that makes Trump look bad), and other things.Focus on the process
Anyways, the approach I recommend to teaching information evaluation is to get students to the point that they are thinking about evaluation before they even begin searching. Focus on the search process; don’t leave evaluation as an after-thought. Think about it this way. Cognitive biases are often subtle and sub-conscious. Likewise, information evaluation is occuring during the search process but it, too is often subtle and sub-conscious. And that’s where cognitive biases are going to have the greatest effect. So turn searching into evaluating. Show Google works and how results are manipulated. Show them things like the ‘site:’ tag. Talk about truth and fairness in reporting. Talk about media ethics. Talk about consistency across multiple sources. Examine how different sources cover the same event. Again, it’s not about asking “is this article credible?” It’s about getting students to ask “given the source of this article, and given the way the issue is reported elsewhere, can I trust what I’m reading?”Conclusion
Wow. This turned into a 2500 word rant. Sorry about that. The nutshell is this: simply giving students a bullet pointed list of “ways to spot fake news” isn’t sufficient; you need to teach in a way that avoids triggering poor cognitive processes. Time instruction correctly. Include accountability; have students justify their search behaviors to their peers. Avoid emotionally-charged examples. Move beyond acronyms. Focus on the search process, not the search results. Teaching information evaluation is as much about how you teach as it is what you teach.Slides Sources
Cassino, D., Woolley, P., & Jenkins, K. (2012). What you know depends on what you watch: Current events knowledge across popular news sources. Fairleigh Dickinson University’s Public Mind Poll. Retrieved from publicmind.fdu.edu/2012/confirmed/final.pdf
Druckman, J. N. (2012). The politics of motivation. Critical Review, 24(2), 199-216. doi:10.1080/08913811.2012.711022
Goldman, A. (1999). Knowledge in a social world. Oxford: Oxford University Press.
Gunther, A. C., & Liebhart, J. L. (2006). Broad reach or biased source? Decomposing the hostile media effect. Journal of Communication, 56(3), 449-466. doi:10.1111/j.1460-2466.2006.00295.x
Gunther, A. C., & Schmitt, K. (2004). Mapping boundaries of the hostile media effect. Journal of Communication, 54(1), 55-70.
Habermas, J. (2006). Political communication in media society: Does democracy still enjoy an epistemic dimension? The impact of normative theory on empirical research. Communication Theory, 16(4), 411-426. doi:10.1111/j.1468-2885.2006.00280.x
Habermas, J. r. (1989). The structural transformation of the public sphere : an inquiry into a category of bourgeois society (T. Burger, Trans.). Cambridge, MA: MIT Press.
Jonas, E., Schulz-Hardt, S., Frey, D., & Thelen, N. (2001). Confirmation bias in sequential information search after preliminary decisions: An expansion of dissonance theoretical research on selective exposure to information. Journal of Personality and Social Psychology, 80(4), 557-571. doi:10.1037/0022-35220.127.116.117
Kaplan, J. T., Gimbel, S. I., & Harris, S. (2016). Neural correlates of maintaining one’s political beliefs in the face of counterevidence. Scientific Reports, 6.
Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121-1134. doi:10.1037//0022-3518.104.22.1681
Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480-498.
Kuran, T., & Sunstein, C. R. (1999). Availability cascades and risk regulation. Stanford Law Review, 683-768.
Lavine, H. G., Johnston, C. D., & Steenbergen, M. R. (2012). The ambivalent partisan: How critical loyalty promotes democracy. New York: Oxford University Press.
Lenker, M. (2016). Motivated reasoning, political information, and information literacy education. portal: Libraries and the Academy, 16(3), 511-528.
Schaffner, B., & Luks, S. (2017). This is what Trump voters said when asked to compare his inauguration crowd with Obama’s. The Washington Post. Retrieved from https://www.washingtonpost.com/news/monkey-cage/wp/2017/01/25/we-asked-people-which-inauguration-crowd-was-bigger-heres-what-they-said/?utm_term=.db79a652500f
Seeber, K. (2017). Wiretaps and CRAAP. Retrieved from http://kevinseeber.com/blog/wiretaps-and-craap/
Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50(3), 755-769. doi:10.1111/j.1540-5907.2006.00214.x
Tetlock, P. E. (1983). Accountability and complexity of thought. Journal of Personality and Social Psychology: Attitudes and Social Cognition, 45(1), 74-83. doi:http://dx.doi.org/10.1037/0022-3522.214.171.124
Tetlock, P. E., & Kim, J. I. (1987). Accountability and judgment processes in a personality prediction task. Journal of Personality and Social Psychology: Attitudes and Social Cognition, 52(4), 700-709. doi:http://dx.doi.org/10.1037/0022-35126.96.36.1990
Vraga, E. K., Tully, M., Akin, H., & Rojas, H. (2012). Modifying perceptions of hostility and credibility of news coverage of an environmental controversy through media literacy. Journalism, 13(7), 942-959. doi:doi:10.1177/1464884912455906
We have daily contact with a lot of information systems. Do we talk about them when we talk about information literacy?
I’ve been working in interlibrary loan a long time and the collaboration I see within this group of librarians is amazing. Back in March, we held the first OCLC Resource Sharing Conference with the theme of Sharing Breakthroughs. I see this community sharing all the time, whether at an event, via a listserv or just a simple phone call.
A member-driven program committee helped shape the agenda and librarians provided much of the program content. Thank you to all who presented and participated. It was a great example of how well the resource sharing community works together to share and celebrate our breakthroughs.
We’ve posted all presentation recordings, including a wonderful keynote about storytelling from Todd Babiak, on the conference site. I invite you to view them, share them and think about attending or presenting next year. We’re so happy to announce that the conference will take place March 13–15, 2018, at the Hyatt Regency Jacksonville Riverfront in Jacksonville, Florida, USA.More ways to connect
It’s a busy time for libraries and for the OCLC Resource Sharing Team. In addition to working with the community on events like the OCLC Resource Sharing Conference, we’re engaging with the community as we build Tipasa and continue to migrate ILLiad libraries to this new ILL management system. We’re also reaching out to our VDX and Navigator libraries as we plan for future migration to Relais D2D. There are many opportunities for you to be involved, share your thoughts and help guide the process forward:
In the meantime, OCLC ILL staff are excited to be presenters at a number of events this summer. We hope you can join us at one or more of them:
Our resource sharing community has nearly 50 years of shared experience, and that is a critical resource for our future. Our membership is also a passionate and engaged worldwide community that, together, fills an ILL request every two seconds. We will take that experience and passion and use it to build cooperative services and resources that better meet the challenges of an increasingly connected, worldwide audience.
Back in January, the American Library Association published a report [pdf] on the activities of school and public libraries aimed at teaching children to write computer code. The report focused on children of school age, but in the marketplace you can find books and educational toys that teach coding to children as young as 3 years old. Is there really an advantage to starting children on coding this early? Does your library have coding books or activities for preschoolers?
Articles from Ohio Web Library:
© A Program of the colleges and universities of Minnesota State