The social media landscape is changing, and we all need better media literacy skills to help us make good choices. In this post, I’ll highlight how the “SIFT” web literacy framework can be helpful as many people start or continue building social connections on Mastodon, a “federated” and open source software platform which is an attractive alternative to Twitter. Even if you are not (yet) on Mastodon or do not plan to join, the media literacy strategies of SIFT can be enormously helpful as we navigate our highly polarized information environment. Here’s the core thesis I want to communicate in this post: We are literally in the middle of a confusing and chaotic information war, in which social media profiles we can encounter may be created and used by a variety of “bad actors” whose agenda is NOT authentic dialog or information sharing. In this environment, SIFT can be an essential media literacy “tool” and strategy in our “toolkits” as citizens, learners, and teachers.

SIFT’s Role in our Polluted Information Landscape

As I write this post on Saturday/Sunday, November 26-27, 2022, Twitter appears to be a dumpster fire headed toward obsolescence.

I sincerely hope this does not happen, because I’ve invested countless hours the past 16 years building my own Twitter lists, which continue to be a primary way I filter and read my news each day, as well as “discover new ideas.” However, the mass exodus of Twitter employees following Elon Musk’s firing of half of them upon taking control of the company, along with his decision to grant “general amnesty” to many previously suspended Twitter accounts, seem likely to promote more “harassment, hate speech and misinformation” on the platform and make it a less desirable place for millions of people as well as organizations to share news, ideas and links.

“SIFT” is a web literacy framework developed by Mike Caufield (@holden), which I’ve been using extensively the past three years in my middle school media literacy courses. We use SIFT in our media literacy unit on “Froot Loop Conspiracy Theories,” analyzing multiple videos relating to the Apollo Moon landings and the conspiratorial idea that NASA never landed men on the moon between 1969 and 1972, but instead those events were faked on a Hollywood sound stage. SIFT involves four different “moves” or stages:

  1. Stop (Especially when content elicits a strong or emotional response from you)
  2. Investigate the source (including the use of “lateral reading”)
  3. Find better coverage
  4. Trace claims, quotes and media to their original context

SIFT is a web literacy and media literacy strategy which is constantly needed and relevant in our modern information landscape, which is increasingly polluted and fractured. Today, as many “early adapter / innovator” educators (like yours truly) are exploring and using Mastodon and building connections there, SIFT is vital to understand and use. Since the dawn of “web 2.0 technologies” in the mid-2000s, the “culture war” in the United States has intensified to a fever pitch, and it can be perilous both personally and professionally to actively engage in public, online social spaces. I do not share that observation and assertion to intimidate you from building a professional and/or personal network on Mastodon or other platforms, but rather, to highlight our need for both caution and careful intentionally in the ways we engage with other people (many of whom are strangers) as well as bots online.

As social media users, we are living very interesting times. Since its inception in 2006, Twitter has become a vital platform in our information ecosystem, not because a majority of people in our society have an account and use it, but because most journalists and many celebrities do. Topics which trend on Twitter often find their way into mainstream media, and Twitter is a “speed of thought” platform for both the sharing of legitimate news as well as misinformation / disinformation / malinformation. Many governmental, business and non-profit organizations use Twitter as a primary means of sharing information and current news. In many ways, Twitter has become our shared “town square” for information sharing.

Content Collapse

“Content Collapse” is now a feature of our information landscape, and present when using social media platforms like Twitter and Mastodon. Among other things, content collapse means that the sharer of the information, idea or link does not know and cannot control the “context” of the audience. Therefore, we (and others) can encounter information online for which we lack meaningful background / context about the sharer, their expertise / credentials, and the purpose / intended audience of their message. We also can (and perhaps often do, depending on who we follow / the timelines we watch) encounter information that can be false, misleading, or deceptively biased. We live in the middle of a literal information war, in which state-sponsored bad actors, non-state bad actors, and individual bad actors pour content into an information ecosystem which can seem both confusing and chaotic.

There are multiple effects of “content collapse” when we are browsing social media. One effect, especially when we are viewing a chronological timeline view which is not algorithmically customized for our preferences and liked / followed accounts and posts, can be a diverse array of posts about widely varying topics.

I’ll share a story from this past week on Mastodon, in this attempt to highlight the importance of SIFT and its value in helping us make better choices online.

This was my experience earlier this week: Seeing some posts about some important but controversial and polarizing topics:

  1. Was Elon Musk’s purchase of Twitter a “triumph” for conservative forces, desiring to silence marginalized, traditionally underrepresented minority voices who have utilized Twitter effectively to organize and educate others? (Mastodon post)
  2. Are advocates of “Christian Nationalism” in the United States abandoning the core teachings and principles espoused by Jesus Christ? (Mastodon post)
  3. Is Twitter, under the leadership and policy changes of Musk, becoming even more dangerous / unsafe for members of vulnerable communities (Mastodon post)

Enter SIFT.

According to the SIFT web literacy framework, whenever we encounter new information from unfamiliar sources, PARTICULARLY polarizing content which elicits a strong emotional response, our first step is to STOP.

When I initially found the Mastodon posts linked above, my first instinct was to click LIKE or FAVORITE. (The name of that button depends on the Mastodon interface you’re using. My current favorite is the “Toot!” iOS app, thanks to Miguel Guhlin and Sandy Kendell’s recommendations.) However, “interacting in any way” with a social media post on Mastodon (liking / favoriting or boosting / re-sharing) is premature, if using SIFT, because it’s first important to INVESTIGATE THE SOURCE.

SIFT Steps: INVESTIGATE the Source and TRACE to the Original

We “INVESTIGATE the source” and “TRACE to the original” with SIFT to determine who the author is, if they have apparent expertise and professional pedigree in the given topic, and also to ascertain if they appear to be an authentic actor (genuine human being) versus a bot or sock puppet account. If possible, we engage in “lateral reading,” which involves the use of OTHER SOURCES (not just the individual / account’s self-created profile) to learn more about them.

Advanced Google searches (like reverse image searches using TinEye or Google Images) and “exact phrase searches” can be helpful in this investigation phase of SIFT.

Let me preface all of what I am going to share here with a disclaimer: After doing many “due diligence” online searches for these online profiles / people, I still am not 100% positive who is operating these accounts. I was able to confirm the identity of one person, with image searches that identified his city of residence, the church where he attends and is a member, and via a YouTube video that was posted by CBC/Radio-Canada six years ago and featured his eyewitness testimony at a breaking news event.

My overall thoughts about engaging in a LENGTHY and TIME CONSUMING background search about these sources is that this process can be both complex and difficult. “Verifying your source” on Twitter was sometimes easier in the pre-Elon days of verified accounts, but today on Mastodon it’s an even crazier “wild west” of unverified accounts. This presents formidable source verification challenges even when we feel reasonably well equipped with SIFT skills and other techniques for effective online research.

Liking or Re-sharing Equals Endorsement

A number of people on Twitter include a disclaimer on their profile like, “RT <> endorsement,” which means that retweeting (re-sharing) a post should not be understood as an endorsement of those ideas, or that the channel author agrees with all the ideas / points of view included in the re-shared post. This social media profile disclaimer is false, however, IMHO. When we re-share someone’s social media post, we are explicitly choosing to “give that post more oxygen.” We are amplifying that idea so more people can see / read / watch / listen / click on that post. We ARE endorsing the post if we re-share / boost / re-tweet it. Even liking / favoriting a post can equate to a meaningful endorsement, and in algorithmically driven platforms like Twitter and Facebook (but at this point, not Mastodon) it can result in more people being shown that post in their own feeds. With algorithmically driven social media feeds like YouTube, TikTok, Facebook, Instagram, Twitter, etc, we “train the machine” by interacting with content. My point is: Any interaction with have with content on a social media platform is potentially consequential, for ourselves and for others.

Consider NOT Sharing / Amplifying Controversial Topics

When I started writing this post, I was originally going to exhaustively share all my “investigate the source” steps for the Mastodon posts linked above. I literally spent a few HOURS investigating and researching them. I’m not going to do that.

When it comes to some controversial topics, there are good reasons to just STOP and not share posts about them further online at all. I’m a media literacy educator and researcher, and since 2019 I’ve been working on the media literacy project, “Conspiracies and Culture Wars.” The topics referenced in these potentially controversial Mastodon posts I encountered this week were / are of professional interest to me. They are also, however, highly polarizing and therefore potentially perilous to share / amplify. Unlike Twitter, where I have multiple accounts / channels to thematically separate things I like to share online, currently on Mastodon I just have one account. So there are two reasons to just STOP and not share this content any further: Doing so could risk amplifying polarizing content, and it also could discourage other educators (with whom I genuinely would love to connect around topics related to media literacy, educational technology and pedagogy) from engaging with me further / following me.

“Verification” on Mastodon versus Twitter

How can you tell if a profile on Mastodon today is legitimate, meaning it’s operated by the actual person whose name is on the account, and it’s not a bot or sock puppet account? This can be tricky.

Mastodon does NOT include a verification process like pre-Elon Musk Twitter did, but some accounts DO include a blue circle and white verification checkmark. One example of a prominent celebrity doing this is the actor George Takei. His Mastodon account has a verification checkmark, but I think that’s just an emoji, it’s not an official “verified account” icon issued by a central authority.

George’s Twitter account (which WAS verified pre-Elon) includes a link to that Mastodon account, so I’m pretty sure it’s legit, but this is an important distinction to make about accounts on Mastodon versus Twitter. Including a verification checkmark in Mastodon can be misleading and confusing.

If a Mastodon user today wants to project authenticity and enable others to “verify” their identity, Mastodon (currently) provides a verification process for profile links. I successfully did this on November 25, 2022. Note the “verified” links on my Mastodon profile (as shown in the “Toot!” iOS app) are highlighted in GREEN and have a checkmark beside them. Other third party Mastodon apps and the Mastodon web-interface have similar ways of differentiating verified profile links.

Conclusions

We live in a very challenging information and communication environment. It is exciting to be making connections on Mastodon, with many other early adopter / innovator educators, but it’s also challenging and (at times) even alarming to also interact with so many other folks as well as (potentially) bots. SIFT is not only a web literacy strategy I want to continue sharing and teaching to my own students, but also one I want to keep practicing personally. I wonder how we can share SIFT with other adults in our families and communities? I think it would be helpful to develop (and perhaps someone already has) some exercises and “case studies” to show and share with civic and church groups. I know Rotary, Lions, Kiwanis, and other service clubs still meet and are often looking for good program ideas. I wonder if any other media literacy educators are engaging with civic groups like this?

The steps of SIFT can be time consuming and confusing. We may not emerge with clear-cut answers. We need, however, to “engage in the messiness” of investigating and exploring sources of our information, and remembering that the first step (STOP) may be the most important of all.

If you enjoyed this post and found it useful, subscribe to Wes’ free newsletter. Check out Wes’ video tutorial library, “Playing with Media.” Information about more ways to learn with Dr. Wesley Fryer are available on wesfryer.com/after.

On this day..

Leave a Reply