Photo found on Unsplash
Technology / Internet

Media Without Borders

The phrase ‘media without borders’ describes the environment of the web as a place with poor distinctions between media, entertainment, and thought-provoking work, a situation which produces personal and societal downsides. This is not by accident. To quote author and scholar Shoshana Zuboff’s book The Age of Surveillance Capitalism: “Google is a shape-shifter, but each shape harbors the same aim: to hunt and capture raw material. Baby, won’t you ride my car? Talk to my phone? Wear my shirt? Use my map?…” The internet replaced our playgrounds, and the social network-extraction-machine is the older dude offering us candy and waving us towards his van. All adults mean well, right? Not really.

The issues we have with the internet are not directed at the access it provides to information itself. Rather, it is an internally felt frustration with media as the vehicle through which all things are framed, where we construct – or warp – reality. The chief flaw with social networks and online media is their  borderless and unregulated nature. Scrolling down my own Instagram feed, it becomes impossible to differentiate art from advertisement, real-life accounts from fiction, vulnerability from self-aggrandizement, and politics from formless, nihilistic meemery. Meta-universes of online parody are often used as the inspiration for ‘real-world’ debate, an incredibly lucrative discussion to internet companies. My gaze is a precious commodity—everything I interact with on virtually any platform competes for my time and attention and tries to shape my point of view.

The sheer amount of media forces us to interact only with what is most entertaining. I don’t really have a robust criteria for avoiding clickbait sinkholes most of the time, as their presentation has become so seamlessly captivating. When I brought this up to my roommate, he said he rarely asks himself why he likes something or not. “I’m loyal to content creators or media outlets because they were able to capture my attention much earlier [when he was younger].” His comment reflects how we sort of passively drift through the web with little trust for anyone besides content creators who have won our trust much earlier in our lives, yet the average person still spends close to 24 hours a week online, despite probably not wanting to. As all forms of online media get better at yanking our monkey-brains and monopolizing our downtime, it’s easy to be entertained by more or less everything while walking away with very little.

‘Clickbait,’ the banner under which some shapeshifters on the internet profit off views, is the less ugly face of misrepresentation on social media. The consequence of clickbait directs us towards ‘inaction’ instead of ‘action’ and functions more to sap away our downtime than it does mobilize us. Dialogue between congresswoman Alexandria Ocasio-Cortez and Mark Zuckerburg in his most recent congressional hearing is particularly helpful in understanding how coercive forces on social media can direct action. The exchange also illustrates a lack of accountability from its architects. 

Cortez: “Would I be able to run advertisements on facebook targeting republicans in primaries saying that they voted for the Green New Deal?” Zuckerberg’s spookily placid response: “I don’t know the answer off the top of my head […] I think, probably?” 

Thanks, Zuck. This exchange is both frightening and hilarious. Could someone spread misinformation online to achieve massively influential results? Mark’s answer, “probably,” is not only misleading, it’s a flat out lie. Looking beyond the 2016 US election and Russia’s involvement, Facebook campaigns have been the instigator and platform for an ongoing genocide in Myanmar. In Myanmar, the military roled out a systematic campaign of Facebook propaganda and targeted at the country’s mostly Muslim Rohingya minority group. The campaign was riddled with fake names and sham accounts created by military personal and was the fuel for the resulting rapes and murders of Rohingyans. 

Facebook isn’t the only social network to blame; Youtube’s AI suggestion engine (that feeds you content based on what you already watch) is under recent scrutiny for pushing people who would otherwise be politically middle-of-the-road in radical directions. Youtube is thought to be, according to New York Times writer Paul Mozur, responsible for the radicalization of Brazil’s radical right-wing movement. Many of the nation’s young regularly and exclusively get their political ideas from radical youtube channels. One of these impressionable young, expressed in the  NYT piece to have been a heavy consumer of radical-right Youtube content, is Brazil’s current leader, President Bolsonaro. Bolsonaro is notorious for awful politics, like denying climate change and refusing to protect the Amazon rainforest, encouraging the displacement of indigenous people.

Studies show that while most people consume information that mirrors their existing views, being exposed to conflicting viewpoints tends to reduce prejudice and bolster creative thinking. Online chat bubbles and suggestion engines are actually making us dumber and unknowingly playing into what psychologists call the ‘mere-exposure effect’ where people tend to develop a preference for ideas merely because they are familiar with them through repeated exposure. This is the most standard branding tactic in the book. The more you see something, the more you are likely to agree with it, even if it’s wrong.

People are (1) getting their information from social networks more than from traditional news sources, and (2) interacting with social networks activates our confirmation biases and actually shapes our worldview, patterning real-world action. Alek Minassian drove a van into a crowd of pedestrians, killing ten people in Toronto in April last year. His last Facebook post: “The Incel Rebellion has already begun!” The Incel movement, a violent online sub-culture whose members define themselves as incels (involuntary celibates) unable to find a romantic or sexual partner, found its inception within the confirmation bias-cesspool, Reddit. For many people, Reddit is a news source, meme repository, community, and, sometimes, a harborer of like-minded crazies. 

The incels exemplify how cult-like thinking can span unprecedented distance online, and that our understanding of the web as an unlimited library is woefully optimistic. Just as easily as the internet can democratize education, it can become a breeding ground for a predatory hive-mind, like inceldom, because there is no one regulating what readers should and should not listen to and believe. You and you alone are the arbiter of your place on the web, and no one has cared to teach you how to navigate the space or be intellectually vigilant against our primal desire for inclusion.

It takes deliberate effort to combat your own psychological vulnerability to online manipulation, largely because the internet profits from an intentional lack of borders. Facebook has a vested interest in being not only your social network, but also your television, chatbox, news outlet, grocery store, outlet mall, currency, best friend, pet…whatever. The more time you spend in the company of adverts, the more money the companies that disseminate them make. Why change? As a result, it has become increasingly difficult to know exactly when you are being entertained or seduced by a view that actually counters your beliefs. We clearly lack the level of perception it takes to be aware of how these forces influence us, as well as the ability to glance through the subtle film that separates propaganda from entertainment. 

My more technically inclined peers tell me there is a movement within STEM education that emphasizes ethics in computer science education. Corporations are now pushing to diversify the overall pool of coders to help eliminate certain racial (or other) biases from the algorithms themselves. In our fervent effort to equip the young members of society with code-building literacy, we manage to pump out a swell of engineers to upkeep and update the infrastructure of the internet without ever questioning the ethics of that infrastructure. If nothing is going to be done about restructuring the internet to be less predatory, which is likely, we need to make a more concerted effort to teach  the public how to avoid the ubiquitous forces of coercion. Corporations should improve their ability to spot fake accounts and rout out violent propaganda, but a comprehensive approach to revitalizing unbiased democratic debate should include public education.

Efforts on the other side of the world have been made to strengthen our increasingly digital global society. Estonia, after numerous cyber attacks instigating violent revolts via online Russian propaganda in 2007, rolled out a comprehensive cyber resilience strategy that takes an interdisciplinary approach to addressing cyber issues. Along with government-verifiable internet IDs and online voting, their position relies heavily on public education. Their efforts have helped identify and coordinate education and training solutions in cyber defence against social media propaganda attacks for all NATO bodies. 

In 2016, Karoliina Ainge, head of Estonian cyber security policy, outlined a strategy for national security which enables people with encryption tools. Estonia’s approach is one the rest of the world should consider: “Our approach has been to help people use trusted communications. We are very strong proponents of strong encryption and have given all citizens access to encryption tools. We believe that if you give people the ways and means to communicate securely it’s a more positive approach than simply banning things. It’s a more positive reinforcement of the use of technology.” I agree with Ainge that positive reinforcement is better than banning things. Taking this strategy one step further, we should educate young people about their own biases and give them the tools for intellectual integrity when surfing the web for information.

It makes sense to tell our children to deny candy from strangers. Similarly, educating the youngest members of our society on how to read through media framing, challenge its assumptions, and understand the psychological forces behind them could pay massive dividends in our societies and democracies over time, leading to more productive discourse and less radical bipartisan quandary online. Ultimately, the internet should be a decentralized tool for getting the facts, not a battleground for social machination and radical political indoctrination.