11 Social Media & Humanity: Has It Worn Out Its Welcome? (Lonstein)

Introduction

Students will examine social media’s meteoric rise between 2012 and today in this chapter. We will discuss social media’s virtues and challenges to the extent they are known as of this writing. The subject of social media and its impact upon humanity is in many ways a fool’s errand when one considers that the dawn of the technology is approximately 2002 when the platform “Friendster” was introduced, followed by My Space” which in 2004 reached reach one million users. (Jones, 2024) That was the same year when Facebook hit the market, led by a Harvard student, Mark Zuckerberg, and others, and began to spread throughout the Ivy League colleges. Soon after that, it became the most ubiquitous and popular social media platform of all time. Simultaneously, the video streaming site YouTube and short-form text site Twitter that is until today when Chinese mega platform TikTok has been downloaded over three billion times globally and, as of March 2023, had over 150 million users in the United States alone. (TikTok, 2023) Throughout the last two decades, social media has experienced a metamorphosis from a mechanism for family and friends to connect and reconnect to sharing notes, photos, and videos. With combined global usage in the billions by 2015, business, government, entertainment, and most forms of “traditional media” followed. By 2023, according to Forbes Advisor, there was a total of 4.9 billion unique social media users globally, with a projected usage of 5.85 billion by 2027. (Wong, 2023) Considering that the global population, as of January 1, 2024, was estimated at just over 8 billion people, having access to over 60% of the world’s population becomes a powerful communication tool, the likes of which have never been witnessed by mankind. Those few companies that control social media platforms and the internet enjoy massive power over almost every aspect of our lives, and with such concentration of power comes risk. Through this lens, we will examine social media’s past, present, and future, as well as its benefits, dangers, and effects on humanity.

Section 1: Assessing the Current State of Social Media

The Rise of Social Media

When Myspace and the budding Facebook were vying for the title of social media champion in the early 2000s, it is fascinating that these platforms were significantly less dynamic, interactive, and captivating. For instance, the 2004 version of Facebook resembled more of a static biography page than the immersive platform we know today. This evolution underscores the transformative power of social media.

Similarly, Myspace, then the leader in the social media industry, also lacked a “real-time” feel and animation and, for many, enough bells and whistles to maintain their interest. Most importantly, the social media products of the early 2000s lacked one of the most critical aspects of their communication dominance today: mobility. Largely confined to the desktop or laptop computer, social media suffered from what most people felt was a need for mobility.

In 2007 and 2008, the world was introduced to the iPhone, and with it came simple operation, a powerful camera, and almost constant connectivity. These factors made the perfect complement to Facebook and other burgeoning platforms such as Twitter and Snapchat. Social media was now live and mobile, and thanks to enhanced processing power and transmission speed, it came to the dawn of live-streaming video in 2015. The two inaugural entrants into live social media streaming were Meerkat and Periscope. Both platforms used Twitter’s instant messaging and push notification function, allowing users to tell the world they were going live instantly. (Pullen, 2015)

Figure 11-1 Mark Zuckerberg Facebook circa 2011 - just smiling
Figure 11-1: Mark Zuckerberg Facebook circa 2011 ( Source: Japan Times, 2024) 

Live streaming immediately spread across the social media marketplace, with Facebook and Google’s YouTube introducing it the following year. Before too long, Twitter purchased Periscope, and Amazon gobbled up the live gaming streaming platform Twitch. Social media was becoming more of a commercial venture than ever before, with news, sports, and entertainment companies jumping on board and a new phenomenon: social video marketplaces. The future was bright, the possibilities endless; what could go wrong?

Challenges and Criticisms

2016 was a year of significant change and turmoil in the world, with the United Kingdom approving the “Brexit” departure from the European Union and the upset victory of Donald Trump, who won the presidency of the United States. Both occurrences. According to Stephen Yan, those who sought to leave the EU were aided in no small part by the effective use of social media. “The Leave campaign’s more potent and passionate social media strategy might have been pivotal in influencing the outcome considering the closeness of the outcome. Due to their social media mechanisms, they might have swayed non-voters towards voting for Leave. The remaining social media strategy was more rational, which, in turn, would not have as profound an effect as the emotional-based Leave tactics. Overall, the Leave movement was able to target more potential undecided voters on social media, potentially influencing what was a close referendum.” (Yan, 2019) Later in the year, the election of Donald Trump to the Presidency of the United States proved to be more heavily influenced by social media. Trump himself was quoted as saying, “I doubt I would be here if it were not for social media, to be honest with you,” to Maria Bartiromo, according to an account published by the Politico website. (McCaskill, 2017)

Figure 11-2 Meerkat &Twitter
Figure 11-2: Meerkat & Twitter (Source: Pullen, 2015)

With its widespread usage and the ability to mobilize supporters and shape narratives, social media has become a dream tool for savvy campaign operatives. However, the high stakes involved in matters influenced by social media also increased the risk of its misuse or manipulation, a factor that became very apparent by the end of 2016.

In 2013, Cambridge Analytica, a British political consulting firm specializing in data analysis and strategic communication for electoral processes, was formed. The Company claimed to use data to develop psychographic profiles of individuals to influence their voting behavior. (Ingram, 2018) In 2014, Cambridge Analytica developed an app called “This Is Your Digital Life,” which presented itself as a harmless personality quiz. Approximately 270,000 Facebook users downloaded and used the app, which collected their data and data from their Facebook friends. The activity harvested data on approximately 87 million users. (Kawamoto, 2018)

The collected data helped create detailed voter profiles and target individuals with personalized political advertisements. Facebook faced severe backlash for failing to protect user data and for not taking more decisive actions when it first learned about data misuse in 2015. In 2016, CEO Mark Zuckerberg testified before the U.S. Congress, and after that, Facebook implemented changes to its platform to enhance data privacy and security. Facebook was fined $5 billion by the Federal Trade Commission (FTC) in 2019 for privacy violations. Cambridge Analytica declared bankruptcy in May 2018 following the fallout from the scandal. (KTLA, 2018). The scandal underscored the vulnerabilities of user data management. It highlighted the potential for misuse in the digital age. It was a significant factor in examining the ethical use of data and the need for greater transparency and accountability from technology companies.

Figure 11-3 Cambridge Analytica /In 2013, Cambridge Analytica, a British political consulting firm specializing in data analysis and strategic communication for electoral processes, was formed. The Company claimed to use data to develop psychographic profiles of individuals to influence their voting behavior. (Ingram, 2018) In 2014, Cambridge Analytica developed an app called "This Is Your Digital Life," which presented itself as a harmless personality quiz. Approximately 270,000 Facebook users downloaded and used the app, which collected their data and data from their Facebook friends. The activity harvested data on approximately 87 million users. (Kawamoto, 2018)
Figure 11-3: Cambridge Analytica (Source: Brito, 2018)

Other concerns regarding social media include but are not limited to Mental Health Impacts such as addiction. The design of social media platforms often encourages addictive behavior, with features like endless scrolling and notifications keeping users engaged for extended periods. Depression and Anxiety: Studies have linked excessive social media use to increased feelings of depression, anxiety, loneliness, and low self-esteem, particularly among teenagers and young adults. 2. Misinformation and Fake News: Spread of False Information: Social media platforms can rapidly disseminate false information, conspiracy theories, and fake news, often outpacing corrections and fact-checking efforts. 3. Echo Chambers: Algorithms that prioritize content users are likely to engage, which can create echo chambers, reinforcing existing beliefs and contributing to polarization. 4. Overreaction and manipulation in the name of moderation, where social media platforms overcorrect and remove “harmful content,” leading to controversies over free speech, censorship, and biased moderation practices. (Harvard Law, 2023)

Section 2: Has Social Media Outlived Its Usefulness?

Pros and Cons of Social Media

The following list is a small example of how social media affects nearly every aspect of our lives, some good and some bad. Like any other technology, social media is capable of benefiting mankind. In others, misuse by users and social media companies can cause great harm at scale.

Connectivity and Communication

Pros

Social media allows users to connect with friends, family, and like-minded individuals across the globe, fostering relationships and community. It lets users access news, educational content, and real-time event updates quickly. The platforms provide spaces for users to share their thoughts, talents, and creativity, offering opportunities for personal growth and recognition, serving as a source of emotional support, and providing communities where users can share experiences and seek advice. 

Cons

Privacy Risks

Users often face data privacy and security issues, as personal information can be harvested, shared, or leaked. Mental Health Impacts: Prolonged use of social media can cause anxiety, depression, and low self-esteem, often due to cyberbullying and the pressure of social comparison. (World Crunch, 2023) One of the most significant concerns with social media is its potential to foster addictive behaviors. Features designed to maximize engagement can lead to constant checking and scrolling, impacting productivity and real-life social interactions. It is essential to be mindful of our social media use and set healthy boundaries to prevent addiction. (Katella, 2024)

Figure 11-4 World Crunch 2023 /One of the most significant concerns with social media is its potential to foster addictive behaviors. Features designed to maximize engagement can lead to constant checking and scrolling, impacting productivity and real-life social interactions. It is essential to be mindful of our social media use and set healthy boundaries to prevent addiction. (Katella, 2024)
Figure 11-4: World Crunch 2023 (Source: Graiewski, 2023)

Misinformation: Users are susceptible to misinformation and fake news, which can skew perceptions and beliefs. (Muhammed, 2022)

Businesses

Benefits

Marketing and Advertising: Social media platforms offer businesses cost-effective advertising solutions with precise targeting capabilities, increasing their reach and customer engagement.

Customer Insights: Businesses can gather valuable consumer preferences and behavior data, helping them tailor their products and services. Social media enables businesses to build brand identities, engage with customers directly, and enhance customer loyalty.

Networking and Partnerships: Platforms facilitate networking and partnerships, expanding business opportunities and collaborations. In fact, in 2024, it is estimated that up to 5 billion people globally use social media, so at least from a density of potential customer perspective, there is presently no market that can compete. (Marketwatch, 2024)

Drawbacks

Reputation & Data Security Risks

Negative feedback and viral complaints can quickly damage a company’s reputation, requiring vigilant reputation management. Businesses must navigate complex data privacy regulations, which can impose legal and compliance costs.

Algorithm Dependence

Changes in platform algorithms can affect a business’s visibility and engagement, making them dependent on external factors, often being changed without notice to the advertiser and a seemingly stubborn reluctance on behalf of the platforms to provide transparency. Case in point: Google and its search algorithms dominate 90% of online search traffic. In a recent leak of Google’s algorithm model, it appears “that click-through rate affects ranking, that subdomains have their rankings that newer websites are thrown into a separate “sandbox” until they start ranking higher in Search and that the age of a domain is a consideration in ranking.” Such a lack of transparency in a near-monopolistic player does little to inspire confidence and predictability in businesses with little choice but to play the game. (Khan, 2024)

Government

Benefits

Social media allows governments to communicate with citizens, disseminate information, and gather feedback quickly and efficiently, especially during crises or emergencies. It can also promote government transparency and hold officials accountable through public scrutiny.

Drawbacks

Disinformation Campaigns: Social media misuse can spread disinformation, manipulate public opinion, and interfere in operations, especially in a crisis. (Shahbazi, 2024) 

Regulatory Challenges

Governments face difficulties in effectively regulating social media and balancing free speech with the need to prevent harm.

Surveillance and Censorship

Some governments use social media for surveillance and censorship, infringing on citizens’ rights to privacy and free expression. 

Public Disorder

Platforms can help organize and incite public disorder or unrest, challenging social order. 

Figure 11-5 Protests after George Floyds Death 2020 / Perhaps one of the most significant events involving public sentiment, disorder, viral spread, and a digital call to action was the George Floyd protests in May 2020. They occurred amid the hotly contested 2020 United States Presidential Campaign and erupted following the death of George Floyd at the hands of Minneapolis police officers. The exponentially expanded violent protests' breadth and scope through social media in various capacities. Social media platforms were critical in mobilizing, organizing, and amplifying the protests, shaping public discourse, and influencing responses from authorities and organizations. (Hughes, 2023)
Figure 11-5: Protests after George Floyds Death 2020 (Source: Associated Press, 2020)

Perhaps one of the most significant events involving public sentiment, disorder, viral spread, and a digital call to action was the George Floyd protests in May 2020. They occurred amid the hotly contested 2020 United States Presidential Campaign and erupted following the death of George Floyd at the hands of Minneapolis police officers. The exponentially expanded violent protests’ breadth and scope through social media in various capacities. Social media platforms were critical in mobilizing, organizing, and amplifying the protests, shaping public discourse, and influencing responses from authorities and organizations. (Hughes, 2023)

Social media’s impact is multifaceted, offering significant benefits while posing substantial challenges. It enhances users’ connectivity and access to information but raises privacy and mental health concerns. Businesses benefit from targeted marketing and customer engagement but must manage reputation and compliance issues. Governments can engage with the public and enhance transparency yet face regulatory and disinformation challenges. Society gains from social movements and economic growth but must contend with polarization and privacy erosion. Balancing these aspects requires continuous effort from all stakeholders to maximize benefits while mitigating drawbacks.

Social media and big technology companies have become the de facto source of information for the citizens of many countries. This rapid shift in news consumption fundamentally altered how news and information are disseminated and consumed. This shift from “traditional media” has had profound implications. Platforms like Facebook, Twitter, Instagram, and TikTok have billions of users globally, allowing information to spread quickly and reach a vast audience. Social media platforms also use opaque algorithms to curate content based on users’ interests, behaviors, and interactions. This personalization can help users find relevant information but also risks creating echo chambers. Social media’s role as a de facto source of information has transformed the information landscape, offering both opportunities and challenges. While it enhances access to diverse perspectives and real-time updates, it also necessitates critical thinking and media literacy to navigate issues like misinformation and echo chambers. The ongoing evolution of social media will continue to shape how people consume and interact with information.

The private companies that run these platforms are far less answerable and transparent than traditional broadcast media, which in the United States is overseen by the Federal Communications Commission. (Federal Communications Commission, 2024)Unlike the FCC’s role in broadcast media, there is little oversight of social media. Social Media companies often push back against governmental oversight by claiming that such regulation would infringe upon their constitutional rights. Even more problematic than the constitutional argument is that technology companies, particularly those in the social media business, enjoy special protection from liability. Section 230 of the Communications Decency Act. See 47 United States Code § 230. (United States Congress, 1996). Essentially, the law states, “This section gave distributors of user-generated content on the internet immunity for the content posted by users. Someone was still liable for the pernicious content, but the burden shifted from the deep pockets of the companies to the person who posted the content” (Morris, 2021)

Figure 11-6 Mark Zuckerberg Apologizes to Parents in Congress
Figure 11-6: Mark Zuckerberg Apologizes to Parents in Congress (Source: NBC News, 2024) 

Section 3: Solutions for Social Media

Can It Be Fixed?

Regulations

Section 230 of the Communications Decency Act (CDA)

It protects online platforms from liability for content users post while allowing them to moderate content in good faith. (United States Congress, 1996)

Children’s Online Privacy Protection Act (COPPA)

It regulates the collection of personal information from children under 13 through online services, including social media platforms, requires parental consent for data collection, and provides data protection and privacy guidelines. (United States Congress, 1998)

California Consumer Privacy Act (CCPA)

It grants California residents rights over their data, including the right to know what data is being collected, to delete data, and to opt out of the data sale. The CCPA applies to businesses that meet specific criteria, including social media platforms. (California, 2024)

New York SHIELD Act

Imposes data security requirements on businesses to protect the private information of New York residents. Requires companies to implement reasonable safeguards to protect personal data. SHIELD Act: The Shield Act significantly strengthens New York’s data security laws by expanding the types of private information that companies must provide consumer notice of in the event of a breach and requiring that companies develop, implement, and maintain reasonable safeguards to protect the security, confidentiality, and integrity of the private information. (State of New York, 2019)

Digital Services Act (DSA)– European Union

It aims to create a safer digital space by regulating online content, services, and platforms. Imposes obligations on platforms to remove illegal content promptly, enhance transparency in content moderation, and ensure the safety of users. (European Union, 2022)

Online Safety Bill – United Kingdom

Seeks to protect users from harmful content on social media platforms and other online services. Imposes duties of care on platforms to prevent the spread of illegal content and protect children from harmful material. The bill establishes the Office of Communications (Ofcom) as the regulator with enforcement powers. (Parliament – United Kingdom, 2023)

Online Safety Act 2021- Australia

Provides a framework for enhancing online safety, particularly for children and vulnerable users. Grants the eSafety Commissioner powers to order the removal of harmful content, impose fines, and promote online safety education. (Australian Government, 2022)

Network Enforcement Act (NetzDG)- Germany

Social media platforms must remove “obviously illegal” content within 24 hours of receiving a complaint. Imposes fines on platforms that fail to comply with content removal requirements. Mandates transparency reports on content moderation practices. (Bundestag – Germany, 2018)

Information Technology Rules 2021-India

Sets guidelines for social media intermediaries, including requirements for grievance redressal, content takedown, and compliance with law enforcement requests. Significant social media intermediaries must appoint a chief compliance officer and other critical officers to ensure compliance. (Ministry of Electronics & Information Technology – Government of India, 2021)

Proposed Bill to Address Online Harms – Canada

The proposed Online Harms Act would specifically target seven types of harmful content:

Content that sexually victimizes a child or re-victimizes a survivor;

Intimate content communicated without consent;

Content used to bully a child;

Content that induces a child to harm themselves;

Content that incites hatred;

Content that incites violence and

Content that incites violent extremism or terrorism. (Government of Canada, 2024) 

Figure 11-7 Censorship by Proxy /Regulations often suffer from vague definitions of critical terms such as "harmful content," "hate speech," or "misinformation." This lack of clarity can lead to inconsistent enforcement and difficulties in compliance. The ambiguity in laws can result in different interpretations, leading to legal disputes and inconsistent application of the regulations across different jurisdictions. Stringent regulations and the threat of heavy penalties can lead to over-censorship by platforms, as they may choose to err on the side of caution and remove content that is not necessarily harmful or illegal, thereby stifling free speech. (Koltay, Andras, 2022)
Figure 11-7: Censorship by Proxy (Source: Envato, 2024)

The Common Flaw of Social Media Legislation

Ambiguity and Vagueness

Regulations often suffer from vague definitions of critical terms such as “harmful content,” “hate speech,” or “misinformation.” This lack of clarity can lead to inconsistent enforcement and difficulties in compliance. The ambiguity in laws can result in different interpretations, leading to legal disputes and inconsistent application of the regulations across various jurisdictions. Stringent regulations and the threat of heavy penalties can lead to over-censorship by platforms, as they may choose to err on the side of caution and remove content that is not necessarily harmful or illegal, thereby stifling free speech. (Koltay, Andras, 2022)

Jurisdictional Issues

Social media platforms operate globally, but regulations are typically enacted on a national or regional level, thereby creating challenges in enforcement, primarily when content originates from or is hosted in a different jurisdiction. The global nature of social media necessitates international cooperation and harmonization of regulations, which is impossible. Cross-border enforcement and coordination among regulatory bodies are complex and can be ineffective. (Ortner, 2019)

Platform Resistance – Censorship by Proxy – Collaboration with Government

Social media companies play a crucial role in enhancing transparency and accountability. However, they may resist transparency requirements, citing concerns about proprietary technology or system gaming. Platforms might not fully comply with transparency and accountability regulations without robust oversight mechanisms, undermining their effectiveness. This resistance could lead to a lack of trust from users, who may question the platforms’ commitment to their safety and privacy. The inherent lack of transparency in social media censorship practices, particularly about government collaboration, poses significant challenges to user trust, free speech, and platform credibility. (Panagiotopoulos, 2014)

Dr. Do not harm. Less may be more.

Section 230 of the Communications Decency Act has been a foundational element of the internet, granting online platforms immunity from liability for user-generated content while allowing them to moderate in good faith. However, a growing argument exists that removing or significantly amending Section 230 could help curb social media abuse.

Incentivizing Responsible Behavior and Reducing Harmful Content:

Without the blanket immunity provided by Section 230, social media platforms would have a greater incentive to implement more effective and responsible content moderation policies. They would need to actively prevent the spread of harmful content, such as hate speech, misinformation, and illegal activities, to avoid legal liabilities. If platforms can be held liable for the content they host, they may invest more resources into monitoring and removing abusive or harmful content. This could lead to a cleaner, safer online environment for users. (Smith, 2021).

Conclusions

The current protections under Section 230 can allow platforms to shirk responsibility for the rampant spread of misinformation and fake news. Removing these protections could force platforms to take more decisive actions against false information, as they would be legally accountable for the damages caused by such content. Without Section 230, platforms would be compelled to establish robust fact-checking and content verification systems to avoid legal repercussions, thereby improving the quality and reliability of information available on social media. The key is a framework where the user and not just the platform have a natural right to opine, be correct, and be wrong. As I wrote in 2022:

Less is more when choosing between censoring or allowing uncomfortable speech to remain online. The preference is that only the most egregious, imminently dangerous content is subject to removal. Consider the suggestions of Robert D. Richards and Clay Calvert to use counter-speech as a partial and less draconian tool for dealing with perceived offensive content.

First, terms of service and use policies are in dire need of simplification and clarity. What takes multiple pages in small print could be simplified and condensed into a Magna Carta and Bill of Rights for the digital age. Simplification would be a win-win.

Still, platforms are under immense pressure to protect users and the public from perceived offensive, illegal, or harmful content. They also enjoy protections that print, audio, and television media do not—Section 230 immunity. Social media’s poor behavior should come at a cost that consumers and the marketplace determine. While some users, advertisers, investors, or governments may become upset by such a policy, this “public square” is unlikely ever to be tranquil.

The cornerstone of reform must start with the “presumption of innocence.” Censorship or deplatforming should only occur once the platform or person seeking censorship meets this burden of proof. Neutral human beings should be making such decisions, not social media machines, algorithms, employees, or vendors.

The adjudication process must be swift, transparent, and neutral. Many neutral arbitration forums, such as the American Arbitration Association or the European Arbitration Chamber, already exist, and others could follow.

With funding from—not oversight by—social media companies, a simple version of small claims courts could rapidly evolve. The accused should not have to pay or hire legal counsel, and the process and forums must not become bureaucratic, complex, or corrupted; constant monitoring of neutrality and transparency is vital. The process must mandate regularly updated financial disclosures for all arbitrators and staff, and these disclosures must also be securely stored but made available to the parties.” (Lonstein, 2022) 

Questions for students

  1. What is content moderation, and why do social media platforms implement it?
  2. How do you define free speech in the context of social media?
  3. What types of content should be moderated or removed from social media platforms, and why? If so, what kinds of content should be moderated or removed from social media platforms, and why?
  4. What role should transparency play in social media content moderation?
  5. How can social media companies balance the need for moderation with the protection of free speech?
  6. What are the potential benefits and drawbacks of government regulation of social media content moderation?
  7. Should governments have the authority to mandate the removal of certain types of content on social media? Why or why not?
  8. What ethical responsibilities do social media platforms have when moderating content? 

References

Associated Press. (2020, May 30). Protests over George Floyd’s death overwhelm authorities again. Retrieved from WHYY: https://whyy.org/articles/protests-over-george-floyd-death-overwhelm-authorities-again/

Australian Government. (2022, January 23). Online Safety Act 2021. Retrieved from Federal Register of Legislation: https://www.legislation.gov.au/

Brito, R. (2018, March 21). Brazil prosecutors open investigation into Cambridge Analytica. Retrieved from Reuters: https://www.reuters.com/article/idUSKBN1GX35A/

Bundestag – Germany. (2018, March 22). The Network Enforcement Act. Retrieved from Bundes Fur Justizamt: https://www.bundesjustizamt.de/SharedDocs/Downloads/DE/NetzDG/Leitlinien_Geldbussen_en.pdf?__blob=publicationFile&v=3

California, S. o. (2024, March 13). California Consumer Privacy Act. Retrieved from State of California Department of Justice: https://oag.ca.gov/privacy/ccpa

Envato. (2024, June 1). My Downloads. Retrieved from Envato elements: https://elements.envato.com/account/downloads

European Union. (2018, May 25). What is the GDPR? The EU’s new data privacy and security law? Retrieved from GDPR.EU: https://gdpr.eu/what-is-gdpr/

European Union. (2022, October 19). Digital Services Act. Retrieved from Eur-Lex: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32022R2065

Federal Communications Commission. (2024, May 15). What We Do. Retrieved from FCC: https://www.fcc.gov/about-fcc/what-we-do

Government of Canada. (2024, February 26). Proposed Bill to address Online Harms. Retrieved from Government of Canada: https://www.canada.ca/en/canadian-heritage/services/online-harms.html

Graiewski, M. (2023, April 1). Who Is Responsible For The Internet’s Harm To Society? Retrieved from World Crunch: https://worldcrunch.com/culture-society/internet-harms-teens-young-adults

Harvard Law. (2023, September 27). Rappaport Forum talks about First Amendment limits of content moderation and ‘lawful but awful’ speech on social media. Retrieved from Harvard Law Today: https://hls.harvard.edu/today/rappaport-forum-talks-first-amendment-limits-of-content-moderation-lawful-but-awful-speech-on-social-media/

Hughes, S. (2023, May 5). Social Media Case Study: The Killing of George Floyd. Retrieved from The Institute of Strategic Risk Management: https://theisrm.org/en/social-media-case-study-the-killing-of-george-floyd

Ingram, D. (2018, March 20 ). Factbox: Who is Cambridge Analytica, and what did it do? Retrieved from Reuters: https://www.reuters.com/article/idUSKBN1GW07F/

Japan Times. (2024, February 11). Facebook, the old-timer of the social network, is turning 20. Retrieved from The Japan Times: https://www.japantimes.co.jp/business/2024/02/01/companies/facebook-turns-20/

Jones, M. (2024, February 26). The Complete History of Social Media: A Timeline of the Invention of Online Networking. Retrieved from historycooperative.org: https://historycooperative.org/the-history-of-social-media/

Katella, K. (2024, January 8). How Social Media Affects Your Teen’s Mental Health: A Parent’s Guide. Retrieved from Yale Medicine: https://www.yalemedicine.org/news/social-media-teen-mental-health-a-parents-guide

Kawamoto, D. (2018, March 20). How to Access the Voter Information Dirt Cambridge Analytica Has on You. Retrieved from Dark Reading: https://www.darkreading.com/cybersecurity-analytics/how-to-access-the-voter-information-dirt-cambridge-analytica-has-on-you

Khan, I. (2024, May 30). Google Algorithm Leak Contradicts What Google Has Said About Website Rankings. Retrieved from CNET: https://www.cnet.com/tech/services-and-software/google-algorithm-leak-contradicts-what-google-has-said-about-website-rankings/

Koltay, Andras. (2022). The Protection of Freedom of Expression from Social Media om Social Media. Mercer Law Review, 548-554.

KTLA. (2018, May 18). Cambridge Analytica Files for Bankruptcy After Facebook-Linked Scandal. Retrieved from KTLA5: https://ktla.com/news/nationworld/cambridge-analytica-files-for-bankruptcy-after-facebook-linked-scandal/

Lonstein, W. (2022, November 1). Modern, Real-World Reformation For Moderating Social Media Content. Retrieved from Forbes: https://www.forbes.com/sites/forbestechcouncil/2022/11/01/modern-real-world-reformation-for-moderating-social-media-content/?sh=25719a466d6d

Marketwatch. (2024, May 28). Top Social Media Statistics 2024. Retrieved from Marketwatch: https://www.marketwatch.com/guides/business/social-media-statistics/

McCaskill, N. D. (2017, October 20). Trump credits social media for his election. Retrieved from Politico: https://www.politico.com/story/2017/10/20/trump-social-media-election-244009

Ministry of Electronics & Information Technology – Government of India. (2021, February). The Information Technology (Intermediary Guidelines and Digital Media Ethics Code). Retrieved from Ministry of Electronics & Information Technology: https://www.meity.gov.in/writereaddata/files/Information%20Technology%20%28Intermediary%20Guidelines%20and%20Digital%20Media%20Ethics%20Code%29%20Rules%2C%202021%20%28updated%2006.04.2023%29-.pdf

Morris, R. G. (2021). The Futility of Regulating Social Media. Notre Dame Journal on Emerging Technologies, 64-65.

Muhammed, S. (2022). The disaster of misinformation: a review of research in social media. Int J Data Sci Anal.

NBC News. (2024, January 31). Senate hearing highlights: Lawmakers grill CEOs from TikTok, X, and Meta about online child safety. Retrieved from NBC News: https://www.nbcnews.com/tech/live-blog/senate-hearing-online-child-safety-big-tech-live-updates-rcna136235

New York Times. (2020, May 29). Absolute Chaos’ in Minneapolis as Protests Grow Across the U.S. Retrieved from New York Times: https://www.nytimes.com/2020/05/29/us/floyd-protests-usa.html

Ortner, D. (2019, August 12). Government regulation of social media would kill the internet — and free speech. Retrieved from The Hill: https://thehill.com/opinion/technology/456900-government-regulation-of-social-media-would-kill-the-internet-and-free/

Panagiotopoulos, P. E. (2014). Citizen-government collaboration on social media: the case of Twitter in the 2011 riots in England. Government Information Quarterly, 349-357.

Parliament – United Kingdom. (2023, October 26). Online Safety Act 2023. Retrieved from Legislation: https://www.legislation.gov.uk/ukpga/2023/50

Pullen, J. P. (2015, March 27). Periscope vs. Meerkat: Which Is the Livestreaming App For You? Retrieved from Time: https://time.com/3761315/periscope-meerkat-livestreaming-twitter/

Shahbazi, M. (2024). Social media trust: Fighting misinformation in the time of crisis. International Journal of Management, 3-10.

Smith, M. D. (2021, August 12). It’s Time to Update Section 230. Retrieved from Harvard Business Review, Government Policy and Regulation: https://hbr.org/2021/08/its-time-to-update-section-230

State of New York. (2019, October 23). The Shield Act. New York General Business Law Section Article 39-F sections 899-AA and 899-BB. Albany, New York, United States: New York Legislature.

TikTok. (2023, March 21). Celebrating our thriving community of 150 million Americans. Retrieved from Newsroom.Tiktok: https://newsroom.tiktok.com/en-us/150-m-us-users

United States Congress. (1996, February 8). The Communications Decency Act. 47 United States Corde Section 230. Washington D.C., District of Columbia, USA: United States Government.

United States Congress. (1998, October 21). Children’s Online Protection Act of 1998. U.S. Code Title 15 Commerce and Trade. Washington D.C., District of Columbia, United States: United States.

Wong, B. (2023, March 18). Top Social Media Statistics And Trends Of 2024. Retrieved from Forbes Advisor: https://www.forbes.com/advisor/business/social-media-statistics/#source

Worldcrunch. (2023, April 1). Who Is Responsible For The Internet’s Harm To Society? Retrieved from Worldcrunch: https://worldcrunch.com/culture-society/internet-harms-teens-young-adults

Yan, S. (2019). Social Media & Brexit: The Role of Social Media in the Outcome of the UK’s EU Referendum. netstudies.

 

 

License

Icon for the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License

Advanced Technologies for Humanity Copyright © 2024 by Nichols, R.K.; Ackerman, P.E, Andrews, E., Carter, C.M., DeMaio, D.D., Knaple, B.S.,  Larson, H., Lonstein, W.D., McCreight, R., Muehlfelder, T., Mumm, H.C., Murthy, R., Ryan, J.J.C.H., Sharkey, K.L., Sincavage, S.M. is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, except where otherwise noted.

Share This Book