Bitcoin Forum
May 05, 2024, 06:56:04 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: Can Mastodon Survive Europe’s Digital Services Act?  (Read 31 times)
Hydrogen (OP)
Legendary
*
Offline Offline

Activity: 2562
Merit: 1441



View Profile
December 08, 2022, 11:35:47 PM
 #1

Quote
It has been around two weeks since Elon Musk, the world’s richest man, acquired Twitter and, already, increasing fears of what this means for free speech on the microblogging platform have begun to proliferate. With Musk firing some of Twitter’s key personnel, including Twitter’s legal chief Vijaya Gadde and terminating contracts with outsourced content moderators, many users are looking for an alternative.

A substantial number are migrating to the ‘fediverse,’ and specifically to Mastodon, a similar microblogging platform that has been called “Twitter, with the underlying architecture of email”. Mastodon’s decentralization raises substantial questions about how existing regulatory regimes, such as Europe’s Digital Services Act (DSA), will apply.

The Move to Mastodon

The fediverse – a portmanteau of federation and universe – is a network of interconnected servers that communicate with each other based on decentralized networking protocols. These servers can be used for web publishing and file hosting, and allow users to communicate with each other despite being on independent servers.

For Mastodon, interoperability is key. Think of it as an email account – a user may use a Google mail service, but that does not stop her from communicating with someone who uses Hotmail, or even someone who hosts their own mail server. As long as a set of protocols are followed, users can communicate across servers easily. The idea behind such a decentralized architecture is giving users direct control of their online use and presence.  Mastodon is one of many social networks that operate using free and open-source software; other examples include Peertube, which is similar to YouTube, and diaspora*, which resembles more with Facebook.

Since Musk’s acquisition of Twitter and the turmoil it has caused, Mastodon’s growth has risen from 60-80 new users an hour to 3,568 new registrations in one hour on the morning of 7th November. It has now amassed 6 million+ user accounts and is still growing.

To sign up to Mastodon, a user can join a number of different servers (known as ‘instances’) of their choice; these instances determine the content users get to see and the community guidelines to which they must subscribe. In essence, the administrator(s) of each instance acts as the ‘moderator’ — deciding what is or is not allowed on that instance – and has the power to filter or block content that contradicts the established rules. The administrator(s) can either act as the moderator themself or use a team of moderators. Within an instance, a user can post text or other media, follow and communicate with other users (within and outside their instance), and share data publicly or to a selected group.

Just like Twitter, Mastodon uses hashtags, has a character limit for posts (500 instead of Twitter’s 280), and is already populated with cat pictures. Although some users have complained about the complexity of the sign-up process and the site’s overall user-friendliness (or lack thereof), Mastodon has proven a healthy alternative and has demonstrated that users are ready to walk away from established social media services if they are presented with options.

Content Moderation on Mastodon

Ultimately, though, the future of Mastodon will hang on the way its individual instances and the site – as a collaborative whole – deals with content moderation and free speech. Mastodon’s appeal lies in its decentralization. When Eugen Rochko founded the network in 2016, it came from a “feeling of distrust of the topdown control that Twitter exercised”. Countering this distrust, while also proudly claiming that it is “not for sale”, the Mastodon network has no single owner or administrator that can set the rules; instead, the administrator of each local instance sets the rules of their server, which users have to abide by. If a user disagrees with these rules, they can easily switch to an instance that aligns with their viewpoint, creating robust avenues for free speech. If an administrator finds that a user has published something in violation of the instance’s rules, they can take the content down, or even remove the user from the instance; the user can then simply move to a different server.

The administrator can also block content from the instance they run if it disrupts users. In 2019, the social media platform Gab – a hub for white supremacists – tested Mastodon’s limits on content moderation. Even though Mastodon could not deny Gab’s use of its open source software, since anyone can use the software if “they keep the same license and make their modifications public”, individual instances were able to block, and consequently isolate, Gab and its users. With no ability to interoperate with other instances, Gab became an instance of no value to the Mastodon collective. As a response to this, mastodon.social — one of the severs run by Mastodon – updated its policy regarding the promotion of instances on their official website, before blocking Gab for good.

Although there is no central authority on Mastodon, when you sign up to the network it shows you some popular instances that you can join to get a general idea of the content on the network. These instances need to abide by certain rules such as not allowing racism, sexism, homophobia, transphobia, etc. on their servers. This shows how content (or rather a platform) can be moderated on a decentralized network: while offending content is not necessarily  completely removed from the network, local action can be taken by each instance administrator to avoid and ultimately ostracize ‘problematic’ servers.

The clear benefits of such decentralized systems – especially if they are non-profit, like Mastodon – are diffused content moderation responsibilities, user empowerment, and disincentives for user conflict (especially related to driving engagement, as seen in big social media). However, this still leaves us with the question of manifestly objectionable content – such as child sexual exploitation material or terrorist content. Certainly, instances have their own incentives to moderate and get rid of such content; however, it is also important to remember that decentralized networks are not above government legislation, nor are they a panacea for content moderation. In the same way that governments can order the takedown of a website, they can also order the takedown of Mastodon instances.

Mastodon and the Digital Services Act

As Mastodon continues to gain in popularity, a question that remains is how existing legislative efforts may affect the entire website and/or its instances. Notably, the Digital Services Act (DSA) in Europe was created to address content moderation issues that manifest themselves in much larger and more centralized platforms, like Facebook. What will the DSA’s effect be on Mastodon?

Currently, there are more than 3000 instances on the network – all with their own users, guidelines, and administrators. In this context, the DSA does not provide clarity on questions of decentralized social media. However, based on the categorizations of the DSA, it is most probable that each instance could be seen as an independent ‘online platform’ on which a user hosts and publishes content that can reach a potentially unlimited number of users. Thus, each of these instances will need to comply with a set of minimum obligations for intermediary and hosting services, including having a single point of contact and legal representative, providing clear terms and conditions, publishing bi-annual transparency reports, having a notice and action mechanism and, communicating information about removals or restrictions to both notice and content providers.

Today– given the non-profit model and limited, volunteer administration of most existing instances– all Mastodon servers would seem to be exempt from obligations for large online platforms. Nonetheless, what will it mean if an instance ends up generating above EUR 10 million in annual turnover or hires more than 50 staff members? Under the DSA, if these thresholds are met the administrators of that instance would need to proceed to the implementation of additional requirements, including a complaint handling system, cooperation with trusted flaggers and out-of-court dispute bodies, enhanced transparency reporting and the adoption of child protection measures, as well as the banning of dark patterns. Failure to comply with these obligations may result in fines or the geo-blocking of the instance across the EU market.

Additionally, in theory, there is always the possibility an instance may reach the threshold for the DSA’s ‘Very Large Online Platform’ (VLOP) status if its user base continues growing and hits the 45 million monthly usage mark. Today, mastodon.social is the largest instance, with 835,227 users. If it goes beyond the VLOP user threshold, there are a significant number of obligations this instance would have to comply with, such as risk assessments and independent audits. This can prove to be an expensive and burdensome administrative burden, given its current turnover.  It is therefore important for the European Commission to provide further clarification about these instances, and to do so quickly.

It is difficult to project what will happen if, and when, Mastodon user numbers reach the likes of platforms such as Twitter and Facebook, especially in the realm of content moderation. Since moderation in major social media platforms is conducted by a central authority, the DSA can effectively hold a single entity accountable through obligations. This becomes more complex in decentralized networks, where content moderation is predominantly community-driven.

Regulatory Ambiguity and the Fediverse

Presently, Mastodon attempts to answer the problems of content moderation through its decentralized architecture. There is no central authority or control that one could point to and hold responsible for content moderation practices; instead, moderation happens in an organic bottom-up manner. With regards to how upcoming digital regulations can apply to these platforms, we are still left with a plethora of questions, which only grow when we consider how a decentralized network could implement these requirements.

Ambiguity over the fediverse shows that when designing Internet regulation, it is important to do so with the widest possible creativity and innovation, instead of having certain players in mind. The last thing Europe wants is its regulation that restricts future innovation, raising barriers to entry for new businesses and users alike.



https://techpolicy.press/can-mastodon-survive-europes-digital-services-act/


....


Mastodon was one of the most popular recommended alternatives to twitter, in the hours after the Elon Musk sale went through. Mastodon is decentralized, open source and minimalist. Which makes it popular as a throwback to some of the features which made twitter popular before it became mainstream.

Unfortunately, mastodons decentralized design. Coupled with the EU's Digital Services Act (DSA) lacking clear guidelines on how decentralized social media platforms should be regulated. Could all conspire to impact mastodon negatively in a worst case scenario. Exactly what the outcome will be, is not clear.

However it seems that vast clarifications could be made concerning the EU's Digital Services Act and how it applies to decentralized social media.
1714892164
Hero Member
*
Offline Offline

Posts: 1714892164

View Profile Personal Message (Offline)

Ignore
1714892164
Reply with quote  #2

1714892164
Report to moderator
1714892164
Hero Member
*
Offline Offline

Posts: 1714892164

View Profile Personal Message (Offline)

Ignore
1714892164
Reply with quote  #2

1714892164
Report to moderator
The network tries to produce one block per 10 minutes. It does this by automatically adjusting how difficult it is to produce blocks.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714892164
Hero Member
*
Offline Offline

Posts: 1714892164

View Profile Personal Message (Offline)

Ignore
1714892164
Reply with quote  #2

1714892164
Report to moderator
1714892164
Hero Member
*
Offline Offline

Posts: 1714892164

View Profile Personal Message (Offline)

Ignore
1714892164
Reply with quote  #2

1714892164
Report to moderator
rokok lokal
Member
**
Offline Offline

Activity: 120
Merit: 25


View Profile
December 09, 2022, 04:48:17 AM
 #2

Mastodon probably hosts a stronger community, with more features and a larger group of users and the developers are pretty smart. While this may not meet all the European regulations it is still a step in the right direction. Mastodon may only have a chance to survive the actions of European digital services.

██████████████ ███████ █│     S y n t r u m     │     JOIN NOW     │█ ███████ ██████████████
►   Blockchain Infrastructure for DeFi, Gaming and NFT   ◄
██████████████       |       Twitter       |     Telegram     |      Medium      |       ██████████████
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!