We Need Product Safety Regulations for Social Media

Date:



Like many people, I’ve used Twitter, or X, less and less over the last year. There is no one single reason for this: the system has simply become less useful and fun. But when the terrible news about the attacks in Israel broke recently, I turned to X for information. Instead of updates from journalists (which is what I used to see during breaking news events), I was confronted with graphic images of the attacks that were brutal and terrifying. I wasn’t the only one; some of these posts had millions of views and were shared by thousands of people.

This wasn’t an ugly episode of bad content moderation. It was the strategic use of social media to amplify a terror attack made possible by unsafe product design. This misuse of X could happen because, over the past year, Elon Musk has systematically dismantled many of the systems that kept Twitter users safe and laid off nearly all the employees who worked on trust and safety at the platform. The events in Israel and Gaza have served as a reminder that social media is, before anything else, a consumer product. And like any other mass consumer product, using it carries big risks.

When you get in a car, you expect it will have functioning brakes. When you pick up medicine at the pharmacy, you expect it won’t be tainted. But it wasn’t always like this. The safety of cars, pharmaceuticals and dozens of other products was terrible when they first came to market. It took much research, many lawsuits, and regulation to figure out how to get the benefits of these products without harming people.

Like cars and medicines, social media needs product safety standards to keep users safe. We still don’t have all the answers on how to build those standards, which is why social media companies must share more information about their algorithms and platforms with the public. The bipartisan Platform Accountability and Transparency Act would give users the information they need now to make the most informed decisions about what social media products they use and also let researchers get started figuring out what those product safety standards could be.

Social media risks go beyond amplified terrorism. The dangers that algorithms designed to maximize attention represent to teens, and particularly to girls, with still-developing brains have become impossible to ignore. Other product design elements, often called “dark patterns,” designed to keep people using for longer also appear to tip young users into social media overuse, which has been associated with eating disorders and suicidal ideation. This is why 41 states and the District of Columbia are suing Meta, the company behind Facebook and Instagram. The complaint against the company accuses it of engaging in a “scheme to exploit young users for profit” and building product features to keep kids logged on to its platforms longer, while knowing that was damaging to their mental health.

Whenever they are criticized, Internet platforms have deflected blame onto their users. They say it’s their users’ fault for engaging with harmful content in the first place, even if those users are children or the content is financial fraud. They also claim to be defending free speech. It’s true, governments all over the world order platforms to remove content, and some repressive regimes abuse this process. But the current issues we are facing aren’t really about content moderation. X’s policies already prohibit violent terrorist imagery. The content was widely seen anyway only because Musk took away the people and systems that stop terrorists from leveraging the platform. Meta isn’t being sued because of the content its users post but because of the product design decisions it made while allegedly knowing they were dangerous to its users. Platforms already have systems to remove violent or harmful content. But if their feed algorithms recommend content faster than their safety systems can remove it, that’s simply unsafe design.

More research is desperately needed, but some things are becoming clear. Dark patterns like autoplaying videos and endless feeds are particularly dangerous to children, whose brains are not developed yet and who often lack the mental maturity to put their phones down. Engagement-based recommendation algorithms disproportionately recommend extreme content.

In other parts of the world, authorities are already taking steps to hold social media platforms accountable for their content. In October, the European Commission requested information from X about the spread of terrorist and violent content as well as hate speech on the platform. Under the Digital Services Act, which came into force in Europe this year, platforms are required to take action to stop the spread of this illegal content and can be fined up to 6 percent of their global revenues if they don’t do so. If this law is enforced, maintaining the safety of their algorithms and networks will be the most financially sound decision for platforms to make, since ethics alone do not seem to have generated much motivation.

In the U.S., the legal picture is murkier. The case against Facebook and Instagram will likely take years to work through our courts. Yet, there is something that Congress can do now: pass the bipartisan Platform Accountability and Transparency Act. This bill would finally require platforms to disclose more about how their products function so that users can make more informed decisions. Moreover, researchers could get started on the work needed to make social media safer for everyone.

Two things are clear: First, online safety problems are leading to real, offline suffering. Second, social media companies can’t, or won’t, solve these safety problems on their own. And those problems aren’t going away. As X is showing us, even safety issues like the amplification of terror that we thought were solved can pop right back up.  As our society moves online to an ever-greater degree, the idea that anyone, even teens, can just “stay off social media” becomes less and less realistic. It’s time we require social media to take safety seriously, for everyone’s sake.

This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.



Source link

Share post:

[tds_leads title_text="Subscribe" input_placeholder="Email address" btn_horiz_align="content-horiz-center" pp_checkbox="yes" pp_msg="SSd2ZSUyMHJlYWQlMjBhbmQlMjBhY2NlcHQlMjB0aGUlMjAlM0NhJTIwaHJlZiUzRCUyMiUyMyUyMiUzRVByaXZhY3klMjBQb2xpY3klM0MlMkZhJTNFLg==" f_title_font_family="653" f_title_font_size="eyJhbGwiOiIyNCIsInBvcnRyYWl0IjoiMjAiLCJsYW5kc2NhcGUiOiIyMiJ9" f_title_font_line_height="1" f_title_font_weight="700" f_title_font_spacing="-1" msg_composer="success" display="column" gap="10" input_padd="eyJhbGwiOiIxNXB4IDEwcHgiLCJsYW5kc2NhcGUiOiIxMnB4IDhweCIsInBvcnRyYWl0IjoiMTBweCA2cHgifQ==" input_border="1" btn_text="I want in" btn_tdicon="tdc-font-tdmp tdc-font-tdmp-arrow-right" btn_icon_size="eyJhbGwiOiIxOSIsImxhbmRzY2FwZSI6IjE3IiwicG9ydHJhaXQiOiIxNSJ9" btn_icon_space="eyJhbGwiOiI1IiwicG9ydHJhaXQiOiIzIn0=" btn_radius="3" input_radius="3" f_msg_font_family="653" f_msg_font_size="eyJhbGwiOiIxMyIsInBvcnRyYWl0IjoiMTIifQ==" f_msg_font_weight="600" f_msg_font_line_height="1.4" f_input_font_family="653" f_input_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMiJ9" f_input_font_line_height="1.2" f_btn_font_family="653" f_input_font_weight="500" f_btn_font_size="eyJhbGwiOiIxMyIsImxhbmRzY2FwZSI6IjEyIiwicG9ydHJhaXQiOiIxMSJ9" f_btn_font_line_height="1.2" f_btn_font_weight="700" f_pp_font_family="653" f_pp_font_size="eyJhbGwiOiIxMyIsImxhbmRzY2FwZSI6IjEyIiwicG9ydHJhaXQiOiIxMSJ9" f_pp_font_line_height="1.2" pp_check_color="#000000" pp_check_color_a="#ec3535" pp_check_color_a_h="#c11f1f" f_btn_font_transform="uppercase" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjQwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGUiOnsibWFyZ2luLWJvdHRvbSI6IjM1IiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiMzAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3Njh9" msg_succ_radius="2" btn_bg="#ec3535" btn_bg_h="#c11f1f" title_space="eyJwb3J0cmFpdCI6IjEyIiwibGFuZHNjYXBlIjoiMTQiLCJhbGwiOiIxOCJ9" msg_space="eyJsYW5kc2NhcGUiOiIwIDAgMTJweCJ9" btn_padd="eyJsYW5kc2NhcGUiOiIxMiIsInBvcnRyYWl0IjoiMTBweCJ9" msg_padd="eyJwb3J0cmFpdCI6IjZweCAxMHB4In0="]
spot_imgspot_img

Popular

More like this
Related

The 1-Ingredient Upgrade for Better Chocolate Chip Cookies (Works Every Time)

My freezer is always stocked with chocolate...

Fit for Royalty Gift Basket – GiftTree

White Chocolate LINDOR Truffles Exquisitely creamy chocolate and...

Gardening For Mental Wellness | Gardening Know How

Gardening is often touted as a good hobby...