Ticker

6/recent/ticker-posts

Ad Code

Responsive Advertisement

How Can We Push for Better, More Human Technology? (In Light of the Facebook Leaked Documents)

The recent Facebook leaked documents show that algorithms matter. They can harm people psychologically and socially. Specifically younger people. What is the public’s role in this issue?

The Wall Street Journal recently ran a series of articles entitled The Facebook Files. In them, it shows that Facebook knows, in acute detail, that its platforms are riddled with flaws that cause harm, and it appears the company knows it full well. The internal documents that WSJ acquired included research reports and internal presentations to senior management. All of them show repeatedly that Facebook and Instagram have ill effects on users. Specifically younger users.

Researchers inside Instagram, which is owned by Facebook, have been studying for years how its photo-sharing app affects millions of young users. What they have found over and over is that Instagram is bad for mental health. For instance, one presentation that was leaked stated, 32% of teenage girls surveyed said that when they felt bad about their bodies, Instagram made them feel worse.

The same goes for Facebook itself too. While Mark Zuckerberg has acknowledged that Facebook can cause harm in some ways (for instance, by making people feel lonely), he argues that “there is a certain cynical element” or pessimistic take about social media’s effects which overlooks all of its benefits. But Facebook’s own research suggests that Facebook does cause harm to its users. In one presentation, Facebook engineers discussed how they conducted a “mood experiment” on nearly 700,000 unwitting Facebook users in 2012 and 2013 by manipulating the content of their newsfeeds (which were already automatically filtered according to what Facebook thought each user might like).

So how can responsible individuals push for change? How can you or I make a difference in the digital world? First, we can push for more transparency from Facebook/Instagram about how their algorithm works so it is clear how they are filtering our feeds.

We can also push for Facebook to use psychology rather than tech as the foundation of their product development. The current algorithms have been created with one thing in mind- engagement and time on the platform. But how could the world change for the better if the psychological health of the user was the logic that was driving the algorithm while also remaining engaging?

Lastly, Facebook should be more transparent about what Facebook is doing with our data. Facebook has the power of knowledge over all of their users, which makes it terrifying that they can refuse to let us know exactly what information they are collecting and using for profit.

As consumers, we have an active role in pushing these companies towards creating better products by choosing not to use Facebook or Instagram if they continue down this road. And as voters, the government can also play a key role in making sure technology advances people rather than harms them psychologically/socially. It was effective with big tobacco companies, it can be done again with large social media companies. People can still smoke if they choose, but they do so now with full knowledge and transparency of the effects.

As it stands, Facebook does not provide enough information to its users about how exactly the company uses its data and who they sell it to (and Facebook has admitted as such). It is up to us as consumers of Facebook products and voters to push them towards becoming a better, more transparent company. We can do so by using our power: we vote with our dollars every time we make a purchase or choose which social media platform we will use.

So while Facebook might not have your best interests at heart with regards to protecting you from negative psychological effects or selling your data without transparency — ultimately these things do matter, and we can speak up and be advocates for change.

Enregistrer un commentaire

0 Commentaires