Now, hours of testimony and thousands of pages of documents from Facebook whistleblower Frances Haugen have re-examined the impact Facebook and its algorithms have on teenagers, democracy, and society at large. The result has raised questions about whether Facebook, and perhaps platforms like it, can use a bevy of algorithms to determine what images, videos and news users see.
But algorithms that pick and choose what we see are central not only to Facebook but to many social media platforms following in Facebook’s footsteps. For example, TikTok would be unrecognizable without a content-recommendation algorithm running the show. And the bigger the platform, the greater the need for algorithms for filtering and sorting content.
Algorithms are not going away. But there are ways for Facebook to make them better, experts in algorithms and artificial intelligence told Granthshala Business. However, it will require something Facebook appears reluctant to offer so far (despite executive talk points): greater transparency and control for users.
What’s in an algorithm?
An algorithm is a set of mathematical steps or instructions, especially for a computer, that describes what to do with some input in order to produce some output. You can think of it roughly similar to a recipe, where the ingredients are the input and the final dish is the output. On Facebook and other social media sites, however, you and your actions – what you type or the pictures you post – are the input. What the social network shows you – whether it’s a post from your best friend or an ad for camping gear – is the output.
At their best, these algorithms can help personalize feeds so that users can discover new people and content that matches their interests based on prior activity. In a worst-case scenario, as Haugen and others have pointed out, they run the risk of directing people down an annoying rabbit hole that could expose them to toxic materials and misinformation. In any case, they keep people scrolling for longer, potentially helping Facebook earn more money by showing users more ads.
Several algorithms work together to create the experience you see online on Facebook, Instagram, and elsewhere. This can make it even more complicated to tease out what’s going on inside such systems, especially at a large company like Facebook, where multiple teams build different algorithms.
“If some high power Facebook would go and say, ‘Fix algorithms in XY’, it’s really hard because they’ve become really complex systems with many inputs, many loads, and they’re going to have multiple systems working together. kind of,” said Hilary Ross, a senior program manager at Harvard University’s Berkman Klein Center for Internet and Society and the Institute for Social Media Rebooting.
“You can even imagine something to be said in this. You may even be able to select preferences for things you want to customize for yourself,” she said, “such as how often you watch content from your immediate family.” Wishes, pictures of high school friends, or kids. All those things can change over time. Why not let users control them?
Transparency is important, she said, because it encourages good behavior from social networks.
Another way social networks can be pushed towards increased transparency is by increasing independent auditing of their algorithmic practices, according to Sasha Costanza-Chalk, director of research and design at the Algorithmic Justice League. They envision this as entirely independent researchers, investigative journalists, or people inside regulatory bodies – not the social media companies themselves, or the companies they hire – that have algorithms to ensure the law. Have the knowledge, skills and legal authority to demand access to the system. are not violated and best practices are followed.
James Mickens, a computer science professor at Harvard and co-director of the Berkman Klein Center’s Institute for Rebooting Social Media, called for Insight to conduct elections without disclosing private information about voters (such as who each person voted for). Suggested to see if audit can be done. About how algorithms can be audited and improved. He believes this could provide some insight into building an audit system that would allow people outside Facebook to provide surveillance while protecting sensitive data.
Other Metrics for Success
Experts say a major barrier to making meaningful improvements is social networks’ focus on the importance of engagement, or how long it takes users to scroll, click, and otherwise interact with social media posts and ads.
This is difficult to change, experts said, although many agree that it may involve considering users’ feelings when using social media, not just the amount of time they spend using it.
“Engagement is not synonymous with good mental health,” Mikens said.
Can Algorithms Really Help Fix Facebook’s Problems? Mickens, at least, is hopeful that the answer is yes. They feel that they can be tailored more towards the public interest. “The question is: what will convince these companies to think this way?” he said.
In the past, some would have said that this would require pressure from advertisers whose dollars back these platforms. But in his testimony, Haugen seemed to bet on a different answer: pressure from Congress.
Credit : www.cnn.com