Is Zuckerberg Doing a Bill Gates?
[ad_1]
It’s easy to be cynical when multi-billionaires say they want to change the world for good, but there is a lot of candid and refreshing honesty in Mark Zuckerberg’s 6,000-word note, or to be more accurate perhaps, manifesto, Building Global Community released on Facebook yesterday.
In it, Zuckerberg stresses the importance of communities and groups, both in the online and offline worlds and explains how Facebook is trying to encourage the creation of new groups and get people to join them.
The honesty is pretty searing and occurs several times, when he fesses up to some of Facebook’s failings. He says: “There are billions of posts, comments and messages across our services each day, and since it’s impossible to review all of them, we review content once it is reported to us. There have been terribly tragic events – like suicides, some live streamed – that perhaps could have been prevented if someone had realized what was happening and reported them sooner. There are cases of bullying and harassment every day, that our team must be alerted to before we can help out. These stories show we must find a way to do more.”
He goes on to say that Facebook is turning to artificial intelligence to surface content that Facebook’s team should review. “This is still very early in development, but we have started to have it look at some content, and it already generates about one-third of all reports to the team that reviews content for our community,” he says.
“It will take many years to fully develop these systems. Right now, we’re starting to explore ways to use AI to tell the difference between news stories about terrorism and actual terrorist propaganda so we can quickly remove anyone trying to use our services to recruit for a terrorist organization. This is technically difficult as it requires building AI that can read and understand news, but we need to work on this to help fight terrorism worldwide.”
And again on fake news, where he says: “Accuracy of information is very important. We know there is misinformation and even outright hoax content on Facebook, and we take this very seriously. We’ve made progress fighting hoaxes the way we fight spam, but we have more work to do. We are proceeding carefully because there is not always a clear line between hoaxes, satire and opinion. In a free society, it’s important that people have the power to share their opinion, even if others think they’re wrong. Our approach will focus less on banning misinformation, and more on surfacing additional perspectives and information, including that fact checkers dispute an item’s accuracy.”
He’s honest, too, in admitting to mistakes in identifying content that should, or should not be allowed on Facebook, saying: “In the last year, the complexity of the issues we’ve seen has outstripped our existing processes for governing the community. We saw this in errors taking down newsworthy videos related to Black Lives Matter and police violence, and in removing the historical Terror of War photo from Vietnam. We’ve seen this in misclassifying hate speech in political debates in both directions – taking down accounts and content that should be left up and leaving up content that was hateful and should be taken down. Both the number of issues and their cultural importance has increased recently.
“This has been painful for me because I often agree with those criticizing us that we’re making mistakes. These mistakes are almost never because we hold ideological positions at odds with the community, but instead are operational scaling issues. Our guiding philosophy for the Community Standards is to try to reflect the cultural norms of our community. When in doubt, we always favor giving people the power to share more.”
In one sense, you get the feeling that Zuckerberg is trying to explain just how difficult it is to run a social network that is global in terms of its coverage and membership. He even says at one point: “With a community of almost two billion people, it is less feasible to have a single set of standards to govern the entire community so we need to evolve towards a system of more local governance.”
Going forward, he says Facebook plans to let each of its users set their own preferences for what they want, or do not want, to see on Facebook when it comes to things like nudity, violence, graphic content and profanity?
“What you decide will be your personal settings … For those who don’t make a decision, the default will be whatever the majority of people in your region selected, like a referendum,” he says.
He adds that AI will help in identifying content that may breach any of these criteria, but cautions that major advances in AI are required to understand text, photos and videos to judge whether they contain hate speech, graphic violence, sexually explicit content, and more. “At our current pace of research, we hope to begin handling some of these cases in 2017, but others will not be possible for many years,” he says.
Alongside the honest admissions of failure and work in progress, Zuckerberg also finds time in the post to point out some of the good things the company has done, like Amber Alerts to help find missing children, and $15m (£12m) raised for victims of the Nepal earthquake in 2015.
But reading the whole thing start to finish, you get the feeling that Zuckerberg is well on the road to “doing a Bill Gates” for want of a better way of putting it. He’s made his billions, and indeed has pledged to donate 99 per cent of his Facebook shares – worth £36bn – to the charitable foundation he runs with his wife.
In a BBC interview following the publication of the post, Zuckerberg pointedly refused to offer an opinion on US president Donald Trump, and reading Zuckerberg’s manifesto, you can’t imagine there’s much in there that the US president would agree with. Who knows, if the rumours are true, Zuckerberg might even run against him four years from now. Now that would be a debate worth watching.
Source link