Caithness Map :: Links to Site Map Great value Unlimited Broadband from an award winning provider  

 

Wikibirthday

20th January 2021

It's hard to imagine that anyone who spends time online doesn't rely on Wikipedia as the default source of information about anything. Twenty years old last Friday, Wikipedia is remarkable for many reasons not least that in this era of social media behemoths driven by algorithms which seem almost designed to encourage bad behaviour, it has remained open source, not for profit, advertisement free and run by hundreds of thousands of volunteer editors. It is the best of what the pioneers of the world wide web envisioned it might become. Happy birthday, Jimmy Swales and Larry Sanger.

Heather Kelly, Washington Post

Wikipedia is a thing that shouldn’t work, but somehow does. Hundreds of thousands of volunteers, without pay, collectively trying to document every corner of human knowledge, including history happening in real time.

This month, the online encyclopedia‘s strengths and quirks were on full display as hundreds of volunteers furiously worked to create a page for the Capitol riots as events unfolded Jan. 6. As it transitioned from a protest to something more violent, Wikipedia’s volunteer editors added key details while debating the article title, as shared by editor Molly White. Was it a protest, an insurrection or a riot? It ended up the "2021 storming of the United States Capitol." Hundreds of people were working on the ballooning document at a time, which has now been touched by nearly 1,000 editors, is more than 10,000 words long and has been viewed nearly 2 million times.

Like most Wikipedia articles, it will continue to change, a fluid draft of history meant to stick as closely to dispassionate facts as possible while regularly swatting off attempts to insert opinions and disinformation.

"I think the large number of editors helps to make sure different viewpoints are considered," said White, who has put in 12 hours of editing on the page and related wikis since last week. “Any changes must be carefully sourced, and there are constant discussions to ensure neutral tone and appropriate weight to topics within the page. ... It is the lower-profile pages that are more susceptible.”

Wikipedia is turning 20 years old on Friday, and in the midst of heightened concerns about the spread of disinformation and misinformation, its pages on controversial topics or current events can be a balm. The page for QAnon gets straight to the point in its first line, saying it “is a disproven and discredited far-right conspiracy theory.” The page for the Proud Boys is equally straightforward, calling them “a far-right, neo-fascist, and male-only political organization that promotes and engages in political violence in the United States and Canada.”

Founded in 2001 by Jimmy Wales and Larry Sanger, Wikipedia is an ad-free site edited by volunteers and hosted by the nonprofit Wikimedia Foundation. It’s one of the 20 most popular sites on the Internet, and its pages are regularly the top results for Google searches. Anyone interested in changing an article is allowed, and people with more experience can gain more privileges. Some editors have specialties, others are generalists, and they all donate their time and energy to try to keep the resource clean and informative in multiple languages. Editors follow a few basic tenets, including that posts should have a neutral point of view, they should treat each other with respect and that there are no firm rules.

They’ve had a busy year. Like many other sites, Wikipedia saw an increase in use during the pandemic, especially the early months. In addition to having more time to contribute, editors have also had an unending stream of news topics to work on.

Many conversations happening between Wikipedia’s editors reflect what’s happening inside news publications and tech companies, but are being played out largely in public. You can see a history of revisions for each story and some of the back and forth between people editing it under the Talk pages, which show discussions between editors. It’s a real example of the word tech companies frequently throw around when discussing their controversial moderation decisions: transparency.

What’s most striking about Wikipedia is its sheer size. Like the number of posts (55 million), the number of volunteers (270,000 active editors a month) and even the number of edits that have taken place (it just passed a billion).

“It is remarkable that it exists when you think about the history of knowledge in the world and who has access to it and the very idea that people can participate in it,” said Katherine Maher, CEO of the Wikimedia Foundation. “It is a somewhat radical act to be able to write your own history, and in many places in the world this is not a thing people take for granted.”

Wikipedia still has its share of errors and incorrect information, though it says most “vandalism” is removed within five minutes. There’s a rich history of hidden pranks and hoax entries; partisan protest edits, like the repeated deletion of Donald Trump’s entry; and angry vandalism, like when Beyonce’s fans attacked Beck’s page when he beat her out for a Grammy. Last year it was discovered that an American teenager had created nearly half of the Scots language Wikipedia pages, without actually knowing the language. Screenshots of these temporarily altered entries can go viral and be seen long after the pages are fixed. Given its size, smaller errors can go undetected for years.

The site has also struggled with diversity among its editors, who skew largely white and male for English-language entries. Wikipedia says that less than 20 percent of its editors identify as women, and a 2018 survey conducted by Wikimedia found 14 percent of editors had experienced some form of harassment.

But it doesn’t face the same kinds of issues with disinformation that the big tech companies do. Everyone from Facebook and Twitter to Snapchat have struggled with moderation, attempting to balance a desire for not wanting to be seen as censoring users with an overwhelming volume of problematic, violent and racist user-generated content. Many social media companies have been hesitant to come out and say some sources are less trustworthy than others for fear of alienating part of their audience. Moderation is largely handled by paid workers who review posts flagged by people or automated systems and follow ever-changing internal rules to determine what stays up.

Facebook has had to hire tens of thousands of moderators and has built custom artificial intelligence systems which now detects more than 94 percent of the hate speech that is removed from the site. It has tried to rank sources by how trustworthy they are and suppress the reach of problematic content. In the past week, it started removing content around the “Stop the Steal” phrase. Ahead of the election, Twitter tried to make it harder for misinformation to spread by removing the ability to retweet without a comment. And most recently, all the big social media sites, from Facebook to Pinterest, banned or suspended President Trump’s accounts after the Capital riots.

While also populated entirely with content from users, Wikipedia has a number of things on its side when it comes to moderation. One of its key advantages is that there is only one page for each subject, and duplicates are removed by editors, meaning it is not set up in a way that lets things go viral. While thousands of misleading articles about the November election results might circulate on Facebook, there is only one main story to tend to on Wikipedia.

It also has a number of tools meant to keep articles clean. There’s the ability to protect and lock down pages, limiting the ability of new editors to change them. People who frequently make false edits can be banned. Editors follow policies meant to keep out anything untrue, such as requiring sources for all claims. And when it comes to those sources, there is of course a Wikipedia that lists sources and rates them according to how reliable they are. Still, the site is open about not wanting to be a reliable source.

“Wikipedia is very open about that fact that we’re not a reliable source. It’s actually a tenet of Wikipedia, which isn’t to say we’re not a good place to start,” Maher said. “We are a great place to start. We just want people to have the ability to read the content on Wikipedia with a critical eye.”

Not everyone sees it the same way. Some of the big tech companies have turned to Wikipedia as a source for fact checks. In 2018, YouTube surprised Wikipedia editors and Wikimedia when it announced the site was going to be a fact-check partner, with text from its articles under controversial videos.

For the past 15 years, Anne Clin has worked as a volunteer editor on Wikipedia under the name Risker. She started by fixing a single typo and has gone on to become an administrator who spends between one and four hours a day editing stories. Her specialty is areas of conflict, like hotly debated topics. She looks out for articles that need to be protected, meaning only experienced editors can make changes, and keeps an eye out for nonpublic information like phone numbers that need to be removed. The site has changed dramatically since she started.

“It’s an evolution. I probably wouldn’t have considered it a reliable source back in 2005. It was getting there, it was just starting off,” said Clin, who recently retired from her job in health care. “But as we have developed over the years and created our infrastructure to support really good quality data, it’s really helped a lot.”

While the more obvious targets for disinformation are locked down, Clin and other editors have learned to keep an eye on unexpected pages. If someone can’t say the QAnon conspiracy theory is true on the topic’s main article, they might find ways to sprinkle its misinformation into lesser-trafficked wikis.

That’s how they discovered the page for Benford’s law, an obscure mathematical theory. Some followers of the QAnon conspiracy theory had seized on the mathematical theory as proof that the November election was fraudulent, and changed the Wikipedia page accordingly. Clin turned to a group of editors who work specifically on mathematical articles and quickly cleaned it up. Stories about QAnon’s conspiracy theory on Benford’s law are still circulating on social media.

From Scottish Community Alliance - https://scottishcommunityalliance.org.uk/

www.wikipedia.org