These Bots Are Responsible For Protecting Wikipedia

Feb. 2, 2020



Wikipedia has been every student’s saviour when it comes to finishing school or college projects. The world’s largest crowdsourced website contains information on any topic you can imagine.

As you already know, Wikipedia is an online encyclopedia with testable information. The idea that anyone with an internet connection can make edits to the information freely was bananas. It was never going to work, but somehow the site still serves its purpose.

Enter The Bots

Enter The Bots

Dr. Jeff Nickerson, a professor at the Stevens Institute of Technology in Hoboken, N.J., who has studied Wikipedia bots, told Digital Trends that the primary reason for which the bots were created, was protection against vandalism.

He told that there are a lot of times when someone goes to a Wikipedia page and defames it. With the amount of traffic on the website, it becomes really annoying and difficult for those who maintain those pages to continuously make changes to the pages.

“So one logical kind of protection [was] to have a bot that can detect these attacks.”, stated Nickerson.

The study conducted by the researchers divided the bots into 9 categories according to their roles and responsibilities assigned to them. The categories are explained below:

The Wikipedia that we know and trust for all our school/college projects won’t be the same without the help of these little guys who work tirelessly to make the platform more refined and trustworthy. In this time and age, bots have a negative reputation in the market. But these bots show that every coin has two sides.The Wikipedia bots are the immune system which protects the site and give us hope that technology can really help us. After all, we created technology, technology did not create us.

Bringing the latest in technology, gaming, and entertainment is our superhero team of staff writers. They have a keen eye for latest stories, happenings, and even memes for tech enthusiasts.