The Internet has given way to numerous applications since its creation, but bots win when it comes to being multi-faceted and pervasive.
A bot is an online software application that typically carries out simple and repetitive automated tasks. Today, bots are so numerous that they account for more online traffic than humans.
A bot’s purpose is determined by a set of directives forming the bot’s algorithm, and whether the bot’s motivation is helpful or harmful is up to the human programmer.
Malicious bots have garnered wide publicity due to their frequency online, and in 2016, 94 percent of all websites experienced a bot attack. Malicious bots take on many forms, including those that assume false identities to bypass website security, and bots that are used for unauthorized data extraction from secure databases.
In the realm of social media, bots have become associated with misdirection and spamming, due to networks of bots (“botnets”) contributing to social media fraud.
One common bot used for this type of fraud is the amplification bot. These bots pose as humans and tweet and retweet posts from clients who pay for their services.
But They’re Also Helpful
While helpful bots rarely find themselves in news headlines, they’re just as widely used as malicious bots.
Feed Fetchers are one type of helpful bot that most internet users interact with daily. Feed Fetchers are bots programmed to retrieve data for the purpose of displaying it on sites like Google and Twitter.
Bots increasingly are becoming important as more industries move to online platforms.
One industry that’s benefiting from this trend: Media.
Bots have proven to be vital tools in journalism, and newsrooms have been using them for years now.
Forward-thinking newsrooms use bots to automate tedious and repetitive tasks, so reporters can focus on more thoughtful work.
Bot-building companies and workshops allow newsrooms to build bots tailored to specific needs. These newsbots give an edge in the modern news cycle to news organizations faced with decreasing revenue and personnel cuts.
Investigative journalism has benefited from bots by using them to delve through large data sets and to keep tabs on those in power.
Bots used to monitor specific areas of the Internet are called web crawlers. Web crawlers allow journalists to monitor sites and social media profiles for breaking news.
Reuters’ News Tracer bot has allowed breaking news events to be reported at a speed previously difficult for human reporters to achieve, if not entirely impossible.
Bots That Write
Bots also are used to write news stories, but their uses are limited, and they function mostly as a form of a “mad libs reporter.”
Editors provide the templates for these news stories, including spaces in the template for the information the bots are to search out. Once the bots are given the parameters of their task, they search for necessary information and fill in the blanks.
A new company in Japan provides insight into how mainstream newsrooms might use bots in the coming months and years. JX Press Corp uses machine learning to comb social media sites for potential breaking news, which then is processed using reporting that’s completely automated. The company features a newsroom staffed by engineers — not journalists — something that could become more common in the future.
As newsrooms become inundated with bot technology, the line between human- and machine-written stories will continue to blur.
Subscribe to Beyond Bylines to get media trends, journalist interviews, blogger profiles, and more sent right to your inbox.
Julian Dossett is a Cision editor and black coffee enthusiast. He’s based in New Mexico.