NEW YORK — Computer-generated writers … writing computer-generated stories?
Sports Illustrated is the latest media company to see its reputation damaged by being less than forthcoming – if not outright dishonest – about who or what is writing its stories at the dawn of the artificial intelligence age.
The once-powerful publication said it was firing a company that produced articles for its website written under the byline of authors who apparently don’t exist. But it denied a published report that stories themselves were written by an artificial intelligence tool.
Earlier this year, experiments with AI went awry at both the Gannett newspaper chain and the CNET technology website. Many companies are testing the new technology at a time when human workers fear it could cost jobs. But the process is fraught in journalism, which builds and markets its values-based products around the notions of truth and transparency.
While there’s nothing wrong in media companies experimenting with artificial intelligence, “the mistake is in trying to hide it, and in doing it poorly,” said Tom Rosenstiel, a University of Maryland professor who teaches journalism ethics.
“If you want to be in the truth-telling business, which journalists claim they do, you shouldn’t tell lies,” Rosenstiel said. “A secret is a form of lying.”
CONFLICTING ACCOUNTS OF WHAT HAPPENED
Sports Illustrated, now run as a website and once-monthly publication by the Arena Group, at one time was a weekly in the Time Inc. stable of magazines known for its sterling writing. “Its ambitions were grand,” said Jeff Jarvis, author of “Magazine,” a book he describes as an elegy for the industry.
On Monday, the Futurism website reported that Sports Illustrated used stories for product reviews that had authors it could not identify. Futurism found a picture of one author listed, Drew Ortiz, on a website that sells AI-generated portraits.
The magazine’s author profile said that “Drew has spent much of his life outdoors, and is excited to guide you through his never-ending list of the best products to keep you from falling to the perils of nature.”
Upon questioning Sports Illustrated, Futurism said all of the authors with AI-generated portraits disappeared from the magazine’s website. No explanation was offered.
Futurism quoted an unnamed person at the magazine who said artificial intelligence was used in the creation of some content as well – “no matter how much they say that it’s not.”
Sports Illustrated said the articles in question were created by a third-party company, AdVon Commerce, which assured the magazine that they were written and edited by humans. AdVon had its writers use a pen name, “actions we don’t condone,” Sports Illustrated said.
“We are removing the content while our internal investigation continues and have since ended the partnership,” the magazine said. A message to AdVon wasn’t immediately returned on Tuesday.
In a statement, the Sports Illustrated Union said it was horrified by the Futurism story.
“We demand answers and transparency from Arena group management about what exactly has been published under the SI name,” the union said. “We demand the company commit to adhering to basic journalistic standards, including not publishing computer-written stories by fake people.”
NOT THE FIRST SUCH SITUATION
Gannett paused an experiment at some of its newspapers this summer in which AI was used to generate articles on high school sports events, after errors were discovered. The articles carried the byline “LedeAI.”
Some of the unpleasant publicity that resulted might have been avoided if the newspapers had been explicit about the role of technology, and how it helped create articles that journalists might not have been available to do, Jarvis said. Gannett said a lack of staff had nothing to do with the experiment.
This past winter, it was reported that CNET had used AI to create explanatory news articles about financial service topics attributed to “CNET Money Staff.” The only way for readers to learn that technology was involved in the writing was to click on that author attribution.
Only after its experiment was discovered and written about by other publications did CNET discuss it with readers. In a note, then-editor Connie Guglielmo said that 77 machine-generated stories were posted, and that several required corrections. The site subsequently made it more clear when AI is being used in story creation.
“The process may not always be easy or pretty, but we’re going to continue embracing it, and any new technology that we believe makes life better,” Guglielmo wrote.
Other companies have been more up front about their experiments. Buzzfeed, for example, attributed a travel article on Santa Barbara, Calif., to writer Emma Heegar and Buzzy the Robot, “our creative AI assistant.”
“We’ll be developing content that is AI-native – cool new things that you couldn’t do at all without AI – and things that are enhanced by AI but created by humans,” Buzzfeed said in a note to readers.
The Associated Press has been using technology to assist in articles about financial earnings reports since 2014, and more recently in some sports stories. At the end of each such story is a note that explains technology’s role in its production, a spokeswoman said.
For instance, a short article about an upcoming NBA matchup earlier this month had this note at the end: “The Associated Press created this story using technology provided by Data Skrive and data from Sportradar.”
Send questions/comments to the editors.
Comments are no longer available on this story