White House Wants Public Input on Artificial Intelligence

(RepublicanNews.org) – With the rise of powerful artificial intelligence programs such as ChatGPT, the federal government wants to hear from the public about how best to regulate these computer systems.

The National Telecommunications and Information Administration (NTIA) is a group with the Commerce Department; it advises the White House on communications and information policy. On April 11th, NTIA asked the public for input as it develops policies “to ensure artificial intelligence (AI) systems work as claimed—and without causing harm.”

What might the “harm” look like specifically? The possibilities seem endless. Anyone who has had a chance to use the service ChatGPT will quickly see how unnervingly human-like its written responses can be to questions asked by users. It is often difficult or impossible to tell from the written text it puts out that the text was written by a machine, not a person.

It is not only that ChatGPT can give specific, understandable, and accurate answers to problems posed by users. It can do that. But it can do so in a way that can convince a user that he or she is talking to a human, not a robot.

How will users know if the answers are real or fabricated? What about news reports written by ChatGPT? These are the questions NTIA and data scientists are trying to answer. 

There are already 100 million active users of ChatGPT around the globe.

Alan Davidson, an official with NTIA, said, “Responsible AI systems could bring enormous benefits”, but humans have to figure out how to handle the potential damage such sophisticated “intelligence” can do. He said that businesses and consumers need to be able to trust these tools if they’re to reach their full potential.

Last week, President Joe Biden said it was still unclear whether artificial intelligence is more dangerous than helpful. “Tech companies have a responsibility, in my view, to make sure their products are safe before making them public,” he said.

Copyright 2023, RepublicanNews.org