NYC’s business chatbot is reportedly doling out ‘dangerously inaccurate’ information

  • 📰 engadget
  • ⏱ Reading Time:
  • 15 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 9%
  • Publisher: 63%

Australia Australia Headlines News

Australia Australia Latest News,Australia Australia Headlines

Cheyenne is Engadget’s weekend editor and covers a little bit of everything. She’s particularly interested in emerging technology and niche gadgets, climate change, space, privacy, and internet culture. She’ll talk your ear off about Tamagotchis if you get her started.

An AI chatbot released by the New York City government to help business owners access pertinent information has been spouting falsehoods, at times even misinforming users about actions that are against the law, according to a report from, includes numerous examples of inaccuracies in the chatbot’s responses to questions relating to housing policies, workers’ rights and other topics.as “a one-stop shop for city services and benefits.

”’s tests, the chatbot repeatedly provided incorrect information. In response to the question, “Can I make my store cashless?”, for example, it replied, “Yes, you can make your store cashless in New York City” — despite the fact that New York City banned cashless stores in 2020.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 276. in AU

Australia Australia Latest News, Australia Australia Headlines