As the Popo Agie River wends its way down from the glaciers atop Wyoming’s Wind River Mountains toward the city of Lander, it flows into a limestone cave and disappears. The formation, known as the Sinks, spits the river back out at another feature called the Rise a quarter of a mile east, a little more voluminous and a little warmer, with brown and rainbow trout weighing as much as 10 pounds mingling in its now smooth pools. The quarter-mile journey from the Sinks to the Rise takes the river two hours.
Scientists first discovered this quirk of the middle fork of the Popo Agie (pronounced puh-po zuh) in 1983 by pouring red dye into the river upstream and waiting for it to resurface. Geologists attribute the river’s mysterious delay to the water passing through exceedingly small crevasses in the rock that slow its flow.
Like many rivers in the arid West, the Popo Agie is an important aquifer. Ranchers, farmers, businesses, and recreationists rely on detailed data about it—especially day-to-day streamflow measurements. That’s exactly the type of empirical information collected by the National Oceanic and Atmospheric Administration (NOAA).
On Saturday, an Associated Press investigation revealed that OpenAI's Whisper transcription tool creates fabricated text in medical and business settings despite warnings against such use. The AP interviewed more than 12 software engineers, developers, and researchers who found the model regularly invents text that speakers never said, a phenomenon often called a "confabulation" or "hallucination" in the AI field.
Upon its release in 2022, OpenAI claimed that Whisper approached "human level robustness" in audio transcription accuracy. However, a University of Michigan researcher told the AP that Whisper created false text in 80 percent of public meeting transcripts examined. Another developer, unnamed in the AP report, claimed to have found invented content in almost all of his 26,000 test transcriptions.
The fabrications pose particular risks in health care settings. Despite OpenAI's warnings against using Whisper for "high-risk domains," over 30,000 medical workers now use Whisper-based tools to transcribe patient visits, according to the AP report. The Mankato Clinic in Minnesota and Children's Hospital Los Angeles count among 40 health systems using a Whisper-powered AI copilot service from medical tech company Nabla that is fine-tuned on medical terminology.