According to former employees, the AI startup EvenUp misled investors about the capabilities of its technology. The company claimed to have developed an AI system that could understand and summarize long documents, but employees said the technology often produced inaccurate and nonsensical outputs, a phenomenon known as “hallucinations” in AI. EvenUp’s CEO denied the allegations, stating that the company’s technology worked as advertised. However, former employees provided examples of the system’s errors, including summarizing a document about the Iraq War as being about the Vietnam War. The allegations raise concerns about the potential for AI companies to overstate their capabilities to attract funding and highlight the challenges of developing reliable AI systems for complex tasks like text summarization.
Source: https://www.businessinsider.com/evenup-ai-errors-hallucinations-former-employees-2024-11