Such thing exits already Better way to get to information - #7 by Tomasz but due to limitations of LLM (hallucinations) it requires human approval/moderation. The LLMs are simply too unreliable, even if they are provided with right context.
Such thing exits already Better way to get to information - #7 by Tomasz but due to limitations of LLM (hallucinations) it requires human approval/moderation. The LLMs are simply too unreliable, even if they are provided with right context.