• 0 Posts
  • 13 Comments
Joined 2 years ago
cake
Cake day: August 7th, 2023

help-circle





  • That isn’t neccesarily true, though for now there’s no way to tell since they’ve yet to release their code. If the timeline is anything like their last paper it will be out around a month after publication, which will be Nov 20th.

    There have been similar papers for confusing image classification models, not sure how successful they’ve been IRL.







  • LLMs only predict the next token. Sometimes those predictions are correct, sometimes they’re incorrect. Larger models trained on a greater number of examples make better predictions, but they are always just predictions. This is why incorrect responses often sound plausable even if logically they don’t make sense.

    Fixing hallucinations is more about decreasing inaccuracies rather than fixing an actual problem with the model itself.