this post was submitted on 30 Jun 2023
13 points (100.0% liked)

LocalLLaMA

2249 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] notfromhere@lemmy.one 1 points 1 year ago (1 children)

I read the guy’s blog post on SuperHOT and it sounded like it didn’t increase perplexity and kept perplexity super low with large contexts. I could have read it wrong but I thought it wasn’t supposed to increase perplexity.

[–] simple@lemmy.mywire.xyz 2 points 1 year ago

The increase in perplexity is very small, but there is still some with 8K content. But it seems like with 2K its much larger. I could be misunderstanding something myself. But my little test with 2K context does suggest there's something going on with 2K contexts on SuperHOT models