this post was submitted on 26 Aug 2024
23 points (100.0% liked)

TechTakes

1401 readers
154 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] dgerard@awful.systems 19 points 2 months ago (9 children)

Basically there isn't significant improvement to be had in the tokeniser, because it's already been trained on all the data on earth. So all they have left is overengineering.

[–] corbin@awful.systems 9 points 2 months ago (2 children)

Calling it now: codepoint-level non-tokenizing, with a remapping step to only recognize the most popular thousands of codepoints, would outperform what OpenAI has forced themselves into using. Evidence is circumstantial but strong, e.g. how arithmetic isn't learned right because BPE tokenizers obscure Arabic digits. They can't backpedal on this without breaking some of their API and re-pretraining a model, and they make a big deal about how expensive GPT pretraining is, so they're stuck in their local minimum.

[–] anton@lemmy.blahaj.zone 6 points 2 months ago (1 children)

But then it can't SolidGoldMagicarp SolidGoldMagicarp SolidGoldMagicarp SolidGoldMagicarp

[–] UnseriousAcademic@awful.systems 4 points 2 months ago

The only viable use case, in my opinion, is to utilise its strong abilities in SolidGoldMagicarp to actualise our goals in the SolidGoldMagicarp sector and achieve increased margins on SolidGoldMagicarp.

load more comments (6 replies)