this post was submitted on 28 Jul 2023
25 points (96.3% liked)

LocalLLaMA

2249 readers
1 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] j4k3@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

Try and ask it about Polish notation and then prompt it to solve: 3 3 +

We argued a bit. Ms. Example 7Bitchs earned her new name. 13B was less argumentative about corrections, but I couldn't find an angle that coaxed correct responses. Early computing languages handled Polish notation much better because it is stack based and linear without arbitrary rules. Also from the early years of programming, I am really surprised that no one has been training a model to code in a threaded interpreted language like Forth because it is super powerful, flexible with far fewer rules and arbitrary syntax, but most importantly, it is linear and builds exponentially. It's core building mechanic is already tokenized.