this post was submitted on 04 Aug 2023
21 points (100.0% liked)
LocalLLaMA
2237 readers
13 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Wouldn't go amiss.
Okay, point taken. I've been guilty of lurking inappropriately and I can model the consequences of that.
I have a reasonable amount of direct experience of purposeful llama.cpp use with 7B/13B/30B models to offer. And there's a context - I'm exploring its potential role in the generation of content, supporting a sci-fi web comic project - hardly groundbreaking, I know but I'm hoping it'll help me create something outside the box.
For additional context, I'm a cognitive psychologist by discipline and a cognitive scientist by profession (now retired) and worked in classic AI back in the day.
Over on TheBloke's discord server, I've been exposing the results of a small variety of pre-trained LLM models' responses to the 50 questions of the OCEAN personality questionnaire, presented 25 times to each - just curious to see whether there was any kind of a reliable pattern emerging from the pre-training:
OCEAN questionnaire full-size jpeg
Looks like the larger models enable a wider range of responses, I guess that's an expected consequence of a smoother manifold.
Happy to answer any questions that people may have and will be posting more in future.
Cheers, Graham
I would love to see more of this and maybe making it its own post for more traction and discussion, do you have a link to those pictures elsewhere? can't seem to get a large version loaded on desktop haha.
I edited the post to include a link to the discord image. If there's interest I can make a post with more details (I used Python's
pexpect
to communicate with a spawned llama.cpp process).