this post was submitted on 05 Mar 2024
61 points (94.2% liked)

AI

4117 readers
14 users here now

Artificial intelligence (AI) is intelligence demonstrated by machines, unlike the natural intelligence displayed by humans and animals, which involves consciousness and emotionality. The distinction between the former and the latter categories is often revealed by the acronym chosen.

founded 3 years ago
top 16 comments
sorted by: hot top controversial new old
[–] original_reader@lemm.ee 12 points 8 months ago
[–] desmosthenes@lemmy.world 8 points 8 months ago

it works on apple arm architecture surprisingly well; near instantly typically

[–] gandalf_der_12te@feddit.de 7 points 8 months ago* (last edited 8 months ago) (1 children)

Where did this come from? Did OpenAI finally release the source code? And where does the training data come from? Is the training data public domain/appropriately licensed?

[–] desmosthenes@lemmy.world 1 points 8 months ago

that’s all on the site - openai offers an api for openai model access

[–] PlutoniumAcid@lemmy.world 7 points 8 months ago (1 children)

There's a long listing of models to choose from. How to pick one? What are differences, benefits/drawbacks?

[–] desmosthenes@lemmy.world 1 points 8 months ago

it’s gonna take experimentation; there’s a list of all models in the app or on the site and maybe a little googling. you can still use openai too. mistral is solid overall though; good for programming

[–] Empricorn@feddit.nl 6 points 8 months ago (3 children)

Is running using a GPU a bad thing? I'm new to this...

[–] Fisch@lemmy.ml 13 points 8 months ago

No, a GPU would be ideal but not everyone has one, especially one with enough VRAM. I have an AMD card with 12gb of VRAM and I can run 7B - 13B models but even the 7B models (which seems to be the lowest that is still good) use a little more than 8gb of VRAM and most people probably have an Nvidia card with 8gb or less. 13B models get very close to using the full 12gb.

[–] DoYouNot@lemmy.world 9 points 8 months ago

Not everyone has a dedicated GPU, I would guess. GPUs are good at doing tensor calculations, but they're not the only way.

[–] Sabata11792@kbin.social 2 points 8 months ago

Its better if you have a good GPU, but will not run without a card from the last few years. It can run on CPU but it's much slower.

[–] DoucheBagMcSwag@lemmy.dbzer0.com 0 points 8 months ago (3 children)

Uh....how are they going to pay for the server load?

[–] sane@feddit.de 9 points 8 months ago (1 children)
[–] null 4 points 8 months ago

What server load?

[–] hswolf@lemmy.world 2 points 8 months ago (1 children)

you just need to pay your energy bill on time,that's all

[–] DoucheBagMcSwag@lemmy.dbzer0.com 1 points 8 months ago

I just read its local