Matburnx

joined 1 year ago
[–] Matburnx@sh.itjust.works 2 points 11 months ago (1 children)

Well, I use my laptop as a daily-driver, so training an AI in the background, even when I don't use it seems a bit complicated. The Markov chain seems like an interesting alternative for what I'm looking, does any tools to use one exist or should I build one from scratch?

[–] Matburnx@sh.itjust.works 1 points 11 months ago (1 children)

That seems pretty disappointing. It seemed to me like it could have been somewhat possible. I've trained a 0.8M parameters model and it was spitting out something that looked like French, not French though. So I need to test it but I feel like if I do it with some millions of parameters it could work. It still wouldn't have a coherence but at least it could form real sentences. Again I don't know much about this, so I'm surely wrong. I also think the dataset may be the issue, I didn't use a general purposed dataset, only French books in a txt file.

 

Hi, I'm currently starting to learn how LLM works in depth, so I started using nanoGPT to understand how to train a model and I'd like to play around with the code a little more. So I set myself a goal to train a model that can write basic French, it doesn't to be coherent or deep in its writing, just French with correct grammar. I only have a laptop that doesn't have a proper GPU, so I can't really train a model with billions of parameters. Do you think it's possible without too much dataset or intensive training? Is it a better idea if I use something different from nanoGPT?

TLDR: I'd like to train my own LLM on my laptop which doesn't have a GPU. It's only for learning purpose, so my goal is that it can write basic French. Is it doable? If it is, do you have any tips to make this easier?

[–] Matburnx@sh.itjust.works 3 points 1 year ago

I just did that and cleaned all the models I downloaded and didn't want to use anymore. Thanks for the advice :)

[–] Matburnx@sh.itjust.works 2 points 1 year ago

I used openplayground a little after it got released, and I installed it through pip so the models were downloaded somewhere in the Python files. I used a software that shows file size as taladar suggested and found it that way. Thanks for all the help, though :)

[–] Matburnx@sh.itjust.works 1 points 1 year ago (2 children)

Oh ok, thanks, this one was pretty easy. Do you have any idea where the models downloaded by openplayground are?

 

This might seem like a dumb question, but my disk space is currently pretty low and I'd like to clean some of my files.

A lot of space has been taken by the models I downloaded with different projects like the one from oobabooga or LocalGPT, however I can't find the file where they were downloaded. So I'd like to know if anyone knows where it is.

I'm on Windows if it changes anything. Thanks in advance for your answers