s-kostyaev

joined 1 year ago
[–] s-kostyaev@alien.top 1 points 10 months ago

See also [this reply](https://github.com/s-kostyaev/ellama/issues/13#issuecomment-1807954046) about using ellama on weak hardware:
You can try:

[–] s-kostyaev@alien.top 1 points 10 months ago

But you can use it with open ai or Google api.

[–] s-kostyaev@alien.top 1 points 10 months ago

On most Chromebooks are very weak hardware. I don't think it will work fast enough to be useful.

[–] s-kostyaev@alien.top 1 points 10 months ago (5 children)

This and other things also possible with ellama. It also works with local models.