But you can use it with open ai or Google api.
s-kostyaev
joined 1 year ago
On most Chromebooks are very weak hardware. I don't think it will work fast enough to be useful.
But you can use it with open ai or Google api.
On most Chromebooks are very weak hardware. I don't think it will work fast enough to be useful.
See also [this reply](https://github.com/s-kostyaev/ellama/issues/13#issuecomment-1807954046) about using ellama on weak hardware:
You can try: