robber

joined 2 years ago
[–] robber@lemmy.ml 54 points 2 months ago (2 children)

Swiss lemming here. Switzerland already open-sourced multiple projects before, most notably the app to store your COVID test / vaccination status in. It was even officially available on F-Droid. I was very suprised at that point, and I'm happy to see that there are now efforts to make such behavior more default and less edge case, even when there are exceptions.

[–] robber@lemmy.ml 1 points 2 months ago

Yes, you can just go ahead and install nix in your distro to use e.g. nix-shell to create a development environment.

[–] robber@lemmy.ml 5 points 3 months ago (2 children)

I recently switched to Debian and use nix to install / provide the likes of node / python / go for development.

[–] robber@lemmy.ml 106 points 3 months ago (7 children)

Step by step, it seems, YouTube is evolving into something that has previously been called TV.

[–] robber@lemmy.ml 1 points 3 months ago

Totally agree, that would be even better.

[–] robber@lemmy.ml 46 points 3 months ago (4 children)

I totally agree regarding making it optional, but I have to say the idea of auto generating alt texts sounds like a really useful application of AI - no one really likes to do that manually yet a significant number of beautiful people rely on it.

[–] robber@lemmy.ml 1 points 3 months ago

Same problem here, my company requires 2FA for remote network access. MS Authenticator requires Google Services on Android which I don't have - so no home office for me I guess.

[–] robber@lemmy.ml 3 points 3 months ago* (last edited 3 months ago)

To be fair, there are a lot of Flatpacks published by the devs themselves (especially in the Gnome/GTK ecosystem).

[–] robber@lemmy.ml 2 points 3 months ago (1 children)

Sounds like a rather frustrating journey for you.

[–] robber@lemmy.ml 3 points 3 months ago (1 children)

Thanks! Glad to see the 8x7B performing not too bad - I assume that's a Mistral model? Also, does the CPU significantly affect inference speed in such a setup, do you know?

[–] robber@lemmy.ml 4 points 3 months ago (3 children)

So you access the models directly via terminal? Is that convenient? Also, do you get satisfying inference speed and quality with a 16GB card?

[–] robber@lemmy.ml 9 points 4 months ago

IIRC extensions are sadly not a part of stable Gnome Web yet.

view more: ‹ prev next ›