kaityy

joined 8 months ago
[–] kaityy@lemmy.blahaj.zone 18 points 5 months ago

luckily element (the client i use) tends to be pretty good about actually having intuitive notification showcasing. Although tbf im only in like 6 rooms/spaces total so idk.

[–] kaityy@lemmy.blahaj.zone 6 points 5 months ago

Wait ur actually soo cutee whatt??? the stripes, the lighting and all the colors matching is so cutee omg, now I kinda want that outfit too hehe

[–] kaityy@lemmy.blahaj.zone 8 points 5 months ago

wow that's actually pretty cool though ngl

[–] kaityy@lemmy.blahaj.zone 4 points 5 months ago (3 children)

for some reason i feel like i shouldn't run that command

[–] kaityy@lemmy.blahaj.zone 16 points 5 months ago (3 children)

tbh idk what you mean. The stuff by my dresser? that's where it already belongs. Or do you mean my bed? there's no laundry on it haha

[–] kaityy@lemmy.blahaj.zone 37 points 5 months ago* (last edited 5 months ago) (12 children)

my bad. It's a federated texting app kinda like discord (unlike lemmy, it's not based on ActivityPub, so not technically part of the so-called 'fediverse'). Anyways, most people are more familiar with the most popular client used to access matrix, which is Element. Maybe I should've opened with that idk haha.

Anyways, it doesn't have to be on Matrix/element, just hoping there's something that isn't a shitty app. I need more queer online friends lol.

[–] kaityy@lemmy.blahaj.zone 22 points 5 months ago

okay that actually is funny yeah i like it

[–] kaityy@lemmy.blahaj.zone 23 points 5 months ago (2 children)

this shit prolly funny as fuck

[–] kaityy@lemmy.blahaj.zone 1 points 5 months ago (3 children)

At least with the more advanced LLM's (and I'd assume as well for stuff like image processing and generation), it requires a pretty considerable amount of GPU just to get the thing to run at all, and then even more to spit something out. Some people have enough to run the basics, but most laptops would simply be incapable. And very few people would have resources to get the kind of outputs that the more advanced AI's produce.

Now, that's not to say it shouldn't be an option, or that they force you to have some remote AI baked into your proprietary OS that you can't remove without breaking user license agreements, just saying that it's unfortunately harder to implement locally than we both probably wish it was.

[–] kaityy@lemmy.blahaj.zone 8 points 5 months ago (2 children)

im not sure why, but these are reminding me of something....... hmm, not sure what......

[–] kaityy@lemmy.blahaj.zone 12 points 5 months ago

me when 🥺

view more: ‹ prev next ›