Th4tGuyII

joined 5 months ago
[–] Th4tGuyII@fedia.io 10 points 4 months ago (1 children)

How would you even answer this question?

You can only "rescind it" by letting it expire, which is what they've done - the contract was fulfilled when they received their licence. Unless you can travel back in time, you can't just untake your driver's test and unreceive your licence, ergo you can't unfulfil the terms of the "contract".

[–] Th4tGuyII@fedia.io 3 points 4 months ago (1 children)

Tell you what it certainly is a "special" military operation considering that it's taken over a year, and at this point requires conscription... An awful lot like if Russia had started a war

[–] Th4tGuyII@fedia.io 10 points 4 months ago (11 children)

On the grand scheme of things, I suspect we actually don't have that much power in stopping the industrial machine.

Even if every person on here, on Reddit, and every left-leaning social media revolted against the powers that be right now, we wouldn't resolve anything. Not really. They'd send the military out, shoot us down (possibly quite literally), then go back to business as usual.

Unless there becomes a business incentive to change our ways, then capitalism will not follow, and instead it'll do everything it can to resist that change. By the time there is enough economic inventive, it'll be far too late to be worth fixing.

[–] Th4tGuyII@fedia.io 8 points 4 months ago* (last edited 4 months ago) (1 children)

I thought Earth was Neapolitan style, but it was Chicago deep dish the whole time! If only I'd have known earlier!

[–] Th4tGuyII@fedia.io 35 points 4 months ago* (last edited 4 months ago)

As others have said, our reference of time comes from our own universe's rules.
Ergo if rendering 1 second of our time took 10 years of their time, we wouldn't measure 10 years, we'd measure 1 second, so we'd have no way of knowing.

It's worth remembering that simulation theory is, at least for now, unfalsifiable. By it's nature there's always a counterargument to any evidence againat it, therefore it always remains a non-zero possibility, just like how most religions operate.

[–] Th4tGuyII@fedia.io 1 points 4 months ago

At the time I got my current system, I did 1tb SSD for the main, and a 4tb HDD for data drive.

For my next system, I think I'll split that a bit more evenly, as most of my games end up on the HDD which means they a bit to load

[–] Th4tGuyII@fedia.io 4 points 4 months ago

They're noisy, very noisy

[–] Th4tGuyII@fedia.io 90 points 4 months ago (4 children)

I mean it's a beautiful name, who really cares if it's named after a genus of Cicadas? There are worse sounding "normal" names out there. Plus it's named after OP's passion, I think that shows a lot of love

[–] Th4tGuyII@fedia.io 1 points 4 months ago (1 children)

So providing a fine-tuned model shouldn't either.

I didn't mean in terms of providing. I meant that if someone provided a base model, someone took that, built upon it, then used it for a harmful purpose - of course the person modified it should be liable, not the base provider.

It's like if someone took a version of Linux, modified it, then used that modified version for an illegal act - you wouldn't go after the person who made the unmodified version.

[–] Th4tGuyII@fedia.io 18 points 4 months ago (3 children)

SB 1047 is a California state bill that would make large AI model providers – such as Meta, OpenAI, Anthropic, and Mistral – liable for the potentially catastrophic dangers of their AI systems.

Now this sounds like a complicated debate - but it seems to me like everyone against this bill are people who would benefit monetarily from not having to deal with the safety aspect of AI, and that does sound suspicious to me.

Another technical piece of this bill relates to open-source AI models. [...] There’s a caveat that if a developer spends more than 25% of the cost to train Llama 3 on fine-tuning, that developer is now responsible. That said, opponents of the bill still find this unfair and not the right approach.

In regards to the open source models, while it makes sense that if a developer takes the model and does a significant portion of the fine tuning, they should be liable for the result of that...

But should the main developer still be liable if a bad actor does less than 25% fine tuning and uses exploits in the base model?

One could argue that developers should be trying to examine their black-boxes for vunerabilities, rather than shrugging and saying it can't be done then demanding they not be held liable.

[–] Th4tGuyII@fedia.io 19 points 4 months ago (1 children)

When you're calling a terrorist organisation your ally, surely you've got to realise what side of history you're on right?

I mean Putin probably does, but he doesn't care as long as he gets to be ~~king~~ president

[–] Th4tGuyII@fedia.io 30 points 4 months ago* (last edited 4 months ago) (1 children)

Alright, fine. You can have 20 miles or so, but nothing more. Oh and all the islands around it are still our's.

view more: ‹ prev next ›