this post was submitted on 24 Feb 2024
114 points (97.5% liked)

Free Open-Source Artificial Intelligence

2886 readers
2 users here now

Welcome to Free Open-Source Artificial Intelligence!

We are a community dedicated to forwarding the availability and access to:

Free Open Source Artificial Intelligence (F.O.S.A.I.)

More AI Communities

LLM Leaderboards

Developer Resources

GitHub Projects

FOSAI Time Capsule

founded 1 year ago
MODERATORS
 

This would be a good opportunity to provide a thoughtful, sane, and coherent response to voice your opinion on the future regulation policies for AI to counter the fearmongering.

How to submit a comment: https://www.regulations.gov/document/NTIA-2023-0009-0001

All electronic public comments on this action, identified by Regulations.gov docket number NTIA–2023–0009, may be submitted through the Federal e-Rulemaking Portal. The docket established for this request for comment can be found at www.Regulations.gov, NTIA–2023–0009. To make a submission, click the ‘‘Comment Now!’’ icon, complete the required fields, and enter or attach your comments. Additional instructions can be found in the “Instructions” section below, after “Supplementary Information.”

you are viewing a single comment's thread
view the rest of the comments
[–] RobotToaster@mander.xyz 62 points 8 months ago (2 children)

Of course they only want to regulate open source models...

[–] AbouBenAdhem@lemmy.world 15 points 8 months ago* (last edited 8 months ago) (1 children)

From their list of concerns:

The benefits and risks of making model weights widely available compared to the benefits and risks associated with closed models; Innovation, competition, safety, security, trustworthiness, equity, and national security concerns with making AI model weights more or less open; and The role of the U.S. government in guiding, supporting, or restricting the availability of AI model weights

It seems like they’re concerned about both open and closed models, and they’re interested in supporting as well as potentially regulating both.

[–] GluWu@lemm.ee 10 points 8 months ago (1 children)

Lol, no they aren't. That's just legalese. A government regulator agency isn't going to open feedback about what it should be doing regulatory wise and promt the entire thing with "Why should we close AI to only corporations willing to post enough money". They have to at least include the "but maybe potentially let average people use AI" part so when they close everything down they can point back and say "look, we were open to talking about open solutions".

[–] AbouBenAdhem@lemmy.world 2 points 8 months ago

That tends to be the outcome of processes like this, and sometimes it is because the agency already decided on policy ahead of time and only asked for public input for the sake of appearances. But in other cases the request for input is in good faith, and industry interests end up dominating the discussion because other voices convince themselves they’d be ignored anyway.

In the case of new industries still in flux, it’s more likely that commercial interests haven’t yet infiltrated the relevant agencies to dictate policy from within—which is why they have to rely on hyperbolic scare tactics and hope no one contradicts them.

[–] rufus@discuss.tchncs.de 5 points 8 months ago* (last edited 8 months ago)

Haha, that'd be fun. EU with least regulation on open-weight models and the US exalctly the other way around.