this post was submitted on 06 Jul 2024
42 points (100.0% liked)

TechTakes

1401 readers
106 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
all 19 comments
sorted by: hot top controversial new old
[–] skillissuer@discuss.tchncs.de 22 points 4 months ago

damn they got their supply of good idea powder back

hide your defense budget before they start staring at goats again

[–] dgerard@awful.systems 21 points 4 months ago (1 children)

This article is heavy on the hype and my eyes are bleeding trying to abstract out what's actually happening here in reality

[–] conciselyverbose@sh.itjust.works 7 points 4 months ago (1 children)

They use LLMs for what they can actually do, which is bullet point core concepts to a huge volume of information, parse a large volume of information for specific queries that may have needed a tech doing a bunch of variations of a bunch of keywords, before, etc. Provided you have humans overseeing the summaries, have the queries surface the actual full relevant documents, and fallback to a human for failed searches, it can potentially add a useful layer of value.

They're probably also using it for propaganda shit because that's a lot of what intelligence is. And various fake documents and web presences as part of cover identities could (again, with human oversight), probably allow you to produce a lot more volume to build them out.

[–] skillissuer@discuss.tchncs.de 11 points 4 months ago (1 children)

Provided you have humans overseeing the summaries

right, at which point you're just better doing it the right way from the beginning, not to mention such tiny detail as not shoving classified information into sam altman's black box

[–] conciselyverbose@sh.itjust.works 9 points 4 months ago* (last edited 4 months ago) (2 children)

I'm not really arguing the merit, just answering how I'm reading the article.

The systems are airgapped and never exfiltrate information so that shouldn't really be a concern.

Humans are also a potential liability to a classified operation. If you can get the same results with 2 human analysts overseeing/supplementing the work of AI as you would with 2 human analysts overseeing/supplementing 5 junior people, it's worth evaluating. You absolutely should never be blindly trusting an LLM for anything. They're not intelligent. But they can be used as a tool by capable people to increase their effectiveness.

[–] dgerard@awful.systems 11 points 4 months ago

the other thing about text like this is that many of the claims of what they're doing will be completely false because someone will have misunderstood then will try to reconstruct a sensible version

[–] skillissuer@discuss.tchncs.de 6 points 4 months ago (3 children)

it's not airgapped, it's still cloud, it can't be. it's some kind of "secure" cloud that passed some kind of audit. openai already had a breach or a few, so i'm not entirely sure it will pan out

[–] V0ldek@awful.systems 2 points 4 months ago (2 children)

Iirc OpenAI uses Microsoft's cloud?

If so, MSFT has a special airgapped cloud specifically for USGov.

[–] froztbyte@awful.systems 2 points 4 months ago

tbh I personally wouldn’t expect/suspect this to be using any of the flavours of govcloud for mass-market flavours (because that has implications on staff hiring etc)

the easy way to handle this is to have a backend/frontend separation with baseline access controlled simply by construction of routing and zone primitives. it’s relatively simple (albeit moderately involved) to do this on most cloud providers

[–] gnomicutterance@awful.systems 2 points 4 months ago (1 children)

they probably do. I worked for a content-as-a-service company that had a contract to deliver our product, airgapped, to a three-letter agency on a regular schedule, and we were a tiny company. Microsoft's biggest customer is probably the U.S. government; I'd be shocked if they don't provide an in-house airgapped set of full Azure services for the entire intelligence agency system.

[–] V0ldek@awful.systems 3 points 4 months ago

They do. Source: I worked in at MSFT in Azure Identity. It's completely separate, has its own rollout schedule for all products, etc.

There's also a physically separate cloud for China 🙃

[–] conciselyverbose@sh.itjust.works 1 points 4 months ago

My interpretation of what they're saying is that it's on their own servers in their own location that can only be accessed from specific access points.

Talking about networks as airgapped isn't abnormal.

I see you've never taken part in a FedRAMP audit. They're brutal.

[–] antifuchs@awful.systems 17 points 4 months ago

Ah yes, the cia is no stranger to the artifice of intelligence.

[–] cwood@awful.systems 14 points 4 months ago (2 children)

Just a minor paragraph rewrite for clarity.

“The reality of generative AI is you’ve got to have a foundation of cloud computing,” AWS Vice President of Worldwide Public Sector Dave Levy, whose compensation relies on him successfully growing Amazon's computer rental income, told Nextgov/FCW in a June 26 interview at AWS Summit. “You’ve got to get your data in a place where you can actually do something with it.”

It's always so tedious when these little conflict of interest notes are left out of articles.

[–] pearsaltchocolatebar@discuss.online 3 points 4 months ago (1 children)

They're not wrong. It's super expensive and time consuming to properly train a generative AI model.

[–] sailor_sega_saturn@awful.systems 7 points 4 months ago* (last edited 4 months ago)

That doesn't imply cloud computing is a hard requirement, just that a server (might be) a requirement.

In a different universe where the cloud / SAAS never took over the market, Cat-GTPurr could be distributed on mail order Blu-Ray disks or (in the worst case) a spinning drive or two, or downloaded once via bittorrent; and then hosted locally. The cost of such a distribution would be a rounding error for most big tech companies.

[–] swlabr@awful.systems 13 points 4 months ago

Best case, this somehow causes the CIA to implode and the west to collapse along with it. Beworst case I’d have to give AI companies credit for providing the tools to said implosion. True worst case… I mean we are already there, i.e. the CIA exists and is operational.