Phoenix

joined 1 year ago
[–] Phoenix@programming.dev 7 points 1 year ago (3 children)

That's only the first stage. Once you get tired enough you start writing code that not even you can understand the next morning, but which you're loathe to change because "it just works".

[–] Phoenix@programming.dev 10 points 1 year ago

"The bug is fixed, but we inadvertently created two new ones, one of which broke production because it was inexplicably not caught."

[–] Phoenix@programming.dev 3 points 1 year ago* (last edited 1 year ago)

If you want to disabuse yourself of the notion that AI is close to replacing programmers for anything but the most mundane and trivial tasks, try to have GPT 4 generate a novel implementation of moderate complexity and watch it import mystery libraries that do exactly what you want the code to do, but that don't actually exist.

Yeah, you can do a lot without writing a single line of code. You can certainly interact with the models because others who can have already done the leg work. But someone still has to do it.

[–] Phoenix@programming.dev 12 points 1 year ago (7 children)

It really is big. From baby's first prompting on big corpo model learning how tokens work, to setting up your own environment to run models locally (Because hey, not everyone knows how to use git), to soft prompting, to training your own weights.

Nobody is realistically writing fundamental models unless they work with Google or whatever though.

[–] Phoenix@programming.dev 1 points 1 year ago

Wayne June has excellent readings of Lovecraft's works.

[–] Phoenix@programming.dev 2 points 1 year ago

Der Prozess/The Trial is one of the better ones in my opinion. Really captures the essence of Kafka. Of course, it is also one of his longer works.

[–] Phoenix@programming.dev 1 points 1 year ago

I've used WSL to run deepspeed before because inexplicably microsoft didn't develop it for their own platform...

[–] Phoenix@programming.dev 2 points 1 year ago* (last edited 1 year ago)

I read it a long time ago. The format is interesting, novel certainly. I suppose it's the selling point, over the prose.

To me it seemed like there were many competing "ways" to read it as well. Like a maze, you can go different paths. Do you read it front to back? Niggle through the citations? Thread back through the holes? It's not often you get a book that has this much re-read value.

[–] Phoenix@programming.dev 2 points 1 year ago

Ah, the hyperbolic timechamber of job experience.

[–] Phoenix@programming.dev 1 points 1 year ago

The assertion that they cannot be cheap is funny, when Vicuna 13B was trained on all of $300.

Not $300,000. $300. And that gets you a model that's almost parity with ChatGPT.

[–] Phoenix@programming.dev 2 points 1 year ago (1 children)

Don't know about Lovecraft, but the big creepypasta/web story thing these days is the SCP foundation. But I assume you've already heard about that one?

[–] Phoenix@programming.dev 15 points 1 year ago (4 children)

The real mystery is when they want five years of experience for the tech that's been out for three.

view more: ‹ prev next ›