this post was submitted on 13 Aug 2023
385 points (74.2% liked)

Technology

59414 readers
2831 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Windex007@lemmy.world 64 points 1 year ago (4 children)

As a Sr. Dev, I'm always floored by stories of people trying to integrate chatGPT into their development workflow.

It's not a truth machine. It has no conception of correctness. It's designed to make responses that look correct.

Would you hire a dev with no comprehension of the task, who can not reliably communicate what their code does, can not be tasked with finding and fixing their own bugs, is incapable of having accountibility, can not be reliably coached, is often wrong and refuses to accept or admit it, can not comprehend PR feedback, and who requires significantly greater scrutiny of their work because it is by explicit design created to look correct?

ChatGPT is by pretty much every metric the exact opposite of what I want from a dev in an enterprise development setting.

[–] JackbyDev@programming.dev 33 points 1 year ago (1 children)

Search engines aren't truth machines either. StackOverflow reputation is not a truth machine either. These are all tools to use. Blind trust in any of them is incorrect. I get your point, I really do, but it's just as foolish as believing everyone using StackOverflow just copies and pastes the top rated answer into their code and commits it without testing then calls it a day. Part of mentoring junior devs is enabling them to be good problem solvers, not just solving their problems. Showing them how to properly use these tools and how to validate things is what you should be doing, not just giving them a solution.

[–] Windex007@lemmy.world 7 points 1 year ago (1 children)

I agree with everything you just said, but i think that without greater context it's maybe still unclear to some why I still place chatGPT in a league of it's own.

I guess I'm maybe some kind of relic from a bygone era, because tbh I just can't relate to the "I copied and pasted this from stack overflow and it just worked" memes. Maybe I underestimate how many people in the industry are that fundamentally different from how we work.

Google is not for obtaining code snippets. It's for finding docs, for troubleshooting error messages, etc.

If you have like... Design or patterning questions, bring that to the team. We'll run through it together with the benefits of having the contextual knowledge of our problem domain, internal code references, and our deployment architecture. We'll all come out of the conversation smarter, and we're less likely to end up needing to make avoidable pivots later on.

The additional time required to validate a chatGPT generated piece of code could have instead been spent invested in the dev to just do it right and to properly fit within our context the first time, and the dev will be smarter for it and that investment in the dev will pay out every moment forward.

[–] JackbyDev@programming.dev 2 points 1 year ago

I guess I see your point. I haven't asked ChatGPT to generate code and tried to use it except for once ages ago but even then I didn't really check it and it was a niche piece of software without many examples online.

[–] SupraMario@lemmy.world 11 points 1 year ago

Don't underestimate C levels who read a Bloomberg article about AI to try and run their entire company off of it...then wonder why everything is on fire.

[–] flameguy21@lemm.ee 6 points 1 year ago (1 children)

Honestly once ChatGPT started giving answers that consistently don't work I just started googling stuff again because it was quicker and easier than getting the AI to regurgitate stack overflow answers.

[–] ewe@lemmy.world 6 points 1 year ago

Would you hire a dev with no comprehension of the task, who can not reliably communicate what their code does, can not be tasked with finding and fixing their own bugs, is incapable of having accountibility, can not be reliably coached, is often wrong and refuses to accept or admit it, can not comprehend PR feedback, and who requires significantly greater scrutiny of their work because it is by explicit design created to look correct?

Not me, but my boss would... wait a minute...