this post was submitted on 21 Sep 2024
84 points (71.4% liked)

Technology

59982 readers
2907 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Please remove it if unallowed

I see alot of people in here who get mad at AI generated code and I am wondering why. I wrote a couple of bash scripts with the help of chatGPT and if anything, I think its great.

Now, I obviously didnt tell it to write the entire code by itself. That would be a horrible idea, instead, I would ask it questions along the way and test its output before putting it in my scripts.

I am fairly competent in writing programs. I know how and when to use arrays, loops, functions, conditionals, etc. I just dont know anything about bash's syntax. Now, I could have used any other languages I knew but chose bash because it made the most sense, that bash is shipped with most linux distros out of the box and one does not have to install another interpreter/compiler for another language. I dont like Bash because of its, dare I say weird syntax but it made the most sense for my purpose so I chose it. Also I have not written anything of this complexity before in Bash, just a bunch of commands in multiple seperate lines so that I dont have to type those one after another. But this one required many rather advanced features. I was not motivated to learn Bash, I just wanted to put my idea into action.

I did start with internet search. But guides I found were lacking. I could not find how to pass values into the function and return from a function easily, or removing trailing slash from directory path or how to loop over array or how to catch errors that occured in previous command or how to seperate letter and number from a string, etc.

That is where chatGPT helped greatly. I would ask chatGPT to write these pieces of code whenever I encountered them, then test its code with various input to see if it works as expected. If not, I would ask it again with what case failed and it would revise the code before I put it in my scripts.

Thanks to chatGPT, someone who has 0 knowledge about bash can write bash easily and quickly that is fairly advanced. I dont think it would take this quick to write what I wrote if I had to do it the old fashioned way, I would eventually write it but it would take far too long. Thanks to chatGPT I can just write all this quickly and forget about it. If I want to learn Bash and am motivated, I would certainly take time to learn it in a nice way.

What do you think? What negative experience do you have with AI chatbots that made you hate them?

(page 2) 50 comments
sorted by: hot top controversial new old
[–] Numuruzero@lemmy.dbzer0.com 5 points 2 months ago (1 children)

I have a coworker who is essentially building a custom program in Sheets using AppScript, and has been using CGPT/Gemini the whole way.

While this person has a basic grasp of the fundamentals, there's a lot of missing information that gets filled in by the bots. Ultimately after enough fiddling, it will spit out usable code that works how it's supposed to, but honestly it ends up taking significantly longer to guide the bot into making just the right solution for a given problem. Not to mention the code is just a mess - even though it works there's no real consistency since it's built across prompts.

I'm confident that in this case and likely in plenty of other cases like it, the amount of time it takes to learn how to ask the bot the right questions in totality would be better spent just reading the documentation for whatever language is being used. At that point it might be worth it to spit out simple code that can be easily debugged.

Ultimately, it just feels like you're offloading complexity from one layer to the next, and in so doing quickly acquiring tech debt.

Exactly my experience as well. Using AI will take about the same amount of time as just doing it myself, but at least I'll understand the code at the end if I do it myself. Even if AI was a little faster to get working code, writing it yourself will pay off in debugging later.

And honestly, I enjoy writing code more than chatting with a bot. So if the time spent is going to be similar, I'm going to lean toward DIY every time.

[–] TheGrandNagus@lemmy.world 5 points 2 months ago* (last edited 2 months ago)

A lot of people are very reactionary when it comes to LLMs and any of the other "AI" technologies.

For myself, I definitely roll my eyes at some of the "let's shoehorn 'AI' into this!" marketing, and I definitely have reservations about some datasets stealing/profiting from user data, and part of me worries about the other knock-on effects of AI (e.g. recently it was found that some foraging books on Amazon were AI generated and, if followed, would've led to people being poisoned. That's pretty fucking bad).

...but it can also be a great tool, too. My sister is blind, and honestly, AI-assisted screen readers will be a game changer. AI describing images online that haven't been properly tagged for blind people (most of them, btw!) is huge too. This is a thing that is making my little sister's life better in a massive way.

It's been useful for me in terms of translation (Google translate is bad), in terms of making templates that take a lot of the tedious legwork out of programming, effortlessly clearing up some audio clarity issues for some voluntary voice acting "work" I've done for a huge game mod, and for quickly spotting programming or grammar mistakes that a human could easily miss.

I wish people could just have rational, adult discussions about AI tech without it just descending into some kind of almost religious shouting match.

[–] essteeyou@lemmy.world 4 points 2 months ago

I use it as a time-saving device. The hardest part is spotting when it's not actually saving you time, but costing you time in back-and-forth over some little bug. I'm often better off fixing it myself when it gets stuck.

I find it's just like having another developer to bounce ideas off. I don't want it to produce 10k lines of code at a time, I want it to be digestible so I can tell if it's correct.

[–] wewbull@feddit.uk 4 points 2 months ago (2 children)
load more comments (2 replies)
[–] Soup@lemmy.cafe 3 points 2 months ago

Because despite how easy it is to dupe people into thinking your methods are altruistic- AI exists to save money by eradicating jobs.

AI is the enemy. No matter how you frame it.

[–] madsen@lemmy.world 3 points 2 months ago

but chose bash because it made the most sense, that bash is shipped with most linux distros out of the box and one does not have to install another interpreter/compiler for another language.

Last time I checked (because I was writing Bash scripts based on the same assumption), Python was actually present on more Linux systems out of the box than Bash.

[–] obbeel@lemmy.eco.br 2 points 2 months ago* (last edited 2 months ago)

I have worked with somewhat large codebases before using LLMs. You can ask the LLM to point a specific problem and give it the context. I honestly don't see myself as capable without a LLM. And it is a good teacher. I learn much from using LLMs. No free advertisement for any of the suppliers here, but they are just useful.

You get access to information you can't find on any place of the Web. There is a large structural bad reaction to it, but it is useful.

(Edit) Also, I would like to add that people who said that questions won't be asked anymore seemingly never tried getting answers online in a discussion forum - people are viciously ill-tempered when answering.

With a LLM, you can just bother it endlessly and learn more about the world while you do it.

[–] lvxferre@mander.xyz 1 points 2 months ago

[NB: I'm no programmer. I can write some few lines of bash because Linux, I'm just relaying what I've read. I do use those bots but for something else - translation aid.]

The reasons that I've seen programmers complaining about LLM chatbots are:

  1. concerns that AI will make human programmers obsolete
  2. concerns that AI will reduce the market for human programmers
  3. concerns about the copyright of the AI output
  4. concerns about code quality (e.g. it assumes libraries and functions out of thin air)
  5. concerns about the environmental impact of AI

In my opinion the first one is babble, the third one is complicated, but the other three are sensible.

load more comments
view more: ‹ prev next ›