this post was submitted on 04 Feb 2024
57 points (90.1% liked)

Programming

17444 readers
200 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 1 year ago
MODERATORS
 

I personally think it's a lot of bs

all 31 comments
sorted by: hot top controversial new old
[–] gerryflap@feddit.nl 58 points 9 months ago (6 children)

As a software engineer/data scientist who has spent quite a while to find some good AI work, this sounds like absolute bullshit. Most companies don't need AI. Prompt engineer seems like a niche thing, I can't imagine that most companies really need someone who does that. It really frustrates me that these bullshit articles keep coming out without any sense or reason. AI is cool technology (imo), but currently it's just the latest bait for CEOs, managers, etc. Somehow these kinda people are just so vulnerable for hype words without ever thinking more than as second about how to use it or whether it's even useful.

[–] lemmyvore@feddit.nl 10 points 9 months ago (1 children)

To add to this, even when they need AI work, they don't look for actual AI specialists. What they do is take one person who's done some random ML work and a bunch of random junior devs and that's the "AI team".

[–] jadero@programming.dev 4 points 9 months ago

That's very closely related to something I've come to think about tech: nerd equivalency. If there is a computer involved, then a nerd is required and they are all interchangeable.

Basically, someone says "we're not moving fast enough, hire another nerd!" and nobody in the chain of command or in the hiring process has a clue which particular skills are required, assuming that everyone can do everything.

That's why so many corporate projects have what amounts to random people doing randomly assigned work producing insecure, unreliable products with obscure and even hostile UIs.

[–] TWeaK@lemm.ee 8 points 9 months ago

Not only that, but these "bootcamps" aren't exactly going to be churning out the highly skilled people needed to really make good use of AI systems.

[–] jadero@programming.dev 4 points 9 months ago (1 children)

AI is cool technology (imo), but currently it's just the latest bait for CEOs, managers, etc. Somehow these kinda people are just so vulnerable for hype words without ever thinking more than as second about how to use it or whether it's even useful.

I think that's a general problem with most technology that is fundamentally about computing.

People outside any field have only the barest grasp of that field, but the problems are so much worse as soon as computers are involved. They are so ubiquitous and so useful to so many people with little or no training or understanding that everyone just succumbs to a form of magical thinking.

[–] MalReynolds 2 points 9 months ago* (last edited 9 months ago)

A: starts spouting technobabble

B: dummy mode on...

[–] Scrof@sopuli.xyz 2 points 9 months ago

Not needing isn't the same as won't hire and allocate vast amounts of resources.

[–] will_a113@lemmy.ml 34 points 9 months ago (2 children)

ML =/= AI. There are legit uses for ML that don’t have anything to do with LLMs and the cloud. I worked on an ML project 3 or 4 years ago to listen for fan noise that might indicate that it was about to fail soon. We trained a tiny GAN on good and bad noises. It runs on a tiny CPU, locally. Highly specialized work, and I have to imagine there are and will continue to be lots of similar opportunities to bring efficiencies by getting computers to make good observations and decisions - even if only about “simple” things like “does this thing seem like it’s about to break?”

[–] Daxtron2@startrek.website 10 points 9 months ago (1 children)

ML literally is a subfield of AI

[–] will_a113@lemmy.ml 1 points 9 months ago (2 children)

Fair enough. ML ⊆ AI then. But these days when everyone talking breathlessly about AI taking away jobs they’re almost always taking about LLMs. This article is about ML in particular which is a different discipline with different applications.

[–] infinitepcg@lemmy.world 2 points 9 months ago

It's not a different discipline, an LLM is an example of a machine learning model.

[–] Lmaydev@programming.dev 1 points 9 months ago

LLMs are neural networks which is literally ML.

The LLM designation refers to what they are trained to do.

[–] Sanctus@lemmy.world 4 points 9 months ago

GANs are so much fun and so tedious. I trained one on dungeon generation in college. It sucked but it worked in the end mostly. I dont know exactly how an LLM works but GANs are way different.

[–] NounsAndWords@lemmy.world 26 points 9 months ago

The most important line on that page:

"FORTUNE may receive compensation for some links to products and services on this website. Offers may be subject to change without notice."

[–] Pat_Riot@lemmy.today 22 points 9 months ago

They really need like 20, but to get colleges to host the courses needed you have to be able to fill the classrooms. Release bullshit articles like this and a bunch of kids hoping they might have a future with a roof on it sign up for the new thing being offered. It's really insidious.

[–] sj_zero@lotide.fbxl.net 16 points 9 months ago

1 million will be required so they can be paid minimum wage with 30 years experience and a phd.

[–] De_Narm@lemmy.world 2 points 9 months ago* (last edited 9 months ago)

Not with the stuff we currently have.

Image generation is neat. But now there are so many badly generated images out there that any new model likely feeds on shitty data. Also the whole copyright debate for the images used.

LLM are neat. But there is no point in widespread adoption. Any model that is built to generate correct sentences but not correct content kind of wastes a lot of time for the user and nothing else. They are inpressive, they can be fun and sometimes they are really helpful - but you never know whether or not they hallucinate any given information.

Voice and Avatar generation is neat. Like genuinely neat. But you rgey are so easy to use already, you don't nees that many specialists.

[–] otl@lemmy.sdf.org 2 points 9 months ago

Here’s the article’s source: https://www3.weforum.org/docs/WEF_Future_of_Jobs_2023.pdf

That report’s data is a survey they sent out to companies. Quantising “so… what do you think is gonna happen?” seems… shonky?

[–] autotldr@lemmings.world 1 points 9 months ago

This is the best summary I could come up with:


Many businesses across a variety of industries are spending more on AI—from Papa John’s to Canva—thus translating to a need for workers to have relevant skillsets such as knowing natural language processing, prompt engineering, and Python.

One broad example is online learning platform Udemy, which hosts dozens of offerings across a wide spectrum of experience level, length, and price.

Kara Sasse, chief product officer at Springboard, says the bootcamps are catered to fit the needs of those working professionals who are eager to upskill and succeed in increasingly AI-focused job environments.

“As ML and AI continue to transform every aspect of our lives, forward-thinking organizations must actively take inventory of potential skills gaps and look for professionals with the tools to succeed in this evolving landscape,” Sasse tells Fortune.

Fullstack Academy similarly offers an AI and ML bootcamp that covers fundamentals as well as emphasizes practical application, according to the company’s CEO, Nelis Parts.

Topic examples: Deep Learning with Keras and TensorFlow; Applied Data Science with Python; Essentials of Generative AI, Prompt Engineering, and ChatGPT


The original article contains 773 words, the summary contains 175 words. Saved 77%. I'm a bot and I'm open source!