this post was submitted on 07 Sep 2024
61 points (100.0% liked)

TechTakes

1401 readers
139 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
top 28 comments
sorted by: hot top controversial new old
[–] UnseriousAcademic@awful.systems 35 points 2 months ago (6 children)

The learning facilitators they mention are the key to understanding all of this. They need them to actually maintain discipline and ensure the kids engage with the AI, so they need humans in the room still. But now roles that were once teachers have been redefined as "Learning facilitators". Apparently former teachers have rejoined the school in these new roles.

Like a lot of automation, the main selling point is deskilling roles, reducing pay, making people more easily replaceable (don't need a teaching qualification to be a "learning facilitator to the AI) and producing a worse service which is just good enough if it is wrapped in difficult to verify claims and assumptions about what education actually is. Of course it also means that you get a new middleman parasite siphoning off funds that used to flow to staff.

[–] Saleh@feddit.org 13 points 2 months ago

They could just have the kids read actual books designed by actual pedagogic experts which actually help to learn through studying it.

Now nobody knows if the "AI" is even teaching real things or if it is only using properly vetted material, if the structure it proposes makes sense.

Yes teachers are fallible, but they are also human and can emotionally understand what is going on during learning that a trained algorithm just cannot get. In so far also it means there needs to be a clearly defined "goal" of knowledge and competencies and the algorithm can only fill the holes, rather than encourage students to maybe seek knowledge beyond the established set.

Also i am skeptical how much of it is even "AI" in the sense of needing a machine learning approach, or it is just regular computer tests of which "level" is reached in each category and where to improve still. Chance is, this could be done with an excel sheet.

[–] Gradually_Adjusting@lemmy.world 12 points 2 months ago

Aren't there laws about who gets to teach kids? I know there are strictures on teacher to student ratios, but how can that exist without a written definition of what a teacher is?

[–] dgerard@awful.systems 9 points 2 months ago

so the thing is this is a private school at the sort of fees that attract really good teachers and use them as a selling point, so I don't actually think being cheap is the goal here. I think some idiot thinks this is actually a good idea.

[–] zbyte64@awful.systems 8 points 2 months ago

Unfortunately this trend is happening in the States even without the AI buzzwords (though it is there). You give every kid a tablet with educational apps that feed into a curriculum algo. Teachers are told by the algo which student needs help on what, basically they become facilitators to the app. Then you also have "student summarizers" which will "analyze" a student written or audio submission and flatten it down to some unform stats.

[–] Maeve@kbin.earth 8 points 2 months ago

In some areas of the USA, teaching degrees aren't required to actually teach. I hope I don't see this world-wide.

[–] dgerard@awful.systems 32 points 2 months ago* (last edited 2 months ago) (1 children)

the kicker here:

this GCSE costs £27,000 per student per year

they're paying this much not to have teachers

I'm trying to find any detail on what they actually do in this thing and I can't find anything. As far as I can tell the AI does ??shit?? and the "learning facilitators" are the "teachers".

[–] mpk@awful.systems 16 points 2 months ago (1 children)

Seems to be like an awesome way to get tech millionaires with weird ideas about education from reading too much Ayn Rand to cough up 27 grand a year to educate their unfortunate kids.

[–] dgerard@awful.systems 13 points 2 months ago* (last edited 2 months ago)

I'm trying to work out what the fuck is up with this school. Like that's a private school fee, sure.

But in what world did this look like a good idea?

for comparison, there's a school that does an online GCSE at £5k/yr full rate, popular with diplomats and expats, but for about half the students it's paid for by the local council as disability support (kids who can't attend a physical school for some reason). I predict everyone involved would shit if they tried this AI nonsense on them.

[–] homesweethomeMrL@lemmy.world 19 points 2 months ago

Hahahaha! That is by far the stupidest embrace of “AI” possible.

I’d say I can’t wait for it to fail, but whenever I say that things tend to become intractably lodged in the culture, so. Great success!

[–] antifuchs@awful.systems 19 points 2 months ago

Of all the awful and bad reasons to homeschool, “my government forces my kids to learn parroted bullshit” is probably the most annoyingly valid.

[–] LordCrom@lemmy.world 18 points 2 months ago

Here's a thought. Take that money per kid per year and pay a teacher a decent salary for a class of 15 kids

[–] Etterra@lemmy.world 7 points 2 months ago

There's absolutely no way this could possibly go wrong ever.

[–] MossyFeathers@pawb.social 7 points 2 months ago* (last edited 2 months ago) (2 children)

Something tells me they're not just slapping chatGPT on the school computers and telling kids to go at it; surely one of the parents would have been up-to-date enough to know it's a scam otherwise. At the very least, surely the students will start to get upset that they're getting made fun of for the "facts" they're learning from chatGPT, complain to their parents, and cause the school to get sued.

It seems like a very stupid scam to try and teach rich kids with chatGPT which is why I'm wondering if they're using something else. They could be acting as a testbed for a new AI designed specifically for teaching. I wouldn't put it past rich people to use their kids as guinea pigs if it meant they could save or make money elsewhere.

Unfortunately the article doesn't mention what kind of AI they're using though.

[–] blakestacey@awful.systems 11 points 2 months ago

Something tells me they’re not just slapping chatGPT on the school computers and telling kids to go at it; surely one of the parents would have been up-to-date enough to know it’s a scam otherwise.

If people with money had that much good sense, the world would be a well-nigh unfathomably different place....

[–] dgerard@awful.systems 6 points 2 months ago

I extremely much want the details on how this all works.

[–] potentiallynotfelix@lemdro.id 6 points 2 months ago (1 children)

The UK has the worst priorities.

[–] mpk@awful.systems 14 points 2 months ago

This isn't the UK government or UK public education policy, to be fair on the UK. It's a £27,000-per-year private school in London - the sort that helps ram the possibly-not-so-bright kids of the wealthy through their GCSEs and A-Levels.

[–] El_guapazo@lemmy.world -2 points 2 months ago

The US hates robots. It'll never catch on.

They hate ATMs, automated drive through kiosks, hitchhiking robots, and electric cars.