this post was submitted on 31 Aug 2024
876 points (98.6% liked)

196

16233 readers
1779 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] dsilverz@thelemmy.club 52 points 2 weeks ago (3 children)

Dev here. Javascript engines (especially Chromium) have a memory limit (as per performance.memory.jsHeapSizeLimit), in best case scenarios, 4GB max. LocalStorage and SessionStorage (JS features that would be used to store the neural network weights and training data) have even lower limits. While I fear that locally AI-driven advertisement could happen in a closer future, it's not currently technically feasible in current Chromium (Chrome, Vivaldi, Edge, Opera, etc) and Gecko (Firefox) implementations.

[–] borth@sh.itjust.works 40 points 2 weeks ago (2 children)

Then Alphabet will come up with a new bullshit idea, "remove the limits for 'trusted' advertisers" so that they can inject more code than allowed as long as they keep paying for their ad "partnership"

[–] thepreciousboar@lemm.ee 27 points 2 weeks ago (1 children)

That's why we need to fight against chromium monopoly

[–] dsilverz@thelemmy.club 7 points 2 weeks ago (2 children)

It became difficult as Web technologies grown complexier, such as implementing native CPU instructions through WASM, bluetooth through Web Bluetooth, 3D graphics through WebGL, NFC, motion sensors, serial ports, and so on. Nowadays, it's simply too hard to maintain a browser engine, because many of the former alternatives were abandoned and became deprecated.

[–] zea_64@lemmy.blahaj.zone 9 points 2 weeks ago (1 children)

I dare anyone to even just compile a document containing all the standards you'd need to implement

[–] dsilverz@thelemmy.club 17 points 2 weeks ago

Actually, there is a compilation of all the standards specifications. It's on W3 (World Wide Web Consortium), where all the technical details are deeply documented (called "Technical Reports"), available on https://www.w3.org/TR/ . To this day, there are 309 published Technical Reports regarding "Standard" specifications.

Fun fact: while seeking for the link to send here, I came across a Candidate Standard entitled "Web Neural Network API", published exactly yesterday. Seems like they're intending to implement browser-native neural network capabilities inside Web specifications, and seems like the "closer future" I mentioned is even closer... 🤔

[–] superb@lemmy.blahaj.zone 1 points 2 weeks ago* (last edited 2 weeks ago)

implementing native CPU instructions through WASM

This is purely a nitpick, but WASM lets you run WASM instructions not native cpu instructions. Its does let you get much closer to the speed of running native instructions

[–] dsilverz@thelemmy.club 3 points 2 weeks ago

remove the limits for ‘trusted’ advertisers

Exactly.. Including themselves, as they are a major player in advertising market (Google Adsense).

[–] TootSweet@lemmy.world 15 points 2 weeks ago (2 children)

I really hope you don't know about this 4GB limit specifically because you've run up against it while doing anything real-world.

[–] Daxtron2@startrek.website 6 points 2 weeks ago (1 children)

Canvas code can get out of hand very quickly if not done right

[–] TootSweet@lemmy.world 1 points 2 weeks ago (2 children)

I've made exactly two projects that utilized canvas, both of which I "released" in a sense. One contains 248kb of JS code and the other contains 246kb. That's before it's minified.

So I guess that means I did my canvas code right. Lol.

(Unless you meant 3d canvas or WebGL stuff with which I haven't played.)

[–] thanks_shakey_snake@lemmy.ca 6 points 2 weeks ago

I think they're referring to the memory footprint, not the source code file size.

[–] Daxtron2@startrek.website 2 points 2 weeks ago

Code size isn't really related to how much graphics data you're throwing in RAM

[–] dsilverz@thelemmy.club 6 points 2 weeks ago

Not yet, but I often code myself some experiments involving datasets (i like to experiment with Natural Language Processing, randomness, programmatic art and demoscenes, the list goes on).

[–] sysop@lemmy.world 2 points 2 weeks ago

It would just slowly accumulate it over time, little bit here, little bit there until it has a fleet of stuff to serve you in a queue, so while you're making more and more bits for more videos, it's serving you videos while you make bits of new videos and sharing them over websockets that JS CDNS force-feed our browsers to centralized servers to offload similar users with similar ad-tastes to also help compile.

Some shit like that. Adtech is cyber terrorism. Never forget.