this post was submitted on 16 Sep 2023
13 points (100.0% liked)

Technology

23 readers
2 users here now

This magazine is dedicated to discussions on the latest developments, trends, and innovations in the world of technology. Whether you are a tech enthusiast, a developer, or simply curious about the latest gadgets and software, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as artificial intelligence, robotics, cloud computing, cybersecurity, and more. From the impact of technology on society to the ethical considerations of new technologies, this category covers a wide range of topics related to technology. Join the conversation and let's explore the ever-evolving world of technology together!

founded 2 years ago
you are viewing a single comment's thread
view the rest of the comments
[–] wolfshadowheart@kbin.social 1 points 1 year ago

Yup, seems like the move. Anything we can do to offload work from loading/sifting through models will greatly increase their efficiency. Currently MythicAI has worked on using analog chips to work with models and they've succeeded, however they're pretty specific to their use case (MythicAI is for AI recognition in realtime video; surveillance basically). But it should be fairly easily adapted to any type of model, it simply needs to be popularized.

Analog looks really useful since it's for highly specific mathematics, but if we could get a system module for RAM in addition to an analog processor, theoretically the need VRAM usage for AI could plummet. We're only using Tensor cores to brute force it, currently we could load a model, have it's information delivered, that's 400watts for the duration... mythicAI's analog chips use only 3.5 watts, delivering the results more quickly.

Anyway, the future is looking promising. Current implementations of AI are mediocre but the biggest hurdle they've had is the sheer amount of energy it takes. The benefits of it go up immensely if the energy cost goes down, and the quality of AI is only going to get better from here. Rather than shun the technology or abhor it, we probably should be looking at ways to embrace it without needing its own electric grid.