this post was submitted on 03 Jul 2023
1 points (100.0% liked)

Machine Learning

1742 readers
1 users here now

founded 4 years ago
MODERATORS
 

https://public.dhe.ibm.com/ibmdl/export/pub/software/server/ibm-ai/conda/#/

On the face of it, the ability to run models larger than GPU memory would seem to be extremely valuable. Why did they give up? Not everyone has an 80GB GPU.

Was the performance too slow?

you are viewing a single comment's thread
view the rest of the comments
[–] hobs@lemmy.ml 1 points 1 year ago

Maybe they aren't investing in advancing Watson as quickly as they used to. Perhaps they are rearchitecting. I'm trying to upgrade legacy transformers code to TF 2.0 and it's a big lift.