- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
Running AI models without matrix math means far less power consumption—and fewer GPUs?
Running AI models without matrix math means far less power consumption—and fewer GPUs?
Let’s pop that bubble
I don’t think that making LLMs cheaper and easier to run is going to “pop that bubble”, if bubble it even is. If anything this will boost AI applications tremendously.