GPU vs TPU vs LPU

 There are hot debate on which form of AI processors could be better, or what processors would dominate the market later. 

From my knowledge,

- Nvidia GPU - use HBM, flexible in different AI model and application, proven technology

- Google TPU - use HBM too. rigid kind of use in AI. lower cost and lower power requirement. More efficient that GPU. 

- Groq LPU - use SRAM. specific for certain use in AI, like language processing. Good for inferencing use.  lower cost, 1/5 cost of GPU 

My speculation for next 3-5 year. 

Since Nvidia acquires LPU technology, it would offer flexibility with balanced cost. Use of LPU could focus on certain part of AI processing. It makes a balance. 

Therefore, the use of NVDA solution vs TPU or alike would be around 7:3

What is expected from 5-10 years later?

As AI becomes more mature, TPU alike solution would prevail.

I would expect 50:50 split between NVDA vs TPU alike, and eventually  3:7.

Anyway, I will make bet on both NVDA and GOOG. Both are good at the moment


Kenzo

2026 Jan 5


Note: 

https://towardsdev.com/meet-groq-making-large-language-models-lightning-fast-0f8a885073b9

Comments

Popular posts from this blog

財務自由的考慮 #1

財務自由的考慮 #5 - 4% rules

財富方程式