No. of Recommendations: 1
The world of AI is moving so fast, it's hard to keep up. Before we get to the hard news, watch this GenAI-powered robot called Figure 1 making a cup of coffee after learning from its mistakes. YouTube:
https://youtu.be/VZ5enBFRGZE?si=GYQejPr6kquLuPjX&t...Apple and Google's GeminiApple is reportedly in talks with Google to incorporate their Gemini AI engine into iPhones. I'm surprised Apple did this, because they've had Siri for years, and I thought they would integrate it with their own LLM. Samsung already agreed to put Gemini in its Galaxy phones in January.
This shows that Google has a real moat with Gemini, at least for now. BTW, I find myself preferring Gemini to ChatGPT for my own daily questions and research.
Nvidia announces new Blackwell chips and software
Nvidia today announced its new Blackwell-based GB200 processors, which offer "20 petaflops in AI performance versus 4 petaflops for the H100. ... Nvidia said that the system can deploy a 27-trillion-parameter [Large Language] model. That’s much larger than even the biggest models, such as GPT-4, which reportedly has 1.7 trillion parameters. They said Amazon Web Services would build a server cluster with 20,000 GB200 chips.
Nvidia’s current Hopper-based H100 costs between $25,000 and $40,000 per chip, with whole systems that cost as much as $200,000, according to analyst estimates."
https://www.cnbc.com/2024/03/18/nvidia-announces-g...