What to Expect for OpenAI GPT-4 and GPT-5

OpenAI CEO Sam Altman was interviewed about GPT4 and other AI topics at a conference one year ago. GPT-4 is coming, but currently the focus is on coding and that’s also where the available compute is going. GPT-4 will be a text model (as opposed to multi-modal). It will not be much bigger than GPT-3, …

Read more

AI Model Trained With 174 Trillion Parameters

The BaGuaLu AI system used the Chinese Sunway exaflop supercomputer to train the largest AI model with over 174 trillion parameters. The miraculous capabilities of neural net AI systems like ChatGPT (AI generate novel text and stories) and Dall-E (AI generate novel pictures) and Alphafold2 (protein folding) comes from the growth of the AI models. …

Read more

Neuromorphic AI Advantages and Recent Progress

Neuromorphic AI is developing spiking Neural Networks. Spiking neural networks can run different algorithms than neural networks. They have temporal properties and features. It requires less data and less energy than regular neural networks. Energy can drop 100 to 1000 times. Spiking neural networks thrives where it responds to real-world inputs in real time settings. …

Read more

13 Wafer Scale Chips for an Exaflop AI Supercomputer

Cerebras Systems, the pioneer in accelerating artificial intelligence (AI) compute, today unveiled Andromeda, a 13.5 million core AI supercomputer, now available and being used for commercial and academic work. Built with a cluster of 16 Cerebras CS-2 systems and leveraging Cerebras MemoryX and SwarmX technologies, Andromeda delivers more than 1 Exaflop of AI compute and …

Read more