There were announcements every week in 2023 on Large Language Models. It appears that there will be similar pace of announcements with humanoid robots in 2024.
The leaders are Teslabot, Sanctuary AI and Figure. However, there are dozens of other companies.
Nextbigfuture has written just days ago that Elon Musk will likely speed up the scaling of AI training compute from going up 5 to 10 times in 2024 from 15 Exaflops to 100 Exaflops by October, 2024 and maybe 150 Exaflops by Dec 2024 towards 100 times in 2024. This would be striving to reach 1500 Exaflops of AI training compute be December 2024. Elon Musk described how there is project in Kuwait for 2025 that will zettaflops of compute and a gigawatt of power for 2025.
The timeline split of AI vs Robot Hardware has changed
the last 90 days i’ve witnessed industry leading AI in our lab running on humanoid hardware, and frankly it’s blown me away
i’m watching robots performing complex tasks entirely with neural nets. AI trained tasks that i… pic.twitter.com/B59zBIgW3l
— Brett Adcock (@adcock_brett) December 26, 2023
There is an Open Source robot, ALOHA, with standard off the shelf hardware and three Stanford PHDs working on it part time that can perform cooking and household chores. There is a split between all autonomous AI and teleoperation.
There are multiple humanoid robotic companies with advanced hand control. Sanctuary AI has about $100 million in funding and has piloted its humandoid robots performing dozens of tasks at a Canadian Tire owned store.
Sanctuary AI humandoid robot was making coffee two years ago. They have a mix of automation and teleoperation.

Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.
AGI? In 2024? I am 99% certain that’s a no.
Real world AI? Based on how people are defining AI so charitably, that’s a maybe.
Humanoid robots? A bigger and bigger “human interest story in 2024—they will be in training and experimenting mode.
How much for the repair, recovery, reboot, spare parts, routine maintenance robot that comes with it?
Maintenance can be a revenue stream, pretty much like car sellers sell the insurance and regular servicing.
I agree completely! This year people will be amazed by robots in the same way they were wowed by Midjourney and Chat Gpt last year. Here’s a link to a stunning video by Agility Robotics asking their robot Digit to sort items using a LLM as the locomotion control driver for the robot. This was impossible just a few of months ago.
https://www.youtube.com/watch?v=CnkM0AecxYA
This video is of Digit walking around different environments in Berkeley. Their group is called hybrid robotics at Berkeley. They posted the LLM/code for it on Github already!
https://www.youtube.com/watch?v=eFoBfFhwo18
LLM’s are going to absolutely turbo charge the speed in which robots are deployed in the real world doing real world jobs. LLM’s are the breakthrough tech in robotics everyone has been waiting for! By the end of this year I believe we will be watching absolutely mind blowing videos of multiple humanoid robots doing incredibly complex tasks from a simple voice command prompt. The LLM’s will cause autonomous humanoid robots to arrive much faster than anyone predicted.
I don’t think the progress will be so fast. Tesla will most likely be the first to do it on a massive scale.
So how does Quantum Computing tie. Into all this ? Does QC have large power requirements ? How does it impact the large Nvidia farms mentioned earlier ?