Tesla builds FSD hardware into all Tesla cars even for those who do not pay to use FSD. This means people can change their mind and sign up for FSD and get it activated. It also means the 144 Trillion operations per second AI will be available to run xAI Grok. There are 5 million Tesla cars globally. Perhaps 60% or 3 million would be available to get paid to run AI computing according to an estimate by Elon Musk. This would be about 500 ExaOPS (8 bit integer operations) based upon the current vehicles. This will likely double every two years based upon increasing annual sales.
Provided our vehicle AI computer is able to run the model, Tesla will probably have the most amount of true usable inference compute on Earth.
Even in a robotaxi future, the cars will only be used for ~1/3 of hours/week, leaving 2/3 for distributed inference, like SETI.
— Elon Musk (@elonmusk) November 4, 2023
Grok is designed to answer questions with a bit of wit and has a rebellious streak, so please don’t use it if you hate humor!
A unique and fundamental advantage of Grok is that it has real-time knowledge of the world via the 𝕏 platform. It will also answer spicy questions that are rejected by most other AI systems.
Grok is still a very early beta product – the best we could do with 2 months of training – so expect it to improve rapidly with each passing week with your help.
X Premium Plus costs $16 per month. ChatGPT plus costs $20 per month.
The @xAI Grok AI assistant will be provided as part of 𝕏 Premium+, so I recommend signing up for that.
Just $16/month via web. https://t.co/wEEIZNjEkp
— Elon Musk (@elonmusk) November 4, 2023
Example of Grok vs typical GPT, where Grok has current information, but other doesn’t pic.twitter.com/hBRXmQ8KFi
— Elon Musk (@elonmusk) November 5, 2023
𝕏 is on fire today!
Elon bought a company ~13 months ago for $44BI’ll think it is going to be worth a lot more within just 12 months!
1. People are going to subscribe to Premium+ to get Grôk
2. 𝕏 is going to be cash positive very soon, if not already!
3. I have the past 3… pic.twitter.com/3NUnHsT3AN
— Nicklas 🇸🇪🚗T🐂📈🍀♻️🚀 (@NicklasNilsso14) November 5, 2023
NEWS: More details on @xAI’s Grok chatbot:
– Strong "feel" for news
– Context window is 25k characters
– Live search engine focuses on X
– API, image and audio recognition are planned – current model has hints of the latter two.
-Ver of Grok will run natively on Tesla cars https://t.co/BckIawF0Sl— X News Daily (@xDaily) November 4, 2023
Groooook https://t.co/kuJE4XhJdq
— Elon Musk (@elonmusk) November 5, 2023
The @xAI Grok AI assistant will be provided as part of 𝕏 Premium+, so I recommend signing up for that.
Just $16/month via web. https://t.co/wEEIZNjEkp
— Elon Musk (@elonmusk) November 4, 2023

Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.
If Tesla will give me free charges at their public chargers then they can use my car for distributed computing inference. Win-win for me and Tesla.
That’s rather silly. Those models are extremely data hungry. Typically it’s not compute bound but bandwidth bound. How do you feed such system?
Starlink V2 adding direct 4G-LTE connection to unmodified mobile devices means all existing Tesla vehicles could use Starlink for up to a few Mbps connection – which is enough for many functions including some uses of this distributed AI inference compute.
Tesla will likely pay cars to opt into distributed computing inference.
They damned well better, if they do this. They’re not renting the car to you, they sold it.
Who will pay for the electricity consumption?
exactly
Something like 500 watts continuous for 20 hours per day? 10kWh per day.
Most electricity in the US is 15c/kWh or less.
$1.50 per day per vehicle.
Belgium is 0.4 euros per kWh
Are you serious? That’s insane. In Canada I pay about 10 cents Canadian , or 0.07 euro.
Yes unfortunately, during 2022 one kWh was above 1 euro , crazy indeed
Since 1.5 x365 =547.5 if your car lasts 10 years you are donating 5475 usd in electricity to X, if the car lasts 20 years it is 10950 usd and if the car last 37 years (as some very enthusiastic estimates based on estimate battery life) you end up donating 20257.5 usd in electricity to X/twitter. It is not nothing.
Why would anyone donate anything? They’ll just get paid for allowing the use of their asset, like regular people.
I would not donate anything to X/Telsa or anyone else, but you assume that you can opt out of autopilot and then also opt out of being exploited as a free-lunch datacenter
I really can’t see them doing this without compensating the owner for well-over the cost electricity.
Thats a steal too, cloud computing prints money.
My house uses 6 kWh per day, so that represents an enormous increase.
Average US household electricity consumption is around 30kWhr/day.
X/Tesla/Elon Musk/X users, obviously. The car owner decides if it’s enough money when they choose to opt in.