Elon Musk indicates that the FSD computers that are in all Tesla cars can be used to run AI models. This will enable Tesla car owners to share in the revenue of a Virtual Cloud Compute (VCC). VCC will be like a compute version of the Powerwall Virtual Power Plant. Through the Emergency Load Reduction Program (ELRP), homeowners receive $2.00 for every additional kWh their Powerwall delivers during an event.
In Texas, Tesla Electric members who meet the eligibility criteria will automatically become a part of the Virtual Power Plant. While participating in the Virtual Power Plant, your Powerwall will be dispatched when the grid needs support. For your participation, you will earn $10 per Powerwall on your monthly electric bill, in exchange for your Powerwall’s contribution. This $10 per Powerwall is in addition to your monthly Sellback Credits earned for energy that you send back to the grid. You do not have to compromise your energy security to participate.
Cern Basher calculates Tesla HW3 car owners could make about $400 per year participating in a VCC and to about $1000 per year using HW4.
Making $1000 per year would reduce the cost of ownership by 10-12%.
I think this could be higher value AI run on the VCC if xAI is more successful.
Provided our vehicle AI computer is able to run the model, Tesla will probably have the most amount of true usable inference compute on Earth.
Even in a robotaxi future, the cars will only be used for ~1/3 of hours/week, leaving 2/3 for distributed inference, like SETI.
— Elon Musk (@elonmusk) November 4, 2023
What if your Tesla could earn money while parked?
Elon Musk’s new idea could turn your car’s downtime into a cash machine, … or will it?
Some have suggested this might be Tesla’s next multi-billion dollar market opportunity.
How much might a Tesla owner make by allowing… pic.twitter.com/DXX3hFJXkR
— Cern Basher, CFA (@CernBasher) November 7, 2023
Tesla currently has a global fleet of 5 million cars.
Ten million cars could make about $4 billion with the VCC.
A hundred million cars by 2030 (most with more powerful and profitable HW4, HW5 and HW6 chips) could generate about $40-100 billion of revenue each year.
Tesla Making Profitable Assets for Customers
Powerwalls and Solar panel for VPP (Virtual Power Plants)
Megapacks profit for utilities with Autobidder
Cars FSD Chips making VCC (Virtual Cloud Computer)
Teslabots will provide profitable work and will also make VCC and VPP money.

Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology.
Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels.
A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.
AI processing isn’t just crunching the numbers like SETI or crypto-mining, it also involves shunting huge amounts of data around. A starlink connection wouldn’t be up to the job.
Each car, or node, or whatever, would only be able to attack its own very small and discrete part of the problem. Thats the main reason why distributed computing has never taken off, its only for limited use cases. Certainly not AI language models.
Inference is a pre-trained NN executing a prompt. With multimodal models that might include sound, image, text files or relatively small data files. The only data that has to be up/downloaded is the prompt and the response to it. LTE is easily capable of that and every Tesla already has that. Gen 2 Starlink would let every Tesla use its mobile device LTE to connect anywhere with a view of the sky. Tesla’s are mobile devices with 4G but much more powerful inference hardware than most others.
FSD car computers are optimized for inference and not training.
Training is extremely compute intensive and this is where there is a market.
There are several distributed platforms, often mixed with blockchains, to provide that service. You buy some tokens (crypto) and state some parameters for the training job you want to process. The system allocates proper hardware among thousands of distributed providers who get paid some tokens. The job is sent out, processed and sent back. The concept is tried and tested and works a bit like crypto mining, except the compute power is hopefully used for something more productive.
However, the FSD computers can’t deal with a training workload for many reasons.
I doubt there is a market for inference. The whole point with AI inference is that it’s lightweight and energy effective (cheap) to run once a model is trained.
Just like us humans…. We spend decades learning, training and educating ourselves. Once done, we only need some food and water once in a while to solve almost any problem quickly.