Editor’s note: This story was previously published early this month. It has since been updated to include the most relevant information on Dojo and other custom AI.
According to Morgan Stanley, Tesla (TSLA) stock is worth $400, and it has nothing to do with its electric vehicles.
Rather, that hefty valuation is thanks to its “Dojo” supercomputer, which could create $500 billion in value by enhancing the EV maker’s AI capabilities.
That’s a big figure.
And yet, it understates the enormous potential economic impact of Tesla’s supercomputer.
Dojo could revolutionize the entire $15.7 trillion AI industry, unleashing a new era of innovation and growth in artificial intelligence.
That’s why you need to act now and invest in the best AI stocks on the market before Elon Musk’s Dojo transforms the global economy.
Here’s the opportunity…
Dojo: The Brains Behind Tesla
Dojo is Tesla’s supercomputer and the “brain” behind its self-driving operations. It processes the driving data from Tesla cars on the road and develops self-driving algorithms that power the autonomous vehicle capabilities in Tesla cars.
Dojo is Tesla’s AI.
It is an amazing technology. How can Tesla cars drive themselves autonomously while other cars lack full self-driving capability? The answer is Dojo.
It is an AI dedicated to creating autonomous vehicles, with access to the largest self-driving data set in the world from Tesla cars, and built and refined by some of the top AI engineers in the world.
Dojo is arguably the most important technology for unlocking the future of the global automotive market.
But it is also much more than that.
It marks a tipping point for the “AI Revolution.”
Until now, most AI applications have been built on GPUs from Nvidia (NVDA). The semiconductor company has established itself as the maker of the best general-purpose AI chips, and most major AI models in the world today were built on Nvidia chips.
Dojo represents a significant departure from that norm.
A Custom AI Supercomputer for Self-Driving Cars
Tesla used to power all of its self-driving operations with a large Nvidia GPU-based supercomputer. Dojo is set to replace that.
In other words, Tesla – like everyone else – used to rely on Nvidia GPUs to power its AI. Now, though, it’s developed its own supercomputer that uses its own GPUs custom-built for its own AI needs.
And that gets to the core theme of the second wave of the “AI Boom” that Dojo is ushering in right now: customization.
The first wave of the “AI Boom” played out in 2023, and it was all about a race to build general-purpose and broad AI models. They didn’t care what AI models they were building; they just all wanted to build some sort of AI model to get their foot in the door of the “AI Boom.”
But Dojo marks a critical shift into the second wave of the “AI Boom” – one that will transform the industry in 2024 and beyond.
Dojo isn’t a broad, general-purpose AI model. It is a narrow, task-specific AI model focused exclusively on creating AI to power self-driving cars.
It’s customized AI.
Welcome to the Era of Custom AI
To power this custom AI, Tesla is building its own custom chips.
It isn’t alone.
Amazon (AMZN) has developed two AI chips customized specifically to building AI models on the firm’s cloud service, AWS.
One is for high-performance inference (AWS Inferentia) and one is for deep-learning training (AWS Trainium). Together, Amazon believes these two AI chips could power all AI functions on AWS in the future.
Meanwhile, Alphabet (GOOGL, GOOG) is already on the fifth generation of its custom Tensor Processing Units, or TPUs, for neural network development. These custom AI chips are built specifically to optimize the integration of AI into search and advertising.
Microsoft (MSFT) is making a big bet on its own chip designs, with one focused on artificial intelligence and the other focused on cloud computing, as it looks to compete with semiconductor giants and reduce its reliance on other companies.
The Maia 100 chip is aimed at AI workloads, and could go up against offerings from Nvidia. The chip will be available for Azure cloud customers and it is already being tested with Bing and Office AI products.
One of the most prominent users of the Maia 100 is OpenAI, which is backed by billions of dollars of investment from Microsoft. OpenAI is testing the chip for its ambitious and cutting-edge AI projects.
The other chip, called Cobalt, is designed for cloud computing tasks, such as running software applications and databases. Both Maia and Cobalt, which are manufactured with 5-nanometer manufacturing technology from Taiwan Semiconductor (TSM), are slated to show up in Microsoft data centers next year.
Last but not least, Meta (META) is developing its own custom chip for running AI models, dubbed the MTIA chip – or Meta Training and Inference Accelerator chip. The MTIA chip is part of a family of chips that Meta is developing for both training and inference workloads. [Editor’s note: Training is the process of teaching an AI model to perform a task, while inference is the process of using a trained model to make predictions or decisions – John Kilhefner, Senior Managing Editor.]
The MTIA chip is designed specifically for inference workloads, which are ubiquitous at Meta and form the basis for a wide range of use cases, such as content understanding, feeds, generative AI, and ads ranking. By rethinking how to innovate across its infrastructure, Meta is creating a scalable foundation to power emerging opportunities in areas like generative AI and the metaverse.
The Final Word on Dojo
So, folks, the world’s most important tech companies have made their move. You’ve seen how the largest tech giants are betting big on AI. They’re not satisfied with relying on Nvidia anymore.
They want to dominate the AI race on their own terms..
As such, there will be a huge reshuffling in the industry.
The old winners of 2023, like Nvidia, won’t shine quite as bright as new challengers arise.
Who will emerge as the new leaders of the $15.7 trillion AI Revolution?
You don’t want to miss this chance to find out.
On the date of publication, Luke Lango did not have (either directly or indirectly) any positions in the securities mentioned in this article.