Connect with us

News

Elon Musk-founded OpenAI gets $1 billion boost from Microsoft investment

[Source: OpenAI]

Published

on

Microsoft’s interest in expanding its Azure cloud computing service to include artificial intelligence (AI) supercomputing technologies has led to a new partnership agreement with the Elon Musk-backed company, OpenAI. An investment of $1 billion dollars was recently made by Microsoft into the venture to develop an Azure-based hardware and software platform that will scale to artificial general intelligence (AGI). In turn, OpenAI will use Microsoft as their exclusive cloud provider.

OpenAI is a nonprofit AI research organization co-founded by Musk, serial entrepreneur Peter Thiel, and Y Combinator’s Sam Altman with the goal of developing beneficial, open source AI to combat any future rise of harmful AI. Musk stepped down from the Board of Directors in early 2018 to avoid any conflicts with Tesla’s Autopilot program; however, he still remains as a benefactor and advisor. Tesla’s Director of AI and Autopilot Vision, Andrej Karpathy, previously worked as a neural network researcher for OpenAI.

While the venture is backed by significant private investment, the long-term goals of OpenAI require even greater resources. The company’s motivation to create the new investment partnership with Microsoft was partially due to financial constraints caused by computing hardware needs. The financial requirements to retain top talent are also significant – OpenAI’s tax filings from 2016 revealed its top researcher was paid a $1.9 million dollar salary, with others receiving significant amounts as well.

Harry Shuman of Microsoft and Sam Altman of OpenAI discuss their new partnership and the future of AI. | Image: Microsoft/YouTube

“OpenAI is producing a sequence of increasingly powerful AI technologies, which requires a lot of capital for computational power. The most obvious way to cover costs is to build a product, but that would mean changing our focus. Instead, we intend to license some of our pre-AGI technologies, with Microsoft becoming our preferred partner for commercializing them,” OpenAI’s press release announcing the new partnership explained.

The connection between Microsoft and OpenAI is not new. In 2016, the companies jointly announced they were working together to run most of OpenAI’s large-scale experiments on Azure, making it their primary cloud platform for deep learning and AI. Azure had hardware configurations optimized for AI computing needs and a roadmap to expand those capabilities even further. One of the stated joint goals between Microsoft and OpenAI is the democratization of AI, and cloud computing is a large part of making that a reality as hardware and software resources are no longer required to be local to the user.

Advertisement

OpenAI has already created some impressive AI capabilities. In August last year, company bots created for the video game Dota 2 defeated a team of highly skilled human players in two games out of three. To accomplish the task, serious amounts of hardware and training were required. The nonprofit research lab employed a scaled-up version of Proximal Policy Optimization running on 256 GPUs and 128,000 cores to complete roughly 180 years worth of gameplay every day through reinforcement learning, which allowed the bots to develop advanced skills for the game. An open source gym for training AI with games was also released by the company.

In 2017, OpenAI announced that it had successfully trained its AI-powered robots to perform a task after watching it once in virtual reality. After showing a robot how to stack a series of colored blocks in a virtual reality simulation, it was then able to successfully mimic the actions. To accomplish this, OpenAI trained the robot in a simulated, virtual environment with nuances like lighting, shadows and backgrounds noise so that when in the real environment, it knew to filter out noise and focus on only important elements as a human brain would.

OpenAI also successfully taught AI bots to create their own language for communicating with each other in 2017. A paper was published on the topic which explained how the bots used reinforcement learning to accomplish simple goals through trial and error. After being given clues such as “Go to” or “Look at” by the researchers, the bots were then required to create their own machine language to communicate with each other.

The company’s latest commitment to Microsoft will now expand their access to resources to achieve even more impressive artificial intelligence feats.

Advertisement

Accidental computer geek, fascinated by most history and the multiplanetary future on its way. Quite keen on the democratization of space. | It's pronounced day-sha, but I answer to almost any variation thereof.

Advertisement
Comments

Elon Musk

Elon Musk reveals date of Tesla Full Self-Driving’s next massive release

Initially planned for a January or February release, v14.3 aims to add some reasoning and logic to the decisions that Full Self-Driving makes, which could improve a lot of things, including Navigation, which is a major complaint of many owners currently.

Published

on

Tesla CEO Elon Musk revealed the date of Full Self-Driving’s next massive release: v14.3.

For months, Tesla owners with Hardware 4 have been utilizing Full Self-Driving v14.2 and subsequent releases. Currently, the most up-to-date FSD version is v14.2.2.5, which has definitely brought out mixed reviews. With releases, some things get better, and other things might regress slightly.

For the most part, things are better in terms of overall behavior.

However, many owners have been looking forward to the next release, which is v14.3, about which Musk has said many great things. Back in November, Musk said that v14.3 “is where the last big piece of the puzzle lands.”

Advertisement

He added:

“We’re gonna add a lot of reasoning and RL (reinforcement learning). To get to serious scale, Tesla will probably need to build a giant chip fab. To have a few hundred gigawatts of AI chips per year, I don’t see that capability coming online fast enough, so we will probably have to build a fab.”

Initially planned for a January or February release, v14.3 aims to add some reasoning and logic to the decisions that Full Self-Driving makes, which could improve a lot of things, including Navigation, which is a major complaint of many owners currently.

Tesla Full Self-Driving v14.2 is a considerable improvement from early versions of the suite, but we have written about the somewhat confusing updates that have come with recent versions.

Advertisement

Tesla Full Self-Driving v14.2.2.5 might be the most confusing release ever

They’ve been incredibly difficult to gauge in terms of progress because some things have gotten better, but there seems to be some real regression on a handful of things, especially with confidence and assertiveness.

Musk confirmed today on X that Tesla is already testing v14.3 internally right now. It will hit a wide release “in a few weeks,” so we should probably expect it by late April.

Overall, there are high hopes that v14.3 could be a true game changer for Tesla Full Self-Driving, as many believe it could be the version that Robotaxis in Austin, Texas, some of which are driverless and unsupervised, are running.

It could also include some major additions, including “Banish,” also referred to as “Reverse Summon,” which would go find a parking spot after dropping occupants off at their destination.

What Tesla will roll out, and when exactly it arrives, all remain to be seen, but fans have been ready for a new version as v14.2.2.5 has definitely run its course. We have had a lot of readers tell us their biggest request is to fix Navigation errors, which seem to be one of the most universal complaints among daily FSD users.

Advertisement
Continue Reading

Cybertruck

Chattanooga Charge: Tesla and EV fans ready for the Southeast’s wildest Tesla party

From Cybertruck Convoys to Kid-Friendly Fun Zones: The Chattanooga Charge Has Something for Everyone

Published

on

By

Hundreds of like-minded Tesla and EV enthusiasts are descending on Chattanooga Charge this weekend for the largest Tesla meet in the Southeast. Taking place on March 20–22, 2026 at the stunning Tennessee Riverpark.

If you were there last year, you’ll know that it’s the ultimate experience to see the wildest Teslas in action, see the best in EV tech, and arguably the most fun – finally put a name to the face and connect with those social media buddies IRL! Oh, and that epic night time Tesla light show is a once-in-a-lifetime experience that will transform the Riverpark into something out of a sci-fi film that’s remarkably unforgettable and must be seen in person.

This year’s event takes everything up a notch, with over 100 Cybertrucks expected to be on display, many sporting jaw-dropping modifications and custom wraps that push the boundaries of what these stainless steel beasts can look like.

Whether you’re a diehard Tesla fan, EV supporter, or just EV-mod-curious, the sheer spectacle is worth the drive.

Advertisement

The Chattanooga Charge doesn’t wait until Saturday morning to get started. The weekend technically kicks off Friday, March 20th, and the venue sets the tone immediately. Come share roadtrip stories over drinks at the W-XYZ Rooftop Bar on the top floor of the Aloft Chattanooga Hamilton Place Hotel, with sunset views over the city.

Come morning, nurse your hangover with a some good coffee, and convoy with hundreds of other Tesla and EV drivers through Chattanooga to the event for some morning meet and greets before the speaker panel starts and the food trucks fire up.

Tesla owner clubs travel from across the country to be here, not just to show off their vehicles,, but to connect, share, and celebrate a shared passion for the future of driving.

Advertisement

Sounds like a plan to me. See you there, guys. Don’t miss it. Get your tickets at ChattanoogaCharge.com and join the charge. 🔋⚡

Chattanooga Charge is a premier Tesla and EV gathering inspired by the X Takeover, known as one of the largest Tesla event gatherings. What began as a bold idea from the team at DIY Wraps/TESBROS, hosted in their hometown of Chattanooga, Tennessee, the event quickly became a movement across social media. The first annual Chattanooga Charge united over 16 Tesla clubs from 16 states, proof that the EV community was hungry for something big in the South. Year after year, the event has grown in scale, ambition, and heart.

Continue Reading

News

Tesla Full Self-Driving gets latest bit of scrutiny from NHTSA

The analysis impacts roughly 3.2 million vehicles across the company’s entire lineup, and aims to identify how the suite’s degradation detection systems work and how effective they are when the cars encounter difficult visibility conditions.

Published

on

Credit: Tesla

The National Highway Traffic Safety Administration (NHTSA) has elevated its probe into Tesla’s Full Self-Driving (Supervised) suite to an Engineering Analysis.

The analysis impacts roughly 3.2 million vehicles across the company’s entire lineup, and aims to identify how the suite’s degradation detection systems work and how effective they are when the cars encounter difficult visibility conditions.

The step up into an Engineering Analysis is often required before the NHTSA will tell an automaker to issue a recall. However, this is not a guarantee that a recall will be issued.

The NTHSA wants to examine Tesla FSD’s ability to assess road conditions that have reduced visibility, as well as detect degradation to alert the driver with sufficient time to respond.

The Office of Defects Investigation (ODI) will evaluate the performance of FSD in degraded roadway conditions and the updates or modifications Tesla makes to the degradation detection system, including the timing, purpose, and capabilities of the updates.

Advertisement

Tesla routinely ships software updates to improve the capabilities of the FSD suite, so it will be interesting to see if various versions of FSD are tested. Interestingly, you can find many examples from real-world users of FSD handling snow-covered roads, heavy rain, and single-lane backroads.

However, there are incidents that the NHTSA has used to determine the need for this probe, at least for now. The agency said:

“Available incident data raise concerns that Tesla’s degradation detection system, both as originally deployed and later updated, fails to detect and/or warn the driver appropriately under degraded visibility conditions such as glare and airborne obscurants. In the crashes that ODI has reviewed, the system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred.”

It continues to say in its report that a review of Tesla’s responses revealed additional crashes that occurred in similar environments showed FSD “did not detect a degraded state, and/or it did not present the driver with an alert with adequate time for the driver to react. In each of these crashes, FSD also lost track of or never detected a lead vehicle in its path.”

Advertisement

The next steps of the NHTSA Engineering Analysis require the agency to gather further information on Tesla’s attempts to upgrade the degradation detection system. It will also analyze six recent potentially related incidents.

The investigation is listed as EA26002.

Continue Reading