Connect with us

News

Elon Musk-founded OpenAI gets $1 billion boost from Microsoft investment

[Source: OpenAI]

Published

on

Microsoft’s interest in expanding its Azure cloud computing service to include artificial intelligence (AI) supercomputing technologies has led to a new partnership agreement with the Elon Musk-backed company, OpenAI. An investment of $1 billion dollars was recently made by Microsoft into the venture to develop an Azure-based hardware and software platform that will scale to artificial general intelligence (AGI). In turn, OpenAI will use Microsoft as their exclusive cloud provider.

OpenAI is a nonprofit AI research organization co-founded by Musk, serial entrepreneur Peter Thiel, and Y Combinator’s Sam Altman with the goal of developing beneficial, open source AI to combat any future rise of harmful AI. Musk stepped down from the Board of Directors in early 2018 to avoid any conflicts with Tesla’s Autopilot program; however, he still remains as a benefactor and advisor. Tesla’s Director of AI and Autopilot Vision, Andrej Karpathy, previously worked as a neural network researcher for OpenAI.

While the venture is backed by significant private investment, the long-term goals of OpenAI require even greater resources. The company’s motivation to create the new investment partnership with Microsoft was partially due to financial constraints caused by computing hardware needs. The financial requirements to retain top talent are also significant – OpenAI’s tax filings from 2016 revealed its top researcher was paid a $1.9 million dollar salary, with others receiving significant amounts as well.

Harry Shuman of Microsoft and Sam Altman of OpenAI discuss their new partnership and the future of AI. | Image: Microsoft/YouTube

“OpenAI is producing a sequence of increasingly powerful AI technologies, which requires a lot of capital for computational power. The most obvious way to cover costs is to build a product, but that would mean changing our focus. Instead, we intend to license some of our pre-AGI technologies, with Microsoft becoming our preferred partner for commercializing them,” OpenAI’s press release announcing the new partnership explained.

The connection between Microsoft and OpenAI is not new. In 2016, the companies jointly announced they were working together to run most of OpenAI’s large-scale experiments on Azure, making it their primary cloud platform for deep learning and AI. Azure had hardware configurations optimized for AI computing needs and a roadmap to expand those capabilities even further. One of the stated joint goals between Microsoft and OpenAI is the democratization of AI, and cloud computing is a large part of making that a reality as hardware and software resources are no longer required to be local to the user.

Advertisement

OpenAI has already created some impressive AI capabilities. In August last year, company bots created for the video game Dota 2 defeated a team of highly skilled human players in two games out of three. To accomplish the task, serious amounts of hardware and training were required. The nonprofit research lab employed a scaled-up version of Proximal Policy Optimization running on 256 GPUs and 128,000 cores to complete roughly 180 years worth of gameplay every day through reinforcement learning, which allowed the bots to develop advanced skills for the game. An open source gym for training AI with games was also released by the company.

In 2017, OpenAI announced that it had successfully trained its AI-powered robots to perform a task after watching it once in virtual reality. After showing a robot how to stack a series of colored blocks in a virtual reality simulation, it was then able to successfully mimic the actions. To accomplish this, OpenAI trained the robot in a simulated, virtual environment with nuances like lighting, shadows and backgrounds noise so that when in the real environment, it knew to filter out noise and focus on only important elements as a human brain would.

OpenAI also successfully taught AI bots to create their own language for communicating with each other in 2017. A paper was published on the topic which explained how the bots used reinforcement learning to accomplish simple goals through trial and error. After being given clues such as “Go to” or “Look at” by the researchers, the bots were then required to create their own machine language to communicate with each other.

The company’s latest commitment to Microsoft will now expand their access to resources to achieve even more impressive artificial intelligence feats.

Advertisement

Accidental computer geek, fascinated by most history and the multiplanetary future on its way. Quite keen on the democratization of space. | It's pronounced day-sha, but I answer to almost any variation thereof.

Advertisement
Comments

News

Tesla Full Self-Driving gets latest bit of scrutiny from NHTSA

The analysis impacts roughly 3.2 million vehicles across the company’s entire lineup, and aims to identify how the suite’s degradation detection systems work and how effective they are when the cars encounter difficult visibility conditions.

Published

on

Credit: Tesla

The National Highway Traffic Safety Administration (NHTSA) has elevated its probe into Tesla’s Full Self-Driving (Supervised) suite to an Engineering Analysis.

The analysis impacts roughly 3.2 million vehicles across the company’s entire lineup, and aims to identify how the suite’s degradation detection systems work and how effective they are when the cars encounter difficult visibility conditions.

The step up into an Engineering Analysis is often required before the NHTSA will tell an automaker to issue a recall. However, this is not a guarantee that a recall will be issued.

The NTHSA wants to examine Tesla FSD’s ability to assess road conditions that have reduced visibility, as well as detect degradation to alert the driver with sufficient time to respond.

The Office of Defects Investigation (ODI) will evaluate the performance of FSD in degraded roadway conditions and the updates or modifications Tesla makes to the degradation detection system, including the timing, purpose, and capabilities of the updates.

Advertisement

Tesla routinely ships software updates to improve the capabilities of the FSD suite, so it will be interesting to see if various versions of FSD are tested. Interestingly, you can find many examples from real-world users of FSD handling snow-covered roads, heavy rain, and single-lane backroads.

However, there are incidents that the NHTSA has used to determine the need for this probe, at least for now. The agency said:

“Available incident data raise concerns that Tesla’s degradation detection system, both as originally deployed and later updated, fails to detect and/or warn the driver appropriately under degraded visibility conditions such as glare and airborne obscurants. In the crashes that ODI has reviewed, the system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred.”

It continues to say in its report that a review of Tesla’s responses revealed additional crashes that occurred in similar environments showed FSD “did not detect a degraded state, and/or it did not present the driver with an alert with adequate time for the driver to react. In each of these crashes, FSD also lost track of or never detected a lead vehicle in its path.”

Advertisement

The next steps of the NHTSA Engineering Analysis require the agency to gather further information on Tesla’s attempts to upgrade the degradation detection system. It will also analyze six recent potentially related incidents.

The investigation is listed as EA26002.

Continue Reading

Elon Musk

SpaceX’s Starship V3 is almost ready and it will change space travel forever

SpaceX is targeting April for the debut test launch of Starship V3 “Version 3”

Published

on

By

SpaceX is closing in on one of the most anticipated rocket launches in history, as the company readies for a planned April test launch and debut of its next-gen Starship V3 “Version 3”.

The latest iteration of Starship V3 has a slightly taller Super Heavy booster and Starship upper stage than their predecessors, and produce stronger, more efficient thrust using SpaceX’s upgraded Raptor 3 engines. V3 also features increased propellant capacity, targeting a total payload capacity of over 100 tons to low Earth orbit, compared to around 35 tons for its predecessor. With Musk’s lifelong aspiration to colonize Mars one day, the increased payload capacity matters enormously, because Mars missions require moving massive amounts of cargo, fuel, and eventually, people. But the most critical upgrade may be orbital refueling. SpaceX’s entire deep space architecture depends on moving large amounts of propellant in space, and having orbital refueling capabilities turn Starship from just a rocket into a true transport system. Without it, neither the Moon nor Mars is reachable at scale.

A fully reusable Starship and Super Heavy, SpaceX aims to drive marginal launch costs down and at a tenfold reduction compared to current market leaders. To put that in perspective, getting a kilogram of cargo to orbit today costs thousands of dollars. Bring that number down far enough and space stops being an exclusive domain. That price point unlocks mass deployment of satellite constellations, large-scale science payloads, and affordable human transport beyond Earth orbit. It also means the Moon stops being a destination we visit and starts being one we inhabit.

Advertisement

Elon Musk pivots SpaceX plans to Moon base before Mars

NASA expects Starship to take off for the Moon’s South Pole in 2028, with the ultimate goal of establishing a permanently crewed science station there. A successful V3 flight this spring keeps that timeline alive.  As for Mars, Musk has shifted focus toward building a self-sustaining city on the Moon first, arguing that the Moon can be reached every 10 days versus Mars’s 26-month alignment window. Mars remains the horizon, but the Moon is the proving ground.

Elon Musk hasn’t been shy with hyping the upcoming Starship V3 launch. In a social media post on Wednesday, he confirmed the first V3 flight is getting closer to launch. SpaceX also announced its initial activation campaign for V3 and Starbase Pad 2 was complete, wrapping up several days of cryogenic fuel testing on a V3 vehicle for the first time. The countdown is on. April can’t come soon enough.

Advertisement
Continue Reading

Cybertruck

Tesla Cybertruck gets long-awaited safety feature

Tesla has announced the rollout of its innovative anti-dooring protection feature to the Cybertruck via the 2026.8 software update.

Published

on

Credit: Tesla Asia | X

Tesla is rolling out a new and long-awaited feature to the Cybertruck all-electric pickup, and it is a safety addition geared toward pedestrian and cyclist safety, as well as accidents with other vehicles.

Tesla has announced the rollout of its innovative anti-dooring protection feature to the Cybertruck via the 2026.8 software update.

This safety enhancement uses the vehicle’s existing cameras to detect approaching cyclists, pedestrians, or vehicles in the blind spot while parked. Upon attempting to open a door, if a hazard is detected, the system activates: the blind spot indicator light flashes, an audible chime sounds, and the door will not open on the initial button press.

Drivers must wait briefly and press the button again to override, providing crucial seconds to avoid an accident.

Advertisement

The feature, also known as Blind Spot Warning While Parked, comes standard on every new Model 3 and Model Y, and is now extending to the Cybertruck. Leveraging Tesla’s vision-based system without requiring new hardware, it represents a cost-effective software solution that builds on community suggestions dating back to 2018.

Advertisement

This technology addresses the persistent danger of “dooring,” where a driver opens a car door into the path of a passing cyclist or pedestrian.

Tesla implemented this little-known feature to make its cars even safer

Dooring incidents are alarmingly common in urban environments.

According to Chicago data, in 2011 alone, there were 344 reported dooring crashes, accounting for approximately 20 percent of all bicycle crashes in the city, nearly one incident per day.

Advertisement

While numbers have fluctuated (dropping to 11 percent in 2014 before rising again), dooring consistently represents 10-20 percent of bike-related crashes in major cities.

A national analysis of emergency department data estimates over 17,000 dooring-related injuries treated in the U.S. over a decade, with many involving fractures, contusions, and head trauma, particularly affecting upper extremities.

By automatically intervening, Tesla’s system not only protects vulnerable road users but also safeguards its owners from potential liability and enhances overall road safety.

As cities promote cycling for sustainable transport, features like this demonstrate how advanced driver assistance and camera systems can evolve beyond highway driving to everyday urban scenarios.

Advertisement

Enthusiastic responses on social media highlight appreciation for the proactive safety measure, with some calling for broader rollout to older models where hardware permits. Tesla continues to push the boundaries of vehicle safety through over-the-air updates, making its fleet smarter and safer over time.

Continue Reading