Connect with us
Dota Dota

News

Elon Musk’s OpenAI to battle in Dota 2 World Championship video game tournament

Published

on

OpenAI, a research lab co-founded by Elon Musk, has developed a new breed of AI agents that are capable of playing Dota 2, a complex strategy game, in 5-on-5 multiplayer matches. OpenAI’s new bots have so far been able to beat amateur and semi-professional teams. With this accomplished, the research lab is now looking to bring its bots to The International, a prolific Dota 2 tournament, this coming August.

The new bots go by the name of OpenAI Five, a reference to the number of neural networks working together in the team. To train the neural networks, the AI has been playing roughly 180 years worth of gameplay every day using reinforcement learning. This enables the AI to learn the intricacies of the game, considering that it is far more complicated than board games like Chess and Go. Dota 2, for example, involves hiding data from players, preventing the system from perceiving the entire playing field at a given time.

The hardware employed by the research lab to train OpenAI Five is impressive. The five neural networks train through a scaled-up version of Proximal Policy Optimization running on 256 GPUs and 128,000 CPU cores. The same setup was adopted in a much smaller scale last year when OpenAI rolled out an artificial intelligence system that proved capable of beating the best Dota 2 players in the world in 1-on-1 matches.

Currently, however, OpenAI Five can only play the game with several restrictions. For one, the AI system can only use five of the 115 heroes available in the game. Skills such as Invisibility, Summons, and the placement of wards are also disabled. The research lab, however, hopes that through time, the neural networks would be able to play the game without any restrictions at all.

As could be seen in a recent video shared by the research lab, OpenAI Five is actually being received well by the Dota 2 community. Professional Dota 2 player Blitz, for one, noted that the bots are adopting strategies that are incredibly effective. In a match against OpenAI Five, Blitz, together with four employees of the research lab, put up a fight before getting dominated by the articificial intelligence. In a statement after the game, Blitz sheepishly stated that the bots capitalized on every small error he made during the match.

Advertisement
-->

“I think the team fight aspect of the bot(s) was excellent. It didn’t mess up. When it came to coordination, it was some of the best pure team fighting because it felt like I was getting hammered every single time I made a mistake. I feel like normal humans don’t do that,” the professional Dota 2 player said.

So what’s the secret behind OpenAI Five? In a statement to The Verge, OpenAI CTO Greg Brockman noted that unlike human players, the bots have “no ego” when they play the game. The teamwork aspect of the bots was also trained by allowing them to work individually at first, then encouraging them to work together.

“The bots are totally willing to sacrifice a lane or abandon a hero for the greater good. For fun, we had a human drop in to replace one of the bots. We hadn’t trained them to do anything special, but he said he just felt so well-supported. Anything he wanted, the bots got him,” Brockman said.

Ultimately, Brockman is encouraged by OpenAI Five’s development so far. The research, after all, is motivated by the idea that if AI systems can be trained to perform complex tasks such as learning a game as intricate as Dota 2, it could eventually be used to solve equally complex real-world challenges. Some examples of real-world applications could be designing and managing a city’s transport structure, or the logistics of a massive business.

“This an exciting milestone, and it’s really because it’s about transitioning to real-life applications. If you’ve got a simulation of a problem and you can run it large enough scale, there’s no barrier to what you can do with this,” he said.

Advertisement
-->

Simon is an experienced automotive reporter with a passion for electric cars and clean energy. Fascinated by the world envisioned by Elon Musk, he hopes to make it to Mars (at least as a tourist) someday. For stories or tips--or even to just say a simple hello--send a message to his email, simon@teslarati.com or his handle on X, @ResidentSponge.

Advertisement
Comments

News

Tesla FSD v14.2.2 is getting rave reviews from drivers

So far, early testers have reported buttery-smooth drives with confident performance, even at night or on twisty roads.

Published

on

Credit: @BLKMDL3/X

Tesla Full Self-Driving (Supervised) v14.2.2 is receiving positive reviews from owners, with several drivers praising the build’s lack of hesitation during lane changes and its smoother decision-making, among others. 

The update, which started rolling out on Monday, also adds features like dynamic arrival pin adjustment. So far, early testers have reported buttery-smooth drives with confident performance, even at night or on twisty roads.

Owners highlight major improvements

Longtime Tesla owner and FSD user @BLKMDL3 shared a detailed 10-hour impression of FSD v14.2.2, noting that the system exhibited “zero lane change hesitation” and “extremely refined” lane choices. He praised Mad Max mode’s performance, stellar parking in locations including ticket dispensers, and impressive canyon runs even in dark conditions.

Fellow FSD user Dan Burkland reported an hour of FSD v14.2.2’s nighttime driving with “zero hesitations” and “buttery smooth” confidence reminiscent of Robotaxi rides in areas such as Austin, Texas. Veteran FSD user Whole Mars Catalog also demonstrated voice navigation via Grok, while Tesla owner Devin Olsen completed a nearly two-hour drive with FSD v14.2.2 in heavy traffic and rain with strong performance.

Closer to unsupervised

FSD has been receiving rave reviews, even from Tesla’s competitors. Xpeng CEO He Xiaopeng, for one, offered fresh praise for FSD v14.2 after visiting Silicon Valley. Following extended test drives of Tesla vehicles running the latest FSD software, He stated that the system has made major strides, reinforcing his view that Tesla’s approach to autonomy is indeed the proper path towards autonomy.

Advertisement
-->

According to He, Tesla’s FSD has evolved from a smooth Level 2 advanced driver assistance system into what he described as a “near-Level 4” experience in terms of capabilities. While acknowledging that areas of improvement are still present, the Xpeng CEO stated that FSD’s current iteration significantly surpasses last year’s capabilities. He also reiterated his belief that Tesla’s strategy of using the same autonomous software and hardware architecture across private vehicles and robotaxis is the right long-term approach, as it would allow users to bypass intermediate autonomy stages and move closer to Level 4 functionality.

Continue Reading

News

Elon Musk’s Grok AI to be used in U.S. War Department’s bespoke AI platform

The partnership aims to provide advanced capabilities to 3 million military and civilian personnel.

Published

on

Credit: xAI

The U.S. Department of War announced Monday an agreement with Elon Musk’s xAI to embed the company’s frontier artificial intelligence systems, powered by the Grok family of models, into the department’s bespoke AI platform GenAI.mil. 

The partnership aims to provide advanced capabilities to 3 million military and civilian personnel, with initial deployment targeted for early 2026 at Impact Level 5 (IL5) for secure handling of Controlled Unclassified Information.

xAI Integration

As noted by the War Department’s press release, GenAI.mil, its bespoke AI platform, will gain xAI for the Government’s suite of tools, which enable real-time global insights from the X platform for “decisive information advantage.” The rollout builds on xAI’s July launch of products for U.S. government customers, including federal, state, local, and national security use cases.

“Targeted for initial deployment in early 2026, this integration will allow all military and civilian personnel to use xAI’s capabilities at Impact Level 5 (IL5), enabling the secure handling of Controlled Unclassified Information (CUI) in daily workflows. Users will also gain access to real‑time global insights from the X platform, providing War Department personnel with a decisive information advantage,” the Department of War wrote in a press release. 

Strategic advantages

The deal marks another step in the Department of War’s efforts to use cutting-edge AI in its operations. xAI, for its part, highlighted that its tools can support administrative tasks at the federal, state and local levels, as well as “critical mission use cases” at the front line of military operations.

Advertisement
-->

“The War Department will continue scaling an AI ecosystem built for speed, security, and decision superiority. Newly IL5-certified capabilities will empower every aspect of the Department’s workforce, turning AI into a daily operational asset. This announcement marks another milestone in America’s AI revolution, and the War Department is driving that momentum forward,” the War Department noted.

Continue Reading

News

Tesla FSD (Supervised) v14.2.2 starts rolling out

The update focuses on smoother real-world performance, better obstacle awareness, and precise end-of-trip routing, among other improvements.

Published

on

Credit: Grok Imagine

Tesla has started rolling out Full Self-Driving (Supervised) v14.2.2, bringing further refinements to its most advanced driver-assist system. The new FSD update focuses on smoother real-world performance, better obstacle awareness, and precise end-of-trip routing, among other improvements.

Key FSD v14.2.2 improvements

As noted by Not a Tesla App, FSD v14.2.2 upgrades the vision encoder neural network with higher resolution features, enhancing detection of emergency vehicles, road obstacles, and human gestures. New Arrival Options let users select preferred drop-off styles, such as Parking Lot, Street, Driveway, Parking Garage, or Curbside, with the navigation pin automatically adjusting to the user’s ideal spot for precision.

Other additions include pulling over for emergency vehicles, real-time vision-based detours for blocked roads, improved gate and debris handling, and extreme Speed Profiles for customized driving styles. Reliability gains cover fault recovery, residue alerts on the windshield, and automatic narrow-field camera washing for new 2026 Model Y units.

FSD v14.2.2 also boosts unprotected turns, lane changes, cut-ins, and school bus scenarios, among other things. Tesla also noted that users’ FSD statistics will be saved under Controls > Autopilot, which should help drivers easily view how much they are using FSD in their daily drives.  

Key FSD v14.2.2 release notes

Full Self-Driving (Supervised) v14.2.2 includes:

Advertisement
-->
  • Upgraded the neural network vision encoder, leveraging higher resolution features to further improve scenarios like handling emergency vehicles, obstacles on the road, and human gestures.
  • Added Arrival Options for you to select where FSD should park: in a Parking Lot, on the Street, in a Driveway, in a Parking Garage, or at the Curbside.
  • Added handling to pull over or yield for emergency vehicles (e.g. police cars, fire trucks, ambulances).
  • Added navigation and routing into the vision-based neural network for real-time handling of blocked roads and detours.
  • Added additional Speed Profile to further customize driving style preference.
  • Improved handling for static and dynamic gates.
  • Improved offsetting for road debris (e.g. tires, tree branches, boxes).
  • Improve handling of several scenarios, including unprotected turns, lane changes, vehicle cut-ins, and school buses.
  • Improved FSD’s ability to manage system faults and recover smoothly from degraded operation for enhanced reliability.
  • Added alerting for residue build-up on interior windshield that may impact front camera visibility. If affected, visit Service for cleaning!
  • Added automatic narrow field washing to provide rapid and efficient front camera self-cleaning, and optimize aerodynamics wash at higher vehicle speed.
  • Camera visibility can lead to increased attention monitoring sensitivity. 

Upcoming Improvements:

  • Overall smoothness and sentience.
  • Parking spot selection and parking quality.
Continue Reading