Connect with us
Elon Musk is confident Neuralink will restore vision & full body functionality Elon Musk is confident Neuralink will restore vision & full body functionality

News

Elon Musk is confident Neuralink will restore vision & full body functionality

Credit: Neuralink

Published

on

Elon Musk is confident that Neuralink will be able to restore vision in humans who are blind and full body functionality in humans who have a severed spinal cord.

Neuralink, another of Elon Musk’s companies, held its Show & Tell on Wednesday, and many revelations were shared in the live stream. One of those revelations included restoring vision even if someone was born blind. And also restoring full body functionality to someone with a severed spinal cord.

Elon Musk founded Neuralink to answer the question, “What do we do if there is a superintelligence that is much smarter than human beings? How do we, as a species, mitigate the risk or, in a benign scenario, go along for the ride?”

He added that even if that didn’t work, he was confident that Nueralink would be able to solve many brain injuries.

“Even if we do not succeed with that problem, we are confident at this point that we will succeed at solving many brain injury issues–spine injury issues–along the way,” Elon Musk said.

Advertisement

“You want to be able to read the signals from the brain. You want to be able to write the signals. You want to be able to ultimately do that for the entire brain and then also extend that to communicating to the rest of your nervous system if you have a severed spinal cord or neck.”

Elon Musk also said that he was confident Neuralink could restore vision even if someone had never seen before and was born blind. He explained that this would be one of the first applications Neuralink is aiming for humans.

“The first two applications we’re going to aim for in humans are restoring vision, and I think this is notable in that even if someone has never had vision EVER; like they were born blind–we believe we can still restore vision.”

“The visual part of the cortex is still there. Even if they’ve never seen before, we’re confident they could see.”

He added that the other application is in the motor cortex and would enable someone who has almost no ability to use their muscles to do so.

Advertisement

“Enable them to operate their phone faster than someone who has working hands. Even better than that would be to bridge the connection Take the signals from the motor cortex and let’s say somebody’s got a broken neck. Bridging those signals to Neuralink devices located in the spinal cord.”

Elon Musk said that he was confident that there are no physical limitations to enabling full-body functionality.

“As miraculous as it may sound, we’re confident that it is possible to restore full body functionality to someone who has a severed spinal cord.”

Your feedback is welcome. If you have any comments or concerns or see a typo, you can email me at johnna@teslarati.com. You can also reach me on Twitter at @JohnnaCrider1.

Teslarati is now on TikTok. Follow us for interactive news & more. Teslarati is now on TikTok. Follow us for interactive news & more. You can also follow Teslarati on LinkedInTwitter, Instagram, and Facebook.

Advertisement

News

Tesla lands permission to test Full Self-Driving in new country

Published

on

tesla showroom
(Credit: NicklasNilsso14)

Tesla has landed permission to begin testing its Full Self-Driving suite in a new country: Sweden.

Tesla has been working to expand its Full Self-Driving suite across the world. Currently, it is available in seven countries: the United States, Canada, Mexico, Puerto Rico, Australia, New Zealand, and China, where it is referred to as “City Autopilot.”

Capabilities of the Full Self-Driving suite differ in each region based on the approvals given to Tesla by regulatory agencies.

In Europe, Tesla has been attempting for a long time to launch FSD in various countries, but regulatory red tape has been prolonging the company’s ability to launch the suite.

However, Sweden appears to be ready to allow Tesla to test FSD in some passenger and public locations, according to the country’s Transport Agency.

Advertisement

On X, a Swedish Tesla owner named Alexander Kristensen, says he received direct confirmation from the Transport Agency that Tesla has “received permission to test automated vehicles.”

The full email said:

Advertisement

“Tesla received permission to test automated vehicles last week. This includes three vehicles and all state highways and expressways in Sweden.”

Tesla has already been working with the Swedish Transport Agency on the first steps of Full Self-Driving’s approval. The company and the Transport Agency spent two weeks assessing data gathered during a Formal Site Assessment Test, or SAT.

Based on the communication from the Transport Agency to Kristensen, it appears the company has passed the SAT and will now be able to perform its own testing within the market.

Advertisement

This approval seems similar to the approval Tesla received in U.S. states for Robotaxi operation. Nevada and Arizona have both given Tesla approval for Robotaxi testing, but passengers are not allowed in the vehicles quite yet.

Instead, company employees perform the testing, which is likely what will go on in Sweden until the Transport Agency gives the company a green light to roll FSD software out to customers.

Continue Reading

News

My Tesla did this on FSD (Supervised) v14.1 and the internet went crazy

Published

on

My Tesla did something on Full Self-Driving (Supervised) v14.1, and it garnered quite the response from the internet.

I received access to Tesla’s FSD v14.1 on Tuesday night, and by Wednesday, I was already using it and seeing all the progress the company had made from v13.2.9.

Tesla Full Self-Driving v14.1 first impressions: Robotaxi-like features arrive

However, there was one thing that it did during the drive that I shared on our social media accounts, and it really got a lot of interesting reactions from people from all corners of the world.

I’ll give some background about the situation: I was driving on Main Street in Dallastown, PA, and the route was about to take me left onto Pleasant Avenue. It is a tight and usually very congested intersection; Main Street is a popular route for many construction vehicles and even some tractor-trailers.

Advertisement

It is a pretty tight intersection for full-size trucks and larger passenger vehicles. It is not super tight for my Model Y, but it gets to feel congested at times, including with what happened yesterday.

The light when I approached the intersection was a green yield; there was also a solid green arrow at the beginning of my light cycle, but I had arrived after that had already turned into the green yield. Oncoming traffic had a green light.

My Model Y got out into the middle of the intersection, and the light turned yellow, then red. Most people, including myself, would have probably made the left turn after the light turned red since the car was already out in the intersection.

The Tesla, using FSD v14.1, did not. Instead, it chose to back up to the “Stop Here on Red” line, which is further back due to the tight turn the perpendicular traffic has:

As I mentioned, I would have probably taken the left turn. However, I believe the Tesla did not see the traffic that sat to the left, and because of this, it weighed the turn as a higher probability of an accident than if it were to just back up to the line.

If you look at these two screenshots from when the light was yellow and red, Tesla’s driver visualization does not have any idea what traffic is to the left on Pleasant Avenue:

I believe that, since FSD could not tell what traffic was down to the left, it chose to reverse.

Advertisement

People had some polarizing opinions on it:

Advertisement

As far as the legality of the move, it does not seem to be against Pennsylvania law to go through or choose to back up. I have seen many cars do both things over the course of my life of driving in this state, and neither has ever gotten anyone a ticket.

Advertisement

I think FSD just did what it felt was the safer option here.

Continue Reading

News

NHTSA probes 2.9 million Tesla vehicles over reports of FSD traffic violations

The agency said FSD may have “induced vehicle behavior that violated traffic safety laws.”

Published

on

Credit: Whole Mars Catalog/YouTube

The U.S. National Highway Traffic Safety Administration (NHTSA) has opened an investigation into nearly 2.9 million Tesla vehicles over potential traffic-safety violations linked to the use of the company’s Full Self-Driving (FSD) system.

The agency said FSD may have “induced vehicle behavior that violated traffic safety laws,” citing reports of Teslas running red lights or traveling in the wrong direction during lane changes.

As per the NHTSA, it has six reports in which a Tesla with FSD engaged “approached an intersection with a red traffic signal, continued to travel into the intersection against the red light and was subsequently involved in a crash with other motor vehicles in the intersection.” Four of these crashes reportedly resulted in one or more major injuries. 

The agency also listed 18 complaints and one media report which alleged that a Tesla operating with FSD engaged “failed to remain stopped for the duration of a red traffic signal, failed to stop fully, or failed to accurately detect and display the correct traffic signal state in the vehicle interface.”

Some complainants also alleged that FSD “did not provide warnings of the system’s intended behavior as the vehicle was approaching a red traffic signal,” as noted in a Reuters report.

Advertisement

Tesla has not commented on the investigation, which remains in the preliminary phase. However, any potential recall could prove complicated since the reported incidents likely involved the use of older FSD (Supervised) versions that have already been updated. 

Tesla’s recent FSD (Supervised) V14.1 update, which is currently rolling out to drivers, is expected to feature significantly improved lane management, intersection handling, and overall driving accuracy, reducing the chances of similar violations. It should also be noted that Tesla maintains that FSD is a supervised system for now, and thus, is not autonomous yet.

While autonomous systems face scrutiny, NHTSA’s own data highlights a much larger danger on the road from human error. The agency recorded 3,275 deaths in 2023 caused by distracted driving due to activities like texting, talking, or adjusting navigation while operating a vehicle manually. It is also widely believed that a good number of traffic violations are unreported due to their frequency and ubiquity.

Continue Reading

Trending