Connect with us

Elon Musk

Elon Musk’s xAI just posted the nerdiest job opening of all time

Anime fans, rejoice.

Published

on

Credit: xAI/X

Elon Musk’s artificial intelligence startup, xAI, just posted what could very well be the nerdiest job opening of all time. As can be seen in the Careers section of xAI’s official website, the AI startup and Grok creator is seeking a Fullstack Engineer for its “Waifus” project.

Anime fans, rejoice.

xAI’s Companions take the internet by storm

Earlier this week, xAI launched its Companions feature for its SuperGrok users, but the company eventually opened access to the new feature to all Grok iOS app users. xAI has released two companions so far: Ani, an overly attached anime goth girl seemingly inspired by the character Misa Amane from Death Note, and Rudi, a talking red panda that becomes more and more unhinged if he gets toggled to “Bad Rudi.” A third companion that looks like a male anime character is also listed as coming soon.

Ani, in particular, has become a massive hit online. Her natural, playful, and flirtatious interactions have proven popular among users across the globe, so much so that there is now fan art of the character being shared by users on social media. This has inspired the joke that Elon Musk, who has openly admitted on X in the past that he is fond of the Death Note anime, had created an actual “waifu” through xAI.

More companions coming

Based on xAI’s job opening, it appears that the company is looking to release even more companions for its users. Ani and Rudi/Bad Rudi may just be the start, especially considering xAI’s growing ambition and raw computing power. 

Advertisement
-->

As per the job listing, engineers hired for the role will help build fast, scalable, and low-latency avatar experiences for Grok. The AI startup noted that the work will involve real-time media processing, performance optimization, and feature development across various platforms. Applicants are also expected to be highly proficient in Python, with bonus points for experience in Rust, WebSocket, and WebRTC protocols.

Fullstack Engineers for xAI’s Waifu project are expected to receive a salary of $180,000 to $440,000, plus equity, comprehensive medical, vision, and dental coverage, access to a 401(k) retirement plan, short & long-term disability insurance, life insurance, and various other discounts and perks.

Simon is an experienced automotive reporter with a passion for electric cars and clean energy. Fascinated by the world envisioned by Elon Musk, he hopes to make it to Mars (at least as a tourist) someday. For stories or tips--or even to just say a simple hello--send a message to his email, simon@teslarati.com or his handle on X, @ResidentSponge.

Advertisement
Comments

Elon Musk

Tesla Full Self-Driving v14.2.1 texting and driving: we tested it

We decided to test it, and our main objective was to try to determine a more definitive label for when it would allow you to grab your phone and look at it without any nudge from the in-car driver monitoring system.

Published

on

Credit: Grok

On Thursday, Tesla CEO Elon Musk said that Full Self-Driving v14.2.1 would enable texting and driving “depending on [the] context of surrounding traffic.”

Tesla CEO Elon Musk announces major update with texting and driving on FSD

We decided to test it, and our main objective was to try to determine a more definitive label for when it would allow you to grab your phone and look at it without any nudge from the in-car driver monitoring system.

I’d also like to add that, while Tesla had said back in early November that it hoped to allow this capability within one to two months, I still would not recommend you do it. Even if Tesla or Musk says it will allow you to do so, you should take into account the fact that many laws do not allow you to look at your phone. Be sure to refer to your local regulations surrounding texting and driving, and stay attentive to the road and its surroundings.

The Process

Based on Musk’s post on X, which said the ability to text and drive would be totally dependent on the “context of surrounding traffic,” I decided to try and find three levels of congestion: low, medium, and high.

I also tried as best as I could to always glance up at the road, a natural reaction, but I spent most of my time, during the spans of when it was in my hand, looking at my phone screen. I limited my time looking at the phone screen to a few seconds, five to seven at most. On local roads, I didn’t go over five seconds; once I got to the highway, I ensured the vehicle had no other cars directly in front of me.

Also, at any time I saw a pedestrian, I put my phone down and was fully attentive to the road. I also made sure there were no law enforcement officers around; I am still very aware of the law, which is why I would never do this myself if I were not testing it.

I also limited the testing to no more than one minute per attempt.

I am fully aware that this test might ruffle some feathers. I’m not one to text and drive, and I tried to keep this test as abbreviated as possible while still getting some insight on how often it would require me to look at the road once again.

The Results

Low Congestion Area

I picked a local road close to where I live at a time when I knew there would be very little traffic. I grabbed my phone and looked at it for no more than five seconds before I would glance up at the road to ensure everything was okay:

Looking up at the road was still regular in frequency; I would glance up at the road after hitting that five-second threshold. Then I would look back down.

I had no nudges during this portion of the test. Traffic was far from even a light volume, and other vehicles around were very infrequently seen.

Medium Congestion Area

This area had significantly more traffic and included a stop at a traffic light. I still kept the consecutive time of looking at my phone to about five seconds.

I would quickly glance at the road to ensure everything was okay, then look back down at my phone, spending enough time looking at a post on Instagram, X, or Facebook to determine what it was about, before then peeking at the road again.

There was once again no alert to look at the road, and I started to question whether I was even looking at my phone long enough to get an alert:

Based on past versions of Full Self-Driving, especially dating back to v13, even looking out the window for too long would get me a nudge, and it was about the same amount of time, sometimes more, sometimes less, I would look out of a window to look at a house or a view.

High Congestion Area

I decided to use the highway as a High Congestion Area, and it finally gave me an alert to look at the road.

As strange as it is, I felt more comfortable looking down at my phone for a longer amount of time on the highway, especially considering there is a lower chance of a sudden stop or a dangerous maneuver by another car, especially as I was traveling just 5 MPH over in the left lane.

This is where I finally got an alert from the driver monitoring system, and I immediately put my phone down and returned to looking at the road:

Once I was able to trigger an alert, I considered the testing over with. I think in the future I’d like to try this again with someone else in the car to keep their eyes on the road, but I’m more than aware that we can’t always have company while driving.

My True Thoughts

Although this is apparently enabled based on what was said, I still do not feel totally comfortable with it. I would not ever consider shooting a text or responding to messages because Full Self-Driving is enabled, and there are two reasons for that.

The first is the fact that if an accident were to happen, it would be my fault. Although it would be my fault, people would take it as Tesla’s fault, just based on what media headlines usually are with accidents involving these cars.

Secondly, I am still well aware that it’s against the law to use your phone while driving. In Pennsylvania, we have the Paul Miller Law, which prohibits people from even holding their phones, even at stop lights.

I’d feel much more comfortable using my phone if liability were taken off of me in case of an accident. I trust FSD, but I am still erring on the side of caution, especially considering Tesla’s website still indicates vehicle operators have to remain attentive while using either FSD or Autopilot.

Check out our full test below:

Continue Reading

Elon Musk

Tesla CEO Elon Musk announces major update with texting and driving on FSD

“Depending on context of surrounding traffic, yes,” Musk said in regards to FSD v14.2.1 allowing texting and driving.

Published

on

Credit: carwow/YouTube

Tesla CEO Elon Musk has announced a major update with texting and driving capabilities on Full Self-Driving v14.2.1, the company’s latest version of the FSD suite.

Tesla Full Self-Driving, even in its most mature and capable versions, is still a Level 2 autonomous driving suite, meaning it requires attention from the vehicle operator.

You cannot sleep, and you should not take attention away from driving; ultimately, you are still solely responsible for what happens with the car.

The vehicles utilize a cabin-facing camera to enable attention monitoring, and if you take your eyes off the road for too long, you will be admonished and advised to pay attention. After five strikes, FSD and Autopilot will be disabled.

However, Musk announced at the Annual Shareholder Meeting in early November that the company would look at the statistics, but it aimed to allow people to text and drive “within the next month or two.”

He said:

“I am confident that, within the next month or two, we’re gonna look at the safety statistics, but we will allow you to text and drive.”

Today, Musk confirmed that the current version of Full Self-Driving, which is FSD v14.2.1, does allow for texting and driving “depending on context of surrounding traffic.”

There are some legitimate questions with this capability, especially as laws in all 50 U.S. states specifically prohibit texting and driving. It will be interesting to see the legality of it, because if a police officer sees you texting, they won’t know that you’re on Full Self-Driving, and you’ll likely be pulled over.

Some states prohibit drivers from even holding a phone when the car is in motion.

It is certainly a move toward unsupervised Full Self-Driving operation, but it is worth noting that Musk’s words state it will only allow the vehicle operator to do it depending on the context of surrounding traffic.

He did not outline any specific conditions that FSD would allow a driver to text and drive.

Continue Reading

Elon Musk

NVIDIA CEO Jensen Huang commends Tesla’s Elon Musk for early belief

“And when I announced DGX-1, nobody in the world wanted it. I had no purchase orders, not one. Nobody wanted to buy it. Nobody wanted to be part of it, except for Elon.”

Published

on

Credit: NVIDIA

NVIDIA CEO Jensen Huang appeared on the Joe Rogan Experience podcast on Wednesday and commended Tesla CEO Elon Musk for his early belief in what is now the most valuable company in the world.

Huang and Musk are widely regarded as two of the greatest tech entrepreneurs of the modern era, with the two working in conjunction as NVIDIA’s chips are present in Tesla vehicles, particularly utilized for self-driving technology and data collection.

Nvidia CEO Jensen Huang regrets not investing more in Elon Musk’s xAI

Both CEOs defied all odds and created companies from virtually nothing. Musk joined Tesla in the early 2000s before the company had even established any plans to build a vehicle. Jensen created NVIDIA in the booth of a Denny’s restaurant, which has been memorialized with a plaque.

On the JRE episode, Rogan asked about Jensen’s relationship with Elon, to which the NVIDIA CEO said that Musk was there when nobody else was:

“I was lucky because I had known Elon Musk, and I helped him build the first computer for Model 3, the Model S, and when he wanted to start working on an autonomous vehicle. I helped him build the computer that went into the Model S AV system, his full self-driving system. We were basically the FSD computer version 1, and so we were already working together.

And when I announced DGX-1, nobody in the world wanted it. I had no purchase orders, not one. Nobody wanted to buy it. Nobody wanted to be part of it, except for Elon.

He goes ‘You know what, I have a company that could really use this.’ I said, Wow, my first customer. And he goes, it’s an AI company, and it’s a nonprofit and and we could really use one of these supercomputers. I boxed one up, I drove it up to San Francisco, and I delivered it to the Elon in 2016.”

The first DGX-1 AI supercomputer was delivered personally to Musk when he was with OpenAI, which provided crucial early compute power for AI research, accelerating breakthroughs in machine learning that underpin modern tools like ChatGPT.

Tesla’s Nvidia purchases could reach $4 billion this year: Musk

The long-term alliance between NVIDIA and Tesla has driven over $2 trillion in the company’s market value since 2016.

Continue Reading