Connect with us

News

Google wants to make “good” AI with your help

Google office in Zurich [Credit: Google]

Published

on

As a company with a global presence to the tune of at least a billion people, Google is taking both its immense tech capabilities and social responsibility role very seriously. Namely, it has pledged to provide tangible support to organizations wanting to help address societal challenges using artificial intelligence through its just announced “AI Impact Challenge”. Whether an idea needs coaching, grant funding from a pool of $25 million available, or credit and consulting from cloud services, Google will be there to help.

Towards this effort, the company has already provided an educational guide to machine learning, the primary tool it wants organizations to utilize in its problem-solving. It might seem counterintuitive for a proposer to need training on the very thing it’s proposing, but this is part of the point of Google’s support. To quote Google’s project page directly, “We want people from as many backgrounds as possible to surface problems that AI can help solve, and to be empowered to create solutions themselves…We don’t expect applicants to be AI experts.” Submissions are open until January 22, 2019, and winners will be announced in spring 2019.

Need inspiration for an idea? Or, perhaps, some examples of the kinds of problems that artificial intelligence can help solve? Google’s page dedicated to its “AI for social good” mission has featured projects that are already working towards societally beneficial goals. Here’s a breakdown of some of them:

  • The “Smart Wildfire Sensor” is a device that identifies and predicts areas in a forest that are susceptible to wildfires. To do this, it uses data from tools measuring wind speed, wind direction, humidity, and temperature combined with Google’s open source machine learning tool TensorFlow for photographic analysis of biomass (accumulated fallen branches and trees).
  • Protecting whales from preventable accidents such as entanglement in fishing gear and collisions with vessels is a challenge being addressed using whale songs and machine learning to locate where they’re singing from. The National Oceanic and Atmospheric Administration (NOAA) uses underwater audio recordings to identify and mitigate the presence of dangers in the estimated areas where whales are present. The thousands of hours of recordings accumulated presented a data challenge well suited to Google’s existing sound classification AI to help meet NOAA’s needs with conservation efforts.
  • As a top cause of infant mortality in the world, birth asphyxia is a serious threat needing all the tools available to new parents. Using machine learning trained to recognize the cries of a newborn with this condition, the company Ubenwa has developed a mobile app enabling a recording of a baby’s cry to be uploaded and diagnosed.

“With great power comes great responsibility” is a familiar motto that applies to the state of modern tech just as much as superheroes. For example, the fast-paced field of artificial intelligence brings frequent developments that challenge our security as a society, thus needing caution. However, the massive companies driving the primary innovations being used among the public on a grand scale are one of the larger demonstrations of this where this motto really applies in today’s world.

Google sharply felt the weight of its responsibility recently when its role in assisting the US Department of Defense to analyze drone footage (Project Maven) was revealed. The “Don’t be evil” part of the company’s Code of Conduct at the time appeared to be violated through the military assistance, and renewal of the contract has since been canceled. Google’s further work on its Chinese search engine with censorship in accordance with the communist government’s requirements has also drawn protest from both inside and outside the company. Given this background, a new project focused on doing “good” things for the benefit of society might be seen as possible damage control. The timing might be suspect, but it’s worth noting that, as seen in the projects described above, Google has been working to help with societal needs for quite some time already.

Advertisement

Overall, headlines in recent years have demonstrated just how flexible AI can be when it comes to solving challenges that face our world. While the fears brought on by future “intelligent” computers may have a foundation in reality, it may do us a great amount of good to turn our focus on the hope such technology can also bring. Whatever Google’s motivation is for launching its “AI for social good project”, if good is achieved, it may just be a win for us all.

Accidental computer geek, fascinated by most history and the multiplanetary future on its way. Quite keen on the democratization of space. | It's pronounced day-sha, but I answer to almost any variation thereof.

Advertisement
Comments

News

Tesla pulls back the curtain on Cybercab mass production

Tesla’s Cybercab drives itself off the Gigafactory Texas line in a striking new production video.

Published

on

By

Tesla Cybercab production units rolling off the factory line in Gigafactory Texas (Credit: Tesla)

Tesla has provided a first look from inside a production Cybercab as it drove itself off the assembly line at Gigafactory Texas. The video footage, posted on X, opens on the factory floor with robotic arms and assembly equipment visible through the Cybercab windshield, and follows the car through a branded tunnel marked “Cybercab”, before autonomously navigating itself to a holding lot.

The first Cybercab rolled off the Giga Texas production line on February 17, 2026, with Musk writing on X, “Congratulations to the Tesla team on making the first production Cybercab.” April marked the official shift to volume production. The Giga Texas line is being prepared to produce hundreds of units per week, with 60 units already spotted on the Gigafactory campus earlier this month.


The Cybercab was first revealed publicly at Tesla’s “We, Robot” event in October 2024 at Warner Bros. Studios in Burbank, California, where 20 pre-production units gave attendees rides around the studio lot. Musk said he believed the average operating cost would be around $0.20 per mile, and that buyers would be able to purchase one for under $30,000. The two-seat design is deliberate. Musk noted that 90 percent of miles driven involve one or two people, making a compact two-passenger vehicle the most efficient configuration for a fleet-scale robotaxi. Eliminating rear seats also removes complexity and cost, supporting that sub-$30,000 target.

Tesla’s annual production goal is 2 million Cybercabs per year once several factories reach full design capacity. The Cybercab has no steering wheel, no pedals, and relies entirely on Tesla’s vision-based FSD system. What the video shows is the first evidence of that system working not as a demo, but as a production reality, driving itself off the line and into the world.

Advertisement
Continue Reading

Elon Musk

Elon Musk talks Tesla Roadster’s future

Elon Musk confirmed the Roadster as Tesla’s last manually driven car, with a debut coming soon.

Published

on

By

Tesla Roadster driving along sunset cliff (Credit: Grok)

During Tesla’s Q1 2026 earnings call on April 22, Elon Musk made a brief but notable comment about the long-awaited next generation Roadster while describing Tesla’s future vehicle lineup. “Long term, the only manually driven car will be the new Tesla Roadster,” he said. “Speaking of which, we may be able to debut that in a month or so. It requires a lot of testing and validation before we can actually have a demo and not have something go wrong with the demo.”

That single statement is the entire Roadster update from yesterday’s call, and while it represents another timeline shift, it comes as no surprise with Tesla heads-down-at-work on the mass rollout of its Robotaxi service across US cities, and the industrial scale production of the humanoid Optimus.

The fact that Musk specifically framed the Roadster as the last manually driven Tesla is significant on its own. As the rest of the lineup moves toward full autonomy, the Roadster becomes something rare in the Tesla-sphere by keeping the driver in control. Driving enthusiasts who buy a $200,000 supercar are not doing so to be passengers. They want the physical connection to the road, the feel of acceleration under their own input, and the experience of controlling something with that level of performance. FSD, however capable it becomes, removes that entirely. The Roadster signals that Tesla understands this distinction and is building a car specifically for the people who consider driving itself the point.

Tesla isn’t joking about building Optimus at an industrial scale: Here we go

Advertisement

The specs for the Roadster Musk has teased over the years are genuinely unlike anything in production. The base model targets 0 to 60 mph in 1.9 seconds, a top speed above 250 mph, and up to 620 miles of range from a 200 kWh battery. The optional SpaceX package takes it further, rumored to add roughly ten cold gas thrusters operating at 10,000 psi, borrowed directly from Falcon 9 rocket technology. With thrusters, Musk has claimed 0 to 60 mph in as little as 1.1 seconds. In a 2021 Joe Rogan interview he went further, stating “I want it to hover. We got to figure out how to make it hover without killing people.” Tesla filed a patent for ground effect technology in August 2025, suggesting the hover concept has not been abandoned. The starting price remains $200,000, with the Founders Series requiring a $250,000 full deposit. Some reservation holders placed those deposits in 2017 and are approaching a full decade of waiting.

With production now targeted for 2027 or 2028 at the earliest, the Roadster remains Tesla’s most audacious promise and its longest-running delay. But if what Musk is testing lives up to even half of what he has described, the demo alone should be worth waiting for.

Continue Reading

Elon Musk

Tesla confirmed HW3 can’t do Unsupervised FSD but there’s more to the story

Tesla confirmed HW3 vehicles cannot run unsupervised FSD, replacing its free upgrade promise with a discounted trade-in.

Published

on

By

tesla autopilot

Tesla has officially confirmed that early vehicles with its Autopilot Hardware 3 (HW3) will not be capable of unsupervised Full Self-Driving, while extending a path forward for legacy owners through a discounted trade-in program. The announcement came by way of Elon Musk in today’s Tesla Q1 2026 earnings call.

The history here matters. HW3 launched in April 2019, and Tesla sold Full Self-Driving packages to owners on the understanding that the hardware was sufficient for full autonomy. Some owners paid between $8,000 and $15,000 for FSD during that period. For years, as FSD’s AI models grew more demanding, HW3 vehicles fell progressively further behind, eventually landing on FSD v12.6 in January 2025 while AI4 vehicles moved to v13 and then v14. When Musk acknowledged in January 2025 that HW3 simply could not reach unsupervised operation, and alluded to a difficult hardware retrofit.

Advertisement

The near-term offering is more concrete. Tesla’s head of Autopilot Ashok Elluswamy confirmed on today’s call that a V14-lite will be coming to HW3 vehicles in late June, bringing all the V14 features currently running on AI4 hardware. That is a meaningful software update for owners who have been frozen at v12.6 for over a year, and it represents genuine effort to keep older hardware relevant. Unsupervised FSD for vehicles is now targeted for Q4 2026 at the earliest, with Musk describing it as a gradual, geography-limited rollout.

For HW3 owners, the over-the-air V14-lite update is welcomed, and the discounted trade-in path at least acknowledges an old obligation. What happens next with the trade-in pricing will define how this chapter ultimately gets written. If Tesla prices the hardware path fairly, acknowledges what early adopters are owed, and delivers V14-lite on the June timeline it committed to today, it has a real opportunity to convert one of the longest-running sore subjects among early adopters into a loyalty story.

Continue Reading