Do you trust Google Maps blindly? Maybe you get caught in a tragic freefall like this

Do you trust Google Maps blindly? Maybe you get caught in a tragic freefall like this

  • Google Maps has been accused of showing incorrect road information in several cases, resulting in major accidents and even deaths.
Google Maps has been accused of showing incorrect road information in several cases, resulting in major accidents and even deaths. (HT_PRINT)

In a tragic incident in Uttar Pradesh, three people died when Google Maps wrongly took their car to an under-construction bridge, causing the vehicle to fall into a river. The incident took place in Bareilly district of Uttar Pradesh on Saturday when the victims were on their way to attend a wedding. They were relying on Google Maps to reach the venue, when the GPS took them to an incomplete flyover.

This incident has once again raised questions on the credibility of Google Maps. It is one of the most popular and widely used navigation systems by motorists, cyclists and pedestrians. However, in many cases, navigation systems have been accused of showing incorrect data, resulting in major accidents like the latest one in UP.

Earlier this year, a Ford Endeavor SUV in Telangana literally fell into a canal after Google Maps misled the driver. In another incident, a man in the US lost his life because Google Maps gave him wrong route information, leading him to a broken bridge. Similar incidents have happened all over the world, raising questions on the reliability of this navigation system.

Why can't Google Maps be blindly trusted?

Google Maps, Apple Maps and Waze, among other navigation applications, have become an indispensable weapon of the modern driver. These navigation applications make it convenient to navigate to a specific location. Applications like Google Maps help drivers navigate to their destination with turn-by-turn navigation instructions.

However, despite the convenience they bring, the Achilles heel of these navigation applications is the map data. Despite developers trying to make maps as accurate as possible, applications do not always show accurate information. There are disruptions, such as incomplete roads and construction work, that applications may not be fully aware of. Furthermore, despite some applications like Waze offering data crowdsourcing in an effort to make changes in real time, updating map data in real time is still impossible at some point.

In such cases, it is always best to trust the locals rather than blindly trust navigation applications, as verbal information is still more accurate than map data, especially in remote locations.

Get information about upcoming cars in India, electric vehicles, upcoming bikes in India and cutting-edge technology that is changing the automotive landscape.

First publication date: 25 November 2024, 10:37 am IST

Source link

Tesla braces for its first trial involving Autopilot fatality

Tesla braces for its first trial involving Autopilot fatality

Self-driving capability is central to Tesla’s financial future, according to Musk, whose own reputation as an engineering leader is being challenged with allegations by plaintiffs in one of two lawsuits that he personally leads the group behind technology that failed. Wins by Tesla could raise confidence and sales for the software, which costs up to $15,000 per vehicle.

Tesla faces two trials in quick succession, with more to follow. The first, scheduled for mid-September in a California state court, is a civil lawsuit containing allegations that the Autopilot system caused owner Micah Lee’s Model 3 to suddenly veer off a highway east of Los Angeles at 65 miles per hour, strike a palm tree and burst into flames, all in the span of seconds.

The 2019 crash, which has not been previously reported, killed Lee and seriously injured his two passengers, including a then-8-year old boy who was disemboweled. The lawsuit, filed against Tesla by the passengers and Lee’s estate, accuses Tesla of knowing that Autopilot and other safety systems were defective when it sold the car.

Musk ‘de facto leader of Autopilot team

File photo of Tesla CEO Elon Musk. (REUTERS)

The second trial, set for early October in a Florida state court, arose out of a 2019 crash north of Miami where owner Stephen Banner’s Model 3 drove under the trailer of an 18-wheeler big rig truck that had pulled into the road, shearing off the Tesla’s roof and killing Banner. Autopilot failed to brake, steer or do anything to avoid the collision, according to the lawsuit filed by Banner’s wife.

Tesla denied liability for both accidents, blamed driver error and said Autopilot is safe when monitored by humans. Tesla said in court documents that drivers must pay attention to the road and keep their hands on the steering wheel. “There are no self-driving cars on the road today,” the company said.

The civil proceedings will likely reveal new evidence about what Musk and other company officials knew about Autopilot’s capabilities – and any possible deficiencies. Banner’s attorneys, for instance, argue in a pretrial court filing that internal emails show Musk is the Autopilot team’s “de facto leader”.

Tesla and Musk did not respond to Reuters’ emailed questions for this article, but Musk has made no secret of his involvement in self-driving software engineering, often tweeting about his test-driving of a Tesla equipped with “Full Self-Driving” software. He has for years promised that Tesla would achieve self-driving capability only to miss his own targets.

Tesla won a bellwether trial in Los Angeles in April with a strategy of saying that it tells drivers that its technology requires human monitoring, despite the “Autopilot” and “Full Self-Driving” names. The case was about an accident where a Model S swerved into the curb and injured its driver, and jurors told Reuters after the verdict that they believed Tesla warned drivers about its system and driver distraction was to blame.

Stakes higher for Tesla

The stakes for Tesla are much higher in the September and October trials, the first of a series related to Autopilot this year and next, because people died.

“If Tesla backs up a lot of wins in these cases, I think they’re going to get more favorable settlements in other cases,” said Matthew Wansley, a former General Counsel of nuTonomy, an automated driving startup and Associate Professor of Law at Cardozo School of Law.

On the other hand, “a big loss for Tesla – especially with a big damages award” could “dramatically shape the narrative going forward,” said Bryant Walker Smith, a law professor at the University of South Carolina.

In court filings, the company has argued that Lee consumed alcohol before getting behind the wheel and that it is not clear whether Autopilot was on at the time of crash.

Jonathan Michaels, an attorney for the plaintiffs, declined to comment on Tesla’s specific arguments, but said “we’re fully aware of Tesla’s false claims including their shameful attempts to blame the victims for their known defective autopilot system.”

In the Florida case, Banner’s attorneys also filed a motion arguing punitive damages were warranted. The attorneys have deposed several Tesla executives and received internal documents from the company that they said show Musk and engineers were aware of, and did not fix, shortcomings.

In one deposition, former executive Christopher Moore testified there are limitations to Autopilot, saying it “is not designed to detect every possible hazard or every possible obstacle or vehicle that could be on the road,” according to a transcript reviewed by Reuters.

In 2016, a few months after a fatal accident where a Tesla crashed into a semi-trailer truck, Musk told reporters that the automaker was updating Autopilot with improved radar sensors that likely would have prevented the fatality.

But Adam (Nicklas) Gustafsson, a Tesla Autopilot systems engineer who investigated both accidents in Florida, said that in the almost three years between that 2016 crash and Banner’s accident, no changes were made to Autopilot’s systems to account for cross-traffic, according to court documents submitted by plaintiff lawyers.

The lawyers tried to blame the lack of change on Musk. “Elon Musk has acknowledged problems with the Tesla autopilot system not working properly,” according to plaintiffs’ documents. Former Autopilot engineer Richard Baverstock, who was also deposed, stated that “almost everything” he did at Tesla was done at the request of “Elon,” according to the documents.

Tesla filed an emergency motion in court late on Wednesday seeking to keep deposition transcripts of its employees and other documents secret. Banner’s attorney, Lake “Trey” Lytal III, said he would oppose the motion.

“The great thing about our judicial system is Billion Dollar Corporations can only keep secrets for so long,” he wrote in a text message.

First Published Date: 28 Aug 2023, 20:00 PM IST


Source link