Meanwhile about 5 people a day die on UK roads caused mainly by human error. Globally about 3700.
True, but then how many autonomous cars are on the road, including those being tested? Most fatal accidents involving human-driven vehciles are down to poor judgement and/or laziness / inattentive behaviour.
Nearly 100% of fatal accidents caused by autonomous vehicles is caused by inherrent flaws in the tech, whether hardware and/or software.
That means if everyone behaved responsibly, very few fatal accidents would occur that were due to the vehicle or driver of non autonomous vehicles - mainly due to sudden onset of illness (heart attacks, strokes, etc) or catastrophic failures of parts, e.g. tyres, etc, etc.
With autonomous vehicles, it would likely be much high, albeit at a lower rate than currently for human-driven vehicles.
In my view, unless and until the hardware and software issues are ironed out to the extent that situations that a human would or should easily avoid an accident are also avoided by the autonomous vehicle, such systems should not be legally allowed aside from small-scale testing.
And such systems still seemingly cannot safely / reliably operate under many circumstances / conditions.
Edited by Engineer Andy on 18/08/2022 at 12:25
|
With autonomous vehicles, it would likely be much high, albeit at a lower rate than currently for human-driven vehicles.
If they are better than human drivers why would you not allow them?
|
With autonomous vehicles, it would likely be much high, albeit at a lower rate than currently for human-driven vehicles.
If they are better than human drivers why would you not allow them?
They may be 'better', but as yet only because humans choose not to be. If such autonomous systems have literally fatal flaws, it means that every time they come across certain situations, they make serious errrors, possibly resulting in a serious accident.
At least with humans with training and human abilities we have the ability not to make the same mistake. That many do not just means they shouldn't be on the road.
With a computerised system, there would be no choice in the matter and (currently) the risk of an serious accident would be higher (minor accidents perhaps not), except for reckless drivers (who choose to drive that way) or those whose skills have seriously diminished with age / health and again shouldn't be driving.
|
With a computerised system, there would be no choice in the matter
That may change soon but the video is 11 months ago so not certain how far they have gone since then.
Tesla's AI chip REVEALED! (Project Dojo) - YouTube
If already seen it just ignore, but it is interesting imo.
|
|
With autonomous vehicles, it would likely be much high, albeit at a lower rate than currently for human-driven vehicles.
If they are better than human drivers why would you not allow them?
They may be 'better', but as yet only because humans choose not to be. If such autonomous systems have literally fatal flaws, it means that every time they come across certain situations, they make serious errrors, possibly resulting in a serious accident.
At least with humans with training and human abilities we have the ability not to make the same mistake. That many do not just means they shouldn't be on the road.
People aren't going to get any better at driving though - so why not let the cars do the driving - and overall you'll have less accidents...and cars will get better at driving over the years as well as they develop more. At the moment there is no fully autonomous vehicles that can go anywhere they like.
|
<< People aren't going to get any better at driving though - so why not let the cars do the driving - and overall you'll have less accidents...and cars will get better at driving over the years as well as they develop more. >>
The designers of autonomous cars have a lot of catching up to do. There is over a century of experience with human drivers, but it will take quite a lot of beta-testing to discover all the scenarios that designers of AI have not thought of. One day the autonomous car may become 'perfect' but I won't be able to wait that long.
|
<< People aren't going to get any better at driving though - so why not let the cars do the driving - and overall you'll have less accidents...and cars will get better at driving over the years as well as they develop more. >>
The designers of autonomous cars have a lot of catching up to do. There is over a century of experience with human drivers, but it will take quite a lot of beta-testing to discover all the scenarios that designers of AI have not thought of. One day the autonomous car may become 'perfect' but I won't be able to wait that long.
I personally wouldn't like to be anywhere near (never mind in as a passenger) one of these 'beta test' autonymous vehicles. The problem is I can't see them either being easily identified nor to avoid.
They should only be let onto the road in decent numbers via normal sales (as opposed to test-only vehicles in limited numbers with specially trained operatives) if and when all the issues are ironed out. The public don't deserve to be used as proverbial guinea pigs, whether buying one or just being on the road with them.
I'd put good money on many 'citizen users' being reckless whilst 'operating' (or not) such vehicles by engaging in behaviour such as watching TV or playing games on a phone, eating/drinking, doing their hair/make-up etc - things certainly not condusive to being able to 'take control' instantly if a situation arises where the car chucks its proverbial toys out of the pram and either doesn't know what to do or makes a mistake.
I suspect many of thes people will think that such cars are designed so they can specifically DO such acts. The Knight 2000 they are not.
|
|
|
|
|
3700 globally? Try 1.3 million www.who.int/news-room/fact-sheets/detail/road-traf...s
These intermediate autonomous - Level 2, Level 3 and Level 4 (to a degree) - still require human interaction, monitoring and decision-making. I don't see any advantage in that for drivers. The idea of sitting there, ready to step in immediately if the software doesnt make a natural or predictable decision is a terrible user experience.
|
Future historians will possibly look back with astonishment at permitting self-driving cars on single carriageway roads where vehicles often closely pass each other at combined approaching speeds of over 100mph.
On the other hand, they might look back with astonishment that we allowed cars driven by mere humans of variable states of competence on single carriageway roads where vehicles often....etc.
|
|
3700 globally? Try 1.3 million www.who.int/news-room/fact-sheets/detail/road-traf...s
Sorry should have stated 3700 daily and yes you are correct that equates to 1.3 million annually
|
<< Sorry, should have stated 3700 daily and yes you are correct that equates to 1.3 million annually. >>
You have no need to apologise as you said 'a day' earlier in the same sentence !
World population continues to increase despite occasional attempts to offset it :-))
|
I was knocked off my bike (which is now broken) yesterday by a yoof on scooter, cutting a crossroads corner fairly fast , but unusually long sightlines meant I had time to shout "WTF are you DOING? (in English, sadly) before impact.
He appeared to be homing on my body heat.
Perhaps some kind of hybrid system?
|
cutting a crossroads corner fairly fast
Lots of any vehicles do that now without looking, and do cause crashes, but as for scooters -the electric ones are becoming a pain- often two standing on board and those really motor along
I gather BMW are, from 2024, allowed to use fully autonomous cars from then on in Germany, possibly elsewhere though not sure about that at the moment?
|
In 120 years of human driven vehicles, despite training,, improved road layouts, speed limits, improved vehicle technology (brakes etc) we have still not managed to eliminate human error as the principal cause of incidents (accidents they are not!).
Technology has made major strides in delivering usually cheaper, better and safer solutions. It does this by designing failure out of systems and products.
I have far more confidence that technology will reduce road incidents though improved software, design and engineering, than human inadequacies will somehow be overcome. One definition of stupidity is continually trying that which has already been proven to fail!
Edited by Terry W on 19/08/2022 at 18:53
|
In 120 years of human driven vehicles, despite training,, improved road layouts, speed limits, improved vehicle technology (brakes etc) we have still not managed to eliminate human error as the principal cause of incidents (accidents they are not!).
Bit longer than that.
www.youtube.com/watch?v=frE9rXnaHpE
www.youtube.com/watch?v=esvihQymZWI
Edited by edlithgow on 20/08/2022 at 00:46
|
|
In 120 years of human driven vehicles, despite training,, improved road layouts, speed limits, improved vehicle technology (brakes etc) we have still not managed to eliminate human error as the principal cause of incidents (accidents they are not!).
Technology has made major strides in delivering usually cheaper, better and safer solutions. It does this by designing failure out of systems and products.
I have far more confidence that technology will reduce road incidents though improved software, design and engineering, than human inadequacies will somehow be overcome. One definition of stupidity is continually trying that which has already been proven to fail!
The problem is that the manufacturers of such systems for cars have admitted on many occasions in the recent past that their systems still cannot cope with all possible situations on the roads. What then? At least a human being has the capacity to deal with (if they were taught properly and passing a test wasn't fabricated) such incidents.
That we are seemingly going to be used as proverbial guinea pigs for such systems, putting lives at risk with the chances of it not resoluting in death no better than flipping a coin every time doesn't fill me with confidence. History is replete with so-called experts over-confidence in new technology being accepted by naive and/or corrupt politicians who then regret what they allowed.
On the other side of the coin, many of the same advocates of driverless cars think that driverless trains - which HAS been proven over decades to be perfectly safe and which has a significantly more limited scope of operational circumstances to deal with - should not be implemented UK-wide, because a few militant railway workers who pretend it is unsafe want to keep their big salaries and grip on power.
Oh, the irony...
Edited by Engineer Andy on 21/08/2022 at 09:09
|
The problem is that the manufacturers of such systems for cars have admitted on many occasions in the recent past that their systems still cannot cope with all possible situations on the roads. What then?
You do as Tesla are doing, they've built a supercomputer to train AI to drive in all scenarios, reason why the cars send info to main centre about all scenarios the car has been through during any drive it does at any time, Tesla reckon full autonomy and fool proof by 2030, possibly sooner depending on the states they can drive in
|
The problem is that the manufacturers of such systems for cars have admitted on many occasions in the recent past that their systems still cannot cope with all possible situations on the roads. What then?
You do as Tesla are doing, they've built a supercomputer to train AI to drive in all scenarios, reason why the cars send info to main centre about all scenarios the car has been through during any drive it does at any time, Tesla reckon full autonomy and fool proof by 2030, possibly sooner depending on the states they can drive in
The ultimate test for an autonomous car computer system would be to somehow submit it to the hazard perception test as used in the UK driving theory test.
That's only fair, you cannot get to take the UK practical driving test without first passing the theory test which includes the hazard perception test so why should self-driving cars get a free pass just because they are supposed to be better than a human driver?
From what I have seen of the onboard footage of self-driving cars their systems are set up to react to on-road hazards instead of scanning and planning to anticipate and act on the situation before it becomes a real hazard which is the whole point of hazard perception training.
I appreciate that very few if any commenters on here have had to take and pass this test, I had to, as part of ongoing instructor testing, so here are some test examples -hazardperceptiontest.net/ - there are others if you search for them.
|
|
One definition of stupidity is continually trying that which has already been proven to fail!
But - - if at first you don't succeed ... The other dictum is that (at least for many people) you only learn from your own mistakes.
|
|
|
cutting a crossroads corner fairly fast
Lots of any vehicles do that now without looking, and do cause crashes, but as for scooters -the electric ones are becoming a pain- often two standing on board and those really motor along
This was a 125 or 150 cc petrol automatic, the standard type here.
The commonest electric scooters (Gogoro battery swap models) are about the sane size, weight and performance and dont seem to be much of a specific issue.=, though they are quieter
One doesn't see the foot board types on the road hardly at all, possibly due to natural selection
|
|
|
|
|
|