Author |
Message |
Vladimir Ivanovich Kirillov
Grand Master Username: soviet
Post Number: 541 Registered: 2-2013
| Posted on Friday, 01 July, 2016 - 14:57: | |
A Tesla self driven car has just rewarded the passenger by killing him after a crashing with a truck in Florida. No doubt lawyers with large fins will do exceedingly well out of this one. |
richard george yeaman
Grand Master Username: richyrich
Post Number: 541 Registered: 4-2012
| Posted on Friday, 01 July, 2016 - 18:09: | |
I have to agree with you on this one Vladimir I think that technology at this level is a step too far I personally would not feel safe in such a vehicle and my sympathy is with the family of the deceased. Richard. |
David Gore
Moderator Username: david_gore
Post Number: 2096 Registered: 4-2003
| Posted on Friday, 01 July, 2016 - 22:42: | |
The news item I saw regarding this accident said the cause was minimal colour difference between the road environment and the white colour of a semi-trailer that turned across the lane the Tesla was travelling in and the car went under the trailer. The subframe of the trailer was at the windscreen height of the Tesla and I presume the passenger was decapitated by the trailer sub-frame. I saw the aftermath of a similar crash between a Range Rover and a coal truck in fog near Wollongong 30 years ago where the driver and passenger were decapitated and it was not a sight you would want to see - another coal truck and I were first arrivals after the accident. It appears to me the Tesla system relies on colour differentials and this was the prime cause of the failure to pick up the turning trailer. It is probable the guidance system needs a second hazard recognition system such as radar to identify hazards that a colour recognition system fails to identify correctly. |
richard george yeaman
Grand Master Username: richyrich
Post Number: 542 Registered: 4-2012
| Posted on Saturday, 02 July, 2016 - 00:08: | |
HI David as you know my wife is blind, Recently she has been given training to use colour ID using her phone the results achieved depends on the amount of light that there is available when the flash function of the camera is used like a torch the results are much more accurate and if she gets the colour wrong nobody gets killed. Richard. |
David Gore
Moderator Username: david_gore
Post Number: 2097 Registered: 4-2003
| Posted on Saturday, 02 July, 2016 - 09:14: | |
Richard, It is incredible that modern technology is being used for worthwhile purposes such as the modified phone provided to your wife. While we "oldies" make fun of the younger generation's obsession with "selfies" and texting, the same technology is now helping those who can benefit from the additional capabilities of the same technology. I can only imagine the change for the better your wife is experiencing as she explores the capabilities of the technology provided to her. We often do not fully appreciate the benefits we receive in life until we lose them. |
Robert Noel Reddington
Grand Master Username: bob_uk
Post Number: 1008 Registered: 5-2015
| Posted on Saturday, 02 July, 2016 - 23:29: | |
Self driving cars are a step to far. Yes we can do this but why. I LIKE the physical connection between me touching the controls steering it turning switches etc. The shadow may not feel dynamic with lots of feed back but piloting the machine still gives a smile. Legally who exactly was driving. Also you have to hover ones hands over the steering wheel when on autopilot, might as well steer the car. Arm ache. The fans of autopilot somehow think the computer can do it better and any human work moving muscles is bad. On trucks some times one has to get over the steering and put some effort into the steering, this can be quite enjoyable. One journey I used to make was from Bouremouth football to cop shop with the 10 cell van. Naughty fans getting over excited. To get down this lane 30 to 40 mph running the bus deep into the corners and then 20 mph and over the steering with lots of lock to keep the rear of the bus well left clear of on comming traffic. Get it right and its feels nice, make a hash and you come all most to a stop. Or end up clipping a car, or worse. I don't use Facebook. I use email. My cousin got a nasty troll saying he had cheated on his wife when she was dying a few years back. The culprit evenutally said it was just a wind up. One could thump him but he still does it. He seems to enjoy trouble making and being awarkward. Also he kept saying that the local police authority were stealing money from fines. Absolutely impossible, the accounts are audited very often and trebled checked to the penny, by lots of people including central government. It's fantastic the stuff they come up with it assist the blind. Love and kind regards to your wife and I hope this device works better than expected. |
Bob Reynolds
Grand Master Username: bobreynolds
Post Number: 410 Registered: 8-2012
| Posted on Sunday, 03 July, 2016 - 03:46: | |
"Self driving cars are a step to far. Yes we can do this but why." Because not everybody can drive, or might be disabled, etc! "The fans of autopilot somehow think the computer can do it better and any human work moving muscles is bad." I am sure that the technology can already do it a lot better and safer than some of the young boy racer types who have just passed their test and trying to show off; or someone who is drunk, etc. I think ultimately (not yet) the roads will be safer with self-driving cars, considering some of the idiots who are speeding around causing horrendous accidents; overtaking on blind corners; pulling out of junctions without looking properly; driving along motorways in the wrong direction, etc.; although transport will probably be a lot slower. As I understand it, the Tesla wasn't a self-driving car (they're not legal yet are they?). It just had a lane-assist feature which helped the driver to change lanes. I am sure they have plenty of get-out clauses in the instructions: "Use this facility at your own risk", etc. I am very surprised that the system depended purely on visual feedback (if that is indeed the case) and didn't include some sort of radar device. Even simple parking sensors use an ultrasonic radar system. It's not difficult or expensive technology. |
Robert Noel Reddington
Grand Master Username: bob_uk
Post Number: 1011 Registered: 5-2015
| Posted on Sunday, 03 July, 2016 - 07:45: | |
Unfortunately stupid driving is common. Fortunately the odds are that they will get away with it physically. One day though they come unstuck because of the same odds. The idiots can drive but can't be bothered with doing it properly. My mates son is a good driver but given a chance he gets to lazy to turn on indicators and not slow down for corners etc, before you know it his driving is a shambles, we tell him and does it properly, it's a bit like being tidy, fine when prompted but left alone and the sink is full of washing up. A lot of bad driving is like a bad habit. Self drive would be good for disabled drivers etc. |
Patrick Ryan
Prolific User Username: patrick_r
Post Number: 285 Registered: 4-2016
| Posted on Sunday, 03 July, 2016 - 09:06: | |
Interesting topic gents. I work for the Volvo Group, and high tech is indeed very much in all of our trucks. GPS guided transmissions, that change gear better than any human and knows, better when to change gear based on GPS data, engine torque and truck load. Coast or roll functions are also there to assist in economy. Electric steering, and even steering that will correct the truck by its self if one runs over the fog line or service lane line. We have radars and cameras all over the trucks galore, dynamic braking, and stability control, even autonomous braking but they are just an aide, it is still up to the driver to assess based on the input from all of the tech all over the truck. It then comes down to people like me to instruct driver how to operate the truck utilising all of these inputs now available to them. Sometimes this can take a whole day per driver. It's amazing on return visits to the fleet how many drivers turn off these aides. Sometimes the fleet get us to program out an off switch. Fleets are indeed in favour of these driver aides. They are very much in favour, but a lot of drivers are not. We even have new systems that allow any fleet manager to "log into" the truck and view the virtual dash board that allows them to see everything including live fuel flow/usage readings. The systems will even send a pop up to the managers computer screen or a txt to their phone if the driver does something out of the ordinary. But still these are just aides. I still believe someone needs to be "responsible" for what they are in control of, but by all means be better informed as a driver (especially a driver of a heavy vehicle) by utalising every aide or high tech input they have at their disposal. |
Brian Vogel
Grand Master Username: guyslp
Post Number: 1992 Registered: 6-2009
| Posted on Sunday, 03 July, 2016 - 09:11: | |
Mr. Reynolds, Hear hear!! There is plenty of evidence, in plenty of very complex command and control situations, that computers can and do outstrip humans in managing things safely. To me it's a matter of getting all the sensors in place and working correctly more than anything else. The basic coding already exists, and will only get more and more sophisticated. From what's been published so far about the Tesla accident it is entirely possible that this driver could have done the same thing had the autopilot not been on. He clearly wasn't monitoring anything himself, and you are correct that this is not billed as absolutely full autonomous driving, but it is much closer than lane-assist. This guy has a ton of YouTube videos out there prior to this horrible accident of him using the system in many different settings. He also ignored his own advice, which was to always be ready, at a moment's notice, to take control back. Brian |
Christian S. Hansen
Prolific User Username: enquiring_mind
Post Number: 298 Registered: 4-2015
| Posted on Sunday, 03 July, 2016 - 11:08: | |
Hmmm... Truly autonomous control is one thing, but a semi-autonomous system that relies on the "human" maintaining constant vigilance if necessary to regain control is in the same bucket as "useless". It is one thing for the "computer" to request an over ride in order to update, or to do diagnostics, or whatever, but entirely another to bleep out in a computer generated fake voice "Alert! Alert! Input error! Returning control to you!" and the poor human has about 1/10 of a second to regain situational awareness and avoid the impending disaster. If you can't "not pay attention", what is the point?? Anyone remember "HAL" from the movie "2001"? "HAL" was "IBM" with one letter backed off. With the emergence of AI (artificial intelligence?) I can imagine a "conveyance" developing a malevolent streak that gets its kicks from playing "chicken" with its human passengers...dangling impending disaster, and then avoiding it at the last second as the "occupant" sweats bullets wondering if it is time to override and regain control, or in the case of what "HAL" did, prevent your override! I suppose that it is inevitable, but with the attendant loss of personal freedom since our movements will be tracked and controlled 24/7 by some computer in the sky. Big Brother is watching you! |
Robert Noel Reddington
Grand Master Username: bob_uk
Post Number: 1013 Registered: 5-2015
| Posted on Sunday, 03 July, 2016 - 21:45: | |
I agree sort of with Mr Hansen. If one has to monitor drive the self drive car then one might as well get on and do the whole job Is not an auto box as good a computer. Almost. Mechanical computer?. Naturally say computer and electronics are first thought. Marine and aviation give more time. A plane at 30,000 ft has 4.5 miles before it hits the ground. A car may have just yards before collision. Boats can get problems but 30 miles off shore gives plenty of time for the crew to sort out the computer glitch or go entirely manual. Press the big red button? Incidentally a retired ships engineer told me they used to do this to check. Also part of security was loose lips cost lives and all crew have beady eyes on everything. Then there's fat finger mistakes. Rogue computers have been a good movie subject let's hope it doesn't happen. |
Geoff Wootton
Grand Master Username: dounraey
Post Number: 1301 Registered: 5-2012
| Posted on Wednesday, 06 July, 2016 - 13:52: | |
Hi Folks We're safely installed in Tulsa Oklahoma. It's a surprisingly beautiful state; rolling green hills and country roads, not unlike the ones you find in the UK, but alas, without the country pubs. Great driving roads for the Rolls (when it gets here). About three hundred miles out we hit a heavy storm, the type that feels like someone is training a fire hose on the windscreen. I slowed down to 30 mph and followed the barely visible white lines of the road, looking out for slow moving or stationary vehicles ahead. It occurred to me this is what a Tesla's computer "sees" all the time. There are youtube videos of the cars getting confused when the white lines get too faint. I was surprised at this. I had assumed they were doing a full image analysis of the road ahead, multiple times per second and had built in all the cues we as humans use subconsciously every time we drive our cars. Apparently not. I suspect if they released the algorithms they use to control their cars we would all be a little shocked. I think there is a great deal of hype surrounding these cars. I also think it is wholly irresponsible to release them as beta test versions to the general public. In my view, fully autonomous control of a motor vehicle is 30 - 50 years away. When I worked in IT we used to refer to unexpected negative behavior of a program as a "gotcha". We would rewrite the code and reboot the system. No one got hurt. It's a bit difficult to reboot someone who's just been decapitated by a semi trailer. I think anyone allowing themselves to be used as a beta tester in one of these vehicles is bonkers. In 30 years time when the work has been done, the software engineers will look back and wonder what on earth we were thinking about. Just my opinion. Geoff |
Geoff Wootton
Grand Master Username: dounraey
Post Number: 1305 Registered: 5-2012
| Posted on Wednesday, 13 July, 2016 - 10:13: | |
Another Tesla crash over the weekend in Montana where the driver blames the autopilot. Luckily he was ok, as was the driver who rolled his model X on the Pennsylvania turnpike, again blaming the autopilot. Tesla are obviously disputing their autopilot feature is to blame in both crashes. Some of the Tesla crashes are obviously due to driver error, like the 18 year old who launched her fathers car in Pullach Germany. She managed to fly 82 ft in the air before crash landing the 5000 lb car in a field. Luckily again, no-one was seriously injured. The crash did prove how well built the cars are, to survive such an impact. It will be interesting to see how the stats develop as the sample size of miles covered increases. Geoff |
Brian Vogel
Grand Master Username: guyslp
Post Number: 2008 Registered: 6-2009
| Posted on Wednesday, 13 July, 2016 - 14:04: | |
One thing that Tesla is going to have going for it, unless it has actually created an inferior product in its software, is the amount of data that they collect, both from each individual car and collectively. I have the feeling that we're going to be seeing virtually every Tesla accident start being "Autopilot's fault," when, in reality, few of them will be when it's used as intended and driver's don't misbehave. It appears that, for all practical intents and purposes, these cars come equipped with their own "black box" as far as what can be teased out. Brian |
Geoff Wootton
Grand Master Username: dounraey
Post Number: 1306 Registered: 5-2012
| Posted on Wednesday, 13 July, 2016 - 15:30: | |
In practice accidents will never be Tesla's fault as drivers are instructed to always have their hands on the steering wheel and ready to take control immediately. This removes all responsibility for accidents from Tesla's autopilot feature. The problem is human psychology. After a few thousand miles of impeccable behavior drivers will get a false sense of security and naturally concentrate less, as their trust in the autopilot feature gets greater. So the driver is just waiting for a black swan event to occur, and then disaster. Tesla stated in the case of the tractor/trailer crash that neither the autopilot or the driver saw the trailer, as it was painted white against a white skyline. I think this is true, but mainly because I doubt the driver was even looking where he was going. Does anyone seriously think a driver who is concentrating on the road ahead would not be able to see a trailer cutting across the lanes, regardless of the color of the trailer or background. I just don't believe it. If that man had not purchased a Tesla and allowed himself to be seduced by the autopilot feature, he would be alive today. I am in no doubt about that. It seems to me it is only a matter of time before Tesla sees some very large lawsuits coming it's way. Geoff |
Brian Vogel
Grand Master Username: guyslp
Post Number: 2009 Registered: 6-2009
| Posted on Wednesday, 13 July, 2016 - 23:34: | |
Geoff, I agree entirely about the human psychology part. I've been following the whole alpha and beta development of this sort of function fairly closely. Google observed precisely what you describe when it was toying with semi-autonomous, and ditched it as a result and began to focus on fully autonomous technologies exclusively. As to your final comment, I agree that's inevitable. That being said, that would have been inevitable for reasons other than this particular one, too. The United States has become hyperlitigious during my lifetime in virtually every way, and I don't see that changing. Not that some lawsuits are not entirely appropriate, but we've reached the point where people can and do sue over the ridiculous. Brian |
Robert Noel Reddington
Grand Master Username: bob_uk
Post Number: 1031 Registered: 5-2015
| Posted on Thursday, 14 July, 2016 - 05:41: | |
Judge Judy will sort out the ridiculous claims. Engineering is about making life easier. A bridge to save walking further. The wheel. I prefer to put in the physical effort and turn and press stuff. However going to sleep on the back seat while the car does the boring motorway stuff appeals to me. |
Vladimir Ivanovich Kirillov
Grand Master Username: soviet
Post Number: 558 Registered: 2-2013
| Posted on Thursday, 14 July, 2016 - 07:55: | |
That's right Bob Judge Judy will make these autopilot loons walk carrying bricks in barbed wire handbags!!! |
Lluís Gimeno-Fabra
Grand Master Username: lluís
Post Number: 413 Registered: 8-2007
| Posted on Monday, 18 July, 2016 - 06:55: | |
Ps I have driven a Tesla p85 with autopilot... You have to be mad to let that thing drive alone. In reality the function is made to assist the cruise control when changing lanes or when others change lanes and in that sense it's great, just like a gps is great at navigating but it is never intended to switch you brain off... |
Robert Noel Reddington
Grand Master Username: bob_uk
Post Number: 1093 Registered: 5-2015
| Posted on Thursday, 11 August, 2016 - 05:44: | |
Another accident This time in China. a Tesla car hit a parked car while on steering assist. The driver was told it was auto pilot which Tesla say is wrong. The system detected no hands on wheel. The driver was on the phone. |
Patrick Ryan
Grand Master Username: patrick_r
Post Number: 414 Registered: 4-2016
| Posted on Thursday, 11 August, 2016 - 07:44: | |
I can see this technology being banned soon, thanks to idiots who don't know how to use the technology correctly. |
Robert Noel Reddington
Grand Master Username: bob_uk
Post Number: 1100 Registered: 5-2015
| Posted on Friday, 12 August, 2016 - 09:12: | |
Today I drove through a complicated road works with chicanes etc. I swooped through the whole lot at 15mph. it was very easy. i didnt need steering assist and judging by the accident in China steering assist would not have worked. traction control is a must on some cars because they will go out of control otherwise. Jezza made a comment to Ian Wright on top gear. Ian Wright smashed up a Ferrai by switching off the traction control. Jezza said no you dont want do that. Why it has a switch is a mystery to me. no technology is foolproof. |
StevenBrown
Frequent User Username: stevenbrown
Post Number: 76 Registered: 10-2008
| Posted on Saturday, 27 August, 2016 - 05:10: | |
Hate to state the obvious. But a self driving car is only safe when the majority of the vehicles on the road are also self driving. Even as a driving aid, this tech is almost pointless without some training to use it properly. The same as Ferrari and Porsche here, both offered track time and performance instruction on new purchases. Teach how to use the performance. The same is needed with this technology or we'll have accidents. |
Geoff Wootton
Grand Master Username: dounraey
Post Number: 1400 Registered: 5-2012
| Posted on Friday, 09 September, 2016 - 02:47: | |
Another Tesla fatality, this time in Holland. The cause is not yet known, however this was a single vehicle crash where the car left the road and hit a tree, which may raise eyebrows. A spokesman for Tesla was quoted as saying "There are going to be educational moments". Geoff |
Bob Reynolds
Grand Master Username: bobreynolds
Post Number: 416 Registered: 8-2012
| Posted on Friday, 09 September, 2016 - 07:15: | |
But not for those killed. |
Brian Vogel
Grand Master Username: guyslp
Post Number: 2063 Registered: 6-2009
| Posted on Friday, 09 September, 2016 - 07:46: | |
Well, unless this wreck in the Netherlands is somehow directly connected to the Autopilot feature, it sounds like a fairly common sort of accident in my neck of the woods (no pun intended). According to this report by Fortune Autopilot was not engaged at the time of the accident. The number of "car versus tree" fatalities that result from high-speed driving on "low speed" roads and sailing off into a large tree is larger than one would hope it would be. It also seems to frequently involve young men and muscle cars. Brian |
David Gore
Moderator Username: david_gore
Post Number: 2196 Registered: 4-2003
| Posted on Friday, 09 September, 2016 - 08:21: | |
"The number of "car versus tree" fatalities that result from high-speed driving on "low speed" roads and sailing off into a large tree is larger than one would hope it would be. It also seems to frequently involve young men and muscle cars. " Add driver fatigue from not having appropriate rest breaks, driver inexperience, city drivers on country roads traveling at speeds they are not used to, swerving to avoid animals [very frequent in Australia], using mobile/cell phones and avoiding oncoming drivers on the wrong side of the road for whatever reason. |
Robert Noel Reddington
Grand Master Username: bob_uk
Post Number: 1118 Registered: 5-2015
| Posted on Monday, 12 September, 2016 - 09:14: | |
A recent study by Swansea uni in Wales UK. older drivers and women have less accidents. I am 65 on Xmas day and I don't have near misses just minor inconveniences where I have to slow down and maybe stop. I never go faster to get out of trouble due to when one hits the tree one will be going faster. |