Whoopsy powerline II+ strength tested college student style (and car talk)

:thinking: I don’t understand. Self driving cars are infinitely more safe then human navigated cars.

You’ll have to get used to self driving cars because one day manually controlled cars will be illegal on streets. And speed limits will be doubled. City traffic will move much faster.

Whenever it happens, we will see. Have you watched the movie I, Robot. Not that I am Will Smith :rofl::rofl: … but I share same thoughts as his :smiley:

1 Like

Nope, never seen it. I saw Gemini man a few weeks ago. It was only okay.

good to see anker was able to stand by their warranty.

This is science, you can’t improve the human, not unless you’re into eugenics or GMO humans.
So a human has slow reactions and is distracted.

Machines will just get faster and faster and given the dedicate driving job. Fewer decisions to make means those decisions are faster.

So humans won’t get any better than driving, but self-driving cars will. The debate is when.

Initially self-driving will be confined controlled situations, then into the open road, and be an option. Then the self-driving accident level will cross from above human (now) to below human (future). Then the insurance companies will charge more for insurance to own a human-driven car, so that pushes human-driven levels down, self-driving up. Then that uptick in usage will cause more self-driven miles, more situations where self-driving makes an error are found and fixed. You then get exponential safety improvement. A network mesh can make cars talk with each other so, say, round a tight bend if there is stationary traffic, the last car round the bend is telling your car to slow down, something a human with eyes and only sees straight lines, cannot solve.

Eventually self-driving are substantially safer than human-driven and you’ll get a mass vote bias to ban human-driven, when for example a child is killed by a human and months have passed since self-driven accident. Then the law changes and human is banned from driving. The only debate is when.

When self-driving becomes safer, those humans who still want to drive will become social pariahs in that period while the law catches up.

Once you removed the human from driving, the vehicle can move between charging stations, so you remove the need for homes to be also charging locations. The self-driving vehicles cooperate to get a car to where you need it, predicting demand. Your AI home will know what you’re probably going to do next and eventually a vehicle will appear just as you’re about to need one.

Cities which don’t have parking needs, and who have banned driving, will become safer, best air quality, and the most educated live there, so the jobs move out for human-driving cities, and when you move to the job you’re even permitted to own or drive.

As finger in air, change takes a generation, children accept something their parents rejected, so of the order of 2 or 3 human reproductive cycles, so 60-90 years until humans are arrested for driving.

Too much believe in KI.
At the moment our IT-engineers are not able to program a working OS for a simple plane, still WITH pilots on board.

You all know that there are fully autonomous vehicles on the road right now… right?

2 Likes

What happens with computers program computers?
Machine Learning “ML”, the humans don’t even program it any more.

Imagine a synthetic environment where a ML aircraft system is being run from inside a fake reality of different scenarios, you can test software faster. Faster than a real aircraft piloted by humans.

But I know, Arnie killed the robots and AI argument 36 years ago

source

We were doing program verifications.
This is a very interesting field of informatics.
This was only the simple attempt to check a program by another algorithm if its does in all situations this what it should do…
Not easy to realize, but works.
Dont think that many “programs” got such an verification.