shrug jpg This is a topic that many people are looking for. khurak.net is a channel providing useful information about learning, life, digital marketing and online courses …. it will help you have an overview and solid multi-faceted knowledge . Today, khurak.net would like to introduce to you Self-Driving Cars Are Dangerously Confused By LED Lights . Following along are instructions in the video below:
“Here s a weird thing. I ve noticed look this vehicle has its rear lights lights on and they look normal. But i m also filming it on my jh5 just watch for a second and see what happens here let me turn off and they come back on again. What s that here are the front lights on there crazy i reckon.
I can explain what s happening here. But first i want to say why it s important and i think it doesn t matter that there s a strange interaction between my camera and the lights on the back of a car so long as it looks the way. It s supposed to look to human eyes. Then that s all that matters because ultimately it s a human driver that s going to be making decisions about safety based on the cars around them.
Except that well personally i hope that in the not too distant future. I won t have to be the human making those decisions. I want a computer to be making those decisions for me in other words. I want my self driving car.
So while led lights on cars might look normal to our analog human senses. A self driving car built up a model of the world based on digital senses. So the way it perceives led lights on cars might be closer to the footage that i captured on my digital camera so let s see if we can concoct a scenario in which that kind of digital artifact might be a problem here in the uk. The real lights of a car are red and the turn signals or indicators as we call them here are amber.
But in america turn signals can be red. So an artificial intelligence may receive this video footage and conclude the vehicle intends to turn or if it can see both lights that the vehicle is indicating a hazard and based on that information. It could make a dangerous decision. I m not saying that the turn signal system in the uk is better than the system in america.
I m just heavily implying it this isn t the only hazardous scenario. We can imagine to cook up a few more we need to understand why it s happening in the first place. And it s all to do with the way we vary the brightness of leds. So you could vary the voltage and that would vary the brightness.
But actually it s easier and cheaper to instead simply turn the led off and then turn it back on again and just do that repeatedly. But you do it so fast that it s not perceptible to the human eye. So for example if it s turning on and off 50 times a second that won t be noticeable to most people. Most of the time.
Let s look at this on a timeline. So it the led is turning on and off 50 times per second..
Then look. Each block here represents a 50th of a second and the led is on for half the time in each one of those blocks so in this scenario. The led would appear to be half as bright as if it was on continuously and if you wanted to make it appear dimmer than that then you would just have it on for a shorter fraction of that 50th of a second if you wanted it to be brighter. Then you would have it on for a larger fraction of that 50th of a second the fraction of time for which the led is on is called the duty cycle.
And this method of dimming leds. Is called pulse width. Modulation. Now.
Let s add a camera to the timeline. Here. The camera is operating at 50 frames per second. And we re showing that the shutter is only open for a small fraction of that 50th of a second so.
What the camera is seeing. While the shutter is open is that the led is on and it s on for the whole duration that the shutter is open so when you look back at the footage. You ll see that the led appears to be on which is good because that s what the human perceives. As well except.
We ve already introduced a digital artifact because from the point of view of the camera. The led appears to be on a hundred percent of the time so it will appear brighter to the camera than it would to a human. But even worse imagine we started the camera a little bit later so we re shifting everything along a little bit. Now the shutter is opened when the led is off and it will be for every single frame.
So when you look back at the footage. The idea appears to be off and now look we already have another disastrous scenario. Where the lights. Don t appear to be on at all for example.
Watching this footage you might assume that the bmw driver has neglected to use the turn signal. But a bmw driver would never do that so it must be an issue with the camera. It can also cause problems with led traffic lights. Imagine that a red light.
Telling you to stop that the camera just can t see the timeline that we ve been using up to now shows an led and a camera with the same frequency 50 hertz. But in reality that might not happen..
Particularly often for example maybe the free see if the led is going to be slightly less than the frame rate of the camera. So let s stretch out the led line to illustrate that and now you can see look sometimes. The open shutter of the camera lines up with the led when it s on and sometimes it lines up with the led when it s off so when you watch the footage back what you perceive is an led that seems to be turning on and off repeatedly. Which is exactly what i got with the mercedes here this slow apparent turning on and off of the leds is called the beep frequency practically speaking you could probably quite easily teach in artificial intelligence to recognize that this footage from the mercedes isn t a turn signal because the light is on for far too long.
And it s only off for a brief period. It doesn t look like a turn signal. But the point is there s no standardization in the leds in cars. Which is to say the duty cycle could be anything the frequency could be anything and it s trivial to concoct a beep frequency that looks just like a turn signal in fact.
This footage was captured in the wild and i think this would even fool a human observer. Which actually reminds me this isn t just a problem for self driving cars and ai. Some car manufacturers are starting to replace wing mirrors with cameras. So you look at the central screen in your car.
Instead of at your wing mirrors. And yeah. It would be easy to think that a car was turning in this scenario. There are a few more scenarios.
But i tell you about one more which is when the frequency and the frame rate are quite different you end up with this horrible really fast flicker and at the corner of your eye a human could interpret that as an emergency vehicle nearby these might seem like edge case bugs in the system and they are they re only gonna crop up very rarely but it s not like an edge case bug in a mobile phone for example. Where if it s really expensive to fix the bug you could just ignore. It it s only affecting a handful of people and they re probably just jump ship to android. If the edge case bug exists in a self driving car then it could kill or injure that handful of people so you have to find a fix it would be nice.
If we could just control the environment. Like force all car manufacturers to revert back to incandescent bulbs. But incandescent bulbs are bad for other reasons like they re hugely energy. Inefficient and you can t use them to create these really interesting designs that we see in car lighting.
These days. But also it s just not how the world works is it and actually that s a general problem the autonomous vehicles face which is the world is messy and you re gonna have to deal with that if we can t change the world around the car. We have to change the car. But what does that solution look like i spoke to someone who knows a lot more about these things than i yes my name s rob instead.
We organize conferences for nerds. Who work on automotive safety systems to do with cameras and other perception sensors..
So what do you do do you train the ai on all these different led flicker scenarios or do you fix the camera. Yes. So the laughter really rather than saying okay. We accept that there s a flashing light here and we need to train the system differently.
There s mitigation that you can put in place at the sensor level. So that what the camera does eventually see is a solid brake light when it meant to see a solid brake light. One solution would be to make sure that the shutter of the camera is open for the entirety of the frame that way if the frequency of the leds is at least. I don t double the frame rate of the camera.
Then the camera should be capturing roughly the right brightness in each frame. I can illustrate that quite nicely with my camera. See here the shutter is only open for a brief fraction of the frame. And we re getting quite bad flicker from the front headlights.
But as i increase the amount of time that the shutter is open for starts to fade away. You can see the problem. Now is that the image is overexposed because we re letting too much light onto the sensor to counteract that we can close down the aperture reducing the amount of light hitting the sensor. There s still a little bit of rolling shutter artifact.
But it s nowhere near as bad. So is that the solution keep the shutter open for the whole frame and make the aperture. Smaller. Well yes.
And no we do want to keep the shutter open for the whole frame. But we can t change the size of the aperture. I spoke to some engineers working on the problem and they tell me that the cam on cars are fixed aperture that s because the variable aperture mechanism in a lens is quite delicate and if you think about what a car goes through imagine the camera. That s attached to the boot of a car and repeatedly slamming that boot closed that delicate mechanism isn t going to survive.
So. Instead they re working on camera sensors that have the capability of being much less sensitive or to put it in normal camera terms to create a camera that has the possibility of a much lower iso that way you could keep the shutter open for the whole frame. Without it being overexposed. You might be thinking that if you keep the shutter open for the entirety of a frame.
You re going to occasionally have really bad motion blur to deal with and that s true. But it seems as though it s easier for artificial intelligence to deal with motion blur than it is for these flickering led artefacts so when s it going to be fixed well that s an interesting question..
But at the same time there are loads and loads of problems that need to be fixed before we see the mass adoption of self driving cars and they re all being solved in parallel. So it s really hard to say so it s one of those situations. Where you know it s quite easy to develop the first 98 percent of the technology. You make in 98 percent safe.
But in order for us to deploy vehicles fully. Driverless vehicles on the road mixed in with other traffic. You know they have to be super super safe you know i often talk about speculative technologies. And it throws up a certain type of person that you meet in the comments that says well this is why it s not going to work the end and i can understand why it s fun to leave a comment like that because it feels good.
It s like i ve thought of this thing. That makes it all impossible. None of you thought of it you all idiots and i m really plepper. I m going to type it what i really like is you have this other type of person who says yeah.
We knew about that problem and we built it anyway. Because that s how you figure out how to solve the problem. These are just they re all technical challenges and when you break it down they re all they re all quite niche technical challenges that all come together to create the whole system yeah and there s there s plenty of nerds out there that like to like to work on solving those problems. Thanks to the i triple e p.
2020. Standards group for helping with this video and for letting me use their footage and thanks to merck for sponsoring. This video. Merck is a science and technology company as part of their curiosity initiative.
They want us all to be more curious in our everyday lives like for example on your commute. Maybe you start thinking about how the lights on modern cars work and how they might interact with autonomous vehicles in the future. I can t just be me on their worlds of curiosity website. They talk about why curiosity is so important and how it could lead to a better world link to that in the description.
I hope you enjoyed this video. If you did don t forget to hit subscribe and i ll see you next time ” ..
Thank you for watching all the articles on the topic Self-Driving Cars Are Dangerously Confused By LED Lights . All shares of khurak.net are very good. We hope you are satisfied with the article. For any questions, please leave a comment below. Hopefully you guys support our website even more.
description:
“This video is sponsored by Merck, a science and technology company. Find out how curiosity can change our future: https://www.merckgroup.com/en/worlds-of-curiosity/?ko=smo #alwayscuriousnnLED lights can look like they re doing weird things when viewed through a digital camera. It s all to do with the frame rate of the camera and the duty cycle of the LED. It s a real problem for autonomous vehicles. Self driving cars analyse footage from cameras to makes driving decisions that affect road safety. Under the right conditions a break light can even look like a turn signal! So what s the solution?nnThe image of the car from the thumbnail is by Michael Shick: https://commons.wikimedia.org/wiki/File:Google_self_driving_car_at_the_Googleplex.jpgnnThe shrug emoji from the thumbnail is by EmmanuelCordoliani: https://commons.wikimedia.org/wiki/File:Person_shrugging_Emoji.pngnnYou can buy my books here:nhttps://stevemould.com/booksnnYou can support me on Patreon here:nhttps://www.patreon.com/stevemouldnnjust like these amazing people:nnGlenn WatsonnPeter TurnernJo l van der LoonMatthew CockenMark BrouwernDenebnnTwitter: http://twitter.com/mouldsnInstagram: https://www.instagram.com/stevemouldscience/nFacebook: https://www.facebook.com/stevemouldscience/nBuy nerdy maths things: http://mathsgear.co.uk”,
tags: