markdown guide
 
  1. Are you ever stressed about the fact that one tiny mistake you make could potentially cost (actual) lives?
  2. To what degree would you feel responsible if a car using the software you wrote would be responsible for an accident?
  3. How rigorous is testing done on your project? (both dev testing and QA testing)
  4. Do you have automation testing? If so, how reliable are your tests?
  5. Do you ever submit code that has known issues (regardless of severity) just to meet deadlines?
  6. How do you handle ethical issues, such as the trolley dilemma?(iflscience.com/editors-blog/trolle...) Is there a special (legal) team that gives you requirements/tasks in this sense? What if you disagree with them (morally)?
 

Hi,

I try to answer all your points:

  1. Basically how these systems and the development is designed it can not be the mistake of only one developer: there are code reviews, there is something called functional safety component, there are multiple levels of tests. So if something is happening that should be the mistake of multiple people. On the other hand of course such things can still happen. But as per now a lot of people are dying in traffic accidents I would be already happy if this number could be strongly decreased by autonomous driving, even if this number will never reach 0.

  2. As I already mentioned it could be never the responsibility of only one developer, it must be a mistake of multiple people. But if so, of course I'm somehow responsible for that. As it is now: in case of most of the driving assistant systems, like emergency brake assistant it is mentioned, that they are not working 100% sure, but also they are strongly decreasing the number of accidents. It's like I'm driving my car on a daily base and of course there's the risk that I'm causing an accident, still I'm doing it.

  3. Right now the most of such projects are still in research phase. That means the cars will be driven only by test drivers, who did certain driving courses before. In this phase the testing is not so strict. In case of safety critical systems going into serial development there are a lot of restrictions on the process, on reviews, on testing on code and on system level. And for safety cases there are even software tricks, like to processors are calculating the same with two different algorithms and if the results are not the same the system is changing into an error state and a lot of other different things.

  4. There are automated tests as well on different levels: unit tests, software in loop, hardware in loop. They are trying to cover all cases, but of course issues can always happen. There is manual testing on the top of that.

  5. Since this topic is still in research mode, yes. But it is always documented what parts are still not working properly and such software should never be published. There are so called software releases which are tested properly.

  6. The whole requirement engineering is a separated team, they are braking the high level requirements and use cases on smaller, component level requirements. So me as a developer I'm never taking such decisions. And as it is right now, the development is really concentrated on more common use cases, like overtaking, handling traffic lights, changing lanes properly and so on. In this phase such corner cases are not really analysed yet. On the other hand in such (really rare) situation you as a driver of a car needs to make you own decision as well. Have you ever thinking about what would be your decision in such a case?

 

Thanks for taking the time to answer all my questions. It probably took you a while to write all those answers :)

I just have 3 things to add:

  • "In this phase such corner cases are not really analyzed yet" - corner cases that are deferred for later usually end up at the bottom of a backlog and will be forgotten by time. While that usually works for "regular" software, I would opt for a more rigorous approach in your case, since the software you're developing is, quite literally, a matter of life and death.

  • "you as a driver of a car needs to make you own decision as well. Have you ever thinking about what would be your decision in such a case?"

That's precisely the problem. When it's my decision and my decision alone, then it's on me.

But when a fully autonomous car makes a decision like that, then I'm not ok with living with the guilt of a decision I didn't make.
That's why these moral issues are so delicate and should be analyzed thoroughly before implementing anything.

  • "So me as a developer I'm never taking such decisions"

Just some friendly advice, be very careful with that.
You may think that "you're just following orders", but that exactly what the developers from Volkswagen thought when they wrote the software that tricked the pollution tests. And one them went to jail for that: bbc.com/news/business-41053740.

Thanks again for taking the time to answer my questions.

Best of luck in the future!

 

How are the vehicles programmed to behave if they loose one or more sensors while in motion/driving?

For example: if a autonomous car loses its front facing camera, can it take appropriate action without intervention from the driver?

 

First of all the different sensors can replace each other in some cases. Like for example in case of heavy rain you can not rely on camera, but the radar/lidar is enough etc.
On the otherhand in case of sensor errors, the system is going into a so called degradation mode, it's up to the type of the degradation it is either introducing some limitations into the functionality, or disabling it. In the second case of course as far as possible the car is bringing itself into a safe state, like stopping.

 
  1. What programming language do you use ?

  2. How big is your team and what are the members of your team.

  3. Do you use agile? e.g. scrum

 
  1. Mainly C++, but for prototyping and tooling Python is used as well
  2. It is really a huge project with over 100 engineers working on that, with very different roles: system engineers, product owners, software engineers, test engineers etc.
  3. Yes
 

I have a feeling that if an autonomous car is built to work in extremely crowded and not very well defined roads, it would work anywhere better. How far from reality am I?

 

I'm not sure if I understand you correctly. But of course the city scenarios are the most complicated ones. But there can be some other tricky situations as well: difficult weather conditions (fog, heavy rain, snowy road surface) or for example a road construction.

 

What do you think will be the timeline before autonomous cars become commonplace on the roads? And, how long do you think before the majority of cars are autonomous?

 

Here I'm a bit pessimistic. Most of the companies are planning the first serial production for the upcoming 5-7 years, but I think it will take a bit longer. But once it will be introduced I think it will become quite fast popular and since most of the new cars today already have most of the sensors (they have front camera, back camera, radar, ultrasonic sensors) only the software needs to be changed strongly I don't expect too high prices. On the other hand there are different levels of autonomous driving. Features like highway pilot or traffic jam assistant are already almost perfect.

Classic DEV Post from Apr 4

DEV feature idea: Self-serve live broadcasting

In the past we put on live streamed talks/tutorials/workshops which were genera...

Marcell Lipp profile image
Hi, I'm a software developer with around 5 years experience. Currently I'm mainly focusing on software architecture, technical project leading, programmer soft skills and blogging.