DEV Community

markrossatos for AWS Community Builders

Posted on • Originally published at Medium

AWS Deep Racer Event Hosting — In-Person Racing

Our event set-up in the Atos London offices

This is the third and final part in my three part series on AWS Deep Racer. If you’ve arrived here directly the first part in the series can be viewed here, and mainly covers setting up AWS Deep Racer in a single AWS account using multi-user mode. The second part can be viewed here, and mainly covers running a virtual online race. I’d recommend you read those first, then return to read about the physical racing. There’s quite a lot to consider when hosting an in person race so let’s get started…

The room you host the event in needs to be a good size. The re:Invent 2018 track we had printed came in at 7.9m x 5.2m, and some of the newer tracks are even longer! I would strongly recommend printing it at the correct dimensions to match what has trained in the virtual world to give your models the best chance. You need to consider what surface you’re laying the track on, and how you’d keep it stretched out. We’ve laid our track on a tiled floor previously and it’s been great, this time we laid it on a carpet-tiled floor and because it had a little bit of give in it we did experience a few wrinkles, which increased with use as people were following the car around. There are other options too, for example you could create your track with white tape on an existing dark coloured floor to keep your costs to a minimum, but a printed track certainly looks more professional. If you need help deciding what to build and how then AWS provide a nice guide here.

Lighting and what objects are in the room is another important consideration. Trying to remove or cover white objects that the car may be able to see is a sensible idea, otherwise you may find your models behave differently than they did in the virtual world. We had to cover some white support posts as well as turning off some screens with a white background that seemed to be interfering with the cars. Another technique, which has the added benefit of stopping the cars getting too far away when they veer off track, is to build barriers. Our barriers were knee high, which logistically is a lot easier than the waist high ones you see at AWS re:Invent and the summits, however I did notice when watching the video of what the car camera was picking up it could see over the barrier at certain points. Try and make the lighting as consistent as possible too, to avoid lots of shadows, or particular bright spots versus the rest of the set-up, we used our standard room interior lighting and for consistency pulled down the blinds as half of the walls in the room were floor to ceiling glass.

Moving on to the cars I would strongly recommend you have multiples of everything available, after all ‘everything fails all the time’ as Werner Vogels would say! Having multiple cars gives resilience in case of failure, so you can carry on racing whilst troubleshooting any issues. It also allows you to get a production line going, so whilst one car is on the track racing the next car can have a model loaded onto it to avoid lots of time when there aren’t cars racing on the track and keep the audience engaged. Extra batteries on top of what are inside each car would also be a sensible addition, so batteries can be charging whilst cars are racing. Details on batteries and other spare parts can be found here. Details on how to set-up and calibrate the cars can be found on the AWS website, you’ll want to calibrate the cars initially and then if you start to have poor model performance during the day a recalibration may be required as heavy crashes into the barriers could impact calibration.

A couple of other useful additions to your set-up will be a mechanism for timing each lap and a wireless network. These could range from a simple manual timer (e.g. phone / stopwatch), to an automated timing mechanism on the start / finish line using a pressure sensor and a Raspberry Pi for a more professional job. A separate wireless router is useful too, as you can preconfigure everything to that network so you don’t have to reconfigure things if you take them to different offices, and all the higher bandwidth activity can all stay local.

Once you have everything in place I strongly recommend you test everything. This allows you to test your team that are running the event, so everyone know what they’re doing, and it also allows you to test the environment, in case anything is adversely affecting the cars.

Our in-person race was our top 30 participants from a virtual qualifying round. We gave them additional hours of training to further alter and improve their models from the end of qualifying to the day of the race. We had around 20 people from the UK qualify who could get to our London offices, whilst the other 10 qualifiers were from further afield (USA, Guatemala, Denmark, France, Romania and Germany). To keep our remote colleagues engaged we set-up a couple of webcams to capture the action from different angles, and with the help of some colleagues streamed the webcams and a live leader board into the metaverse, complete with a representation of the track!

AWS Deep Racer goes into the metaverse!

We gave each racer an initial time slot to come and test out their models. some models transferred well to the physical world, whereas others that had performed well in the virtual world struggled a bit. In machine learning this problem is known as being ‘over fit’, some of the models that did very well in the virtual world had become overly reliant on the training data (the virtual environment) and weren’t so good at generalising and overcoming things they’d not previously seen (e.g. a wrinkle in the track, a shadow etc.). This was a good learning point for participants and something that does transfer into the real world, a road isn’t a sterile environment it could have rubbish on it, a pot hole or markings that have faded. Once everyone had been given a round we then invited back the top 10 for a further shoot out to see if their times could be improved.

The results of the top racers were impressive, and interestingly the podium for the in person event was completely different to the virtual qualifiers, with each of the top 3 going quicker in person than they’d managed in the virtual qualifiers.

Atos and Cloudreach in-person leaderboard top 12

Our winner, Marco, had an excellent model. Not only was it the fastest but it consistently went around the track, in his first run he posted a sub 9s time and then in the top 10 shootout he managed to lower that time to an incredible 7.939s.


Fastest lap from our live finals

A worthy winner and in the process, to everyone’s surprise, he bagged himself an all expenses paid trip to re:Invent presented to him by one of our Atos OneCloud leadership team, Santi Ribas!

Presenter Santi Ribas (left) with our podium left to right (Nickson, Marco, Simon)

Participants and AWS colleagues who helped to run the event

Participants and AWS colleagues who helped to run the event.

Finally I’d like to thank those that helped to organise and run the event. My Atos colleagues Matt Knight and Neil Clark who helped with the virtual and physical races, and my colleagues Vrushali Malankar and Kshitij Bhatnagar for creating the metaverse. I’d also like to thank my AWS Colleagues Sathya Paduchuri, Bharath Sridharan, Rajan Patel, Stuart Lupton, Pete Moles and Jenny Vega.

Top comments (0)