DEV Community

How we made the return process more efficient thanks to an IPA beer notification hack

The built return app using aws rekognition
Daniel Wellington is a Swedish fashion brand founded in 2011. Since its inception, it has sold over 11 million watches and established itself as one of the fastest-growing and most beloved brands in the watch industry history.

Most people know we were the pioneer with influencer marketing. It is impossible to archive without advanced technology. To make it possible, we started using Amazon Web Services (AWS) since 2014. We use services like Amazon Elastic Container Service, AWS Lambda, Amazon DynamoDB, Amazon Sagemaker and Amazon Rekognition.

We develop our services and applications on the AWS cloud platform and make extensive use of all its amazing serverless features, neatly matched with microservice based architectures.

Our design principle is serverless first. If serverless is not available or practical, containers are recommended, EC2 is legacy. JavaScript and Go are some of our languages of choice.

As a global fashion company that has used technology to disrupt an entire industry, we are continually learning and challenging our way of doing business. Our latest venture is to understand how machine learning can help us develop business value.

Since a year back, we started by looking into how we could do machine learning awareness to the business, since then we started internally a machine learning project where we are building a model to predict and detect certain events. Besides that, one of our employees, Anders, have also been playing around with the recognition service on his spare time, more specific the text in image feature.

The actual beer menu that is automatically captured and published several times per hour by the hipster bar.The actual beer menu that is automatically captured and published several times per hour by the hipster bar.

Why?

Well, what Anders tried to do is to get a notification when a hipster bar at south side of Stockholm added a new IPA to the beer menu.

Why over engineer the problem?

Now the problem Is that they update the menu so often that they write the menu by hand and publish a picture of it to their website instead of having the menu written on the website

With Anders marvellous personal experience using the Rekognition service, he wondered if we could employ it to solve any actual business problem. What we came up with was to to use it to scan and read the back plate of our watches to help our warehouse workers that take care of returns.

During high peak season, each of them may have to handle several hundreds of returns per day. Now, that is a very tedious process, and we understood that it took some while to look up the details in different systems and relabel the return. The relabelling part alone required minimum 40 but up to 60 seconds per item. At the same time, we realized that the business had been looking for a third party solution for almost 18 months.

So we decided to solve the problem with ML technology. The outcome is a React Native app that communicates with an Amazon API Gateway and several Lambda functions, making use of SQS, Rekognition, and finally a serverless printer server (yes, you read it right).

The architecture diagramThe architecture diagram

Now, you may think that OCR is not a hard thing to do, and wonder why are we using a service when there are plenty of open source projects available. Well, we did a comparison between several frameworks and the AWS service and we found out that Rekognition was the one with the highest precision, while the other frameworks were just not good enough. Besides the best accuracy, we also found the cost of running Rekognition is low.

Our warehouse workers can save up to 56 seconds of the price $0.0013 per scan and lambda.

As a result, we brought the relabelling process down to four seconds and sometimes even faster, with higher quality and accuracy. The best part is it only took us less than two weeks to build and deploy it as MVP. The consumption of coffee dropped in the warehouse because they no longer need to spend time on tedious, repetitive task.

Here is a demo of it

What’s next? We have now taken this project out of the proof of concept phase and we are building a proper implementation of the app with all the additional requirements needed in a production environment, such as authentication, the ability to choose the AWS region closest to the warehouse, running automated builds, etc.

We will stay curious and keep experimenting until we set our eyes on the next problem to solve with the power of machine learning.

Do you want to know more about how it is to work with technology at Daniel Wellington? Take a few minutes to watch the video

Top comments (0)