DEV Community

Rob Sherling
Rob Sherling

Posted on

Serverless Backends with AWS Cloud: Sending tweets and collecting emails

This is a part in a series on AWS serverless architecture. The original blog post series can be found both here and on my blog J-bytes.

The final disclaimer

I know I've said it about a dozen times before, but here it is for the last time: This tutorial was built to teach you cool AWS services and give you some practical experience with them. There are many, many different ways that you could have done this, and not all of them are equally suited for all purposes. Still, this very closely resembles actual production code that I have used in a company (with their explicit permission to write this guide, of course). It works, and it's battle-tested.

Also, go ahead and install MySQL on the computer you plan to do this on. You don't have to use MySQL, of course, but we'll be using the mysql2 gem in Ruby so I recommend it. Feel free to substitute your favorite db configuration as appropriate. If you do not know how to install and minimally configure a Mysql/PostgreSQL DB (but you could somehow follow and understand all the code and words until now?), this is the time to learn. Go hit some tutorials, then come back.

Good. Let's make the DB.

Create statement that will probably drive DBA's insane (something something VARCHAR utf-8 something):

CREATE DATABASE twitter;
use twitter;
CREATE TABLE users (uid int unique, sent boolean, token VARCHAR(255), token_iv VARCHAR(255), screen_name VARCHAR(255), screen_name_iv VARCHAR(255), secret VARCHAR(255), secret_iv VARCHAR(255), message TEXT);
Enter fullscreen mode Exit fullscreen mode

Also, go ahead and install the mysql2 and twitter gems.

Emails and CSV

So, at some point, your client will most likely ask you for a list of emails of people that have signed up to send them "targeted messages." Here is how you do that.

I'm assuming Linux/Mac, but I see no reason why this shouldn't work exactly the same with Windows assuming you have Ruby and Node.js installed.

First, download your Data Pipeline-created emails file (it's a JSON, because of course it is) and change the name to something easier to type like "encrypted_emails". If you look at the contents, it should look like mostly garbage. This is good; AES is working and we're passing all the compliance checks. Let's get that back into a normal format we can send our client so they can use. Note: Best practices would be to send the encrypted file to the client over a secure connection, then have them decrypt the file using a password delivered by a team of highly-trained agents using a briefcase with a dead-man switch. I can afford normal email, so we used that to send the data.

Make a folder in your project space called "local". Inside, place the file decrypt.js, and change the part with 'ADD PASSWORD' to your AES password that you saved in the appropriate S3 config bucket. Note: In real life, you would put this in an environment variable because of security and accidental code commits revealing your password.

This code simply takes a string and restores it to its unencrypted state. We're making it in node.js because I could not get ruby and node to play nicely with their respective crypto libraries and this way took 10 seconds and was guaranteed to work and deadlines, people, deadlines.

Next, in that same local folder, place the file decrypt-emails.rb. Make sure that the encrypted_emails file is also in the local folder. Read the script for the explanation. Then, in your terminal:

ruby decrypt-emails.rb <S3 email file name here>

Observe "processed_emails.csv" output.

Success! If you are still seeing garbage strings in that document, it is because your AES password is misconfigured. Make sure you have the right stage and password.

Tweets. All the tweets.

Download the twitter data in our S3 bucket and rename it to something mysterious and indecipherable like "twitter_data". Move that to our recently-created "local" folder.

We'll need two files here. One to upload our file into our prepared Mysql database, and one to send tweets out of that db.

See above creation statement if you haven't created the appropriate db/tables on your local host MySQL yet.

Put the following two files into your 'local' folder. Code assumes MySQL root user with no password, change as appropriate.

s3_to_mysql.rb - Change data where needed

send_tweets.rb - input twitter client data and MySQL data as needed, remember to put the correct AES password in decrypt.js before running. Again, not to beat a dead horse but please use environment variables in real life.

Feel free to modify that code to read our tweet from a file, enforce 140 character limits, etc.

Note that s3_to_mysql.rb can be used in any project because there is nothing specific to our project about it. Just make a MySQL table that has all the keys that our DynamoDB table does, and run the script.

Anyway, in the console, run:

ruby s3_to_mysql.rb <s3_twitter data file name>

Your mysql should be populated with data! Then run:

ruby send_tweets.rb <message to tweet here>

The twitter account(s) that you registered should now have a dm from themselves with your message. If so, congratulations!

Epilogue

I wish I had a cool way to wrap up all these hours of tutorial and frustration along the way, but all I can say is:

This took a long, long time to get all this code down to what you see. If I had to do it again, I would be able to do it much faster. No one at my company even knew if this would work when I started; even the guy who asked me to build it was taking a guess. I didn't see any kind of tutorial on how to connect most of this stuff, and that deeply bothered me. There were so many pitfalls and so many "gotchas" along the way that frustrated me to the point of needing to walk away. I am looking directly at you, CloudWatch logging roles for lambda.

When I started, I had never used lambda but I knew how the internet worked so I could try to piece it together. Now I'm the lambda/API gateway authority at my company, and it feels really cool. Don't be afraid to try new technologies, and be even less afraid to ask for help when you need it.

Until next time,

-Rob

Top comments (0)