DEV Community

Cover image for Analyzing recent mentions of a user on Twitter with TensorflowJs Part 1
bijanGh
bijanGh

Posted on • Updated on

Analyzing recent mentions of a user on Twitter with TensorflowJs Part 1

I recently encountered many small web projects utilizing TensorflowJS so I decided to give it a go myself. Here I want to use AI to predict how nice or bad a tweet is in different mentions of a user. so let's break it down into its parts. first, we need to get some tweets from Twitter API (NodeJs), then we can feed them to our prediction system and present the results in our web app(React).

In this post we assume that you know the basics of NodeJs; leave the rest to me. I'll try my best to bore you with details😄
you can skip to Part Two here

Part One: Fetch & serve some tweets

  • Get the keys from the Twitter

  • Setup our minimal API

  • Fetch some tweets

Register in the Twitter developer platform

First of all, we need a Twitter account. You can get it here on Twitter.com. Then you should apply for a developer account here; after answering some questions they'll approve your account. head to your developer portal and in the projects and apps section create an app. save the Bearer token somewhere at hand.

Setting up our API (please let go of Express)

you can do this via any language available to you but I'm quite happy with NodeJS and please don't use ExpressJs anymore it has not been maintained in years now, instead use Fastify which is very syntactically similar and with so much more features and a stable modern API. it also handles asynchronous code perfectly (which ExpressJs lacks fundamentally).

TLDR;

you can have a working example of the API here just make sure to provide TWITTER_BEARER_TOKEN as an environment variable.

First, we need to initialize our project with:

npm init -y
Enter fullscreen mode Exit fullscreen mode

Then we need to add our dependencies:

npm install fastify fastify-cors --save
Enter fullscreen mode Exit fullscreen mode

we also need some dev dependencies to make our lives better as a dev:

npm install --save-dev nodemon
Enter fullscreen mode Exit fullscreen mode

all you need now is a single .js file( preferably index.js in the root of the project directory) and a script to run it. go to the package.json and add this line

//package.json
  "scripts": {
    "dev": " nodemon index.js",
  },
Enter fullscreen mode Exit fullscreen mode

now you can run your code.
let's add some REST endpoints to see if our setup is working.

// index.js

// intializing fastify instance
const fastify = require("fastify")({});
// register cors middleware in order to prevent CORS error when we request from our localhost
fastify.register(require("fastify-cors"), {
  origin: true,
});

// to check if it is working
fastify.all("/health-check", (req, reply) => {
  reply.send(`I'm OK!`);
});

const app = async () => {
  try {
    await fastify.listen(process.env.PORT || 4000, "0.0.0.0");
    console.log(`Our beautiful API is working, Let's conqure the world!!`);
  } catch (err) {
    console.log(
      `Our great API shamefully encounterd an error on startup, our hope is lost`
    );
    fastify.log.error(err);
    process.exit(1);
  }
};

app();
Enter fullscreen mode Exit fullscreen mode

type this in your CLI and watch what happens:

npm run dev
Enter fullscreen mode Exit fullscreen mode

now open http://localhost:4000/health-check in your browser. and you should see a tiny beautiful " I'm OK! ".

Let's fetch'em :)

It's time to get tweets from Twitter API and for that, we need that Bearer Token we received after twitters approval of our developer account. but we can't put it directly in our code, it's unsafe. so we pass it as an environment variable. for that we need dotenv:

npm install --save-dev dotenv
Enter fullscreen mode Exit fullscreen mode

make sure to update your dev script accordingly

//package.json
  "scripts": {
    "dev": " nodemon -r ./node_modules/dotenv/config index.js",
  },
Enter fullscreen mode Exit fullscreen mode

and also make a ".env" file in the root of the project with your token

# .env
TWITTER_BEARER_TOKEN=someOddLookingCharactersAlongSideEAchOtHer
Enter fullscreen mode Exit fullscreen mode

now when we use "process.env.TWITTER_BEARER_TOKEN" without compromising our token, cool.

now it's time to set up an endpoint to receive some mentions of a specific user; for that we need to send HTTP request to Twitter API which is also a REST API. we could use NodeJs HTTP module for that but for the sake of simplicity and ease we are going to add Axios, a js library made on top of node http to facilitate our lives:

npm install --save axios
Enter fullscreen mode Exit fullscreen mode

also, make some query factory functions to keep codes clean:

// index.js

/**
 * a neat query factory for getting user from twitter API
 * @param {string} user
 * @returns {string}
 */

const getUserQuery = (user) =>
  `https://api.twitter.com/2/users/by/username/${user}?user.fields=id,name,profile_image_url&tweet.fields=id,text,created_at,conversation_id `;

/**
 * a neat query factory for getting user's tweets or mentions from twitter API
 * @param {'mentions'|'tweets'} type
 * @param {string} user
 * @returns
 */
const getTweetsQuery = (type, user) =>
  `https://api.twitter.com/2/users/${user}/${type}?tweet.fields=id,text,created_at,conversation_id&max_results=${
    process.env.MAX_RESULT_COUNT ?? 20
  }`;

Enter fullscreen mode Exit fullscreen mode

the comments are JSDoc comments, quite useful in VisualStudio code IDE for documentations and type checking( better than typescript if you ask me ).
I added type to getTweetsQuery to be able to get mentions or tweets of a user with a single endpoint.
let's use these in an endpoint:

// index.js
const axios = require("axios");

fastify.post("/tweets", async (req, reply) => {
  const { user, type } = JSON.parse(req.body);

  if (!user) return new Error(`please add user to the body of the request`);
  if (!type ) return new Error(`please add type of tweets in body of the request`);

  const {
    data: { data: userData },
  } = await axios.get(getUserQuery(user), {
    headers: {
      Authorization: `Bearer ${process.env.TWITTER_BEARER_TOKEN}`,
    },
  });

  const {
    data: { data: tweets },
  } = await axios.get(getTweetsQuery(type , userData.id), {
    headers: {
      Authorization: `Bearer ${process.env.TWITTER_BEARER_TOKEN}`,
    },
  });

  return { user: userData, tweets };
});

Enter fullscreen mode Exit fullscreen mode

in order to test it, we should send a POST request which can be done by curl or a tool like Postman.
let's get recent tweets of @elonmusk. for that, send a post request to http://localhost:4000/tweets with this body payload:

{
    "user": "elonmusk",
    "type": "tweets"
}

Enter fullscreen mode Exit fullscreen mode

and you should receive a result like this

{
  "user": {
    "profile_image_url": "https://pbs.twimg.com/profile_images/1423663740344406029/l_-QOIHY_normal.jpg",
    "username": "elonmusk",
    "name": "Elon Musk",
    "id": "44196397"
  },
  "tweets": [
    {
      "created_at": "2021-08-17T14:19:59.000Z",
      "text": "@MKBHD Impressive",
      "conversation_id": "1427633063652102155",
      "id": "1427636326539608077"
    },
    {
      "created_at": "2021-08-16T01:54:52.000Z",
      "text": "@cleantechnica Robyn is great",
      "conversation_id": "1427084772220809216",
      "id": "1427086427674877952"
    },
    {
      "created_at": "2021-08-15T16:05:10.000Z",
      "text": "@OstynHyss @nickwhoward Beta 10 or maybe 10.1. Going to pure vision set us back initially. Vision plus (coarse) radar had us trapped in a local maximum, like a level cap.\n\nPure vision requires fairly advanced real-world AI, but that’s how our whole road system is designed to work: NN’s with vision.",
      "conversation_id": "1426713249710497797",
      "id": "1426938024018038786"
    }
  ]
}

Enter fullscreen mode Exit fullscreen mode

but with much more tweets.

Concolusion

ok we successfully received some tweets from Twitter API and safely served it via our Node REST API. in the second part we are going to set up our web app, make a request to our API, and process tweets on the client using TensorflowJs sentiment analysis and present the results.

Top comments (0)