DEV Community 👩‍💻👨‍💻

Cover image for Progress Report
Emily
Emily

Posted on • Updated on

Progress Report

Progress

Until now, my first PR for adding route \feeds\invalid to Posts service merged, and I also sent another PR for the issue #2413. So everything is going well, and I have not had any significant challenges working on these issues.

Finding

The hardest part for implementing the route \feeds\delayed to Posts service is testing my code. When microservices finished downloading feeds and processing them, I checked the Redis server to find the delayed one. And I found nothing. After asking a question on Telescope's Slack channel, I got an answer from prof. David that this case is not usual. The reason is a feed is marked delayed only when the processor receives an HTTP 429 response ("the user has sent too many requests in a given amount of time ("rate limiting")"). This issue made it hard to test my new functions against the real-time database.
The workaround suggested by prof. David is implementing an integration test to create delayed feeds on the fly. Surprisingly, those feeds did not have the time to live when I used ttl command. I found that the setDelayedFeeds() function did not set the expire time for the key.

// old code
setDelayedFeed: (id, seconds) => redis.set(createDelayedFeedKey(id), seconds, 1),
// new code
setDelayedFeed: (id, seconds) => redis.set(createDelayedFeedKey(id), seconds, 'EX', seconds),
Enter fullscreen mode Exit fullscreen mode

Take-away

The biggest take-away for me this time is learning about Redis commands and Readable stream in NodeJs.
Streams is one of the core concepts in NodeJs, and it is often used when we need to work with a large amount of data.
When calling the scanStream() method on a Redis client, I actually created an object of readable stream.

Scanstream : Convenient class to convert the process of scaning keys to a readable stream

My code:

const getFeedKeysUsingScanStream = (matchPattern) => {
  const keys = new Set(); 
  const stream = redis.scanStream({
    match: matchPattern,
  }); //create a readable stream object 
  return new Promise((resolve, reject) => {
    stream.on('data', (data = []) => {
      data.forEach((k) => keys.add(k));
    });
    stream.on('error', (err) => {
      logger.error({ err }, 'Error while scanning redis keys');
      reject(new Error('Error while scanning redis keys'));
    });
    stream.on('end', () => {
      resolve([...keys]);
    });
Enter fullscreen mode Exit fullscreen mode

A stream object has many events: 'closed', 'data', 'end', 'error', etc. Because the stream processes data piece by piece, we need to wrap it inside a Promise and only resolve when there is no more data to consume _ the end event.

In the end, I was able to make the route work, and I am happy with the progress.

Top comments (6)

Collapse
 
zohebkhan profile image
Zoheb Alli Khan

Heyyo Emily this is a great blog post :D

Collapse
 
andersbjorkland profile image
Anders Björkland

Good progress! How do you like contributing to open source?

Collapse
 
hphan9 profile image
Emily

Thank you. I love that there is always something new to learn from open source project :)

Collapse
 
thebox193 profile image
Sir.Nathan (Jonathan Stassen) • Edited on

Wow learning streams, that very cool! I've been a developer for 9+ years and have yet to work with streams. Keep up the great work! :)

Collapse
 
hphan9 profile image
Emily

Thank you for your kind words <3

We want your help! Become a Tag Moderator.
Check out this survey and help us moderate our community by becoming a tag moderator here at DEV.