DEV Community

Cover image for A Personalized News Summary AI Tool with me.
Bosco Kalinijabo
Bosco Kalinijabo

Posted on

A Personalized News Summary AI Tool with me.

Have you wished to read like top 5 latest articles all at once on a single page and in just a few minutes instead of turning pages 🤔? Nowadays, everything is fast, there are many things to read, from newsletters we subscribed to, flooding into our email inbox every morning to articles we come across shared with us or popping up on our social media timelines, it could be challenging to cover them all.

I recently pondered how I could stay updated with the news from home 🇷🇼, while also exploring new interests and learning. Many reasons can be as excuses for not finding time to read the news (I don't know mine honestly 🤭). One of the popular online newspapers for Rwandan news in English is the New Times.

With artificial intelligence (AI) booming in every corner of our lives, [ from my view ] it has seemed to be facilitating us rather than taking over our jobs ( now we are less worried, even though Devin is there), that's why I have been interested lately in software engineering with AI/ML. A plethora of models are being released daily and platforms like HuggingFace have been free of access to a vast of the models, datasets, demo apps, and AI/ML researchers.

I wanted to read the latest articles but in briefe, summarized. I used the Bart-large-cnn for summarization, as one of the Machine learning (ML) models.

from transformers import pipeline
summarizer = pipeline("summarization", model="facebook/bart-large-cnn")
ARTICLE = 'put-all-your-contents-here'
summarizer(ARTICLE, max_length=130, min_length=30, do_sample=False)
Enter fullscreen mode Exit fullscreen mode

you can explore more here.

The model mentioned above was used to summarize the 5 latest extracted articles, by web scrapping them.

A web scraper automates the process of extracting information from other websites, quickly and accurately
and one of the libraries for easing the job is BeautifulSoup4 for parsing HTML and XML into texts, then summarizes one article per article in 130 characters on maximum.

[Check how it returns the articles with Postman]
Image description

To integrate everything, I built an API with Django in the backend (wanted to keep it in Python) and react.js for the readers ( Front End).

Additionally, the reader can also opt to receive these summarised articles via their email at the scheduled time using the SMTP application layer protocol.
[ check how they are sent in email]
running with Postman
Emails in email inbox

In conclusion, many models can accomplish similar tasks, tools like ChatGPT or Claude, etc... This project was an exploration of what different tools combined could do, thanks to AI/ML for easing the job. Numerous tools are being developed and released to assist us in our work, from generating graphics to composing audio to enhancing the articles we write. it is up to us to know how we can integrate these tools into our projects and get the best out of them.

NB. Retrieving all the articles and summarizing them at once takes some seconds, a lot still needs to be improved 😉, remember, it was for fun. Here is the link to the project.

Top comments (0)