DEV Community

Cover image for Nairobi Stock Exchange Web Scraper (MongoDB Atlas Hackathon 2022 on DEV)
Zoo Codes
Zoo Codes

Posted on

Nairobi Stock Exchange Web Scraper (MongoDB Atlas Hackathon 2022 on DEV)

What I built

I built a web scraper using Python (specifically the Scrapy Framework and Beautiful Soup library), MongoDB Atlas as the database, Atlas Charts for data visualization and Africas Talking as the SMS API. The web scraper scrapes the NSE website for the latest stock prices and stores them in a MongoDB Atlas database. The data is then retrieved from the database and sent to the user via SMS.

Additionally I added CI/CD to the project using GitHub Actions. The CI/CD pipeline runs the tests and lints the code. It tests the project against python versions: 3.8, 3.9 and 3.10 as well MongoDB versions: 4.4, 5.0, 6.0. This ensures wide compatibility with different versions of Python and MongoDB.

Category Submission

Choose Your Own Adventure

App Link

GitHub Link

Screenshots

Scrapy running

Scrapy

Charts-1

Charts-3

Sample Chart

HeatMap Chart

Atlas DB

Github Actions

Description

NSE Stock Scraper is a web scraper that scrapes the NSE website for the latest stock prices and stores them in a MongoDB Atlas database. It is meant to be ultimate data collection tool using the Open Source tools.

Link to Source Code

GitHub logo KenMwaura1 / nse-stock-scraper

This is Web Scraper utilizing Scrapy Framework, MongoDB and AfricasTalking to get stock prices for companies listed on the Nairobi Stock Exchange. This project will store ticker name and price as well notify via SMS once properly setup via AfricasTalking.

Daily Stock Price Scraper

Badges

Python application forthebadge made-with-python

MIT license Open Source Love svg1

Overview

Web scraper utilizing scrapy to scrape live stock prices from the Nairobi Stock Exchange The prices are then saved in MongoDB Database after each scrape, we use pymongo to connect to MongoDb Atlas. We then proceed to use Atlas Charts to visualize the data.

The accompanying article can be found here

Screenshots

App Screenshot

Atlas DB

Charts Dashboard

The actual platform we are scraping is afx website.

Getting Started

Prerequisites

  • Python and pip (I am currently using 3.9.2) Any version above 3.7 should work.
  • An Africas Talking account
    • Api Key and username from your account. Create an app and take note of the api key.
  • MongoDB Atlas account, create a free account here
    • Create a cluster and take note of the connection string.

Installation

Clone this repo

  git clone https://github.com/KenMwaura1/nse-stock-scraper
Enter fullscreen mode Exit fullscreen mode

Step 1

Change into the directory

cd stock-price-scraper

Step 2

Create a virtual environment (venv) to hold all the…

Permissive License

Background

I have recently been learning more about financial markets and data analysis. I wanted to build something that would help me learn more about the stock market and also help me practice my web scraping skills. I also wanted to learn more about MongoDB Atlas and how to use it to store data. Among my goals was to utilize Africas Talking to send notification SMS messages.

How I built it

I built the web scraper using Python and the Scrapy Framework. I used Beautiful Soup to parse the HTML data. I used MongoDB Atlas to store the data. I used Atlas Charts to visualize the data. I used Africas Talking to send the data to the user via SMS.

Scrapy as a framework enforces certain ways to structure your code, which makes it easier to maintain and extend. It also has a lot of built-in features that make it easy to do things like pagination, following links, and storing data. Using MongoDB Atlas was easy as it has a free tier that allows you to store up to 512 MB of data. I used Atlas Charts to visualize the data collected into MongoDB Atlas.

In this case we created a simple spider for the afx website. The spider is responsible for crawling the website and extracting data from it. The spider is also responsible for following links and crawling other pages. Once data is extracted from the website, it is stored in MongoDB Atlas.

We then use the data in Atlas Charts to generate different types of visualizations and store them in a dashboard. As shown below:

Sample Dashboard

The Atlas dashboard provides alot of useful metrics in real-time about the currently running database(s).

Atlas metrics

Additional Resources/Info

Scrapy

MongoDB Atlas

Atlas Charts

Africas Talking

Beautiful Soup

Python

GitHub Actions

What's next for the Project?

  • Fix bugs as well setting up a cron job to send the data to the user at a specific time of the day.
  • Deploy the project to cloud hosting platform i.e. Google Cloud, Azure, etc.
  • Add more features to the project such as sending the data to the user via email.
  • Containerize the project using Docker.

Let me know what you think of the project. I would love to hear your feedback. Thanks!
Feel free to reach out to me on Twitter or LinkedIn.

Next-time

Top comments (4)

Collapse
 
hartley94 profile image
Martin Thuo

πŸ₯‚

Collapse
 
ken_mwaura1 profile image
Zoo Codes

πŸŽ‰πŸŽ‰

Collapse
 
csituma profile image
Clara Situma

πŸ”₯

Collapse
 
ken_mwaura1 profile image
Zoo Codes

πŸš€πŸš€