<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Evalyn Njagi</title>
    <description>The latest articles on DEV Community by Evalyn Njagi (@evlyn_njagi_a32f3ab6c984).</description>
    <link>https://dev.to/evlyn_njagi_a32f3ab6c984</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/evlyn_njagi_a32f3ab6c984"/>
    <language>en</language>
    <item>
      <title>The Power BI and REST APIs: A Fun Guide on how to Turn Web Data into Beautiful dashboards.</title>
      <dc:creator>Evalyn Njagi</dc:creator>
      <pubDate>Tue, 11 Nov 2025 11:46:50 +0000</pubDate>
      <link>https://dev.to/evlyn_njagi_a32f3ab6c984/the-power-bi-and-rest-apis-a-fun-guide-on-how-to-turn-web-data-into-beautiful-dashboards-54o5</link>
      <guid>https://dev.to/evlyn_njagi_a32f3ab6c984/the-power-bi-and-rest-apis-a-fun-guide-on-how-to-turn-web-data-into-beautiful-dashboards-54o5</guid>
      <description>&lt;p&gt;Ever wonder how live Power BI dashboards display real-time data like weather, stock prices, or social media trends?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The solution is not complex: REST APIs -&lt;/strong&gt; blankets that aren't seen but provide a way to establish a connection between Power BI and an infinite realm of data that can be found online.&lt;/p&gt;

&lt;p&gt;This guide is intended for new APIs and Power BI users. You will know what are the REST APIs, how Power BI is connected to them, how to authenticate, what problems may occur in general, and how to build a real-time weather dashboard showing the current temperature in Nairobi, Mombasa, Kampala, and Kigali.&lt;/p&gt;

&lt;p&gt;At the conclusion, you will have developed the API novice into a data explorer.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What Is a REST API?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Imagine a restaurant. You are the customer, the kitchen is the data server and the waiter is the API. One can make a request to the waiter to select something particular, say, what is the current temperature in Nairobi. The waiter is introduced to the kitchen and requests what you have ordered, goes and gets the information and comes back with it on a plate all prepared to eat.&lt;/p&gt;

&lt;p&gt;That is precisely what an API does. It enables you to query a source of data and get them in a more structured format, typically in the form of a JavaScript Object notation (JSON) that can be easily read by Power BI and converted into visuals.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Methods of integrating Power BI with a REST API.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Power BI provides numerous alternatives to access APIs, which will depend on the complexity of the API and the type of information required.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Using "Get Data - Web"&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This is the simplest and least complicated alternative.&lt;/p&gt;

&lt;p&gt;Steps:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Home - Get Data - Web (in Power BI Desktop).

Paste your API link. For example:
https://api.openweathermap.org/data/2.5/weather?q=Nairobi&amp;amp;appid=YOURAPIKEY
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The data will be fetched and will be displayed in the form of JSON by Power BI. Then clean, transform, and visualize it.&lt;/p&gt;

&lt;p&gt;The technique is ideal in straightforward or open APIs that do not require intricate authentication.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Using Power Query M Code&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Want to have more control, such as adding parameters, adding headers, or making a chain of API calls? The Power Query language (M) of Power BI allows creating your own script.&lt;/p&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;let
    // === Settings ===
    ApiKey = "YOUR API KEY",
    Cities = {"Nairobi,KE", "Mombasa,KE", "Kampala,UG", "Kigali,RW"},

    // === Function to call OpenWeatherMap for a single city ===
    GetWeather = (city as text) =&amp;gt;
        let
            url = "https://api.openweathermap.org/data/2.5/weather?q=" &amp;amp; city &amp;amp; "&amp;amp;appid=" &amp;amp; ApiKey &amp;amp; "&amp;amp;units=metric",
            Source = Json.Document(Web.Contents(url)),
            // convert record to a table so it matches your original expansion logic
            #"Converted to Table" = Table.FromRecords({Source}),
            #"Expanded coord" = Table.ExpandRecordColumn(#"Converted to Table", "coord", {"lon", "lat"}, {"coord.lon", "coord.lat"}),
            #"Expanded weather" = Table.ExpandListColumn(#"Expanded coord", "weather"),
            #"Expanded weather1" = Table.ExpandRecordColumn(#"Expanded weather", "weather", {"id", "main", "description", "icon"}, {"weather.id", "weather.main", "weather.description", "weather.icon"}),
            #"Expanded main" = Table.ExpandRecordColumn(#"Expanded weather1", "main", {"temp", "feels_like", "temp_min", "temp_max", "pressure", "humidity", "sea_level", "grnd_level"}, {"main.temp", "main.feels_like", "main.temp_min", "main.temp_max", "main.pressure", "main.humidity", "main.sea_level", "main.grnd_level"}),
            #"Expanded wind" = Table.ExpandRecordColumn(#"Expanded main", "wind", {"speed", "deg", "gust"}, {"wind.speed", "wind.deg", "wind.gust"}),
            #"Expanded clouds" = Table.ExpandRecordColumn(#"Expanded wind", "clouds", {"all"}, {"clouds.all"}),
            #"Expanded sys" = Table.ExpandRecordColumn(#"Expanded clouds", "sys", {"country", "sunrise", "sunset"}, {"sys.country", "sys.sunrise", "sys.sunset"}),
            #"Changed Type" = Table.TransformColumnTypes(#"Expanded sys",{
                {"coord.lon", type number}, {"coord.lat", type number}, {"weather.id", Int64.Type},
                {"weather.main", type text}, {"weather.description", type text}, {"weather.icon", type text},
                {"main.temp", type number}, {"main.feels_like", type number}, {"main.temp_min", type number},
                {"main.temp_max", type number}, {"main.pressure", Int64.Type}, {"main.humidity", Int64.Type},
                {"wind.speed", type number}, {"wind.deg", Int64.Type}, {"name", type text}
            })
        in
            #"Changed Type",

    // === Call function for each city and combine ===
    WeatherTables = List.Transform(Cities, each GetWeather(_)),
    Combined = Table.Combine(WeatherTables),
    #"Reordered Columns" = Table.ReorderColumns(Combined,{"sys.country", "name", "coord.lon", "coord.lat", "weather.id", "weather.main", "weather.description", "weather.icon", "base", "main.temp", "main.feels_like", "main.temp_min", "main.temp_max", "main.pressure", "main.humidity", "main.sea_level", "main.grnd_level", "visibility", "wind.speed", "wind.deg", "wind.gust", "clouds.all", "dt", "sys.sunrise", "sys.sunset", "timezone", "id", "cod"}),
    #"Renamed Columns" = Table.RenameColumns(#"Reordered Columns",{{"sys.country", "country"}}),
    #"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns",{{"country", type text}}),
    #"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"id", "cod", "timezone", "weather.icon"})
in
    #"Removed Columns"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will provide you with freedom to call automation, deal with many cities, and control more sophisticated processes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;With Custom Connector or Azure Function.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Beyond advanced configurations, businesses often use custom Power BI connectors or Azure Functions to handle authentication and data processing before sending clean data to Power BI.&lt;/p&gt;

&lt;p&gt;This is ideal for sensitive, high-volume data requiring frequent updates.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Power BI Authentication.&lt;/strong&gt;&lt;br&gt;
To connect to APIs, Power BI must be granted data access permission via authentication, which Power BI supports in common forms.&lt;/p&gt;

&lt;p&gt;1) Anonymous&lt;/p&gt;

&lt;p&gt;Applied to public APIs that do not need any form of logins. You are able to stay in touch with each other without unnecessary procedures.&lt;/p&gt;

&lt;p&gt;2) API Key&lt;/p&gt;

&lt;p&gt;This is the most widespread technique. You append your key to an API URL or a header. It is saved safely in the Data Source Settings of Power BI.&lt;/p&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;https://api.openweathermap.org/data/2.5/weather?q=Nairobi&amp;amp;appid=YOURAPIKEY&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;3) Basic Authentication&lt;/p&gt;

&lt;p&gt;Unlike other APIs that use a username and password for authentication, Power BI securely stores your credentials after you enter them.&lt;/p&gt;

&lt;p&gt;4) OAuth 2.0&lt;/p&gt;

&lt;p&gt;APIs requiring user authorization, such as Google or Microsoft Graph, utilize a secure login window for authentication. Upon signing in, Power BI receives a temporary token granting access. Always consult the API documentation to confirm the authentication method. Importing API data into Power BI can present challenges.&lt;/p&gt;

&lt;p&gt;Integrating Power BI with an API can be exciting, but it's important to be aware of potential pitfalls.&lt;/p&gt;

&lt;p&gt;1) Pagination&lt;/p&gt;

&lt;p&gt;Most APIs do not provide a large number of records on a request. Power Query loops or functions will be required in order to access all the data pages.&lt;/p&gt;

&lt;p&gt;2) Rate Limits&lt;/p&gt;

&lt;p&gt;Other APIs limit the frequency of requests made e.g. 60 calls per minute. Going beyond this will have your key blocked temporarily.&lt;/p&gt;

&lt;p&gt;3) Authentication Expiration&lt;/p&gt;

&lt;p&gt;The use of tokens in APIs (such as OAuth) may have expiry times and should be automatically renewed otherwise there may be refresh errors.&lt;/p&gt;

&lt;p&gt;4) Nested JSON&lt;/p&gt;

&lt;p&gt;The API data is usually embedded in layers, such as boxes within boxes. The Power Query editor on Power BI allows one to expand and flatten these fields into a clean table.&lt;/p&gt;

&lt;p&gt;5) Cloud Refresh&lt;/p&gt;

&lt;p&gt;When publishing to Power BI Service, ensure your credentials and gateways are properly configured for scheduled refresh.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Integrating OpenWeatherMap API data with Power BI.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To be more practical, we will consider linking Power BI to the OpenWeatherMap API to get the current temperature of Nairobi, Mombasa, Kampala, and Kigali.&lt;/p&gt;

&lt;p&gt;Step 1: Get Your API Key&lt;br&gt;
Obtain your API key from the open data weather site by creating a free account.&lt;/p&gt;

&lt;p&gt;Copy your special API key (long sequence of characters and numbers).&lt;/p&gt;

&lt;p&gt;Step 2: Develop a Power Query Function.&lt;/p&gt;

&lt;p&gt;In Power BI:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Home -Get Data - Blank Query - Advanced Editor. 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This provides an interactive feature that recurs and gets weather information for any passing city.&lt;/p&gt;

&lt;p&gt;Step 3: write the below code in the advanced editor&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;et
    // === Settings ===
    ApiKey = "Api Key",
    Cities = {"Nairobi,KE", "Mombasa,KE", "Kampala,UG", "Kigali,RW"},

    // === Function to call OpenWeatherMap for a single city ===
    GetWeather = (city as text) =&amp;gt;
        let
            url = "https://api.openweathermap.org/data/2.5/weather?q=" &amp;amp; city &amp;amp; "&amp;amp;appid=" &amp;amp; ApiKey &amp;amp; "&amp;amp;units=metric",
            Source = Json.Document(Web.Contents(url)),
            // convert record to a table so it matches your original expansion logic
            #"Converted to Table" = Table.FromRecords({Source}),
            #"Expanded coord" = Table.ExpandRecordColumn(#"Converted to Table", "coord", {"lon", "lat"}, {"coord.lon", "coord.lat"}),
            #"Expanded weather" = Table.ExpandListColumn(#"Expanded coord", "weather"),
            #"Expanded weather1" = Table.ExpandRecordColumn(#"Expanded weather", "weather", {"id", "main", "description", "icon"}, {"weather.id", "weather.main", "weather.description", "weather.icon"}),
            #"Expanded main" = Table.ExpandRecordColumn(#"Expanded weather1", "main", {"temp", "feels_like", "temp_min", "temp_max", "pressure", "humidity", "sea_level", "grnd_level"}, {"main.temp", "main.feels_like", "main.temp_min", "main.temp_max", "main.pressure", "main.humidity", "main.sea_level", "main.grnd_level"}),
            #"Expanded wind" = Table.ExpandRecordColumn(#"Expanded main", "wind", {"speed", "deg", "gust"}, {"wind.speed", "wind.deg", "wind.gust"}),
            #"Expanded clouds" = Table.ExpandRecordColumn(#"Expanded wind", "clouds", {"all"}, {"clouds.all"}),
            #"Expanded sys" = Table.ExpandRecordColumn(#"Expanded clouds", "sys", {"country", "sunrise", "sunset"}, {"sys.country", "sys.sunrise", "sys.sunset"}),
            #"Changed Type" = Table.TransformColumnTypes(#"Expanded sys",{
                {"coord.lon", type number}, {"coord.lat", type number}, {"weather.id", Int64.Type},
                {"weather.main", type text}, {"weather.description", type text}, {"weather.icon", type text},
                {"main.temp", type number}, {"main.feels_like", type number}, {"main.temp_min", type number},
                {"main.temp_max", type number}, {"main.pressure", Int64.Type}, {"main.humidity", Int64.Type},
                {"wind.speed", type number}, {"wind.deg", Int64.Type}, {"name", type text}
            })
        in
            #"Changed Type",

    // === Call function for each city and combine ===
    WeatherTables = List.Transform(Cities, each GetWeather(_)),
    Combined = Table.Combine(WeatherTables),
    #"Reordered Columns" = Table.ReorderColumns(Combined,{"sys.country", "name", "coord.lon", "coord.lat", "weather.id", "weather.main", "weather.description", "weather.icon", "base", "main.temp", "main.feels_like", "main.temp_min", "main.temp_max", "main.pressure", "main.humidity", "main.sea_level", "main.grnd_level", "visibility", "wind.speed", "wind.deg", "wind.gust", "clouds.all", "dt", "sys.sunrise", "sys.sunset", "timezone", "id", "cod"}),
    #"Renamed Columns" = Table.RenameColumns(#"Reordered Columns",{{"sys.country", "country"}}),
    #"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns",{{"country", type text}}),
    #"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"id", "cod", "timezone", "weather.icon"})
in
    #"Removed Columns"

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Step 4: Build the Dashboard&lt;/p&gt;

&lt;p&gt;Import the data into Power BI and draw a bar chart, table or map of the temperatures of the four cities. Even refreshing it automatically can be scheduled.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frfvazy4lim2ca332316x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frfvazy4lim2ca332316x.png" alt=" " width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Final Thoughts&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;REST APIs open up a world of live data for Power BI users. Whether you want to track weather, financial markets, or social media activity, APIs can feed real-time information directly into your dashboards.&lt;/p&gt;

&lt;p&gt;You’ve learned how Power BI connects to APIs, how it handles authentication, what challenges to watch out for, and how to build a real example using OpenWeatherMap.&lt;/p&gt;

&lt;p&gt;Once you get comfortable with APIs, you’ll realize that Power BI isn’t just about charts and tables; it’s your window into the living, breathing data of the internet.&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>microsoft</category>
      <category>api</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Collecting Africa’s Energy Insights:</title>
      <dc:creator>Evalyn Njagi</dc:creator>
      <pubDate>Sun, 12 Oct 2025 17:15:16 +0000</pubDate>
      <link>https://dev.to/evlyn_njagi_a32f3ab6c984/building-the-africa-energy-coverage-data-extractor-2000-2022-22pk</link>
      <guid>https://dev.to/evlyn_njagi_a32f3ab6c984/building-the-africa-energy-coverage-data-extractor-2000-2022-22pk</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction&lt;/strong&gt;&lt;br&gt;
When working with data, access to well-structured and reliable information can make all the difference. I recently developed a project called Africa Energy Coverage Data Extractor, which focuses on collecting energy data for all African countries between 2000 and 2022. This project uses Python and Selenium to automatically gather data from the Africa Energy Portal and then stores it in a format that can be analyzed or uploaded to a cloud database.&lt;/p&gt;

&lt;p&gt;The goal was simple: create a tool that could automatically go through the Africa Energy Portal, capture yearly statistics for every country, and organize everything into one clean dataset.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What the Project Does&lt;/strong&gt;&lt;br&gt;
The Africa Energy Coverage Data Extractor uses Selenium WebDriver to browse through the website just like a human would. It navigates across multiple pages and year sliders to pull together information on different energy indicators.&lt;br&gt;
Once the data is collected, it is saved into a CSV file for local use. You can also choose to upload it to MongoDB Atlas, which is a cloud database that makes it easier to store and manage large datasets.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Main Features&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Here are some of the key things the extractor can do:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Collects energy data for all 54 African countries&lt;/li&gt;
&lt;li&gt;Extracts yearly data from 2000 to 2022&lt;/li&gt;
&lt;li&gt;Handles dynamic website elements using Selenium&lt;/li&gt;
&lt;li&gt;Gathers a wide range of energy indicators such as electricity access, generation, and consumption&lt;/li&gt;
&lt;li&gt;Saves data as a structured CSV file&lt;/li&gt;
&lt;li&gt;Optionally uploads the data to MongoDB Atlas&lt;/li&gt;
&lt;li&gt;Organizes code into simple, modular scripts&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;The Tools Behind the Project&lt;/em&gt;&lt;/strong&gt;&lt;br&gt;
The project was built with tools that are common in data work but powerful when combined:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Python 3.x for programming&lt;/li&gt;
&lt;li&gt;Selenium WebDriver for browser automation&lt;/li&gt;
&lt;li&gt;Pandas and NumPy for data handling&lt;/li&gt;
&lt;li&gt;webdriver-manager for managing browser drivers&lt;/li&gt;
&lt;li&gt;MongoDB Atlas and PyMongo for data storage and access&lt;/li&gt;
&lt;li&gt;Google Chrome as the browser used during automation&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This combination makes it easy to manage both scraping and storage without needing complicated setups.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;How the Project is Organized&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The code is divided into a few Python files that handle different parts of the process:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;scrape.py – contains the main scraping logic&lt;/li&gt;
&lt;li&gt;mongodb.py – manages the connection to MongoDB and data uploads&lt;/li&gt;
&lt;li&gt;main.py – runs the full process from scraping to saving&lt;/li&gt;
&lt;li&gt;requirements.txt – lists the Python packages you need&lt;/li&gt;
&lt;li&gt;README.md – contains documentation&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This structure keeps everything clear and easy to maintain.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Getting Started&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To run the project, you’ll need to have Python 3.8 or higher, Google Chrome, and Git installed.&lt;/p&gt;

&lt;p&gt;After cloning the project repository, you can set up your environment by creating a virtual environment and installing the dependencies:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python -m venv venv
source venv/bin/activate    # or venv\Scripts\activate on Windows
pip install -r requirements.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once that’s done, you can run the main script to start scraping:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python main.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will collect the data and save it into a CSV file on your computer.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Connecting to MongoDB Atlas&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If you’d like to upload the extracted data to MongoDB, you’ll need to create a free cluster on MongoDB Atlas.&lt;/p&gt;

&lt;p&gt;After that, set up your environment variables with your database connection details:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;MONGO_URI="your_mongodb_connection_uri"
MONGO_DATABASE="AfricaEnergyData"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then you can upload the data by running:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;python main.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;em&gt;Data in MongoDB&lt;/em&gt;:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz55kc9idznwidikc40g1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz55kc9idznwidikc40g1.png" alt=" " width="800" height="284"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The data will be sent to your cloud database, where you can access it anytime.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Understanding the Data&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Each record in the dataset contains information about a specific energy metric for a country and year.&lt;br&gt;
Here’s what each column represents:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;country – name of the country

country_serial – ISO3 country code

metric – energy metric (for example, access to electricity)

value – the numeric value of the metric

unit – the unit of measurement

sector – the main sector category

sub_sector – a more specific classification

sub_sub_sector – an even finer breakdown

source_link – link to the source on the Africa Energy Portal

source – the name of the source (default: Africa Energy Portal)

year – the year of the data (2000–2022)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This makes it easy to work with the data in analytics tools or dashboards later on.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How It Works in Practice&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The process starts with Selenium opening the Africa Energy Portal. It then loops through each year from 2000 to 2022, collecting data for all available countries. Once all the information is gathered, it’s organized into a CSV file.&lt;/p&gt;

&lt;p&gt;You can decide whether to stop there or upload the data to MongoDB Atlas for storage and sharing.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Contact&lt;/em&gt;&lt;/strong&gt;&lt;br&gt;
Author: Evalyn Njagi&lt;br&gt;
Email: &lt;a href="mailto:evalynnjagi02@gmail.com"&gt;evalynnjagi02@gmail.com&lt;/a&gt;&lt;br&gt;
LinkedIn: Evalyn Njagi&lt;/p&gt;

&lt;p&gt;If you are interested in contributing, improving the scraper, or expanding it to new datasets, feel free to connect or open an issue on GitHub.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Final Thoughts&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Working on this project taught me a lot about data automation and cloud storage. It showed how a simple Python workflow can help collect large-scale information in a structured and reusable format.&lt;br&gt;
This tool can be extended for other kinds of regional or sectoral data, and it serves as a practical example of how data engineering can support research and development in different fields.&lt;/p&gt;

</description>
      <category>dataengineering</category>
      <category>data</category>
      <category>automation</category>
      <category>python</category>
    </item>
    <item>
      <title>E-commerce Price Comparison: Real-Time Price Checking with Python</title>
      <dc:creator>Evalyn Njagi</dc:creator>
      <pubDate>Fri, 04 Jul 2025 10:12:30 +0000</pubDate>
      <link>https://dev.to/evlyn_njagi_a32f3ab6c984/e-commerce-price-comparison-real-time-price-checking-with-python-in-2025-3n0l</link>
      <guid>https://dev.to/evlyn_njagi_a32f3ab6c984/e-commerce-price-comparison-real-time-price-checking-with-python-in-2025-3n0l</guid>
      <description>&lt;h3&gt;
  
  
  Introduction:
&lt;/h3&gt;

&lt;p&gt;Have you ever wondered if the “big discount” flashing on your screen is the best deal available?&lt;br&gt;
I have. And like many shoppers today, I was tired of guessing.&lt;/p&gt;

&lt;p&gt;Shopping in 2025 is a thrilling yet confusing experience. Prices bounce up and down, discounts come and go, and no one has time to compare dozens of pages every day.&lt;br&gt;
So I decided to let Python do the hard work for me.&lt;/p&gt;

&lt;h3&gt;
  
  
  Project Overview:
&lt;/h3&gt;

&lt;p&gt;It is no secret that online shopping platforms sometimes display wildly different prices for the same item. With dynamic pricing algorithms and personalized deals, prices for the same product often vary more than most people think. The truth is that comparing prices manually is time-consuming.&lt;/p&gt;

&lt;p&gt;What if we could:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Extract live prices from different stores&lt;/li&gt;
&lt;li&gt;Clean and match messy product names&lt;/li&gt;
&lt;li&gt;Compare them instantly&lt;/li&gt;
&lt;li&gt;See where the real deal is hiding&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This brief project demonstrates how Python can be used to retrieve actual data and perform a price comparison.&lt;/p&gt;

&lt;h3&gt;
  
  
  Project Code
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://github.com/EvalynTheAnalyst/E-Commerce-Price-Comparison.git" rel="noopener noreferrer"&gt;https://github.com/EvalynTheAnalyst/E-Commerce-Price-Comparison.git&lt;/a&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  Price Comparison Overview
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://e-commerce-price-comparison-3v4hksca8brzaudtn2jkuc.streamlit.app/#e-commerce-price-tracker" rel="noopener noreferrer"&gt;https://e-commerce-price-comparison-3v4hksca8brzaudtn2jkuc.streamlit.app/#e-commerce-price-tracker&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcn49jx42p6ch5opwjvod.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcn49jx42p6ch5opwjvod.png" alt="Image description" width="800" height="413"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Methodology:
&lt;/h3&gt;

&lt;p&gt;The solution uses Python and a few powerful libraries. The goal is simple: collect product details from two well-known e-commerce sites, clean the data, match similar products, and display the price comparison with clear visuals.&lt;/p&gt;

&lt;p&gt;Here's a step-by-step breakdown: &lt;/p&gt;

&lt;h4&gt;
  
  
  Importing necessary Libraries:
&lt;/h4&gt;

&lt;p&gt;The following libraries will be used to retrieve data from e-commerce sites, clean it, and visualize the price comparison.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsvta6doe9xi6mj362hni.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsvta6doe9xi6mj362hni.png" alt="Image description" width="651" height="194"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  2. Scraping Data
&lt;/h4&gt;

&lt;p&gt;Using BeautifulSoup and requests, the script visits product pages and extracts information such as name, price, discount, and reviews.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffboc0ir8vgnt2sit917k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffboc0ir8vgnt2sit917k.png" alt="Image description" width="800" height="609"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  3)Cleaning Data
&lt;/h4&gt;

&lt;p&gt;Pandas cleans price data and handles missing values, crucial for reliable data analysis.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv4u449rgikyqxnlnf0m6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv4u449rgikyqxnlnf0m6.png" alt="Image description" width="800" height="254"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  4) Matching Products
&lt;/h4&gt;

&lt;p&gt;The same phone or accessory may appear with slightly different names on each site. RapidFuzz helps match them smartly using fuzzy matching.&lt;/p&gt;

&lt;h4&gt;
  
  
  5) Comparing Prices
&lt;/h4&gt;

&lt;p&gt;Once matched, prices are compared to see which site offers a better deal for the same product.&lt;/p&gt;

&lt;h4&gt;
  
  
  6.) Visualizing Insights
&lt;/h4&gt;

&lt;p&gt;Finally, the project wraps all this into a Streamlit app. With one click, it scrapes fresh data, cleans it, matches products, and shows a clear table and charts.&lt;/p&gt;

&lt;p&gt;Features include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Matched products with side-by-side prices&lt;/li&gt;
&lt;li&gt;Histogram of price differences&lt;/li&gt;
&lt;li&gt;Average price comparison&lt;/li&gt;
&lt;li&gt;Top 10 largest price gaps&lt;/li&gt;
&lt;li&gt;Option to download results as CSV&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Conclusion:
&lt;/h3&gt;

&lt;p&gt;In today’s market, algorithms quietly adjust prices all the time. Discounts can be real or just marketing. Reviews influence buying decisions, but sometimes the lowest price does not come with the best trust signals.&lt;/p&gt;

&lt;p&gt;By scraping data responsibly and presenting it clearly, shoppers get back control. The tool does not rely on guesswork. It checks actual listings, compares the same products, and shows where you can save money. For anyone who runs an online store, this same idea can help keep an eye on the competition.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Extracting Data from APIs Using Python — A Beginner-Friendly Guide</title>
      <dc:creator>Evalyn Njagi</dc:creator>
      <pubDate>Tue, 13 May 2025 12:04:02 +0000</pubDate>
      <link>https://dev.to/evlyn_njagi_a32f3ab6c984/extracting-data-from-apis-using-python-a-beginner-friendly-guide-4l48</link>
      <guid>https://dev.to/evlyn_njagi_a32f3ab6c984/extracting-data-from-apis-using-python-a-beginner-friendly-guide-4l48</guid>
      <description>&lt;p&gt;In today’s digital world, data is the heartbeat of modern applications and decision-making. Whether you're analyzing stock trends, tracking weather updates, or studying social media patterns, much of this data comes from external sources, often through APIs.&lt;/p&gt;

&lt;p&gt;This article will walk you through how to extract data from an API using Python’s requests library clearly and practically.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Understanding Data Sources&lt;/strong&gt;&lt;br&gt;
Before we dive into coding, let’s first understand where data comes from. Broadly, data sources fall into two categories:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Internal Sources&lt;/strong&gt;&lt;br&gt;
These are datasets generated within an organization — such as financial reports, employee records, or sales logs. For instance, a company’s profit and loss statement or payroll data would be considered internal.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;External Sources&lt;/strong&gt;&lt;br&gt;
These are obtained from outside the organization, often via the internet. Examples include weather updates, social media content, or real-time stock prices accessed using APIs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is an API?&lt;/strong&gt;&lt;br&gt;
API stands for Application Programming Interface. It acts like a bridge that allows two software applications to communicate. When you request information from a website via an API, the server responds with structured data, often in formats like JSON or XML, which you can then use in your application.&lt;/p&gt;

&lt;p&gt;Think of an API as a waiter at a restaurant: you (the client) place your order (the request), the waiter delivers it to the kitchen (the server), and brings back your food (the data).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why Use the requests Module?&lt;/strong&gt;&lt;br&gt;
Python’s requests module makes it incredibly simple to send HTTP requests and receive responses. It's widely used for calling APIs because it’s beginner-friendly and powerful.&lt;/p&gt;

&lt;p&gt;To start, you have to install the requests module in Python&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fod41eneas4mnitfoq3uj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fod41eneas4mnitfoq3uj.png" alt="Image description" width="800" height="30"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once installed, import it into your code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0d9nm34pfo5z68lw7hb9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0d9nm34pfo5z68lw7hb9.png" alt="Image description" width="800" height="32"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example: Extracting Stock Ticker Data&lt;/strong&gt;&lt;br&gt;
Here’s a sample script that uses the Polygon.io API to retrieve stock ticker data:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff4flux1ltl1ac93sutxo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff4flux1ltl1ac93sutxo.png" alt="Image description" width="800" height="384"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Output&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fay6t24kr4034q2tgydx7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fay6t24kr4034q2tgydx7.png" alt="Image description" width="800" height="552"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What Next? Clean and Analyze the Data&lt;/strong&gt;&lt;br&gt;
Once you’ve extracted the data, the next step is to clean it, filter out what you need, and analyze it using tools like pandas, numpy, or even visualization libraries like matplotlib or seaborn.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Final Thoughts&lt;/strong&gt;&lt;br&gt;
Getting data via APIs is a powerful skill that opens up endless possibilities for analysis, automation, and app development. With just a few lines of Python, you can tap into live, real-world data — no spreadsheets required.&lt;/p&gt;

&lt;p&gt;If you're just starting in data science, engineering or backend development, working with APIs is one of the most valuable and practical things you can learn.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Extracting Data from APIs Using Python — A Beginner-Friendly Guide</title>
      <dc:creator>Evalyn Njagi</dc:creator>
      <pubDate>Tue, 13 May 2025 12:04:02 +0000</pubDate>
      <link>https://dev.to/evlyn_njagi_a32f3ab6c984/extracting-data-from-apis-using-python-a-beginner-friendly-guide-36k6</link>
      <guid>https://dev.to/evlyn_njagi_a32f3ab6c984/extracting-data-from-apis-using-python-a-beginner-friendly-guide-36k6</guid>
      <description>&lt;p&gt;In today’s digital world, data is the heartbeat of modern applications and decision-making. Whether you're analyzing stock trends, tracking weather updates, or studying social media patterns, much of this data comes from external sources, often through APIs.&lt;/p&gt;

&lt;p&gt;This article will walk you through how to extract data from an API using Python’s requests library clearly and practically.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Understanding Data Sources&lt;/strong&gt;&lt;br&gt;
Before we dive into coding, let’s first understand where data comes from. Broadly, data sources fall into two categories:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Internal Sources&lt;/strong&gt;&lt;br&gt;
These are datasets generated within an organization — such as financial reports, employee records, or sales logs. For instance, a company’s profit and loss statement or payroll data would be considered internal.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;External Sources&lt;/strong&gt;&lt;br&gt;
These are obtained from outside the organization, often via the internet. Examples include weather updates, social media content, or real-time stock prices accessed using APIs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is an API?&lt;/strong&gt;&lt;br&gt;
API stands for Application Programming Interface. It acts like a bridge that allows two software applications to communicate. When you request information from a website via an API, the server responds with structured data, often in formats like JSON or XML, which you can then use in your application.&lt;/p&gt;

&lt;p&gt;Think of an API as a waiter at a restaurant: you (the client) place your order (the request), the waiter delivers it to the kitchen (the server), and brings back your food (the data).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why Use the requests Module?&lt;/strong&gt;&lt;br&gt;
Python’s requests module makes it incredibly simple to send HTTP requests and receive responses. It's widely used for calling APIs because it’s beginner-friendly and powerful.&lt;/p&gt;

&lt;p&gt;To start, you have to install the requests module in Python&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fod41eneas4mnitfoq3uj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fod41eneas4mnitfoq3uj.png" alt="Image description" width="800" height="30"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once installed, import it into your code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0d9nm34pfo5z68lw7hb9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0d9nm34pfo5z68lw7hb9.png" alt="Image description" width="800" height="32"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example: Extracting Stock Ticker Data&lt;/strong&gt;&lt;br&gt;
Here’s a sample script that uses the Polygon.io API to retrieve stock ticker data:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff4flux1ltl1ac93sutxo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff4flux1ltl1ac93sutxo.png" alt="Image description" width="800" height="384"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Output&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fay6t24kr4034q2tgydx7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fay6t24kr4034q2tgydx7.png" alt="Image description" width="800" height="552"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What Next? Clean and Analyze the Data&lt;/strong&gt;&lt;br&gt;
Once you’ve extracted the data, the next step is to clean it, filter out what you need, and analyze it using tools like pandas, numpy, or even visualization libraries like matplotlib or seaborn.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Final Thoughts&lt;/strong&gt;&lt;br&gt;
Getting data via APIs is a powerful skill that opens up endless possibilities for analysis, automation, and app development. With just a few lines of Python, you can tap into live, real-world data — no spreadsheets required.&lt;/p&gt;

&lt;p&gt;If you're just starting in data science, engineering or backend development, working with APIs is one of the most valuable and practical things you can learn.&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
