<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Icefloqx Brian</title>
    <description>The latest articles on DEV Community by Icefloqx Brian (@icefloqx_brian).</description>
    <link>https://dev.to/icefloqx_brian</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/icefloqx_brian"/>
    <language>en</language>
    <item>
      <title>Connecting Power BI to REST APIs Endpoints.</title>
      <dc:creator>Icefloqx Brian</dc:creator>
      <pubDate>Wed, 12 Nov 2025 10:26:19 +0000</pubDate>
      <link>https://dev.to/icefloqx_brian/connecting-power-bi-to-rest-apis-endpoints-46oc</link>
      <guid>https://dev.to/icefloqx_brian/connecting-power-bi-to-rest-apis-endpoints-46oc</guid>
      <description>&lt;p&gt;APIs power almost everything today — from live weather dashboards to business systems.&lt;br&gt;
And when you combine them with Power BI, you unlock real-time, automated reporting from virtually any web data source.&lt;/p&gt;

&lt;p&gt;In this article we'll explore;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Different ways to connect Power BI to a REST API endpoint?&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;How does Power BI handle authentication when connecting to a web API?&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;What are some challenges you might face when importing data from an API into Power BI?&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;## Different ways to connect Power BI to a REST API endpoint.&lt;/em&gt;&lt;br&gt;
There are several approach when one wants to connect to APIs from simple to fully dynamic connections.&lt;/p&gt;

&lt;p&gt;_  1. By simple GET requests or simply using a web connector._&lt;br&gt;
This method is one of the easiest ways to fetch data from endpoints especially those that use public APIs.&lt;br&gt;
In Power BI;&lt;br&gt;
          - Go to Get Data → Web&lt;br&gt;
          - Paste your API endpoint (e.g. &lt;a href="https://api.openweathermap.org/data/2.5/weather?q=Nairobi&amp;amp;appid=YOUR_API_KEY&amp;amp;units=metric" rel="noopener noreferrer"&gt;https://api.openweathermap.org/data/2.5/weather?q=Nairobi&amp;amp;appid=YOUR_API_KEY&amp;amp;units=metric&lt;/a&gt;)&lt;br&gt;
          - Click OK&lt;br&gt;
Power BI should automatically detect and parse the JSON format.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;2. Power Query(M Language)&lt;/em&gt;&lt;br&gt;
Here one fetches data in json format and parse the data itself dynamically.&lt;br&gt;
This function better when one wants to loop through endpoints or build custom logics.&lt;br&gt;
&lt;code&gt;let&lt;br&gt;
    Source = Json.Document(&lt;br&gt;
        Web.Contents("https://api.openweathermap.org/data/2.5/weather",&lt;br&gt;
            [Query = [&lt;br&gt;
                q = "Nairobi",&lt;br&gt;
                appid = "YOUR_API_KEY",&lt;br&gt;
                units = "metric"&lt;br&gt;
            ]]&lt;br&gt;
        )&lt;br&gt;
    ),&lt;br&gt;
    main = Source[main],&lt;br&gt;
    temperature = main[temp]&lt;br&gt;
in&lt;br&gt;
    temperature&lt;br&gt;
&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;3. Using custom connectors.&lt;/em&gt;&lt;br&gt;
Writing you own connector in power query SDK or M code and packaging it when connecting to complex APIs or reusing across reports.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;4. Using python/R scripts.&lt;/em&gt;&lt;br&gt;
Power BI also supports Python and R, so you can use them to make API calls.&lt;br&gt;
Example in a code;&lt;br&gt;
`import requests, pandas as pd&lt;/p&gt;

&lt;p&gt;url = "&lt;a href="https://api.openweathermap.org/data/2.5/weather" rel="noopener noreferrer"&gt;https://api.openweathermap.org/data/2.5/weather&lt;/a&gt;"&lt;br&gt;
params = {"q": "Nairobi", "appid": "YOUR_API_KEY", "units": "metric"}&lt;/p&gt;

&lt;p&gt;response = requests.get(url, params=params)&lt;br&gt;
df = pd.json_normalize(response.json())&lt;br&gt;
`&lt;br&gt;
You should then import this script into Power BI as a data source.&lt;/p&gt;

&lt;h2&gt;
  
  
  How Power BI Handles Authentication When Connecting to APIs.
&lt;/h2&gt;

&lt;p&gt;When you connect Power BI to a web API, it prompts you to choose an authentication method. Supported types include:,&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Anonymous: No login credentials required mostly for public APIs.&lt;/li&gt;
&lt;li&gt;Basic: Username and password are essential for requests.&lt;/li&gt;
&lt;li&gt;Web API: Api keys/tokens are required and are passed via query.&lt;/li&gt;
&lt;li&gt;Windows.&lt;/li&gt;
&lt;li&gt;Organizational accounts.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Challenges faced importing data from Power BI.
&lt;/h2&gt;

&lt;p&gt;Working with APIs inside Power BI comes with a few hurdles.&lt;br&gt;
And they are;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Authentication - some API keys may expire while refreshing.&lt;/li&gt;
&lt;li&gt;Pagination - large datasets may split across pages.&lt;/li&gt;
&lt;li&gt;Rate limits - APIs may restrict number of requests made maybe per day or hourly.&lt;/li&gt;
&lt;li&gt;API key exposure -  storing API keys might be risky since they can be visible especially when using queries.&lt;/li&gt;
&lt;li&gt;Complex JSON structure - nested objects make it hard to expand data.&lt;/li&gt;
&lt;li&gt;Dynamic endpoints- some URLs might change with each requests.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Integrating Openweather API in the easiest way.
&lt;/h2&gt;

&lt;p&gt;Step 1: Create a blank query.&lt;br&gt;
 In Power BI:&lt;/p&gt;

&lt;p&gt;Home -&amp;gt; Get Data -&amp;gt; Web -&amp;gt; Basic&lt;/p&gt;

&lt;p&gt;In the URL section under basic paste a similar code as this: &lt;a href="https://api.openweathermap.org/data/2.5/weather?q=Nairobi&amp;amp;appid=%7BAPI" rel="noopener noreferrer"&gt;https://api.openweathermap.org/data/2.5/weather?q=Nairobi&amp;amp;appid={API&lt;/a&gt; KEY}&amp;amp;units=metric&lt;/p&gt;

&lt;p&gt;This will in turn give you data of the climate in Nairobi in form of a table.&lt;br&gt;
Repeat the same process to get data of the other cities then append the tables to form one table.&lt;br&gt;
After fetching all data, clean and remove unnecessary columns or rename some columns if necessary.&lt;/p&gt;

&lt;h2&gt;
  
  
  Visualizing the weather data.
&lt;/h2&gt;

&lt;p&gt;Below is just a page of the data:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fua03tz2qizwzbamec3m4.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fua03tz2qizwzbamec3m4.jpg" alt=" " width="800" height="389"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Thoughts.
&lt;/h2&gt;

&lt;p&gt;Integrating APIs with Power BI unlocks live, dynamic dashboards powered by real-world data.&lt;/p&gt;

</description>
      <category>api</category>
      <category>restapi</category>
      <category>datascience</category>
      <category>data</category>
    </item>
    <item>
      <title>Understanding Central Tendencies in Data</title>
      <dc:creator>Icefloqx Brian</dc:creator>
      <pubDate>Mon, 22 Sep 2025 15:15:44 +0000</pubDate>
      <link>https://dev.to/icefloqx_brian/understanding-central-tendencies-in-data-3ebc</link>
      <guid>https://dev.to/icefloqx_brian/understanding-central-tendencies-in-data-3ebc</guid>
      <description>&lt;p&gt;Central tendency in statistical analysis involves summarizing datasets through key measures: the mean, median, and mode. These measures offer distinct methods for representing the "center" of data, but their utility depends on the nature of the data distribution. &lt;/p&gt;

&lt;p&gt;The mean, which is arithmetic average of a dataset while also commonly used, can be heavily influenced by outliers and skewed distributions. &lt;/p&gt;

&lt;p&gt;The median, which is the middle number when data is arranged in order offering robustness against extreme values, is particularly useful for skewed data.&lt;/p&gt;

&lt;p&gt;The mode, which is the value that occurs most often in a dataset provides insights into the most frequent values, often in categorical datasets. &lt;/p&gt;

&lt;p&gt;We further examine how measures of central tendency interact with distribution shapes such as normal, skewed, and kurtotic distributions. Understanding these interactions is critical in selecting the appropriate measure for accurate data interpretation.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Importance of the use of central tendencies&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;They help us detect outliers from our datasets.&lt;/p&gt;

&lt;p&gt;They help us simplify complex data.&lt;/p&gt;

&lt;p&gt;Central tendency measures make it easier to communicate findings to stakeholders.&lt;/p&gt;

&lt;p&gt;Conclusion&lt;/p&gt;

&lt;p&gt;Central tendencies are vital because they summarize, simplify, and clarify data or simply they show the "normal" in data. They provide a quick way to understand what is typical, allow for meaningful comparisons, reveal outliers, and support better decisions. Whether in business, healthcare, education, or science.&lt;/p&gt;

</description>
      <category>learning</category>
      <category>datascience</category>
    </item>
    <item>
      <title>RELATIONSHIPS IN POWER BI</title>
      <dc:creator>Icefloqx Brian</dc:creator>
      <pubDate>Sat, 20 Sep 2025 14:29:33 +0000</pubDate>
      <link>https://dev.to/icefloqx_brian/relationships-in-power-bi-257b</link>
      <guid>https://dev.to/icefloqx_brian/relationships-in-power-bi-257b</guid>
      <description>&lt;p&gt;&lt;strong&gt;How relationships work in Power BI.&lt;/strong&gt;&lt;br&gt;
In Power BI, relationships connect different tables of data, allowing you to analyze them together as if they were one through keys.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Types of relationships.&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;One-to-Many (1:*)
A single record on the "one" side (the lookup or dimension table) can relate to many records on the "many" side (the fact table).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;-One product → many sales transactions.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;One-to-one (1:1)&lt;br&gt;
A single record in the first table is related to one and only one record in the second table.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Many-to-many (:):&lt;br&gt;
Multiple records in one table can relate to multiple records in the other. This type is generally handled using a "bridge" or "junction" table that creates a link between the two primary tables. &lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

</description>
    </item>
    <item>
      <title>EXCEL IN REAL WORLD.</title>
      <dc:creator>Icefloqx Brian</dc:creator>
      <pubDate>Sat, 20 Sep 2025 13:43:06 +0000</pubDate>
      <link>https://dev.to/icefloqx_brian/excel-in-real-world-1cba</link>
      <guid>https://dev.to/icefloqx_brian/excel-in-real-world-1cba</guid>
      <description>&lt;p&gt;Excel is a universal known tool by how it handles data and also how it's used to analyze data. &lt;br&gt;
Below are some of the ways excel is used in real world;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Finance Industry&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Excel is used in this industry to track for profits or losses to track growth.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Used to forecast financial discipline by a company&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;Human Resource &lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;-In companies, the HR is tasked with tracking everything employees have to do including how they are supposed to get paid and mostly, the HR uses excel to do that and is responsible for all their data.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Sales &amp;amp; Marketing&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;-Customer and sales tracking.&lt;/p&gt;

&lt;p&gt;-Campaign performance dashboards.&lt;/p&gt;

&lt;p&gt;-Market research survey data. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;3 Excel function/formulas&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;VLOOKUP&lt;br&gt;
-We might use VLOOKUP to look for related data let's say when you have two lists or data but both have a common identifier.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Concatenate Formulas&lt;br&gt;
-Assume you want to have a worker's name and where they live, you can combine those two in a different cell by using this formula.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Pivot Tables&lt;br&gt;
-Assume you want to check how a product is doing in different states if not cities, you can use the pivot tables to analyze that and you'll have your views or results.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;PERSONAL REFLECTION&lt;/em&gt;&lt;/strong&gt;&lt;br&gt;
Before I enrolled for this course, I had a hint of how excel is used but didn't know it was this broad. With the concept of using functions and also visualization I can now see this all differently.&lt;/p&gt;

&lt;p&gt;Just from excel definition, excel have taught me the importance of cleaning data. This will help even in the future how I approach data. And with the help of visualization, I can now group important takes instead of just looking for them manually from a dataset.&lt;/p&gt;

&lt;p&gt;In conclusion, I now see data differently. Before I used to be overwhelmed by the size of data but now I approach all this as an opportunity since handling messy data and large data is now a normal thing to me.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>RAG FOR DUMMIES</title>
      <dc:creator>Icefloqx Brian</dc:creator>
      <pubDate>Mon, 15 Sep 2025 10:39:28 +0000</pubDate>
      <link>https://dev.to/icefloqx_brian/rag-for-dummies-4bpm</link>
      <guid>https://dev.to/icefloqx_brian/rag-for-dummies-4bpm</guid>
      <description>&lt;p&gt;&lt;strong&gt;&lt;em&gt;What is RAG?&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Retrieval-augmented generation, or RAG, is a framework that combines the strengths of retrieval-based systems and generation-based models to produce more accurate &amp;amp; contextualized relevant responses. It does this through the following core components:&lt;/p&gt;

&lt;p&gt;Retrieval: relevant data is identified &amp;amp; retrieved from an external data source based on a user query.&lt;/p&gt;

&lt;p&gt;Augmentation: the retrieved data and the user query are combined into a prompt to provide the model with context for the generation step.&lt;/p&gt;

&lt;p&gt;Generation: the model generates output from the augmented prompt, using the context to drive a more accurate and relevant response.&lt;/p&gt;

&lt;h2&gt;
  
  
  How RAG functions.
&lt;/h2&gt;

&lt;p&gt;The first thing RAG does is take all the data and break them down into pages that we call “chunks”. &lt;/p&gt;

&lt;p&gt;These chunks are then be converted into searchable dimensions/vectors by a process known as embedding which are then stored in a vector database.&lt;/p&gt;

&lt;p&gt;After taken to the database which is more like our knowledge base, we smart fill the system that will always find the similar content when asked a question.&lt;/p&gt;

&lt;p&gt;We will build the system that will only answer questions from the given document.&lt;/p&gt;

&lt;p&gt;The RAG magic: Instead of just guessing, the system will first search your documents for relevant information, then use that information to generate accurate answers.&lt;/p&gt;

&lt;p&gt;We'll then ask it questions. If it answers questions within the context then we'll conclude by saying the RAG is functioning.&lt;/p&gt;

</description>
      <category>python</category>
      <category>beginners</category>
      <category>career</category>
      <category>vscode</category>
    </item>
    <item>
      <title>TRADING TYPE I &amp; TYPE II ERRORS</title>
      <dc:creator>Icefloqx Brian</dc:creator>
      <pubDate>Wed, 03 Sep 2025 07:33:37 +0000</pubDate>
      <link>https://dev.to/icefloqx_brian/trading-type-i-type-ii-errors-12fh</link>
      <guid>https://dev.to/icefloqx_brian/trading-type-i-type-ii-errors-12fh</guid>
      <description>&lt;p&gt;&lt;em&gt;&lt;u&gt;Where to Trade Off Type I and Type II Errors: A Medical Scenario&lt;/u&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In statistical decision-making, especially in medical research and diagnostics, two critical risks must be considered: &lt;strong&gt;Type I errors&lt;/strong&gt; (false positives) and &lt;strong&gt;Type II errors&lt;/strong&gt; (false negatives). Understanding when and how to trade one off against the other is essential for optimal decision-making, particularly when the stakes involve human health.&lt;/p&gt;

&lt;p&gt;&lt;u&gt;Defining Type I and Type II Errors&lt;u&gt;&lt;/u&gt;&lt;/u&gt;&lt;/p&gt;

&lt;p&gt;-Type I Error (False Positive)*&lt;em&gt;: Concluding that an effect or condition exists when it actually does not.&lt;br&gt;&lt;br&gt;
Example in medicine&lt;/em&gt;: Diagnosing a patient with a disease they do not have.&lt;/p&gt;

&lt;p&gt;-Type II Error (False Negative)*&lt;em&gt;: Failing to detect an effect or condition that actually exists.&lt;br&gt;&lt;br&gt;
Example in medicine&lt;/em&gt;: Missing a diagnosis in a patient who actually has the disease.&lt;/p&gt;

&lt;p&gt;&lt;u&gt;The Medical Use Case: Screening for a Life-Threatening Disease&lt;/u&gt;&lt;/p&gt;

&lt;p&gt;Imagine a screening test for &lt;strong&gt;early-stage pancreatic cancer&lt;/strong&gt;—a disease that is rare but deadly if undetected. The test’s accuracy can be adjusted by changing its &lt;strong&gt;threshold for detection&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;-Lower threshold → More likely to detect true cases (lower Type II error) but risk more false positives (higher Type I error).&lt;br&gt;
-Higher threshold → Fewer false alarms (lower Type I error) but greater chance of missing real cases (higher Type II error).&lt;/p&gt;

&lt;p&gt;&lt;u&gt;When to Trade Off&lt;/u&gt;&lt;br&gt;
The trade-off depends on "the consequences of each type of error".&lt;/p&gt;

&lt;p&gt;1.When to Accept More Type I Errors (False Positives)&lt;br&gt;
If the cost of missing a diagnosis is extremely high (e.g., delayed treatment leads to death), we tolerate more false positives to ensure almost no true cases are missed.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In our pancreatic cancer example, a false positive may lead to additional tests, anxiety, and medical expenses—but a false negative could mean a missed early diagnosis, drastically reducing survival chances.&lt;/li&gt;
&lt;li&gt;Here, we prioritize sensitivity(detecting all true cases) over specificity, meaning we accept more Type I errors.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;2.When to Accept More Type II Errors (False Negatives)&lt;br&gt;
If treatment is invasive, risky, or expensive, and a false positive could cause serious harm, we may prefer to miss some cases rather than risk unnecessary intervention.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Example: A surgery with high complication risk—only patients with strong evidence of disease should undergo it.&lt;/li&gt;
&lt;li&gt;In such cases, we prioritize specificity (avoiding false alarms) over sensitivity.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Balancing with Statistical Thresholds&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In hypothesis testing, the balance between these errors is often managed by:&lt;/p&gt;

&lt;p&gt;-Significance level (α) → Probability of Type I error (commonly 0.05).&lt;br&gt;
-Power (1 – β) → Probability of detecting a true effect (avoiding Type II error).&lt;/p&gt;

&lt;p&gt;In medical contexts:&lt;/p&gt;

&lt;p&gt;-Life-threatening but treatable disease → Lower α (e.g., 0.10) to increase sensitivity, accepting more false positives.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Dangerous or costly treatment → Higher α (e.g., 0.01) to reduce false positives.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;u&gt;Conclusion&lt;/u&gt;&lt;/p&gt;

&lt;p&gt;In medicine, the decision to trade off Type I and Type II errors is not purely statistical—it is ethical, clinical, and context-dependent. For a disease like pancreatic cancer, the cost of missing a true case is far greater than the cost of a false alarm. Therefore, clinicians and policymakers often choose to &lt;strong&gt;err on the side of caution&lt;/strong&gt;, accepting more false positives to save lives. However, in situations where treatment risks outweigh the benefits, the trade-off shifts, favoring fewer false positives even at the expense of missing some true cases.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>SUPERVISED LEARNING AND IT'S RELATION TO CLASSIFICATION.</title>
      <dc:creator>Icefloqx Brian</dc:creator>
      <pubDate>Sun, 24 Aug 2025 21:56:07 +0000</pubDate>
      <link>https://dev.to/icefloqx_brian/supervised-learning-and-its-relation-to-classification-11l9</link>
      <guid>https://dev.to/icefloqx_brian/supervised-learning-and-its-relation-to-classification-11l9</guid>
      <description>&lt;p&gt;Imagine you're teaching a child to recognize different types of animals. You show them a picture of a cat and say, "That's a cat." You show them a dog and say, "That's a dog." After many examples, the child can look at a new picture they've never seen before and correctly identify whether it's a cat or a dog.&lt;br&gt;
This is the key concept to what machine learning is...&lt;br&gt;
&lt;strong&gt;What is supervised learning?&lt;/strong&gt;&lt;br&gt;
Supervised learning is a pattern in ML where a model learns from labeled data and in return it predict the labels for new data. The  main task under supervised learning is classification which is used to predict categorical labels e.g. "Spam" or "Not spam" in emails.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How classification works&lt;/strong&gt;&lt;br&gt;
We begin with definition;&lt;br&gt;
Classification is a type of supervised learning and the main goal here is to predict categories or classes. The model is trained using labeled data plus the best and correct category then the model learns unique features during training to unseen data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The different models used for classification&lt;/strong&gt;&lt;br&gt;
A wide variety of algorithms are common when dealing with classification and they are;&lt;/p&gt;

&lt;p&gt;*Logistic regression- it uses sigmoid function to convert a linear output to a probability. It is also simple and interpretable and works well with binary classifications.&lt;/p&gt;

&lt;p&gt;*KNN(K-Nearest Neighbors)- it mainly classifies a data point based on the majority class of it's "Neighbors". It's also simple but mostly depends with the value of 'K'.&lt;/p&gt;

&lt;p&gt;*Decision Trees- the model here makes the decision by splitting data into several branches which sometimes makes it easy to interpret.&lt;/p&gt;

&lt;p&gt;*Random Forest- they are from an ensemble method of learning that deals with multiple ML models to improve the performance and in turn it will also reduce overfitting.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Personal Views and Insights&lt;/strong&gt;&lt;br&gt;
My view on supervised learning in relation to classification is that a clean and well-featured dataset will yield better results but also sometimes simple models like logistic regression can outdo the complex ones example of those ensemble techniques.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Challenges you’ve faced while working with classification&lt;/strong&gt;&lt;br&gt;
As a beginner, working with classification has been full of trial and errors but the most challenging ones have been;&lt;br&gt;
*Overfitting and underfitting- for example when dealing with KNN, a high pick of "K" might make you overlook some pattern hence missing on some important patterns that are maybe key to your model.&lt;/p&gt;

&lt;p&gt;*Choosing the right algorithm to use.&lt;/p&gt;

&lt;p&gt;*Hardship in interpretation and explaining classifications.&lt;/p&gt;

&lt;p&gt;To conclude, supervised classification and learning proves the partnership between humans and computers by how we teach computers to learn from data.&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
