<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Beatrice Njagi</title>
    <description>The latest articles on DEV Community by Beatrice Njagi (@beatrice_njagi).</description>
    <link>https://dev.to/beatrice_njagi</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/beatrice_njagi"/>
    <language>en</language>
    <item>
      <title>Deploying a Customer Lifetime Value (CLV) Prediction Model Using FastAPI</title>
      <dc:creator>Beatrice Njagi</dc:creator>
      <pubDate>Tue, 10 Mar 2026 16:12:51 +0000</pubDate>
      <link>https://dev.to/beatrice_njagi/deploying-a-customer-lifetime-value-clv-prediction-model-using-fastapi-2al6</link>
      <guid>https://dev.to/beatrice_njagi/deploying-a-customer-lifetime-value-clv-prediction-model-using-fastapi-2al6</guid>
      <description>&lt;p&gt;Customer Lifetime Value (CLV) is one of the most practically useful metrics a data-driven business can track. At its core, CLV estimates the total revenue a business can expect from a single customer over the entire duration of their relationship. Rather than treating every customer the same, CLV helps businesses identify which customers are worth investing in, which are at risk of churning, and how to allocate marketing and retention budgets more intelligently. &lt;/p&gt;

&lt;p&gt;For ride-hailing platforms, e-commerce companies, and subscription services alike, predicting CLV accurately can be the difference between sustainable growth and expensive mistakes.&lt;/p&gt;

&lt;p&gt;This article walks through how to build a CLV prediction model and deploy it as a live API using FastAPI.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Data&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The dataset used for this project contains customer records with seven input features and one target variable:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Customer_Age&lt;/strong&gt; — the age of the customer, which can influence spending patterns and platform engagement&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Annual_Income&lt;/strong&gt; — the customer's yearly income, used as a proxy for overall purchasing power&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tenure_Months&lt;/strong&gt; — how long the customer has been active, measured in months&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Monthly_Spend&lt;/strong&gt; — the average amount the customer spends per month on the platform&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Visits_Per_Month&lt;/strong&gt; — how frequently the customer engages with the platform each month&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Avg_Basket_Value&lt;/strong&gt; — the average value of each transaction or order placed&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Support_Tickets&lt;/strong&gt; — the number of support or complaint tickets raised, which can signal dissatisfaction and churn risk&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The target variable is &lt;strong&gt;CLV&lt;/strong&gt; — a score representing the lifetime value of each customer. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Model Selection: Linear Regression vs. Random Forest&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Two models were trained and evaluated: Linear Regression and a Random Forest Regressor.&lt;/p&gt;

&lt;p&gt;Linear Regression assumes a straight-line relationship between the input features and the CLV target. It is interpretable, fast to train, and performs reliably when the relationships in the data are consistent and proportional.&lt;/p&gt;

&lt;p&gt;Random Forest is an ensemble method that builds many decision trees during training and averages their outputs, making it well-suited for capturing complex, non-linear patterns in data.&lt;/p&gt;

&lt;p&gt;The models were evaluated using three common metrics: Mean Squared Error (MSE), Root Mean Squared Error (RMSE), and R-squared (R²). These metrics measure how closely the predicted values match the actual values. These were results:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Linear Regression — MSE: 272,939,026 | RMSE: 16,520.87 | R2: 0.9398&lt;/li&gt;
&lt;li&gt;Random Forest — MSE: 432,071,884 | RMSE: 20,786.34 | R2: 0.9047&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;After evaluation, Linear Regression performed better than Random Forest. It achieved a lower RMSE and a higher R² score, meaning its predictions were more accurate. Because of this performance advantage, Linear Regression was selected as the final model and saved for deployment along with the model features.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjbd2inpvuuh4q3zsgp0h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjbd2inpvuuh4q3zsgp0h.png" alt=" " width="800" height="193"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Deploying the Model with FastAPI&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;After training the model, the next step was to make it accessible for real-world use. This was done using FastAPI.&lt;/p&gt;

&lt;p&gt;The model was loaded into a Python script that defines an API. FastAPI uses Pydantic to validate input data. A class was created using BaseModel to define the expected input features such as customer age, income, tenure, and spending patterns. This ensures that any request sent to the API contains the correct data types and required fields.&lt;/p&gt;

&lt;p&gt;The API includes a /predict endpoint that accepts a POST request containing customer information in JSON format. When the request is received, the model processes the data and returns a predicted CLV value as a JSON response.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5zrc8erct2qnk6f3psl3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5zrc8erct2qnk6f3psl3.png" alt=" " width="800" height="435"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Running and Testing the API&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The API was run locally using Uvicorn, an ASGI server used to run FastAPI applications. The server can be started using the command:&lt;/p&gt;

&lt;p&gt;uvicorn main:app --reload&lt;/p&gt;

&lt;p&gt;FastAPI automatically generates interactive API documentation that can be accessed at:&lt;/p&gt;

&lt;p&gt;&lt;a href="http://127.0.0.1:8000/docs" rel="noopener noreferrer"&gt;http://127.0.0.1:8000/docs&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This interface allows users to test the API directly from a web browser by entering sample customer data. &lt;/p&gt;

&lt;p&gt;For programmatic testing, a request using Python's requests body looks like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fytniyx3ouldzrcjf6zkv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fytniyx3ouldzrcjf6zkv.png" alt=" " width="800" height="194"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A successful response returns something like:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8wbdfd2b0j7gyxqvzqjr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8wbdfd2b0j7gyxqvzqjr.png" alt=" " width="800" height="53"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This output tells the business that, based on this customer's behaviour, they are predicted to generate $49322.59 in lifetime value — a number that can directly inform decisions around discounts, loyalty rewards, or re-engagement campaigns.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Building a CLV prediction model is only half the work — getting it into a form that others can actually use is what makes it valuable. FastAPI makes that second half much more manageable than it might seem. With a trained Linear Regression model, a few lines of code, and Pydantic handling the input validation, the API was up and running without much friction. &lt;/p&gt;

</description>
      <category>datascience</category>
      <category>fastapi</category>
      <category>machinelearning</category>
      <category>python</category>
    </item>
    <item>
      <title>Ridge Regression vs Lasso Regression</title>
      <dc:creator>Beatrice Njagi</dc:creator>
      <pubDate>Fri, 30 Jan 2026 10:44:06 +0000</pubDate>
      <link>https://dev.to/beatrice_njagi/ridge-regression-vs-lassoregression-21hh</link>
      <guid>https://dev.to/beatrice_njagi/ridge-regression-vs-lassoregression-21hh</guid>
      <description>&lt;p&gt;&lt;strong&gt;Explained Through a House Price Prediction Problem&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Predicting house prices is a classic machine learning and statistics problem. Imagine you want to predict the price of a house using features such as its size, number of bedrooms, distance to the city center, nearby schools, and several other indicators. Linear regression is often the first model we try— but real-world data introduces challenges like overfitting and noise.&lt;/p&gt;

&lt;h2&gt;
  
  
  Ordinary Least Squares (OLS)
&lt;/h2&gt;

&lt;p&gt;Ordinary Least Squares (OLS) is the standard form of linear regression. It estimates the relationship between input features (e.g., house size, bedrooms) and the target variable (house price) by fitting a straight line (or hyperplane) that best explains the data. OLS minimizes the sum of squared residuals, where a residual is the difference between the actual house price and the predicted price. OLS tries to make the predicted prices as close as possible to the actual prices by minimizing squared errors.&lt;/p&gt;

&lt;p&gt;But this can lead to overfitting in real-world datasets. In the case of real housing data, many features may be correlated (e.g. size and number of bedrooms), some features may be noisy or irrelevant or the dataset may have limited samples.&lt;/p&gt;

&lt;p&gt;In such cases, OLS has no built-in mechanism to control model complexity. If the data includes many weak, irrelevant or noisy features, the model then learns the training data too closely—capturing noise and random patterns instead of the true relationship and patterns. The model performs very well on training data but poorly on unseen (test) data.&lt;/p&gt;

&lt;p&gt;To address these limitations, regularization techniques such as &lt;strong&gt;Ridge Regression (L2 regularization)&lt;/strong&gt; and &lt;strong&gt;Lasso Regression (L1 regularization)&lt;/strong&gt; are used.&lt;/p&gt;

&lt;h2&gt;
  
  
  Regularization
&lt;/h2&gt;

&lt;p&gt;Regularization addresses the overfitting problem by discouraging overly complex models. In linear regression, complexity usually shows up as very large coefficients or high sensitivity to small changes in data.&lt;/p&gt;

&lt;p&gt;Regularization works by adding a penalty term to the model’s loss function that penalizes large coefficients. This penalty discourages the model from relying too heavily on any single feature, forcing it to learn simpler, more general patterns that perform better on new data. Let’s look at two types of regularization bellow:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Ridge Regression (L2 Regularization)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Ridge adds a penalty that makes the model keep coefficients small. The penalty shrinks all coefficients toward zero, but never makes them exactly zero. Therefore, every feature stays in the model, just with reduced strength. &lt;/p&gt;

&lt;p&gt;Because the penalty only shrinks coefficients, it doesn’t fully remove them. So Ridge is useful when you believe all feature matter a little bit. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lasso Regression (L1 Regularization)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Lasso also adds a penalty, but this one can completely remove features. Lasso penalizes coefficients based on their absolute size. This penalty can force some coefficients to become exactly zero. Zero coefficients = feature removed from the model. &lt;/p&gt;

&lt;p&gt;The L1 penalty makes it “cheaper” for the model to get rid of a weak feature rather than keep shrinking it. For example: If the number of trees in a compound doesn’t really affect house price → Lasso sets its coefficient to 0 and removes it.&lt;br&gt;
So Lasso is useful when you think only a few features truly matter.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary: Ridge vs Lasso Regression&lt;/strong&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Method&lt;/th&gt;
&lt;th&gt;What It Does&lt;/th&gt;
&lt;th&gt;Effect on Features&lt;/th&gt;
&lt;th&gt;When to Use&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Ridge (L2)&lt;/td&gt;
&lt;td&gt;Shrinks coefficients&lt;/td&gt;
&lt;td&gt;Keeps all features&lt;/td&gt;
&lt;td&gt;When all features contribute a bit&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Lasso (L1)&lt;/td&gt;
&lt;td&gt;Shrinks + removes coefficients&lt;/td&gt;
&lt;td&gt;Drops unimportant features&lt;/td&gt;
&lt;td&gt;When only a few features matter&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Key Differences Between Ridge and Lasso Regression&lt;/strong&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Aspect&lt;/th&gt;
&lt;th&gt;Ridge Regression (L2)&lt;/th&gt;
&lt;th&gt;Lasso Regression (L1)&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Regularization type&lt;/td&gt;
&lt;td&gt;L2&lt;/td&gt;
&lt;td&gt;L1&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Feature selection&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Coefficient behavior&lt;/td&gt;
&lt;td&gt;Shrinks coefficients&lt;/td&gt;
&lt;td&gt;Shrinks and sets some to zero&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Best for&lt;/td&gt;
&lt;td&gt;Many useful features&lt;/td&gt;
&lt;td&gt;Few strong features&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Interpretability&lt;/td&gt;
&lt;td&gt;Moderate&lt;/td&gt;
&lt;td&gt;High&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Application Scenario: House Price Prediction&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You are using features such as:&lt;/p&gt;

&lt;p&gt;• Size of the house&lt;br&gt;
• Number of bedrooms&lt;br&gt;
• Distance to the city&lt;br&gt;
• Number of schools nearby&lt;br&gt;
• Several noisy or weak features&lt;/p&gt;

&lt;h2&gt;
  
  
  Choosing the Right Model
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;a. If all features are believed to contribute, choose Ridge Regression&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why?&lt;/strong&gt;&lt;br&gt;
• Ridge reduces overfitting without removing features&lt;br&gt;
• Ideal when many features have small but real effects&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;b. If only a few features are truly important, choose Lasso Regression&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why?&lt;/strong&gt;&lt;br&gt;
• Automatically removes noisy and irrelevant features&lt;br&gt;
• Produces a simpler, more interpretable model&lt;br&gt;
• Focuses on the strongest predictors of house price&lt;/p&gt;

</description>
      <category>algorithms</category>
      <category>datascience</category>
      <category>machinelearning</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Health Care Data Analysis: Power BI Workflow</title>
      <dc:creator>Beatrice Njagi</dc:creator>
      <pubDate>Mon, 08 Dec 2025 18:14:27 +0000</pubDate>
      <link>https://dev.to/beatrice_njagi/health-care-data-analysis-power-bi-workflow-dli</link>
      <guid>https://dev.to/beatrice_njagi/health-care-data-analysis-power-bi-workflow-dli</guid>
      <description>&lt;p&gt;This document highlights the process of importing and cleaning data in Power Query, modeling, creating measures, and finally visualizing the report in Power BI.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Data Import and Cleaning in Power Query Import:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Start a Blank Report in Power BI.&lt;/li&gt;
&lt;li&gt;Navigate to Get Data → PostgreSQL database.&lt;/li&gt;
&lt;li&gt;Enter the connection details (server:database) and authentication credentials (username:password), then connect.&lt;/li&gt;
&lt;li&gt;In the navigation panel, select the views to import and click Transform Data to open Power Query.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Cleaning:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Renamed columns for clarity and consistency.&lt;/li&gt;
&lt;li&gt;Rounded decimal numbers to two decimal places for readability.&lt;/li&gt;
&lt;li&gt;Created a Date Table covering the full range of dates in the dataset to facilitate time- based visuals and analysis.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Modelling Choices&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Established relationships between the Date Table, the hospital_doctor_monthly_metrics2 and hospital_appointments_enriched tables via their date columns.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo9qdg0u28z94p7hs6mxb.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo9qdg0u28z94p7hs6mxb.jpeg" alt=" " width="800" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Dashboard and Visuals&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Measures were created to track appointments summary by doctors, YTD appointment trends, cancelled appointments, patient volume, overall revenue collection summary and billing summary per patient.&lt;/li&gt;
&lt;li&gt;Visualizations include time trends, cancellations proportion, patient billing, and doctor- level performance summary.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Executive Summary&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnpgn3189aeunved2xozq.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnpgn3189aeunved2xozq.jpeg" alt=" " width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Appointments Analysis&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F652syzz04jva8wuydpnl.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F652syzz04jva8wuydpnl.jpeg" alt=" " width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Financial Analysis&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx1ikcnh8io4m579a38c0.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx1ikcnh8io4m579a38c0.jpeg" alt=" " width="800" height="443"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>How to Connect PostgreSQL to Power BI Using Local PostgreSQL and Aiven</title>
      <dc:creator>Beatrice Njagi</dc:creator>
      <pubDate>Mon, 24 Nov 2025 17:47:42 +0000</pubDate>
      <link>https://dev.to/beatrice_njagi/how-to-connect-postgresql-to-power-bi-using-local-postgresql-and-aiven-1k5m</link>
      <guid>https://dev.to/beatrice_njagi/how-to-connect-postgresql-to-power-bi-using-local-postgresql-and-aiven-1k5m</guid>
      <description>&lt;p&gt;Power BI is one of the leading business intelligence tools for analyzing and visualizing data. It provides multiple ways to load data, including direct connections to databases. This article explains how to connect Power BI to both a local PostgreSQL database and a cloud-based PostgreSQL database hosted on Aiven, a fully managed data platform for open-source databases.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Connecting to a Local PostgreSQL Database&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before connecting Power BI to your local database, ensure that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;PostgreSQL is installed and running on your machine.&lt;/li&gt;
&lt;li&gt;You can successfully connect to the database using a tool like DBeaver.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Note down connection details&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open DBeaver.&lt;/li&gt;
&lt;li&gt;Right-click your database connection and select Edit Connection.&lt;/li&gt;
&lt;li&gt;Note the following details:&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Setting&lt;/th&gt;
&lt;th&gt;Example&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Host&lt;/td&gt;
&lt;td&gt;localhost&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Port&lt;/td&gt;
&lt;td&gt;5432&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Database Name&lt;/td&gt;
&lt;td&gt;postgres&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Username&lt;/td&gt;
&lt;td&gt;postgres&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Password&lt;/td&gt;
&lt;td&gt;your_password&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhu0bc0dgz6zft5zgp9zv.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhu0bc0dgz6zft5zgp9zv.jpeg" alt=" " width="685" height="552"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You will use these details in Power BI.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Connect Power BI&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open Power BI Desktop and click blank report.&lt;/li&gt;
&lt;li&gt;Navigate to Home → Get Data → PostgreSQL database.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7p4utoaks3u31btpxi2y.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7p4utoaks3u31btpxi2y.jpeg" alt=" " width="679" height="656"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Enter the server (localhost) and database name (postgres) and click OK.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxyvhaa69ogu13fah3w7m.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxyvhaa69ogu13fah3w7m.jpeg" alt=" " width="703" height="354"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Enter your authentication credentials (username and password) and click Connect.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the navigator pane, select the tables you wish to import (e.g., patients, appointments) and click Load.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6k8ct3tprw8s38vya4ux.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6k8ct3tprw8s38vya4ux.jpeg" alt=" " width="800" height="636"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You have now successfully made the connection.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Connecting to Aiven PostgreSQL&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Aiven is a cloud-based data platform that provides fully managed open-source databases, streaming systems, and analytics services. Unlike a local database, connecting to Aiven requires additional steps because of network and security configurations.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Prerequisites&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before connecting, ensure you have:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;An active Aiven PostgreSQL service.&lt;/li&gt;
&lt;li&gt;Connection details: host, port, database name, username, password.&lt;/li&gt;
&lt;li&gt;The SSL certificate bundle (ca.pem) downloaded from Aiven and installed. ( Steps on how to do this are described below)&lt;/li&gt;
&lt;li&gt;Power BI Desktop installed.&lt;/li&gt;
&lt;li&gt;Optional: DBeaver for testing the connection.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Retrieve Aiven Connection Details&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Log into your Aiven dashboard.&lt;/li&gt;
&lt;li&gt;Select your PostgreSQL service.&lt;/li&gt;
&lt;li&gt;Navigate to Connection Information and copy the host, port, database name, username, and password.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb41tv4trapbvsq5lpyk3.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb41tv4trapbvsq5lpyk3.jpeg" alt=" " width="800" height="383"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Test Connection (Optional but Recommended)&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open DBeaver.&lt;/li&gt;
&lt;li&gt;Click New Connection → PostgreSQL.&lt;/li&gt;
&lt;li&gt;Enter the connection details:&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Setting&lt;/th&gt;
&lt;th&gt;Example&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Host&lt;/td&gt;
&lt;td&gt;Aiven host&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Port&lt;/td&gt;
&lt;td&gt;Aiven port&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Database Name&lt;/td&gt;
&lt;td&gt;Database Name&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Username&lt;/td&gt;
&lt;td&gt;Aiven user&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Password&lt;/td&gt;
&lt;td&gt;Aiven password&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;ul&gt;
&lt;li&gt;Click Test Connection. If successful, your settings are correct.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Step 3: Connecting to power BI&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For the connection to work, you need to add the Aiven CA certificate to the Trusted Root Certification Authorities store on Windows so Power BI Desktop can connect successfully.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to Add the Aiven CA Certificate to Trusted Root Certification Authorities (Windows)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1: Download the CA Certificate from Aiven&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Log in to Aiven Console&lt;/li&gt;
&lt;li&gt;Go to your PostgreSQL service&lt;/li&gt;
&lt;li&gt;Under Connection Info, download the CA Certificate (ca.pem)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffr0rktr72mjyp79f5txy.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffr0rktr72mjyp79f5txy.jpeg" alt=" " width="800" height="346"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Save it somewhere easy to find (e.g. Desktop or Downloads).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;2: Rename the File (optional but helps)&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Right-click the file → Rename&lt;/li&gt;
&lt;li&gt;Change from: ca.pem to: aiven-ca.cer&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;3: Install the Certificate&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Double-click the aiven-ca.cer file&lt;/li&gt;
&lt;li&gt;Click Install Certificate...&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd9idd9o2wo9woa8olvxi.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd9idd9o2wo9woa8olvxi.jpeg" alt=" " width="402" height="504"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select Local Machine (important). If asked, click Yes for Admin rights&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvj8vz2nmflbmh2bl597w.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvj8vz2nmflbmh2bl597w.jpeg" alt=" " width="536" height="530"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Choose: Place all certificates in the following store&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzm6vtsk34yqokg2x322q.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzm6vtsk34yqokg2x322q.jpeg" alt=" " width="530" height="521"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click Browse..., select:
Trusted Root Certification Authorities&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpbtnvyhow4cw56s6sp3y.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpbtnvyhow4cw56s6sp3y.jpeg" alt=" " width="534" height="523"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click Next → Finish&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You should see: "The import was successful."&lt;/p&gt;

&lt;p&gt;After adding the CA certificate to the Trusted Root store, Power BI Desktop will trust the connection to Aiven PostgreSQL and stop throwing SSL validation errors&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4: Connect Power BI&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open Power BI Desktop.&lt;/li&gt;
&lt;li&gt;Open a new or existing report.&lt;/li&gt;
&lt;li&gt;Navigate to Home → Get Data → PostgreSQL database.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F759983csgjdkvfvkciwr.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F759983csgjdkvfvkciwr.jpeg" alt=" " width="679" height="656"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Enter the Aiven host and port in the server field (host:port) and the database name and click Connect.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx58roa7p3kwqndoujubv.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx58roa7p3kwqndoujubv.jpeg" alt=" " width="699" height="351"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Enter your authentication credentials (username and password) and click Connect.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the navigator pane, select the tables you wish to import (e.g., patients, appointments) and click Load.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Femuojfyghf2hunxeevuq.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Femuojfyghf2hunxeevuq.jpeg" alt=" " width="800" height="638"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By following these steps, you can seamlessly connect Power BI to both local and cloud PostgreSQL databases for analytics and visualization.&lt;/p&gt;

</description>
      <category>postgres</category>
      <category>analytics</category>
      <category>microsoft</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Understanding DAX Functions in Power BI</title>
      <dc:creator>Beatrice Njagi</dc:creator>
      <pubDate>Fri, 10 Oct 2025 18:14:37 +0000</pubDate>
      <link>https://dev.to/beatrice_njagi/understanding-dax-functions-in-power-bi-59bm</link>
      <guid>https://dev.to/beatrice_njagi/understanding-dax-functions-in-power-bi-59bm</guid>
      <description>&lt;p&gt;Power BI stands for Power Business Intelligence—a powerful platform that enables users to transform raw data into clear, actionable, and interactive visualizations, supporting data-driven decision-making, collaboration through shared dashboards, and efficiency through automated data refreshes.&lt;/p&gt;

&lt;p&gt;At the heart of Power BI lies Data Analysis Expressions (DAX) — a formula language that powers calculations, aggregations, and data modeling. DAX allows users to create custom, powerful formulas that extend beyond what standard Excel functions can achieve.&lt;/p&gt;

&lt;p&gt;DAX functions are categorized by their purpose. In this article, we’ll explore four key categories:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Mathematical Functions (SUM, AVERAGE)&lt;/li&gt;
&lt;li&gt;Text Functions (LEFT, RIGHT, CONCATENATE)&lt;/li&gt;
&lt;li&gt;Date &amp;amp; Time Functions (YEAR, TOTALYTD, SAMEPERIODLASTYEAR)&lt;/li&gt;
&lt;li&gt;Logical Functions (IF, SWITCH)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each category plays a unique role in analyzing and managing data effectively.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Mathematical Functions (SUM, AVERAGE)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Mathematical DAX functions perform numeric calculations such as totals, averages, minimums, or maximums.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Common Functions:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;SUM() – Adds up all the values in a column.&lt;/p&gt;

&lt;p&gt;AVERAGE() – Calculates the mean value in a column.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example (Kenya Crop Dataset):&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The dataset includes columns such as County, Crop Type, Market Price (KES/Kg), Farmer, and Revenue (KES).&lt;/p&gt;

&lt;p&gt;To find total revenue and the average market price:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Total Revenue = SUM(Kenya_Crops_Dataset[Revenue (KES)])&lt;/li&gt;
&lt;li&gt;Average Market Price = AVERAGE(Kenya_Crops_Dataset[Market Price (KES/Kg)])&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;2. Text Functions (LEFT, RIGHT, CONCATENATE)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Text DAX functions manipulate string data — ideal for data cleaning, formatting, or combining text fields. These functions help structure data more clearly for reporting and analysis.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Common Functions:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;LEFT() – Extracts a specific number of characters from the start of a text string.&lt;br&gt;
RIGHT() – Extracts characters from the end of a text string.&lt;br&gt;
CONCATENATE() – Joins two or more text strings into one.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example (Kenya Crop Dataset):&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To extract the first and last three letters from the Crop Type column:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;First Three Letters = LEFT(Kenya_Crops_Dataset[Crop Type], 3)&lt;/li&gt;
&lt;li&gt;Last Three Letters = RIGHT(Kenya_Crops_Dataset[Crop Type], 3)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To combine Farmer Name and Crop Type into one field:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Crop and Farmer = CONCATENATE(Kenya_Crops_Dataset[Farmer Name], " " &amp;amp; Kenya_Crops_Dataset[Crop Type])&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;3. Date &amp;amp; Time Functions (YEAR, TOTALYTD, SAMEPERIODLASTYEAR)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Date and time DAX functions are essential for analyzing trends over time. They help track seasonal patterns, compare year-over-year performance, and forecast future outcomes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Common Functions:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;YEAR() – Extracts the year from a date field.&lt;br&gt;
TOTALYTD() – Calculates the year-to-date total for a given measure.&lt;br&gt;
SAMEPERIODLASTYEAR() – Compares performance for the same period in the previous year.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example (Kenya Crop Dataset):&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Year = YEAR(Kenya_Crop_Dataset[Date])&lt;/li&gt;
&lt;li&gt;YTD Revenue = TOTALYTD(SUM(Kenya_Crop_Dataset[Revenue (KES)]), Kenya_Crop_Dataset[Date])&lt;/li&gt;
&lt;li&gt;Previous Year Revenue = CALCULATE(SUM(Kenya_Crop_Dataset[Revenue (KES)]), SAMEPERIODLASTYEAR(Kenya_Crop_Dataset[Date]))&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;4. Logical Functions (IF, SWITCH)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Logical functions test conditions and return specific results depending on whether those conditions are TRUE or FALSE. They are useful for categorizing, flagging, or grouping data based on performance or attributes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Common Functions:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;IF() – Tests a condition and returns one value if true, another if false.&lt;/p&gt;

&lt;p&gt;SWITCH() – Evaluates an expression against multiple possible outcomes and returns a result for the first match.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example (Kenya Crop Dataset):&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To categorize Market Price into low, median, or high:&lt;/p&gt;

&lt;p&gt;Market Price Category = IF(Kenya_Crops_Dataset[Market Price (KES/Kg)] &amp;gt; 100.68, "High", "Low")&lt;/p&gt;

&lt;p&gt;Or using SWITCH() for multiple conditions:&lt;/p&gt;

&lt;p&gt;Market Category v2 =&lt;br&gt;
SWITCH(&lt;br&gt;
    TRUE(),&lt;br&gt;
    Kenya_Crops_Dataset[Market Price (KES/Kg)] &amp;lt; 100.68, "Low",&lt;br&gt;
    Kenya_Crops_Dataset[Market Price (KES/Kg)] = 100.68, "Median",&lt;br&gt;
    Kenya_Crops_Dataset[Market Price (KES/Kg)] &amp;gt; 100.68, "High",&lt;br&gt;
    "Invalid"&lt;br&gt;
)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Power BI and DAX together form a powerful analytical engine for turning data into actionable insights. For instance, farmers, agribusinesses, and policymakers can use DAX functions to identify trends, monitor performance, and make informed decisions. With these tools, users can predict harvest outcomes, allocate resources efficiently, and optimize profitability across seasons.&lt;/p&gt;

</description>
      <category>datascience</category>
      <category>learning</category>
      <category>productivity</category>
    </item>
  </channel>
</rss>
