Introduction
Databases are essential for storing and managing data in modern applications. Many organizations turn to PostgreSQL, a powerful, open-source object-relational database management system known for its reliability and data integrity.
By leveraging Structured Query Language (SQL), we can build an interconnected system that not only stores information but also reveals meaningful insights through advanced relationships and filtering.
Most DBMSs operate using two core components: Data Definition Language (DDL) and Data Manipulation Language (DML). Together, DDL and DML handle the computation within a DBMS, while the database itself stores the data.
Here are the key differences between DDL and DML:
| Aspect | DDL | DML |
|---|---|---|
| Purpose | Defines and manages the database schema and structure. | Manipulates and manages actual data records. |
| Commands |
CREATE, ALTER, DROP, TRUNCATE, RENAME
|
SELECT*, INSERT, UPDATE, DELETE, MERGE
|
| Effect | Changes the structure of tables, indexes, or views. | Changes only the rows or records within existing tables |
| Auto-Commit | Changes are permanent immediately (auto-committed) in most databases. | Changes are not permanent until a COMMIT command is issued. |
| Rollback | Generally cannot be undone once executed. | Can be rolled back (undone) if not yet committed. |
| WHERE Clause |
Cannot use a WHERE clause. |
Can use a WHERE clause to target specific records. |
In my recent database assignment, I developed a structured system to manage student records, curriculum details, and examination performance.
Using PostgreSQL, I implemented the four core SQL operations: CREATE, INSERT, UPDATE, and DELETE, to build and maintain a functional educational database.
- CREATE
The first step involved defining the architecture of the database. I began by creating a dedicated schema to ensure all school data remained organized.
create schema nairobi_academy;
Within this schema, I used the CREATE TABLE command to build three interconnected entities:
students
create table students(
student_id INT primary key,
first_name VARCHAR(50) not null,
last_name VARCHAR(50) not null,
gender VARCHAR(1),
date_of_birth DATE,
class VARCHAR(10),
city VARCHAR(50)
);
subjects
create table subjects(
subject_id INT primary key,
subject_name VARCHAR(100) unique not null,
department VARCHAR(50),
teacher_name VARCHAR(100),
credits INT
);
exam_results
create table exam_results(
result_id INT primary key,
student_id INT not null,
subject_id INT not null,
marks INT not null,
exam_date DATE,
grade VARCHAR(2)
);
- INSERT
I used INSERT INTO to transform the empty tables into a live database. I successfully migrated data for 10 students and 10 core subjects. This phase also included recording initial exam marks and grades, providing a guideline of data to work with.
insert into exam_results (result_id, student_id, subject_id, marks, exam_date, grade)
values
(1, 1, 1, 78, '2024-03-15', 'B'),
(2, 1, 2, 85, '2024-03-16', 'A'),
(3, 2, 1, 92, '2024-03-1', 'A'),
(4, 2, 3, 55, '2024-03-17', 'C'),
(5, 3, 2, 49, '2024-03-16', 'D'),
(6, 3, 4, 71, '2024-03-18', 'B'),
(7, 4, 1, 88, '2024-03-15', 'A'),
(8, 4, 6, 63, '2024-03-19', 'C'),
(9, 5, 5, 39, '2024-03-20', 'F'),
(10, 6, 9, 95, '2024-03-21', 'A');
- UPDATE
I utilized the UPDATE command to ensure the records remained accurate over time.
update students
set city = 'Nairobi'
where student_id = 5;
update exam_results
set marks = 59
where result_id = 5;
- DELETE
The final part involved data cleanup. Using the DELETE statement, I removed obsolete entries from the system.
delete from exam_results
where result_id = 9;
Additionally, I used structural commands like ALTER TABLE to drop unnecessary columns.
alter table students
drop column phone_number;
Filtering with WHERE
The WHERE clause only allows rows that meet specific criteria to appear in the results. To make these filters precise, I utilized several key SQL operators.
-
The
=Operator: Used in this case to retrieve specific records.
select * from students
where class = 'Form 4';
-
The
>Operator: This is essential for analyzing performance and numerical data.
select * from exam_results
where marks >= 70;
-
The
BETWEENOperator: HereBETWEENprovides a cleaner way to filter within a specific range.
select * from exam_results
where marks between 50 and 80;
-
The
INOperator: This filters records where a column matches one of the specified values.
select * from students
where city in ('Nairobi', 'Mombasa', 'Kisumu');
-
The
LIKEOperator: is used within aWHEREclause to search for a specified pattern in a column. It is specifically case-sensitive by default.
select * from subjects
where subject_name like '%Studies%';
I also combined these filters for deeper insights. For instance, using WHERE class = 'Form 3' AND city = 'Nairobi' allowed me to find a very specific subset of students: those in a particular year who also live in that city.
Using CASE WHEN to Transform Data
In a school setting, a score of "85" or "49" is more than just a number, it represents a student's progress. I used CASE WHEN to automatically categorize performance.
select marks,
case
when marks >= 80 then 'Distinction'
when marks >= 60 then 'Merit'
when marks >= 40 then 'Pass'
when marks < 40 then 'Fail'
end as performance
from exam_results;
Additionally, I used the CASE statement combined with the IN operator to define Student Levels:
select first_name, last_name, class,
case
when class in ('Form 3', 'Form 4') then 'Senior'
when class in ('Form 2', 'Form 1') then 'Junior'
end as student_level
from students;
A reflection on my study findings
What I found most interesting was the use of the CASE WHEN statement, being able to take a column of raw numbers and instantly turn them into human-readable labels like "Distinction" or "Senior" without actually changing the underlying data.
The biggest challenge was the strictness of SQL syntax. One missing semicolon or a misspelled column name command can cause errors in the entire system.
The transition from writing basic code to deploying a functional database was a rewarding challenge. This experience highlighted that SQL proficiency goes beyond technical syntax; it is about developing the logical framework required to communicate effectively with data and extract meaningful insights.
Top comments (0)