Introduction
Importing data manually into your database especially when its a number of tables, can not only be tiresome but also time consuming. This can be made easier by use of python libraries.
Download painting dataset from kaggle. Paintings data set is made up of 8 csv files that we will import to our database by use of a simple python script, instead of importing data to our database table manually.
Steps to import data
- Create database in PG-admin and call it painting
create database painting
- Open jupyter notebook and install python libraries
pip install sqlalchemy
pip install pandas
- Import Python libraries
import pandas as pd
from sqlalchemy import create_engine
- Create a connection to your pg-admin database
conn_string = 'postgresql://postgres:1344@localhost/painting'
db = create_engine(conn_string)
conn = db.connect()
In conn_string stores url to our database where postgresql is our database since we created database in pg-admin, postgres is default name of our database name pg-admin, 1344 is password to our pg-admin, @localhost is host name and painting is our database name
- Load files to your database
files = ['artist', 'canvas_size', 'image_link', 'museum', 'museum_hours', 'product_size', 'subject', 'work']
for file in files:
df = pd.read_csv(fr"C:\Users\Admin\Desktop\famous painti\{file}.csv")
df.to_sql(file, con = conn, if_exists='replace', index = False)
files is the names we wish to name our tables in painting database. for loop is used so that we can iterate over our files. fr in pd.read csv represents f string and raw data. Index must be set to false to not import default pandas index.
Conclusion
We have learned how to use Python and MySQL Connector to create an entirely new database in MySQL Server, create tables within that database instead of doing it manually.
Top comments (0)