Overview
Expanding on the blog i did last week i wanted to talk more about how you can explore and learn Keras. i was able to find a very simple data set to explore Keras on and followed a few tutorials to get my feet wet with this sort of programming before i delved further into machine learning. To Start us off i imported the very basics i needed to run a Keras program. The data set i was working with was on Pima Indians with diabetes dataset, i chose this because it had clear a clear set of categories i could develop and they were very short and simple.
- Number of times pregnant
- Plasma glucose concentration a 2 hours in an oral glucose
- tolerance test
- Diastolic blood pressure (mm Hg)
- Triceps skin fold thickness (mm)
- 2-Hour serum insulin (mu U/ml)
- Body mass index (weight in kg/(height in m)^2)
- Diabetes pedigree function
Age (years)
-
Class variable (0 or 1)
- the y variable whether a patient had diabetes or not
Imports
i imported the basic libraries i needed
from numpy import loadtxt
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
Load the dataset
i then loaded the data set that i was working with
dataset = loadtxt('pima-indians-diabetes.csv', delimiter=',')
split into input (X) and output (y) variables
X = dataset[:,0:8]
y = dataset[:,8]
Define the keras model
The first hidden layer has 12 nodes and uses the relu activation function. (Rectified Linear Unit)
The second hidden layer has 8 nodes and uses the relu activation function.
The output layer has one node and uses the sigmoid activation function.
model = Sequential()
model.add(Dense(12, input_shape=(8,), activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
Compile the keras model
loss = binary_crossentropy
* Computes the cross-entropy loss between true labels and predicted labels.
optimizer = adam
* adam is a popular version of gradient descent because it automatically tunes itself and gives good results in a wide range of problems.
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
other metrics to look at: https://keras.io/api/metrics/
Fit the keras model on the dataset
epoch = how many times the data set is passed into the AI
batch_size = The batch size defines the number of samples that will be propagated through the network
model.fit(X, y, epochs=150, batch_size=10)
Evaluate the keras model
you can evaluate your model on your training dataset using the evaluate() function
accuracy = model.evaluate(X, y)
print('Accuracy: %.2f' % (accuracy*100))
Results
768/768 [==============================] - 0s 63us/step - loss: 0.4817 - acc: 0.7708
Epoch 147/150
768/768 [==============================] - 0s 63us/step - loss: 0.4764 - acc: 0.7747
Epoch 148/150
768/768 [==============================] - 0s 63us/step - loss: 0.4737 - acc: 0.7682
Epoch 149/150
768/768 [==============================] - 0s 64us/step - loss: 0.4730 - acc: 0.7747
Epoch 150/150
768/768 [==============================] - 0s 63us/step - loss: 0.4754 - acc: 0.7799
768/768 [==============================] - 0s 38us/step
Accuracy: 76.56
Predictions
To find predictions based off of our model we can use the predict() function to predict how the model will go moving forward.
...
predictions = model.predict(X)
round predictions
rounded = [round(x[0]) for x in predictions]
Prediction Results
Number of times pregnant => 0 (expected 1)
Plasma glucose concentration a 2 hours in an oral glucose => 0 (expected 0)
tolerance test => 1 (expected 1)
Diastolic blood pressure (mm Hg) => 0 (expected 0)
Triceps skin fold thickness (mm) => 1 (expected 1)
Conclusion
In conclusion i am still learning and i am not proficient in Keras and i Don't exactly know if all of my data was perfect but i wanted to do a rough tutorial on how to filter in data within Keras to show how easy it is to create models and predictions using this library for machine learning. the main take away i want people to read this short tutorial is that the keras library takes a lot of the convoluted coding and simplifies it even within tensorflow and allows for a much easier time taking data and creating models from the data.
Top comments (0)