Neural Network Using ReLU Activation Function

Your accuracy is 0 because you forgot to add an output layer, so your loss is not computed properly. In addition to this, accuracy is not a relevant metric since you are doing regression and not classification.

You need to modify your model like this:

model = Sequential(
    Dense(32, activation='relu', input_shape=(7,)),
    Dense(1, activation='linear'))

Also, in your model.compile() you have to modify your loss to be “mse” instead of “binary_crossentropy“, since you are doing regression and not classification.

model.compile(optimizer='sgd',
              loss='mse',
              metrics=['mean_squared_error'])

CLICK HERE to find out more related problems solutions.

Leave a Comment

Your email address will not be published.

Scroll to Top