Zero accuracy training a neural network in Keras
I am not sure what your problem is, but your model looks a little weird to me.
This is your model:
lrelu = LeakyReLU(alpha = 0.1)
model = Sequential()
model.add(Dense(126, input_dim=15)) #Dense(output_dim(also hidden wight), input_dim = input_dim)
model.add(lrelu) #Activation
model.add(Dense(252))
model.add(lrelu)
model.add(Dense(1))
model.add(Activation('linear'))
and the visualization of your model is shown as below:
There are two layers which can be the output layer of your model, and you didn't decide which one is your actual output layer. I guess that's the reason you cannot make the correct prediction.
If you want to implement your model like this,
you should add your activation layer independently, rather than use the same one.
For example,
model = Sequential()
model.add(Dense(126, input_dim=15)) #Dense(output_dim(also hidden wight), input_dim = input_dim)
model.add(LeakyReLU(alpha = 0.1)) #Activation
model.add(Dense(252))
model.add(LeakyReLU(alpha = 0.1))
model.add(Dense(1))
model.add(Activation('linear'))
The problem is that your final model output has a linear activation, making the model a regression, not a classification problem. "Accuracy" is defined when the model classifies data correctly according to class, but "accuracy" is effectively not defined for a regression problem, due to its continuous property.
Either get rid of accuracy as a metric and switch over to fully regression, or make your problem into a classification problem, using loss='categorical_crossentropy'
and activation='softmax'
.
This is a similar problem to yours: Link
For more information see: StackExchange
I ran into a similar problem, after trying all the suggestions and none of them working, I figured something must be wrong somewhere else.
After looking at my data distribution, I realized that I was not shuffling my data. So my training data was the majority of one class and my testing data was 100% another class. After shuffling the data the accuracy was no longer 0.0000e+00, it was something more meaningful.