Quantcast
Channel: Incompatible shapes: [128,1] vs. [128,3,3] - Stack Overflow
Viewing all articles
Browse latest Browse all 3

Answer by Abhishek Verma for Incompatible shapes: [128,1] vs. [128,3,3]

$
0
0

Put a Flatten layer before the last Dense layer. Because you are not doing that, that is why the tensor is not reduced to a single dimension tensor before the layer that gives you the class.

It is a general pattern in TensorFlow to use a Flatten before the layer that outputs that class.

Also, I have removed the BatchNormalization in the random dense layer you had put, BatchNormalization layer is put generally after a Conv layer, you can put them after a Dense layer, though. If you are BatchNormalization, make sure the whole network or the relevant has it. Don't just put one random BatchNormalization layer.

Here's how you change your code to do that.

 model = Sequential([Conv2D(filters=8, kernel_size=(3, 3),                            activation='relu', input_shape=(32, 32,3),                            name='conv_1'),                     BatchNormalization(),                     Conv2D(filters=8, kernel_size=(3, 3),                           activation='relu', padding= 'SAME',                            name='conv_2'),                     BatchNormalization(),                     MaxPooling2D(pool_size=(8, 8), name='pool_1'),                     Flatten(),                     Dense(64,  kernel_regularizer =                            regularizers.l2(0.5), bias_initializer='ones',                           activation='relu' , name='dense_1'),                     Dropout(0.3),                       Dense(64,kernel_regularizer =                            regularizers.l2(0.5) , activation='relu',                            name='dense_2'),                     Dense(64,  kernel_regularizer =                            regularizers.l2(0.5) ,  activation='relu',                            name='dense_3'),                     Dense(10,  activation='softmax',                            name='dense_4')             ])           model.compile(           optimizer = 'adam',           loss = 'sparse_categorical_crossentropy',           metrics= ['accuracy' ])           history = model.fit(train_images,train_labels , epochs = 30)

Viewing all articles
Browse latest Browse all 3

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>