Quantcast
Channel: Incompatible shapes: [128,1] vs. [128,3,3] - Stack Overflow
Viewing all articles
Browse latest Browse all 3

Answer by Gi Yeon Shin for Incompatible shapes: [128,1] vs. [128,3,3]

$
0
0

I think there are two problems in your code.

At first, please check the shape of train_labels.Incompatible shapes error would appear if the shapes of tensor are wrong. I think that shape of [128, 1] means that the train_label is not kind of one-hot vector. If train_label shapes are such as [1, 3, 8, ..], then you should change the shapes into [[0, 1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 1, 0, 0, 0, 0, 0, 0], ...]

Second, you should add Flatten layer before Dense layer as mentioned above.

 model = Sequential([       Conv2D(filters=8, kernel_size=(3, 3), activation='relu', input_shape=(32, 32, 3), name='conv_1'),       ...       MaxPooling2D(pool_size=(8, 8), name='pool_1'),       Flatten(),       Dense(64,  kernel_regularizer ...             ])

The shape of (32, 32, 1) means that the last dim of input shape should be one. so you should change the input_shape of Conv2D into (32, 32, 1)

Conv2D(filters=8, kernel_size=(3, 3), activation='relu', input_shape=(32, 32, 1) ...

Also, the train_images should be also changed into (32, 32, 1) because the channel of images is one.

train_images = tf.expand_dims(train_images, -1)

Additionally, you can get one-hot vector of train_labels like that:

train_labels = tf.squeeze(train_labels)train_labels = tf.one_hot(train_labels, depth=10)

Viewing all articles
Browse latest Browse all 3

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>