Skip to content

Conversation

@terryum
Copy link

@terryum terryum commented May 17, 2016

To be honest, it's hard to understand the role of if 0: else: statements in the minibatch learning for-loop.
I think using only mnist.train.next_batch(batch_size) can make decent results.

I realized that if I use the Random batch sampling only, the performance decreases from 91-ish to 87-ish.
The reason is that random sampling doesn't exploit the whole set because of duplicated samples.

I changed this part by using np.random.permutation(n_train) to cover all data at each epoch.

To be honest, it's hard to understand the role of ```if 0: else:``` statements in the minibatch learning for-loop. I think using only ```mnist.train.next_batch(batch_size)``` can make decent results. I realized that if I use the *Random batch sampling* only, the performance decreases from 91-ish to 87-ish. The reason is that random sampling doesn't exploit the whole set because of duplicated samples. I changed this part by using ```np.random.permutation(n_train)``` to cover all data at each epoch.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

1 participant