When you train your neural network for example, or evaluate it, you do the feedforward not only with one sample but with a lot of them, that’s what we call a batch. You’re gonna feed forward your NN with all your batches. Using all your batches once is 1 epoch. If you have 10 epochs it mean that you’re gonna use all your data 10 times (split in batches).
https://www.quora.com/What-is-epochs-in-machine-learning http://stackoverflow.com/questions/4752626/epoch-vs-iteration-when-training-neural-networks