To start with, let's ask the question: what does it mean for something to be continuous? Let's make the problem more concrete by looking at a volume slider on your ipod. For the sake of discussion, let's assume this is a small bar on the screen that can slide up to 100mm from the bottom. In the bottom position, the sound is off, and the volume is 0 decibels. In the top position, the sound is fully on, for a volume of 100 decibels. Naturally, this position is reserved for listening to Math Mutation podcasts. What does it mean to say that this volume control is continuous? Well, as you slide the lever from the bottom to the top, you expect the sound to smoothly get louder gradually as it rises. If some position caused a sudden jump to 100db and back down, or sliding past some other position caused it to suddenly drop to a much quieter level, you would say that it's not continuous.
The confusing part comes if you point at some spot on the control, say the exact center, and ask "Is this control continuous here?" It's hard to say if the control is continuous at that point without wiggling it around a little-- after all, when it's just sitting at the middle and playing at 50 db, you don't know how the overall control behaves. This was one of the sticking points of early calculus: fundamentally, it tried to talk about motion or change at individual points, while every individual point is inherently static. But if you wiggle it a little, you can quickly see that moving it one way will get you a little more than 50db, and wiggling the other way will get you a little less, with the change being related to how much you move the lever. We need to somehow make use of this fact to clearly define continuity.
To do this a little more rigorously, let's say you know that it's hard to get the volume lever at a precise point, but want to guarantee an error within 1db of that 50% mark, so you want the volume to be between 49 and 51 db. We should be able to identify a range on the volume slider, in this case a distance of 1mm on either side of that center point, such that as long as you are at least that close to the center, your error will be in that 1db range you want.
In other words, we have said that if the slider is truly continuous, it should be the case that for any arbitrary point on the slider, if we want our error to be within some designated range, then we can find a distance such that if we're at least that close to our target point, we'll be within that error range. And this is precisely the epsilon-delta definition: the Greek letter epsilon can be thought of as specifying the desired error range, and the delta represents the distance that we are allowed.
In calculus class, this probably sounded a lot more complicated. I'm not sure why, when trying to describe basic concepts, many textbooks devolve into a muddle of arcane symbols. I think math stuff sounds a lot more impressive when you throw in lots of Greek letters, even if this is not optimal for the reader's comprehension. Sadly, I have seen many high school math teachers who never quite grasped this concept either. It's really just a precise statement that to discuss continuity and limits, we need to directly relate expected error to distance from each point. A basic intuition about this error vs distance, or epsilon vs delta, concept is invaluable in many areas of modern mathematics.
And this has been your math mutation for today.