Difference between continuity and uniform continuity

I understand the geometric differences between continuity and uniform continuity, but I don’t quite see how the differences between those two are apparent from their definitions. For example, my book defines continuity as:

Definition 4.3.1. A function f:AR is continuous at a point cA if, for all ϵ>0, there exists a δ>0 such that whenever |xc|<δ (and xA) it follows that |f(x)f(c)|<ϵ.

Uniform continuity is defined as:

Definition 4.4.5. A function f:AR is uniformly continuous on A if for every ϵ>0 there exists a δ>0 such that |xy|<δ implies |f(x)f(y)|<ϵ.

I know that in Definition 4.3.1, δ can depend on c, while in definition 4.4.5, δ cannot depend on x or y, but how is this apparent from the definition? From what appears to me, it just seems like the only difference between Definition 4.3.1 and Definition 4.4.5 is that the letter c was changed to a y.

My guess is that the first definition treats c as a fixed point and it is only x that varies, so in this case, δ can depend on c since c doesn't change. Whereas for the second definition, neither x or y are fixed, rather they can take on values across the whole domain, A. In this case, if we set a δ such that it depended on y, then when we pick a different y, the same δ may not work anymore. Is this somewhat a correct interpretation?

Anymore clarifications, examples, would be appreciated.

Answer

First of all, continuity is defined at a point c, whereas uniform continuity is defined on a set A. That makes a big difference.
But your interpretation is rather correct: the point c is part of the data, and is kept fixed as, for instance, f itself. Roughly speaking, uniform continuity requires the existence of a single δ>0 that works for the whole set A, and not near the single point c.

Attribution
Source : Link , Question Author : user124005 , Answer Author : Siminore

Leave a Comment