Theorem:Suppose that $f : A \to \mathbb{R}$ where $A \subseteq \mathbb{R}$. If $f$ is differentiable at $x \in A$, then $f$ is continuous at $x$.This theorem is equivalent (by the contrapositive) to the result that if $f$ is not continuous at $x \in A$ then $f$ is not differentiable at $x$.

Why then do authors in almost every analysis book, not take continuity of $f$ as a requirement in the definition of the derivative of $f$ when we (seemingly) end up with equivalent results?

For example I don’t see why this wouldn’t be a good definition of the derivative of a function

Definition:Let $A \subseteq \mathbb{R}$ and let $f : A \to \mathbb{R}$ be a functioncontinuousat $a$. Let $a \in \operatorname{Int}(A)$. We define the derivative of $f$ at $a$ to be $$f'(a) = \lim_{t \to 0}\frac{f(a+t)-f(a)}{t}$$

provided the limit exists.I know this is probably a pedagogical issue, but why not take this instead as the definition of the derivative of a function?

**Answer**

Definitions tend to be minimalistic, in the sense that they don’t include unnecessary/redundant information that can be derived as a consequence.

Same reason why, for example, an equilateral triangle is defined as having all sides equal, rather than having all sides *and* all angles equal.

**Attribution***Source : Link , Question Author : Perturbative , Answer Author : dxiv*