# Finding a point along a line a certain distance away from another point!

Let’s say you have two points, $(x_0, y_0)$ and $(x_1, y_1)$.

The gradient of the line between them is:

And therefore the equation of the line between them is:

Now, since I want another point along this line, but a distance $d$ away from $(x_0, y_0)$, I will get an equation of a circle with radius $d$ with a center $(x_0, y_0)$ then find the point of intersection between the circle equation and the line equation.

Circle Equation w/ radius $d$:

Now, if I replace $y$ in the circle equation with $m(x - x_0) + y_0$ I get:

I factor is out and simplify it and I get:

However, upon testing this equation out it seems that it does not work! Is there an obvious error that I have made in my theoretical side or have I just been fluffing up my calculations?

Let $\mathbf v = (x_1,y_1)-(x_0,y_0)$. Normalize this to $\mathbf u = \frac{\mathbf v}{||\mathbf v||}$.
The point along your line at a distance $d$ from $(x_0,y_0)$ is then $(x_0,y_0)+d\mathbf u$, if you want it in the direction of $(x_1,y_1)$, or $(x_0,y_0)-d\mathbf u$, if you want it in the opposite direction. One advantage of doing the calculation this way is that you won’t run into a problem with division by zero in the case that $x_0 = x_1$.