Finding a point along a line a certain distance away from another point!

Let’s say you have two points, (x0,y0) and (x1,y1).

The gradient of the line between them is:

m=(y1y0)/(x1x0)

And therefore the equation of the line between them is:

y=m(xx0)+y0

Now, since I want another point along this line, but a distance d away from (x0,y0), I will get an equation of a circle with radius d with a center (x0,y0) then find the point of intersection between the circle equation and the line equation.

Circle Equation w/ radius d:

(xx0)2+(yy0)2=d2

Now, if I replace y in the circle equation with m(xx0)+y0 I get:

(xx0)2+m2(xx0)2=d2

I factor is out and simplify it and I get:

x=x0±d/1+m2

However, upon testing this equation out it seems that it does not work! Is there an obvious error that I have made in my theoretical side or have I just been fluffing up my calculations?

Answer

Another way, using vectors:

Let v=(x1,y1)(x0,y0). Normalize this to u=v||v||.

The point along your line at a distance d from (x0,y0) is then (x0,y0)+du, if you want it in the direction of (x1,y1), or (x0,y0)du, if you want it in the opposite direction. One advantage of doing the calculation this way is that you won’t run into a problem with division by zero in the case that x0=x1.

Attribution
Source : Link , Question Author : Kel196 , Answer Author : Théophile

Leave a Comment