### How fast will 1g get you there?

• If you have the energy for a constant 1G thrust, how long would it take to get to the planets in our solar system? How long for the 5 nearest solar systems?

Assuming turn over and decelerate at halfway.

"1g of thrust" pointed straight up will balance gravity, and result in you floating. "1g" (as I read it), is the acceleration caused by the Earth's gravity; if that's how you actually define it, then your acceleration decreases as you get further (and 'feel less pull') from Earth. Of course, you don't need to point straight up, and TidalWave's assumption that what you *meant is 9.8m/s/s* is probably correct - but note that even so, his answer provides you with a *minimum*, eg assuming you could turn off gravity and the atmosphere (and the assumptions he mentions at the top).

@hunter2, you are correct 1g of thrust will not get you off the planet. The assumption is that the starting point is in orbit, 1g of thrust during a long trip provides thrust & simulated gravity.

6 years ago

Assuming acceleration is constant, $d=(1/2) a t^2$. So plotted over time, distance traveled is a nice parabola.

If you want the time it'd take for a specific distance, it's easy to manipulate $d=(1/2) a t^2$.

$t=\sqrt{2d/a}$

If you're using meters and seconds as your units, $a=9.8 meters/sec^2$

To travel half the distance to the moon would take about 1.75 hours. The other half distance spent decelerating would take the same amount of time.

Using Days and AU (astronomical units) we can see 3 days will get about 2.5 AU (halfway to Jupiter). 4.5 days will get you 5 AU (halfway to Saturn). 9 days will get you 20 AU (more than halfway to the Kuiper belt)

It gets trickier for interstellar distances. In Newtonian mechanics v = at, so it'd take a little less than a year to reach c at 1 g acceleration. But relativity won't allow that, we can only get close to c.

Our Newtonian model is okay for nearly a year of acceleration and after that relativity wrecks this nice parabola:

After 1 year at 1 g we will have traveled .5 lightyears and our velocity will be close to maxed out. There after we're moving at close to c, so add a little more than a year for each lightyear distance.

Your "add a little more than a year for each lightyear distance" is correct for an outside observer, but for someone aboard the ship, the Newtonian model is correct for all distances (as measured before starting acceleration): Lorentz contraction will shrink the universe during travel to give the appearance of Newtonian physics.

Beautiful answer. I just want to point out that since the entire question is theoretical, why not ignore mass? if we allow ourselves to assume a=9.8m/s/s, then it's not depended on mass, so relativity isn't a big problem.

@Mark I broke travel into 35.4 day increments, each increment accelerating .1 c. After 354 days I got about .76 c and the passengers perceiving 300 days. I'm not sure that's correct, I'm not comfortable with special relativity. I don't think either an outside observe nor the accelerating passengers would see what appears to be a Newtonian universe.

@Mark not quite true. After all, if you observe things falling in constant gravity, you still can't see them go beyond speed of light, meaning that Newtonian physics doesn't hold even as seen by the ship.

@MaudPieTheRocktorate, if you observe things falling in constant gravity, you're the outside observer, not the person aboard the ship.