How long does a rifle bullet take to travel a mile?

Status
Not open for further replies.
It could be useful for several things. For example, it would be necessary for hitting a moving target. It would also be useful in determining the distance you were being shot from, i.e. a bullet hits near you, then you hear the gun's report. If you knew the time between these two events, how far away was the shooter?
 
And there are going to be two solutions:

One for the most shallow angle at which the bullet can be shot to travel a mile, and one for the high trajectory path in which the bullet is launched at a more vertical angle and falls back to earth a mile away after reaching zero velocity at the apex.

I assume you are interested in the first, rather than the second.
Yes, I was not thinking of artillery.
 
It could be useful for several things. For example, it would be necessary for hitting a moving target. It would also be useful in determining the distance you were being shot from, i.e. a bullet hits near you, then you hear the gun's report. If you knew the time between these two events, how far away was the shooter?
Quigley down under much?
 
As several have implied, what you're after is the result of an integral calculus function. However, you can get not only in the ball park, but in the infield without knowing calculus.

Use any one of the available ballistic calculators ( Hornady's calculator is a good one). If the one you choose gives the exact flight time, then you're done, but most don't. In that case you can dummy up the calculus by using the most granular interval they have, say 100 yards. Input the Ballistic Coefficient (you can get that from the loading manuals or the manufacturer for your particular bullet), the muzzle velocity, and the distance (1760 yards). You can ignore wind, sight height, humidity, etc. - go with the defaults.

Once you have the output table, just divide each interval distance (300 feet if you chose 100 yard intervals) by the bullet velocity for that interval. Then add up all the time factors you get. For the example here, you will do 17 easy divisions.

It's ok to assume the bullet's velocity was a constant during the interval. That's the way an integral calculus function would do it, but the intervals would tend towards 0 feet with almost an infinite number of intervals. The difference in the pseudo way I showed and the real function would be insignificant for your purposes.

Ok, I can see how to do this. Where does one find the ballistic coefficient of a 20mm round (and .50BMG, .338 Lapua, etc.)?
 
That's just basic arithmetic.

5280 (feet in a mile) divided by 2600 (feet per second) equals 2.0307 seconds (at least).

Then add just a few milliseconds because the bullet does slow down just a bit.

It is that extra time that I am looking for.


It could be useful for several things. For example, it would be necessary for hitting a moving target. It would also be useful in determining the distance you were being shot from, i.e. a bullet hits near you, then you hear the gun's report. If you knew the time between these two events, how far away was the shooter?
How do you propose to usefully time such an event in milliseconds? :scrutiny:
 
The OP said that he was interested in the time down to the millisecond, that's an accuracy of 1/1000 of a second.
2.86 seconds is only down to 1/100 of a second and 2.8 seconds is only accurate to 1/10 of a second so neither of those answers work. ;)

Actually, I guess I could see this kind of question if you wanted to shoot a moving animal that is a mile away. An animal moving at 25mph will move about .44 inches in a millisecond. Lets assume that you have a kill zone of 8 inches, if you aim at the center of the kill zone then you don't want an error of more than 4 inches from the point of aim. If your speed estimate were off by more than 9 milliseconds then you'd hit the target outside the kill zone. If the animal were moving faster than 25mph you would have to be even more accurate.

There are two problems though, I doubt if you'll ever find a ballistics calculator that could be accurate down to the millisecond and I think it would be pretty unethical to even take the shot.

I recently watched the movie of American Sniper and when he finally gets the opposing sniper it's from some huge distance, I think over a mile... the movie emphasized it by ostensibly showing the bullet on its way taking what seemed like a long time, and in the end hitting the opposing sniper exactly between the eyes. My knowledge of these things is very limited and although I could imagine him hitting the target if the target didn't move, I was thinking "Right between the eyes? Really???" Plus in two+ seconds what are the odds the target would hold perfectly still?
 
Not really.

I have read about Carlos Hathcock making some very long range shots. At those distances time to impact is a factor.

Yes but not milliseconds or centiseconds. This in fact, like many other variables, changes from one shot to the next.
So one should not worry about such minute variations because it has no relevance in the practical world.
A range estimation error by 1 or 5 yards for example has more impact and we know that is unavoidable and even that has no impact in practical terms.
 
Last edited:
Not really.

I have read about Carlos Hathcock making some very long range shots. At those distances time to impact is a factor.
Just struck me at the moment because the question, "how long from the time the bullet hit you heard the shot!?" Was so similar, no offense meant.
 
According to JBM Ballistics a 338 Lapua firing a 300 grain Sierra Match King bullet at sea level with a muzzle velocity of 2750 fps will travel a mile in about 3.1 seconds.
 
It could be useful for several things. For example, it would be necessary for hitting a moving target. It would also be useful in determining the distance you were being shot from, i.e. a bullet hits near you, then you hear the gun's report. If you knew the time between these two events, how far away was the shooter?

Flechette, that is just mean I'll be up all night figuring that one out.
 
Last edited:
Flechette, those cameras are amazing. They can also record a bee flying so one can see its wings in motion yet it has nothing to do with the practical application of ballistics.

I guess I am wondering how much delay there is in the bullet arriving on target at such long distances. For military purposes, a target can walk a significant distance in 3 seconds. Since the bullet is slowing down all the way to the target a simple calculation using a range finder and muzzle velocity might not work.
 
Flechette, that is just mean I'll be up all night figuring that one out.

It's an old trick based on the speed of sound. You multiply the speed of sound times the seconds between the impact and the report of the shot. The problem is that your range estimate is only as accurate as your estimate of the speed of sound (specifically where you are) and how much time elapsed from the time of impact to the time that you heard the report of the shot. If you assume the speed of sound to be 1100 fps then the sound of the shot travels 1100 feet for every second, if there were 2 seconds between the impact and the report then the shooter is approx. 2200 feet (733 yards) from the point of impact.
 
That's just basic arithmetic.

5280 (feet in a mile) divided by 2600 (feet per second) equals 2.0307 seconds (at least).

Then add just a few milliseconds because the bullet does slow down just a bit.

Aarond
.
It's a bit more complicated than that....
Even sophisticated battleship fire control systems which account for pitch and roll and humidity and firing platform movement and the Coriolis effect, etc, etc, etc, still need spotters at long ranges most of the time.
Well, to be fair, they are shooting at a moving target that is 10 to 15 miles distant, and even so, they are a bit more accurate than you think.

During the Battle of Surigao Strait, the USS West Virginia open fired at approximately 22,800 yards (13 miles), at night, at the IJN Yamashiro, closing at full steam, and achieved hits with the first salvo.

In one of the longest engagements on record, the Scharnhorst open fired on the HMS Glorious at approximately 26,500 yards (15 miles) and scored a hit with the third salvo.

Even in WW1 naval gun fire was pretty darn good, During the opening action of the Battle of Jutland, at a range of about 7-1/2 miles, in heavy seas, at dusk, and while receiving fire from two battlecruisers, the SMS Moltke scored 9 hits on the HMS Tiger in 12 salvos . . .

The West Virginia engagement is the equivalent of shooting at an average sized rat walking across the range at 400 yards, and hitting him, with the first shot.
 
Last edited:
The problem with all of the above calculations is that they assume the bullet will travel in a straight line and maintain muzzle velocity for the whole mile. The trajectory is going to be very arched meaning for a bullet to hit a target a mile away it will have to travel a much greater distance. I'm not smart enough with the math to figure it out. But you're going to have to aim about 100' high to hit a target 1 mile away. Looking at 300 WM with 180 gr bullets with a BC above .5 anyway.

Also, most bullets starting at 2700 fps will be down to about 1300 fps at 1/2 mile, about 700ish fps at a mile. Figuring in for the extra distance the bullet travels in an arc vs a straight line impact speeds could be closer to 500 fps. I think it will be a bit more than 2-3 seconds. I'm guessing as much as 4-5 seconds.
 
The problem with all of the above calculations is that they assume the bullet will travel in a straight line and maintain muzzle velocity for the whole mile. The trajectory is going to be very arched meaning for a bullet to hit a target a mile away it will have to travel a much greater distance. I'm not smart enough with the math to figure it out. But you're going to have to aim about 100' high to hit a target 1 mile away. Looking at 300 WM with 180 gr bullets with a BC above .5 anyway.

Also, most bullets starting at 2700 fps will be down to about 1300 fps at 1/2 mile, about 700ish fps at a mile. Figuring in for the extra distance the bullet travels in an arc vs a straight line impact speeds could be closer to 500 fps. I think it will be a bit more than 2-3 seconds. I'm guessing as much as 4-5 seconds.
So if the speed decreases so much, what effect does that have on lethality when the bullet finally hits?
 
So if the speed decreases so much, what effect does that have on lethality when the bullet finally hits?

The article I linked to in Post #20 says a bullet needs to be traveling at least 300 FPS to be lethal. Of course, that general statement does not consider bullet placement which would take this in another direction entirely!!
 
It could be useful for several things. For example, it would be necessary for hitting a moving target. It would also be useful in determining the distance you were being shot from, i.e. a bullet hits near you, then you hear the gun's report. If you knew the time between these two events, how far away was the shooter?

For this to be accurate would require you to run around with your finger on the button of a stopwatch, and that your reflexes and reaction time be known and be consistent.
 
Status
Not open for further replies.
Back
Top