svtruth
Member
This cries out for empirical verification, with video.
Wouldn't that be fun?
Wouldn't that be fun?
It could be useful for several things. For example, it would be necessary for hitting a moving target. It would also be useful in determining the distance you were being shot from, i.e. a bullet hits near you, then you hear the gun's report. If you knew the time between these two events, how far away was the shooter?Why?
Yes, I was not thinking of artillery.And there are going to be two solutions:
One for the most shallow angle at which the bullet can be shot to travel a mile, and one for the high trajectory path in which the bullet is launched at a more vertical angle and falls back to earth a mile away after reaching zero velocity at the apex.
I assume you are interested in the first, rather than the second.
Quigley down under much?It could be useful for several things. For example, it would be necessary for hitting a moving target. It would also be useful in determining the distance you were being shot from, i.e. a bullet hits near you, then you hear the gun's report. If you knew the time between these two events, how far away was the shooter?
As several have implied, what you're after is the result of an integral calculus function. However, you can get not only in the ball park, but in the infield without knowing calculus.
Use any one of the available ballistic calculators ( Hornady's calculator is a good one). If the one you choose gives the exact flight time, then you're done, but most don't. In that case you can dummy up the calculus by using the most granular interval they have, say 100 yards. Input the Ballistic Coefficient (you can get that from the loading manuals or the manufacturer for your particular bullet), the muzzle velocity, and the distance (1760 yards). You can ignore wind, sight height, humidity, etc. - go with the defaults.
Once you have the output table, just divide each interval distance (300 feet if you chose 100 yard intervals) by the bullet velocity for that interval. Then add up all the time factors you get. For the example here, you will do 17 easy divisions.
It's ok to assume the bullet's velocity was a constant during the interval. That's the way an integral calculus function would do it, but the intervals would tend towards 0 feet with almost an infinite number of intervals. The difference in the pseudo way I showed and the real function would be insignificant for your purposes.
Not really.Quigley down under much?
That's just basic arithmetic.
5280 (feet in a mile) divided by 2600 (feet per second) equals 2.0307 seconds (at least).
Then add just a few milliseconds because the bullet does slow down just a bit.
It is that extra time that I am looking for.
Why?
How do you propose to usefully time such an event in milliseconds?It could be useful for several things. For example, it would be necessary for hitting a moving target. It would also be useful in determining the distance you were being shot from, i.e. a bullet hits near you, then you hear the gun's report. If you knew the time between these two events, how far away was the shooter?
The OP said that he was interested in the time down to the millisecond, that's an accuracy of 1/1000 of a second.
2.86 seconds is only down to 1/100 of a second and 2.8 seconds is only accurate to 1/10 of a second so neither of those answers work.
Actually, I guess I could see this kind of question if you wanted to shoot a moving animal that is a mile away. An animal moving at 25mph will move about .44 inches in a millisecond. Lets assume that you have a kill zone of 8 inches, if you aim at the center of the kill zone then you don't want an error of more than 4 inches from the point of aim. If your speed estimate were off by more than 9 milliseconds then you'd hit the target outside the kill zone. If the animal were moving faster than 25mph you would have to be even more accurate.
There are two problems though, I doubt if you'll ever find a ballistics calculator that could be accurate down to the millisecond and I think it would be pretty unethical to even take the shot.
Not really.
I have read about Carlos Hathcock making some very long range shots. At those distances time to impact is a factor.
Just struck me at the moment because the question, "how long from the time the bullet hit you heard the shot!?" Was so similar, no offense meant.Not really.
I have read about Carlos Hathcock making some very long range shots. At those distances time to impact is a factor.
It could be useful for several things. For example, it would be necessary for hitting a moving target. It would also be useful in determining the distance you were being shot from, i.e. a bullet hits near you, then you hear the gun's report. If you knew the time between these two events, how far away was the shooter?
Define "usefully".How do you propose to usefully time such an event in milliseconds?
Flechette, that is just mean I'll be up all night figuring that one out.
None taken.Just struck me at the moment because the question, "how long from the time the bullet hit you heard the shot!?" Was so similar, no offense meant.
Define "usefully".
High speed cameras can record how RPGs explode.
Flechette, those cameras are amazing. They can also record a bee flying so one can see its wings in motion yet it has nothing to do with the practical application of ballistics.
Flechette, that is just mean I'll be up all night figuring that one out.
It's a bit more complicated than that....That's just basic arithmetic.
5280 (feet in a mile) divided by 2600 (feet per second) equals 2.0307 seconds (at least).
Then add just a few milliseconds because the bullet does slow down just a bit.
Aarond
.
Well, to be fair, they are shooting at a moving target that is 10 to 15 miles distant, and even so, they are a bit more accurate than you think.Even sophisticated battleship fire control systems which account for pitch and roll and humidity and firing platform movement and the Coriolis effect, etc, etc, etc, still need spotters at long ranges most of the time.
<shaking my head & walking away from the Thread>Define "usefully".
So if the speed decreases so much, what effect does that have on lethality when the bullet finally hits?The problem with all of the above calculations is that they assume the bullet will travel in a straight line and maintain muzzle velocity for the whole mile. The trajectory is going to be very arched meaning for a bullet to hit a target a mile away it will have to travel a much greater distance. I'm not smart enough with the math to figure it out. But you're going to have to aim about 100' high to hit a target 1 mile away. Looking at 300 WM with 180 gr bullets with a BC above .5 anyway.
Also, most bullets starting at 2700 fps will be down to about 1300 fps at 1/2 mile, about 700ish fps at a mile. Figuring in for the extra distance the bullet travels in an arc vs a straight line impact speeds could be closer to 500 fps. I think it will be a bit more than 2-3 seconds. I'm guessing as much as 4-5 seconds.
So if the speed decreases so much, what effect does that have on lethality when the bullet finally hits?
It could be useful for several things. For example, it would be necessary for hitting a moving target. It would also be useful in determining the distance you were being shot from, i.e. a bullet hits near you, then you hear the gun's report. If you knew the time between these two events, how far away was the shooter?