I've noticed that depending on the exact angle the bullet travels through my chronograph's screens, the velocity can change significantly. For example, I setup my .308 to shoot at 100 and 200 yards, and place the chronograph ~10 feet in front of the rifle. The 200 yard target is above the 100 yard target, so the bullet would be traveling at a different angle when shooting at it (not as parallel to the ground). The calculated velocity different was quite dramatic - the average difference from shooting at 100 and 200 yards for the 3 loads I tried ranged between 60fps and over 100fps, for the same load. (200 yard loads were clocked slower, and had less parallel trajectory). That is a pretty big difference, especially when trying to calculate drop out to 600 yards and beyond.
How common is this phenomenon? I'd imagine that whats happening is the bullet is taking slightly longer to go through the screens because of the steeper angle when shooting at longer distances. Is this something I should worry about, or should I get a rough estimated of fps and just worry about standard deviation?
How common is this phenomenon? I'd imagine that whats happening is the bullet is taking slightly longer to go through the screens because of the steeper angle when shooting at longer distances. Is this something I should worry about, or should I get a rough estimated of fps and just worry about standard deviation?