Say I am loading 3.1 grains of powder and I want to know if my measure is throwing closer to 3.0 or 3.2, I just throw ten charges, weigh them all and divide by 10 and you’ll get “3.1x”.
I don't think this is a good method to determine the "accuracy" of the charges thrown. Consider the following example. Let's say you throw the following ten charges (in any order).
3.0, 3.0, 3.0, 3.0, 3.1, 3.1, 3.2, 3.2, 3.2, 3.2
Total weight is 31.0 so divide by 10 equals 3.10 but the standard deviation is 0.094 (close to 0.1) and only two of the charges thrown were the desired 3.1.
That said, there are so many other variables in reloading and shooting that completely negate +/- 0.1 grains of powder. If you're loading 4.0 grains then 0.1 grain = 2.5% of the total charge. If you're loading 25.0 grains then 0.1 grain = 0.4% of the total charge. If you're loading 75.0 grains then 0.1 grain = 0.1% of the total charge. So 0.1 grains has the biggest impact on small pistol loads which are intended for short barreled, sub 50 yard type firearms where 1/2" groups aren't the norm. For rifle loads such as .223 and up you're looking at less than 0.5% change in the total charge. If you think that +/- 0.1 grains makes a "significant, measurable" difference to your POI, let alone +/- 0.01 grains then you should have been in Beijing this week winning gold medals in every shooting event for the US.