Not hardly.
View attachment 1069874
What is revealed is the inner characteristics of the QuickLoad Model -- which is 3(+)-order polynomial at its roots.
The least squares methodology shows that empirically... without ever having access to the multifactor equations themselves.
No, not not hardly. Heartily!
You're one of the people here who has the background and skill to understand what is going on. So I'll take one more swing at explaining why your model is severely busted.
In normal algebra, we're trying to solve for the values of the Xs. In regression (curve fitting) we are trying to solve for the coefficients of an expression of the form Y = a0 + a1X1 + a2X2 + a3X3.... In your case X1 is your X^3 term, X2 is your X^2 term, and X3 is your X term.
The required assumption is that X1, X2, and X3 are independent, or, if you prefer, orthogonal like the X Y and Z axes. The fundamental math of regression runs on variances, and if this condition is not met, then X1 contributes to the X2 and X3 variances, X2 to X1 and X3, etc. One of the ways to test for this is to calculate the Variance Inflation Factor. For small data sets, a VIF of 5 or more signals trouble. If your software calculates VIF, you might want to see what it is telling you. If it doesn't calculate it, then it is leaving you adrift.
With your data set, the contributions of the exponential terms are small. What that means is that your exponential terms practically overlay your linear term. In other words, they are far from independent but are highly correlated, or colinear. That busts the tool you are trying to use, causing non-credible results, and it frequently produces negative coefficients, such as X^2, where there should be none.
For a plethora of reasons, most reputable curve fitting software doesn't even use terms above X^2. And if you already have an R^2 of .985, it is unwise to continue adding terms. More terms = more wobbly model, and virtually no additional information. There are tools where that is not the case, but it is very much the case with regression.
Trust me on this. I co-wrote a major statistical application, and wrote a couple of books on the topic.
In the real world, I have run many data sets like the 8mm data I showed earlier. I have never found curvature. The hardest thing is to find a black cat in a dark room especially if there is no cat.
So is there curvature or not? There might be. We don't know. Under the circumstances, it is improper to assert that it is there. If it is there, it makes a very tiny contribution and a simple linear model gives a more credible result that has practically all the information that can be extracted from the data, and it is proper to say that the data are linear.