Jump to content

A question about data analysis?


bsanders149

Recommended Posts

Okay, so, for this project, I'm doing a regression for the data I measured and the data that was generated(an estimation made by a computer). I am also trying to find the rate of error between the two values. So, when I try to find the error, do I measure how far away the generated is from the measured, or how far the ordered pair that they make up is from the closest point on a 1:1 line? Or, am I going about this completely wrong? x3 Thanks for any help you can give me!

Link to comment
Share on other sites

Yes, I know about the r^2 value. It is how well the data set as a whole fits the lines of best fit. What I'm looking for is how well each individual data point fits, but I don't know if I should be comparing the x value to the y value (assuming that the x value is completely without error), or the (x,y) to the 1:1 line (ie (1,1), (2,2), (3,3), etc.).

Link to comment
Share on other sites

well you should compare values with the same x, as both will be some function of x, so f(x) is your experimental data, and g(x) is your fit. The difference between f(x) and g(x) will be the error of the fit for that point.

Link to comment
Share on other sites

Thank you for the response. I decided to go with just subtracting the estimated values from the measured(which I have put with the assumption that they are accurate, eeeeeven though there's probably some errors there :P). I think that's what you all were trying to explain, anyway. x3 Thanks.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.