In D, .1 will be set as the "global min" but it is a mistake and 1500 should be set as the true global min.
To eleborate on ww’s response:
As LanX says, the standard test for outliers is Grubbs’s test. There is an online calculator for Grubbs’s test at http://www.graphpad.com/quickcalcs/grubbs2/, and entering your sample data for series “D” — with Alpha set to either 0.05 or 0.01 — produces the following result :
Row |
Value |
Z |
Significant Outlier? |
1 |
0.1 |
1.471 |
Furthest from the rest, but not a significant outlier (P > 0.05). |
2 |
1500.0 |
0.173 |
|
3 |
1700.0 |
0.000 |
|
4 |
2100.0 |
0.346 |
|
5 |
3200.0 |
1.298 |
|
So, why do you identify 0.1 as an outlier?