Saturday, November 4, 2017

HP Prime: Best Regression Fit

HP Prime:  Best Regression Fit

The program BESTFIT compares a set of regressions to determine a best fit.  This simulates a feature presented on the Hewlett Packard HP 48S, HP 48G, HP 49G, and HP 50g.  BESTFIT compares the correlations of the following four regression models:

1.  Linear:  y = a * x + b
2.  Logarithmic:  y = a * ln x + b
3.  Exponential:  y = b * a^x   (y = b * e^(ln a * x))
4.  Power: y = b * x^a

The output is a two element list:  a string of the best fit equation and its corresponding correlation.

HP Prime Program:  BESTFIT

EXPORT BESTFIT(L1,L2)
BEGIN
// 2017-11-02 EWS
// Simulate Best Fit
// HP 48GX,49G,50g

// initialize
LOCAL clist,m,c,v,s,n;
clist:={0,0,0,0};

// test
// correlation is linear only
// requires approx()
// linear y=a*x+b
clist[1]:=approx(correlation(L1,L2));
// log y=a*LN(x)+b
c:=approx(correlation(LN(L1),L2));
IF IM(c)==0 THEN
clist[2]:=c;
END;
// exponential y=b*e^(a*x)
c:=approx(correlation(L1,LN(L2)));
IF IM(c)==0 THEN
clist[3]:=c;
END;

// power y=b*a^x
c:=approx(correlation(LN(L1),LN(L2)));
IF IM(c)==0 THEN
clist[4]:=c;
END;

// test
// POS(L0^2,MAX(L0^2))
m:=POS(clist^2,MAX(clist^2));
c:=clist[m];

IF m==1 THEN
v:=linear_regression(L1,L2);
s:=STRING(v[2])+"+"+STRING(v[1])+
"*X";
RETURN {s,c};
END;

IF m==2 THEN
v:=logarithmic_regression(L1,L2);
s:=STRING(v[2])+"+"+STRING(v[1])+
"*LN(X)";
RETURN {s,c};
END;

IF m==3 THEN
v:=exponential_regression(L1,L2);
s:=STRING(v[2])+"*"+STRING(v[1])+
"^X";
RETURN {s,c};
END;

IF m==4 THEN
v:=power_regression(L1,L2);
s:=STRING(v[2])+"*X^"+
STRING(v[1]);
RETURN {v,s};
END;

END;

Examples – BESTFIT:

Example 1:

X
y
1.05
10.45
2.28
11.33
4.20
16.38
6.34
28.87

Result: {“7.78250037344*1.21713132288^X”, 0.981261724397}
Example 2:

X
Y
-10
82
-5
41
5
-42
10
-79

Result:  {“0.5+ -8.1*X”, -0.999801918343}


Example 3:

X
Y
10
2.278
11
2.666
12
2.931
13
3.212

Result:  {“-5.79464870365+3.51429873838*LN(X)”, 0.998295735284}

BESTFIT2:  An Extended Version

Version 2 adds the following regressions: 

5.  Inverse:  y = b + a/x
6.  Simple Logistic:  y = 1/(b + a*e^(-x))
7.  Simple Quadratic:  y = b + a*x^2
8.  Square Root:  y = √(a*x + b)

HP Prime Program:  BESTFIT2

EXPORT BESTFIT2(L1,L2)
BEGIN
// 2017-11-02 EWS
// Simulate Best Fit
// HP 48GX,49G,50g
// additional models

// initialize
LOCAL clist,m,c,v,s;
clist:={0,0,0,0,0,0,0,0};

// test
// correlation is linear only
// requires approx()

// linear y=a*x+b
clist[1]:=approx(correlation(L1,L2));

// log y=a*LN(x)+b
c:=approx(correlation(LN(L1),L2));
IF IM(c)==0 THEN
clist[2]:=c;
END;

// exponential y=b*e^(a*x)
c:=approx(correlation(L1,LN(L2)));
IF IM(c)==0 THEN
clist[3]:=c;
END;

// power y=b*a^x
c:=approx(correlation(LN(L1),LN(L2)));
IF IM(c)==0 THEN
clist[4]:=c;
END;

// inverse y=b+a/x
IF POS(L1,0)==0 THEN
clist[5]:=approx(correlation(1/L1,
L2));
END;

// simple logistic
IF POS(L2,0)==0 THEN
clist[6]:=approx(correlation(e^(−L1),
1/L2));
END;

// simple quadratic
clist[7]:=approx(correlation(L1^2,
L2));

// square root
IF ΣLIST(L2≥0)==SIZE(L2) THEN
clist[8]:=approx(correlation(L1,
L2^2));
END;



// test
// POS(L0^2,MAX(L0^2))
m:=POS(clist^2,MAX(clist^2));
c:=clist[m];

IF m==1 THEN
v:=linear_regression(L1,L2);
s:=STRING(v[2])+"+"+STRING(v[1])+
"*X";
END;

IF m==2 THEN
v:=logarithmic_regression(L1,L2);
s:=STRING(v[2])+"+"+STRING(v[1])+
"*LN(X)";
END;

IF m==3 THEN
v:=exponential_regression(L1,L2);
s:=STRING(v[2])+"*"+STRING(v[1])+
"^X";
END;

IF m==4 THEN
v:=power_regression(L1,L2);
s:=STRING(v[2])+"*X^"+
STRING(v[1]);
END;

// inverse
IF m==5 THEN
v:=linear_regression(1/L1,L2);
s:=STRING(v[2])+"+"+STRING(v[1])+
"/X";
END;

// simple logistic
IF m==6 THEN
v:=linear_regression(e^(−L1),
1/L2);
s:="1/("+STRING(v[2])+"+"+
STRING(v[1])+"*e^(−X))";
END;

// simple quadratic
IF m==7 THEN
v:=linear_regression(L1^2,L2);
s:=STRING(v[2])+"+"+STRING(v[1])+
"*X^2";
END;

// square root
IF m==8 THEN
v:=linear_regression(L1,L2^2);
s:="√("+STRING(v[2])+"+"+
STRING(v[1])+"*X)";
END;

RETURN {s,c};
END;

Examples – BESTFIT2:

Example 4:

X
y
1.05
10.45
2.28
11.33
4.20
16.38
6.34
28.87

Result:  {“9.0573970192+0.48023219108*X^2”, 0.994491728382}

Example 5:

X
y
1
8.4853
2
8.9443
4
9.7980
7
10.9546

Result: {“√(63.9995089183+8.00048914762*X”, 0.999999999759}

Example 6:

X
y
1
1
2
-0.5
3
-1
4
-1.25

Result:  {“-2+3/X”, 1}

Eddie

This blog is property of Edward Shore, 2017

  Casio fx-7000G vs Casio fx-CG 50: A Comparison of Generating Statistical Graphs Today’s blog entry is a comparison of how a hist...