0
ClearAll["Global`*"]
link1 = {84, 54, 36, 21};
klink1 = 4;
skinks = {56, 19, 28, 18, 24, 14, 9};
kskinks = 7;


mylikBetaBin[data_, kdata_, f0_, alpha_, beta_] := Module[
    {K, pj, fj, j, N, loglik, above, below},

    K = kdata;
    pj = Table[
        Binomial[K, j]*
        Beta[alpha + j, K - j + beta]/
        Beta[alpha, beta], {j, 0, K}
    ];

    fj = Prepend[data, f0];

    N = Sum[fj[[j]], {j, 1, Length[fj]}];
    above = LogGamma[N + 1];
    below = Sum[LogGamma[fj[[j]] + 1], {j, 1, Length[fj]}];
    loglik = fj.Log[pj];

    loglik = above - below + loglik
]

(*mylikBetaBin[link1, klink1, 37, 0.8, 0.3]*)

testfun[f0_, alpha_, beta_] := 
mylikBetaBin[skinks, kskinks, f0, alpha, beta];

(*testfun[165.178,0.573,1.576]*)

NMaximize[
    {testfun[f0, alpha, beta], 
    f0 >= 0 && alpha >= 0 && beta >= 0},
    {f0, alpha, beta},
    Method -> {"RandomSearch", "SearchPoints" -> 250}
]

I am new to Mathematica. I need to do some (global) optimization with constraints. This is a simple example I just managed to get it work! (finally)

I want to know if I am doing the right thing? Is NMaximize the function to use?

But the main question is, is there an optimizer will give me the standard errors (Hessian evaluated at these points)?

If not, how would I get them?

m_goldberg
  • 107,779
  • 16
  • 103
  • 257
Chen Stats Yu
  • 4,986
  • 2
  • 24
  • 50
  • Just in case you haven't read it http://reference.wolfram.com/mathematica/tutorial/ConstrainedOptimizationGlobalNumerical.html – Dr. belisarius Sep 30 '14 at 00:08
  • There's also: http://mathematica.stackexchange.com/questions/43513/how-to-work-with-experimentalnumericalfunction/44890#44890 – dr.blochwave Sep 30 '14 at 08:34
  • And (related) http://mathematica.stackexchange.com/a/54877/13162 – dr.blochwave Sep 30 '14 at 08:35
  • Thanks blochwave! I had read two of the links you provided. I was hoping that some optimizer will have an 'option' I can just switch on to have the standard errors (or just HESSIAN). But it seems not though Hessian is most commonly 'desired' when optimization, I think. – Chen Stats Yu Sep 30 '14 at 12:25

1 Answers1

0
ans=NMaximize[
    {testfun[f0, alpha, beta], 
    f0 >= 0 && alpha >= 0 && beta >= 0},
    {f0, alpha, beta},
    Method -> {"RandomSearch", "SearchPoints" -> 200}
]


he = D[testfun[f0, alpha, beta], {{f0, alpha, beta}, 2}]

he /. ans[[2]] // MatrixForm

Inverse[%]
Diagonal[-%]
Sqrt[%]

After some trial and error, I believe I found 'a' way to do this.

If anyone has a better solution, please let me know. Thanks!

Chen Stats Yu
  • 4,986
  • 2
  • 24
  • 50