How to define the bounds of Gamma distribution (a,b)

2 views (last 30 days)
Hey everybody, I am looking for how to calculate the interval of the gamma density distribution when setting the priors in Bayesian estimation. For beta(a,b) the mean of X= E(X)=a/(a+b) and variance is V(X)=(a+b)/(a+b+1)(a+b)^2, as we define the mean and varaince from the common values in the literature I return and calculate a and b. Please for gamma (a,b) distribution with E(X)=0.74 and std(X)=0.0056 how to find a and b? Many thanks in advance.

Accepted Answer

Star Strider
Star Strider on 4 Jun 2017
The Wikipedia article on the Gamma distribution (link) indicates that:
gamma_mean = a*b;
gamma_var = a*b^2;
so with your data:
gamma_mean = 0.74
gamma_var = 0.0056^2 % Var is StDev^2
b = gamma_var/gamma_mean
a = gamma_mean/b
b =
4.2378e-05
a =
17462
  4 Comments
Houda
Houda on 4 Jun 2017
Thanks a lot because I tried before to resolve this issue but with to not avail, thanks to you I found correct results as the autho, however, in many cases it was impossible. I attached the paper you can check by your self (table 2 P1274). take care.

Sign in to comment.

More Answers (0)

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!