back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [ 45 ] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


45

Figure 3.9 shows the graph of the score function 5( ) = log L/ . The ML estimator 9 is the value of 9 for which 5(0) = 0. This is shown as the point C, where 5(9) crosses the 9-axis. Since

we have

alogL

5(0) de = log L(0) and

de = log 0

5(0) de = log L(0o). The LR statistic

is LR = -2[log ) - log 0)] = 2[log L(0) - log L(0o)]. Hence we get

LR = 2x (area )

Now consider approximating area by a triangle. Draw a tangent to 5(0) at point (i.e., 0 = 9). As mentioned earlier, 7(0) is estimated by the slope of this tangent, that is, by AD/AC.

var(e) =

7(0) AD

5(6)

Figure 3.9. Graph of the score function and geometric derivation of the LR, W, and LM test statistics. LRJ2 = area . W/2 = area DAC. LM/2 = area BAE.



The Wald statistic is W = (6 - %) \ ) = (ACfAD) = (AC x AD). Hence we get

W = 2 X (area of the triangle ADC)

Now draw a tangent to S(Q) at point (i.e., 6 = Go). By a similar argument, we get

LM = 2 X (area of the triangle BAE)

Thus the W and LM statistics amount to using two different triangle approximations to area under the curve 5(6). They differ in computing the value of/(e). The W statistic computes its value at = ¸, the ML estimate. The LM statistic computes it at 9 = 9o and is easier to calculate because we have to estimate the model with fewer parameters (the restricted model with = ).

(8) Methods of Solving the Likelihood Equations

Earlier (see item 4 in the appendix) we discussed the maximum likelihood (ML) estimation of the parameters in the linear regression model. There we saw that the ML estimates of a and p were the same as the least squares estimates.

In many cases the computation of the ML estimates is not as straight-forward and we need some iterative techniques to compute them. Here we shall explain two of the iteration methods commonly used: the Newton-Raphson method and the scoring method. For ease of exposition we shall consider the case of a single parameter 9. The generalization to several parameters involves writing down the corresponding expressions in matrix notation (using the Appendix to Chapter 2).

Consider a sample of n independent observations ( , . . • . . y„) from a density function f(y. 6). Let ) be the likelihood function. Then log L(9) =

2 og f(y„ ). A necessary condition that log ) is maximum at 9 = is

dlogL

The equation log LldQ = 0 is called the likelihood equation. Sometimes it can be solved easily. But often we have to use iterative methods.

Let Go be a trial value of the estimate. Then expanding (d log LldQ) and retaining only the first power of 86 = 6 - we get

alogL eiogL log L

ae d% aeg

At the maximum a log L/M = 0. This gives us 86. Thus, the correction for the next iteration is given by

e, - = se =

alogL

~d log L

,



alogL

is the likelihood equation. This is a nonhnear equation in p.

2y, 3

log L 2

Since

(X, - ) {X, - P)

, - P we have

/(P)

V 2 ; {x,-Y

Suppose we start with an initial value Po. Then we calculate 5(Po) and /(Po). The next value is given by

~ /(Po)

We next calculate 5(p,) and /(p,) and proceed with this equation until convergence.

We now proceed with the first and second derivatives at 9, and get a new estimate 92- We continue this iterative procedure until convergence (we should also check the value of log L at each iteration). This is the Newton-Raphson method. The quantity log / % is sometimes called the score at % and is denoted by 5(60). Also E[-d log £ ] is called the information on 9 in the sample and is denoted by /(9). In the method of scoring we substitute /( ) for (-# log L/a92) in the Newton-Raphson method. The iteration method, in the method of scoring gives

9, = 9o + 5(eo) (eo)

The difference between the Newton-Raphson method and the method of scoring is that the former depends on observed second derivatives and the latter depends on expected values of the second derivatives.

As an illustration of the scoring method, consider the following regression model.

where u, are IN(0, 1). We assume the variance of u, to be unity, since we want to discuss the single-parameter case here. The log likelihood function is

logL = const - 1/2 .{y, -



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [ 45 ] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]