back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [ 23 ] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


23

Tr(A + B) = Tr(A) + Tr(B) Tr(AB) = Tr(BA) if both AB and A are defined.

These results can be checked by writing down the appropriate expressions. As an example, let

Then

"l 2"

0 1

and =

.1 1

"4 2

AB =

1 1

and BA

3 1

2 0 1 1 1 1

3 5 2 4

Tr(AB) = Tr(BA) = 7. Another important result we shall be using (in the case of the multiple regression model) is the following:

If A is an idempotent matrix, then Rank(A) = Tr(A)

Characteristic Roots and Vectors

This topic is discusssed in the Appendix to Chapter 7.

Exercises on Matrix Algebra 1.

Consider A =

1 0 1

2 1 1

, =

1 0

, and =

2 1 1 1

(a) Compute ABC, CAB, BCA, CBA, CBA.

(b) Verify that (ABC) = CBA.

(c) Find the inverses of these matrices. Verify that (ABC)" = C-(AB)-.

(d) Verify that Tr(BCA) = Tr(ABC) = Tr(CAB).

2. Solve the following set of equations using matrix methods.

x, + 2x2 + 2x3 = 1 2X + 2x2 + 3 = 3 X - 2 + 3 = 5

3. Determine those values of X for which the following set of equations may possess a nontrivial solution.

3x, + Xj - = 0 4x, - 2x2 - 3 = 0 2Xx, + 4x2 + = 0

For each permissible value of X, determine the most general solution.

For a proof, see Maddala, Econometrics, pp. 444-445.



4. If AB = AC, where A is a square matrix, when does it necessarily follow that = C? Give an example in which this does not follow.

5. A and are symmetric matrices. Show that AB is also symmetric if and only if A and are commutative.

6. If A is a square matrix, the matrix obtained by replacing each element of A by its cofactor and then transposing the resulting matrix is known as the adjoint of A and is denoted by Adj(A). Note that A" = (1/A Adj(A). Show that if A and are square matrices, Adj(AB) = Adj(A) x Adj(B).

7. In Exercise 1, show that A(AA)-A and B(BB)-B are both idempotent. What are the ranks of these two matrices?

8. Determine whether the quadratic form Q = x\ + 2x1 + - 2xX2 + IxXy is positive definite or not. Answer the same for

Q = 2x] + 3x1 + xl + xl + 2x,X2 - 2xiXj + 8x3X3 + 4x3X4 + 4x3X4

9. (a) Show that the set of equations

2x, - 2x2 + X3 = Xx, 2x, - 3x2 + 2x3 =

-X, + 2X2 = XXi

can possess a nontrivial solution only if X = 1 or -3. (b) Obtain the general solution in each case. 10. Construct a set of three mutually orthogonal vectors that are linear combinations of the vectors (1, 1, 0, 1), (1, 1, 0, 0), and (1, 0, 2, 2).



Simple Regression

3.1 Introduction

3.2 Specification of the Relationships

3.3 The Method of Moments

3.4 Method of Least Squares

3.5 Statistical Inference in the Linear Regression Model

3.6 Analysis of Variance for the Simple Regression Model

3.7 Prediction with the Simple Regression Model

3.8 Outliers

3.9 Alternative Functional Forms for Regression Equations *3.10 Inverse Prediction in the Least Squares Regression Model *3.11 Stochastic Regressors

*3.12 The Regression Fallacy Summary Exercises

Appendix to Chapter 3

3.1 Introduction

Regression analysis is one of the most commonly used tools in econometric work. We will, therefore, start our discussion with an outline of regression analysis. The subsequent chapters will deal with some modifications and ex-



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [ 23 ] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]