back start next


[start] [1] [2] [3] [4] [5] [6] [7] [ 8 ] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


8

Statistical Background and Matrix Algebra

2.1 Introduction

2.2 Probability

2.3 Random Variables and Probability Distributions

2.4 The Normal Probability Distribution and Related Distributions

2.5 Classical Statistical Inference

2.6 Properties of Estimators

2.7 Sampling Distributions for Samples from a Normal Population

2.8 Interval Estimation

2.9 Testing of Hypotheses

2.10 Relationship Between Confidence Interval Procedures and Tests of Hypotheses

Summary Exercises

Appendix to Chapter 2



2.2 Probability

The term probability is used to give a quantitative measure to the uncertainty associated with our statements. The earliest notions of probability were first applied to gambling. Backgammon was known thousands of years ago. Although most of the gamblers presumably calculated some probabilities, it is Gerolamo Cardano (1501-1576), an Italian physician, mathematician, and astrologer, who is credited with the first systematic computation of probabilities. He defined probability as

, , ... number of favorable outcomes

probability =

total number of possible outcomes

This is called the classical view of probability. Suppose that we roll a balanced die. What is the probability of getting a 6? The number of favorable outcomes is one since there is only one 6. The total number of possible outcomes is six (1, 2, 3, 4, 5, 6). Hence the probability is i

Another definition of probability is that of the relative frequency of the occurrence of an event in a large number of repetitions. In the case of the die, we calculate the probability of a 6 by rolling the die a large number of times and

2.1 Introduction

In this chapter we review some basic results in probability and statistics that are used in the book. Most of the results are generally covered in a course in introductory statistics that students take before entering a course in econometrics. Hence the discussion is concise, and proofs are omitted. However, exercises at the end of the chapter will provide some practice in the use of concepts and results presented here. Students should attempt to work on these questions as well, as part of this review. This chapter serves two purposes:

1. As a review of some material covered in previous courses in statistics.

2. As a reference for some results when needed.

In this chapter we omit discussion of methods of estimation: method of moments, method of least squares, and the method of maximum likelihood as they are discussed in Chapter 3 when we cover simple regression.

The other thing this chapter provides is an introduction to matrix algebra. This is included in an appendix because the main body of the text does not involve matrix notation and explains the basic concepts with simple models. Chapters 4, 5, 7, 9, and 12 also contain appendixes that contain proofs of some results stated without proof in the respective chapters. These proofs need the matrix notation. Thus students wishing to learn the proofs and the matrix notation would start with this appendix and then go through the appendixes to the other chapters. The appendix to this chapter also contains exercises on matrix algebra that students are advised to work on.



2.2 probability 13

then observing the proportion of times we get a 6. This observed proportion will give us the probabiHty of a 6. You would think that nobody would have the patience to compute probabilities this way. However, there is a well-known example of Kerrick, who while interned in Denmark during World War 11, performed several such experiments. For instance, he tossed a coin 10,000 times! Initially, the relative frequency of heads fluctuated widely, but it finally settled close to 0.5, with a value of 0.507 on the final toss. The experiments are described in J. E. Kerrick, An Experimental Introduction to the Theory of Probability (Copenhagen: Jorgensen, 1946).

The frequency view of probability, first given by the French mathematician Poisson in 1837, says that if n is the number of trials and n{E) the number of occurrences of the event E, then the probability of E, denoted by P(,E) is

P(E) = lim

-*O0 n

According to the classical view, probability is a theoretical number defined as the ratio of the number of favorable cases to the total number of possible cases. According to the frquency view, it is a limit of the observed relative frequency as the number of repetitions becomes very large.

A third view is the subjective view of probability, which is based on personal beliefs. Suppose you say that the probability that the New York Giants will win is i Consider a bet where you are given $1 if the Giants win and you pay $3 if they lose. Your probability indicated that this is a "fair" bet. If you are not willing to take this bet, your subjective probability that the Giants will win is < f. If you are too anxious to take this bet, your subjective probabiUty that the Giants will win is > . Because of the "betting" basis for subjective probability, it is also called "personal pignic probability." The work "pignic" comes from the Latin word pignus (a bet).

Addition Rules of Probability

Fortunately, however, whatever view of probability we adopt, the rules for the calculation of probability are the same. Before we discuss these, we have to define some terms. Events A„ A2, 3, . . . are said to be mutually exclusive if when one occurs, the others do not. They are said to be exhaustive if they exhaust all the possibilities. In the case of a die, the events A,, A2, . . . , A,, that the die shows 1, 2, 3, 4, 5, and 6 are mutually exclusive as well as exhaustive.

We write P(A + B) as the probability that A or of the events A and occur. This is called the union of the two events. We write ( ) as the probability of the joint occurrence of A and B. This is called the intersection of the events A and B.

For example, if we define A and as:

A: The die shows 1, 3, or 5 B: The die shows 3



[start] [1] [2] [3] [4] [5] [6] [7] [ 8 ] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]