back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [ 20 ] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]


20

A = 1

4 10

2 3

= 12 - 20 = -8

5. From property 4 we can deduce the important property that if a row (or column) of a matrix A can be expressed as a linear combination of the other rows (or columns), the determinant of A is zero. For example, consider

3 -1 1

4 2 8 10 3 16

Here the third column of A = (first column) + 2(second column). In such cases we say that the columns are linearly dependent. When we evaluate A we can subtract from column 3, (column 1 + 2 column 2) and get all zeros in column 3. Then the expansion by the elements of column 3 gives us A = 0. From this property we can derive the following theorem:

Theorem: The determinant of a matrix A is nonzero if and only if there are no linear dependencies between the columns (or rows) of A. If A 5 0, A is said to be nonsingular. If A = 0, A is said to be singular.

Note that each row (or column) of a matrix is a vector. A set of vectors is said to be linearly independent if none can be expressed as a linear combination of the rest. For example, (1, -1, 0), (2, 3, 1), and (4, 1, 1) are not Unearly independent because the third vector = 2(first vector) + ((second vector). Three cannot be more than n linearly independent vectors of order n.

Determinants of the Third Order

There is a simple procedure for evaluating determinants of the third order. For higher-order determinants we have to follow the expansion by cofactors and the simplification rule 4. Consider

4 3 3 3 I 4 3 4 1

What we do is append the first two columns to this and take products of the diagonal elements. There are three products going down and three products going up. The products going down are positive; the products going up are negative. The value of the determinant is 76 - 82 = -6.

3 1 4

9 + 64 + 9 = 82 4 3 3 1 3 4

4 + 36 + 36 = 76



As another example, consider

We have

2 4- 48 - 12 = 38

18 - 8 + 8 = 18 The value of the determinant is 18 - 38 = -20.

Finding the Inverse of a Matrix

To get the inverse of a matrix A. we first replace each element of A by its cofactor, transpose the matrix, and then divide each element by A. For instance, with the 3 X 3 matrix A that we have been considering.

A-.

. ,

,

Noting that expansion by own cofactors gives A and expansion by alien co-factors gives zero, we get

A-A = -i-A

A33J

On «12 «13 «21 «22 «23 L«31 «32 «33J

A 0 0 0 A 0 0 0 A

We can also check that A A = I. If A = 0, then A~" does not exist. Thu for a singular matrix, the inverse does not exist. As an example of computing an inverse, consider the matrix

3 -1

4 2 1 2

To find the inverse, first we have to find the matrix of cofactors and also A. The matrix of cofactors is

-10 -4 6

5 8-7 -10 -20 10



and Aj = -20. Hence

A- =

-10 -4 6

5 -10 8 -20 -7 10

Reversal Law

Inverses follow a reversal law, just like transposes. If and both have inverses, then

( )

Orthogonal Matrices

Two vectors b and b, are said to be orthogonal if blb, = 0. They are also of unit length if b,b, = bjbi = 1. A matrix is said to be an orthogonal matrix if its rows are orthogonal and of unit length. We have

BB =

[b,, bj, . . . , bJ =

0 •

• 0

0 •

• 0

0 •

• 1

= I the identity matrix Postmultiplying both sides by we get

B = B

Thus for an orthogonal matrix, the inverse is just the transpose. Premultiplying both sides by B, we get BB = I. Thus for an orthogonal matrix the rows as well as columns are orthogonal. Also, BB = 1. Hence B = ± I.

Rank of a Matrix

A matrix of order m x n can be regarded as a set of m row vectors or a set of n column vectors. The row rank of the matrix is the number of linearly independent row vectors. The column rank of the matrix is the number of linearly independent column vectors. The row rank is < m and the column rank is < n. There are three important results on the rank of a matrix (which we shall state without proof).

1. Row rank = column rank.

2. If A and are two matrices such that their product AB is defined, rank (AB) is not greater than rank A or rank B.

3. The rank of a matrix is unaltered by pre- or postmultiplication by a non-singular matrix. This says that rank is unaltered by taking linear combinations of rows (or columns).



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [ 20 ] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155] [156] [157] [158] [159] [160] [161] [162] [163] [164] [165] [166] [167] [168] [169] [170] [171] [172] [173] [174] [175] [176] [177] [178] [179] [180] [181] [182] [183] [184] [185] [186] [187] [188] [189] [190] [191] [192] [193] [194] [195] [196] [197] [198] [199] [200] [201] [202] [203] [204] [205] [206] [207] [208] [209] [210] [211] [212]