<< Chapter < Page Chapter >> Page >

Let V k be the vector space of polynomials of degree k or less. Then V k has dimension k + 1 , since 1 , t , t 2 , ... , t k is basis. Multiplication by f defines a linear transformation from V m - 1 to V n + m - 1 . With respect to these standard bases for V m - 1 and V n + m - 1 , the ( n + m ) × m matrix

a 0 0 0 0 a 1 a 0 a 2 a 1 0 a 2 a 0 0 a 1 a 0 a 2 a 1 a n - 1 a 2 a n a n - 1 0 a n 0 a n - 1 a n a n - 1 0 0 0 a n

represents multiplication by f . Similarly, multiplication by g defines a linear transformation from V n - 1 to V n + m - 1 , represented by the ( n + m ) × n matrix

b 0 0 0 0 b 1 b 0 0 b 2 b 1 b 0 0 b 2 b 1 0 0 b m - 1 b 2 b 0 0 0 b m b m - 1 b 1 b 0 0 0 b m b m - 1 b 2 b 1 b 0 0 0 b m b 2 b 1 0 0 b m - 1 b 2 0 b m b m - 1 0 b m b m - 1 0 0 0 b m

with respect to the standard bases for V n - 1 and V n + m - 1 . Now, let V m - 1 V n - 1 be the vector space of pairs of polynomials ( u , v ) where deg u m - 1 and deg v n - 1 . The vector space V m - 1 V n - 1 has dimension m + n , since

( 1 , 0 ) , ( t , 0 ) , ( t 2 , 0 ) , ... , ( t m - 1 , 0 ) , ( t m , 0 ) , ( 0 , 1 ) , ( 0 , t ) , ( 0 , t 2 ) , ... , ( 0 , t n - 1 ) , ( 0 , t n )

is a basis; this is essentially the standard basis for V m - 1 followed by the standard basis for V n - 1 . The function T ( u , v ) = u f + v g defines a linear transformation from V m - 1 V n - 1 to V n + m - 1 whose matrix with respect to the above basis on V m - 1 V n - 1 and the standard basis on V n + m - 1 is the ( n + m ) × ( n + m ) matrix

Syl ( f , g , t ) = a 0 0 0 0 b 0 0 0 0 a 1 a 0 0 b 0 0 a 2 a 1 0 b m - 3 b 0 0 a 3 a 2 a 0 0 b m - 2 b m - 3 0 0 a 3 a 1 a 0 b m - 1 b m - 2 b m - 3 b 0 0 0 a n - 2 a 2 a 1 b m b m - 1 b m - 2 b 0 0 a n - 1 a n - 2 a 3 a 2 0 b m b m - 1 b m - 3 b 0 a n a n - 1 a 3 0 b m b m - 2 b m - 3 0 a n a n - 2 0 0 b m - 1 b m - 2 b m - 3 0 a n - 1 a n - 2 0 0 b m b m - 1 b m - 2 0 a n a n - 1 0 0 0 0 b m b m - 1 0 0 0 a n 0 0 0 0 0 b m

which we call the Sylvester matrix . The determinant of the Sylvester matrix we call the resultant :

Res ( f , g , t ) = det ( Syl ( f , g , t ) )

of f and g with respect to the variable t . The resultant is a polynomial in the coefficients of f and g with integer coefficients.

The determinant of a matrix is non-zero precisely when the corresponding linear transformation is one-to-one (and equivalently, if and only if the linear transformation is onto). Thus, since we know that solutions to u f + v g = h with u V m - 1 and v V n - 1 exist (and are unique) for all h V n + m - 1 precisely when f and g have no common factor, we have the following:

Theorem Suppose that f , g k [ t ] are polynomials over a field k of degrees n > 0 and m > 0 respectively. Then Res ( f , g , t ) = 0 if and only if f and g have a common factor in k [ t ] .

One thing we need to be careful of is that this theorem only applies when the degrees of f and g are actually n and m . If a n = b m = 0 , applying the resultant as if f and g had degree n and m will always yield zero (Why?) even though f and g may not have a common factor.

Remark There's another way to define the resultant. If we assume that f , g C [ t ] are monic polynomials, then by the fundamental theorem of algebra, we can factor them over the complex numbers as

f ( t ) = ( t - α 1 ) ( t - α 2 ) ( t - α 3 ) ( t - α n - 1 ) ( t - α n ) , g ( t ) = ( t - β 1 ) ( t - β 2 ) ( t - β m ) .

Then if we form the product

R ( f , g , t ) = j = 1 n k = 1 m ( α j - β k )

then it will certainly have the property that R ( f , g , t ) = 0 if and only if f and g have a root (or equivalently, a factor) in common. Also, it's easy to see that permuting the α j or permuting the β k has no effect on this product. It turns out that this means it's possible to rewrite R ( f , g , t ) as a polynomial in the coefficients of f and g , The general statements is that a symmetric polynomial in the roots α 1 , ... , α n of a monic polynomial f ( t ) = t n + a n - 1 t n - 1 + + a 0 can be written as a polynomial in the elementary symmetric polynomials

- a n - 1 = σ 1 = α 1 + + α n , a n - 2 = σ 2 = α 1 α 2 + α 1 α 3 + + α 1 α n + α 2 α 3 + + α 2 α n + + α n - 1 α n , ( - 1 ) r a n - r = σ r = i 1 < i 2 < < i r α i 1 α i 2 α i r , ( - 1 ) n a 0 = σ n = α 1 α 2 α n ,
and is thus a polynomial in the coefficients of f . See section 7.1 of Cox, Little, and O'Shea for details.
and in fact R ( f , g , t ) = Res ( f , g , t ) . On the other hand, our original definition of the resultant Res ( f , g , t ) is a polynomial in the coefficients a i , b j .

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, The art of the pfug. OpenStax CNX. Jun 05, 2013 Download for free at http://cnx.org/content/col10523/1.34
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'The art of the pfug' conversation and receive update notifications?

Ask