jagomart
digital resources
picture1_Matrices Pdf Notes 174474 | Linear Transformations Of The Plane


 181x       Filetype PDF       File size 0.16 MB       Source: www.math.ucla.edu


File: Matrices Pdf Notes 174474 | Linear Transformations Of The Plane
3 linear transformations of the plane now that we re using matrices to represent linear transformations we ll nd ourselves en countering a wide range of transformations and matrices it ...

icon picture PDF Filetype PDF | Posted on 27 Jan 2023 | 2 years ago
Partial capture of text on file.
            3 Linear Transformations of the Plane
            Now that we’re using matrices to represent linear transformations, we’ll find ourselves en-
            countering a wide range of transformations and matrices; it can become difficult to keep
            track of which transformations do what. In these notes we’ll develop a tool box of basic
            transformations which can easily be remembered by their geometric properties.
                                                   2     2
               We’ll focus on linear transformations T : R → R of the plane to itself, and thus on the
            2 ×2 matrices A corresponding to these transformation. Perhaps the most important fact
            to keep in mind as we determine the matrices corresponding to different transformations is
            that the first and second columns of A are given by T(e1) and T(e2), respectively, where e1
                                                2
            and e2 are the standard unit vectors in R .
            3.1   Scaling
                                      2
            The first transformation of R that we want to consider is that of scaling every vector by
                                                       2
            some factor k. That is, T(x) = kx for every x ∈ R . If k = 1, then T does nothing. In this
            case, T(e ) = e and T(e ) = e , so the columns of the corresponding matrix A are e and
                    1    1        2    2                                             1
            e2:                                      
                                              A= 1 0 .
                                                   0 1
            We call this the identity matrix (of size 2) and denote it either as I or as I when the
                                                                          2
            size is obvious. For any other scale factor k we have
                                   T(e1) = k   and    T(e2) = 0,
                                           0                    k
            so the corresponding matrix is given by
                                                     
                                              A= k 0 .
                                                   0 k
            Now suppose we have two scaling maps: T which scales by a factor of k , and T which
                                                  1                         1      2
            scales by a factor of k . Then T ◦T is the transformation that scales by a factor of k and
                               2       2   1                                         1
            then by k , which is to say that it scales by a factor of k k . This means that its matrix
                     2                                        2 1
            representation should be given by
                                                        
                                           A= k2k1     0  ,
                                                 0   k k
                                                      2 1
            and indeed we can easily check that
                                                            
                                       k   0   k   0     k k   0
                               A A = 2          1     = 2 1        =A.
                                2  1   0   k   0  k       0   k k
                                            2      1           2 1
            This agrees with our notion of matrix multiplication representing composition of linear trans-
            formations.
                                                  15
                                    Figure 2: Orthogonal projection of v onto w.
              3.2    Orthogonal Projection
              The next linear transformation we’d like to consider is that of projecting vectors onto a line
                   2
              in R . First we have to consider what it means to project one vector onto another. Take a
              look at Figure 2, where we’re projecting the vector v onto w orthogonally. What we mean
              by orthogonal projection is that the displacement vector projwv − v is orthogonal to the
              vector w, as seen in Figure 2.
                 Wecanseethatproj v will be a scalar multiple of w, so let’s write proj v = kw. Since
                                     w                                                w
              we require that proj v −v be orthogonal to w, we have
                                 w
                                                 (kw−v)·w=0.
              That is,
                                                 kw·w−v·w=0,
              so                                         v·w
                                                     k = w·w.
              This means that we have                    v·w
                                               projwv = w·w w.
              Nownotice that if we project v onto any vector which is a nonzero scalar multiple of w, the
              resulting vector will be the same as proj v. So really we’re projecting v onto the line L
                                                      w
              determined by w. For this reason, we write projLv for the projection of v onto the line L.
                 Given a line L, we can compute proj v by first selecting a unit vector u = hu ,u i
                                                       L                                         1  2
              through which L passes and then projecting v onto u. We then have
                                                          e ·u      u       2 
                                                           1         1       u
                                  proj (e ) = proj (e ) =       u= u= 1
                                      L 1         u  1     u·u       1      u u
                                                                             1 2
              and                                                              
                                                          e ·u      u       u u
                                  proj (e ) = proj (e ) = 2    u= 2u= 1 2 ,
                                      L 2         u  2                        2
                                                          u·u       1        u
                                                                              2
                                                         16
                                         Figure 3: Rotation by θ.
            so the matrix A corresponding to the projection onto L is
                                                2        
                                                 u    u u
                                           A=      1   1 2 .
                                                u u    u2
                                                  1 2   2
            As before, there’s a matrix product that’s worth considering here. If we apply the transfor-
            mation of projecting onto L twice, this should be no different than applying it once. After
                                              2
            the first projection, all the vectors in R have been mapped onto L, and projecting a vector
            on L onto L does nothing. This is borne out by the fact that
                                  u2   u u  u2  u u     u2  u u 
                            A2 =    1    1 2    1   1 2 =     1   1 2 =A.
                                          2          2             2
                                  u u    u    u u   u       u u   u
                                   1 2    2    1 2   2       1 2   2
            We can also verify our claim that projecting a vector which already lies on L onto L does
            nothing:                                              
                                         3       2         2   2
                               ku      ku +ku u       ku (u +u )      ku
                             A    1 =    1     1 2 =    1  1   2  =     1 ,
                                         2       3         2   2
                               ku      ku u +ku       ku (u +u )      ku
                                  2      1 2     2      2  1   2        2
            where we using the fact that any vector on L has the form hku ,ku i for some k.
                                                                  1   2
            3.3    Rotation
            Nextwe’ll consider rotating the plane through some angle θ, as depicted in Figure 3. Because
            the vector e1 lies on the unit circle, so does T(e1), and T(e1) makes an angle of θ with the
            x-axis. As a result, its x- and y-components are cosθ and sinθ, respectively:
                                                        
                                            T(e ) = cosθ .
                                                1    sinθ
            At the same time, since e2 makes an angle of π/2 with e1, the vectors T(e2) and T(e1)
            should also have an angle of π/2 between them. So the x- and y-components of T(e2) are
                                                   17
                                            Figure 4: Reflection across a line.
               cos(θ +π/2) and sin(θ +π/2), respectively:
                                                                          
                                          T(e2) = cos(θ +π/2) = −sinθ .
                                                    sin(θ + π/2)       cosθ
               So the matrix corresponding to rotation by θ is
                                                                    
                                                  A= cosθ −sinθ .
                                                        sinθ   cosθ
               If we let A and A be the matrices corresponding to rotation through an angle θ and an
                          θ       α
               angle α, respectively, then one can compute (using angle-addition identities) that
                                                                            
                                          A A = cos(θ+α) −sin(θ+α) ,
                                            θ  α    sin(θ + α)   cos(θ +α)
               which is the matrix corresponding to rotation through the angle θ + α.
               3.4    Reflection
               Next we’ll consider the linear transformation that reflects vectors across a line L that makes
               an angle θ with the x-axis, as seen in Figure 4. Computing T(e1) isn’t that bad: since L
               makes an angle θ with the x-axis, T(e1) should make an angle θ with L, and thus an angle
               2θ with the x-axis. So                             
                                                    T(e ) = cos2θ .
                                                       1      sin2θ
               Determining T(e2) is only slightly less straightforward. First, e2 makes an angle of π/2 − θ
               with L, so T(e2) should make the same angle with L, but on the other side of the line. Since
               there’s an angle of θ between the x-axis and L, the angle between T(e2) and the x-axis is
               π/2 − 2θ. (We’re assuming that 0 ≤ θ ≤ π/2, but similar computations can be done for
                                                           18
The words contained in this file might help you see if this file matches what you are looking for:

...Linear transformations of the plane now that we re using matrices to represent ll nd ourselves en countering a wide range and it can become dicult keep track which do what in these notes develop tool box basic easily be remembered by their geometric properties focus on t r itself thus corresponding transformation perhaps most important fact mind as determine dierent is rst second columns are given e respectively where standard unit vectors scaling want consider every vector some factor k x kx for if then does nothing this case so matrix call identity size denote either i or when obvious any other scale have suppose two maps scales say means its representation should kk indeed check agrees with our notion multiplication representing composition trans formations figure orthogonal projection v onto w next d like projecting line first project one another take look at orthogonally mean displacement projwv seen wecanseethatproj will scalar multiple let s write proj kw since require nownotice...

no reviews yet
Please Login to review.