okay here is lecture 10 in linear algebra two important things to do in this lecture one is to correct an error from lecture 9 so the blackboard with that awful error is still with us and the second the big thing to do is to tell you about the four subspaces that come with a matrix we've seen two subspaces the column space and the null space there's two to go okay first of all and this is a great way to recap and correct the previous lecture so you remember I was just doing r3 I couldn't have taken a simpler example than r3 and I wrote down the standard basis that's the standard basis that base is the obvious basis for the whole three-dimensional space and then I wanted to make the point that there was nothing special nothing about that basis that another basis couldn't have it could have linear independence it could span a space there are lots of other bases so I started with these vectors 1 1 2 and 2 2 5 and those were independent and then I said 3 3 7 wouldn't do because 3 3 7 is the sum of those so in my innocence I put in 3 3 8 I figured probably if 3 3 7 is on the plane is which I know it's in the plane with these two then probably 3 3 8 sticks a little bit out of the plane and it's independent and it gives a basis but after class to my sorrow student tells me wait a minute that be that third vector 3 3 8 is no independent and why did she say that she didn't actually take the time didn't have to to find what combination of this one and this one gives three three eight she did something else in other words she looked ahead because she said wait a minute if I look at that matrix it's not invertible that third column can't be independent of the first two because when I look at that matrix it's got two identical rows I have a square matrix its rows are obviously dependent and that makes the columns dependent so there's my error when I look at the matrix a that has those three columns those three columns can't be independent because that matrix is not invertible because it's got two equal rows and today's lecture will reach the conclusion the great conclusion that connects the column space with the row space so those are the row space is now going to be another one of my fundamental subspaces the row space of this matrix or of this one well the row space of this one is okay but the row space of this one I'm looking at the rows of the matrix Oh anyway I'll have two equal rows and the row space will be only two dimensional the rank of the matrix with these columns will only be two so only two of those columns columns can be independent to the rows tell me something about the columns in other words something that I should have noticed and I didn't okay so now let me pin down these four fundamental subspaces so here are the four fundamental subspaces this is really the heart of this approach to linear algebra to see these four subspaces how they're related so what are they the column space C of a the null space innovate and now comes the row space something new the row space what's in that it's all combinations of the rows that's natural we want a space so we have to take all combinations and we start with the rows so the rows span the row space or the rows a basis for the row space maybe so maybe no the rows are a basis for the row space when they're independent but if they're dependent as in this example my error from last time they're not those three rows are not a basis the row space would would only be two-dimensional I only need two rows for a basis so the row space now what's in it it's all combinations of the rows of a all combinations of the rows of a but I don't like working with row vectors all my vectors have been column vectors I'd like to stay with column vectors how can I get two column vectors out of these rows I transpose the matrix so if that's okay with you I'm going to transpose the matrix I'm going to say all combinations of the columns of a transpose and that allows me to use the convenient notation the column space of a transpose nothing no mathematics went on there we just got some vectors that were lying down to stand up but it means that we can use this column space of a transpose that's telling me in a nice matrix notation what the row space is okay and finally is another null space the fourth the fourth fundamental space will be the null space of a transpose the fourth guy is the null space of a transpose and of course my notation is and of a transpose that's the null space of a transpose now we don't have a perfect name for this space as connecting with a but our usual name is the left null space and I'll show you why in a moment so often I call this the just to write that word the left null space of a so just the way we have the row space of a and we switch it to the column space of a transpose so we have this space of guys that I call the left null space of a but the good notation is it's the null space of a transpose okay those are four spaces where are those spaces what what big space are they in for when a is M by N in that case the null-space of a what's in the null space of a vectors within components solutions to ax equals zero so the null space of a is in our n what's in the column space of a well columns how many components to those columns have m so this column space is in our m what about the column space of a transpose which are just a disguised way of saying the rows of a the rows of a in this three by six matrix have six components n components these the column spaces in our n and the null space of a transpose I see that this fourth space is already getting second you know second class citizen treatment and it doesn't deserve it it's it should be there it is there and shouldn't be squeezed the null space of a transpose well if the null space of a had vectors with n components the null space of a transpose will be in our M I want to draw a picture of the four spaces okay okay here are the four spaces okay let me put n-dimensional space over on this side then which were the subspaces in RN the null space was and the row space was so here we have the can I make that picture of the row space and can I make this kind of picture of the null space that's just meant to be a sketch to remind you that they're in this which will help what type of vectors are in it vectors with n components over here invent in inside consisting of vectors with M components is the column space and what I'm calling the null space of a transpose those are the ones with M components okay to understand these spaces Azhar is our job now because by understanding those spaces we know everything about this half of linear algebra what do I mean by understanding those spaces I would like to know a basis for those spaces for each one of those spaces how would I create construct a basis what systematic way would produce a basis and what's there dementia okay so for each of the four spaces I have to answer those questions how do I produce a basis and then which has a somewhat long answer and what's the dimension which is just a number so it has a real short answer can I give you the short answer first I shouldn't do it but here it is I can tell you the dimension of the column space let me start with this guy what's its dimension I have an M by n matrix the dimension of the column space is the rank R we actually got to that at the end of the last lecture but only for an example so I really have to say okay what's going on there I should produce a basis and then I just look to see how many vectors I needed in that basis and the answer will be R actually I'll do that before I get on to the others what's a basis for the column space we've done all the work of row reduction identifying the pivot columns the ones that have pivots the ones that end up with pivots but now I the pivot columns I'm interested in our columns of a the original a and those pivot columns there are R of them the rank R counts those those are a basis so if I answer this question for the column space the answer will be a basis is the pivot columns and the dimension is the rank R and there are R pivot columns and everything great okay so that space we pretty well understand I probably have a little going back to see that to prove that this is a right answer but you know it's the right answer now let me look at the row space okay shall I tell you the dimension of the row space yes before we do even an example let me tell you the dimension of the row space it's dimension is also are the row space and the column space have the same dimension that's a wonderful fact the dimension of the column space of a transpose that's the row space is our that that space is our dimensional and so is this one okay that's the that's the sort of insight that got used in this example if those if those if those are the three columns of a matrix let me make them the three columns of a matrix by just erasing some brackets okay those are the three columns of a matrix the rank of that matrix if I look at the columns it wasn't obvious to me anyway but if I look at the rows it's now it's obvious the row space of that matrix obviously is two-dimensional because I can I see a basis for the row space this row and that row and of course strictly speaking I'm supposed to transpose those guys make them stand up but the rank is two and therefore the column space is two-dimensional by this wonderful fact that the row space and column space have the same dimension and therefore there are only two pivot columns not three and though the three columns are dependent okay now let me theory that error and talk about the row space well I'm going to give you the dimensions of all the spaces because that's the such a nice answer okay so let me come back here so we have this great fact to establish that the row space the it's dimension is also the rank okay what about the null space what's a basis for the null space what's the dimension of the null space let me I'll put that answer up here for the null space well how have we constructed the null space we took the matrix a we did those row operations to get it into a form u or even further we got into the reduced form R and then we read off special solutions special solutions and every special solution came from a free variable and those special solutions are in the null space and the great thing is there a basis for it so for the null space a basis will be the special solutions and there's one for every free variable right for each free variable we give that variable the value 1 the other free variable 0 we get the pivot variables we get a vector in the in we get a special solution so we get all together and minus R because that's the number of free variables if we have R this is this is the dimension is R is the number of pivot variables this is the number of free variables so the beauty is that those special solutions do form a basis and tell us immediately that the dimension of the null face is n I better write this well because it's so nice n minus R and you see the nice thing that the two dimensions in this n dimensional space one subspace is R dimensional to be proved that's the row space the other sub space is n minus R dimensional that's the null space and the two dimensions like together give n the sum of R and n minus R is N and that's just great it's really copying the fact that we have n variables R of them our pivot variables and n minus our free variables and an all together ok and now what's the dimension of this poor misbegotten fourth subspace it's got to be M minus R the dimension of this left null space left out practically is M minus R well that's really just saying that this again the sum of that Plus that is M and M is correct it's the number of columns in a transpose a transpose is just as good a matrix is a it just happens to be n by M it happens to have M columns so it will have M variables when I go to ax equals 0 and M of them and and R of them will be pivot variables and M minus R will be free variables it a transpose is as good a matrix as a it follows the same rule that this plus the dimension this dimension was this dimension adds up to the number of columns and over here a transpose has m columns okay okay so I gave you the easy answer the dimensions now can I go back to check on a basis we would like to think that say the row space because we've got a basis for the column space the pivot columns give a basis for the column space now I'm asking you to look at the row space and I you could say okay I can produce a basis for the row space by transposing my matrix making those columns then doing elimination row reduction and checking out the pivot columns in this transposed matrix but that means you had to do all that row reduction on a transpose it ought to be possible if we take a matrix a let me take the matrix maybe we had this matrix in the last lecture 1 1 1 2 1 2 3 2 3 1 1 1 ok that matrix was so easy we spotted its pivot columns 1 & 2 without actually doing row reduction but now let's do the job properly so I subtract this away from this to produce a 0 so 1 2 3 1 is fine subtracting that away leaves me minus 1 minus 1 0 right and subtracting that from the last row oh well that's easy ok I'm doing row reduction now I've the first column is all set the second column I now see the pivot and I can clean up if I actually okay why don't I make the pivot into a 1 I'll multiply that row through by by minus 1 and then I have 1 one that was an elementary operation I'm allowed multiply a row by a number and now I'll do elimination two of those away from that will knock this guy out and make this into a 1 so that's now a zero and that's a 1 okay done that's our I'm seeing the identity matrix here I'm seeing zeros below and I'm seeing F there okay what about its row space what happened to its row space well what happened from me first asked just because this is sometimes something does happen its column space changed the column space of R is not the column space of a right because one one one is certainly in the column space of a and certainly not in the column space of our I did row operations those row operations preserve the row space so the roast so the column spaces are different different column spaces different column spaces but I believe that they have the same row space same row space I believe that the row space of that matrix and the row space of this matrix are identical they have exactly the same vectors in them those vectors are vectors with four components right they're all combinations of those rows or I believe you get the same thing by taking all combinations of these rows and if true what's a basis what's a basis for the row space of R and it'll be a basis for the row space of the original a but it's obviously a basis for the row space of our what's a basis for the row space of that matrix the first two rows so a basis for the row so a basis is for the row space of a or of R is is the first R rows of are not of a sometimes it's true for a but not necessarily but R we definitely have a matrix here whose row space we can we can identify the row space is spanned by the three rows but if we want a basis we want independence so out goes Row three the base the row space is also spanned by the first two rows this guy didn't contribute anything and of course over here this one two three one and the bottom didn't contribute anything we had it already so this here is a basis 1 0 1 1 and 0 1 1 0 I believe those are in the row space I know they're independent why are they in the row space why are those two vectors the row space because all those operations we did which started with these rows and took combinations of them I took this row - this row that gave me something that's still in the row space that's the point when I took a row - a multiple of another row I'm staying in the row space the row space is not changing my little basis for it is changing and I've ended up with a sort of the best basis if their columns of the identity matrix are the best basis for r3 or RN the rows of this matrix are the best basis for the row space best in the sense of being as clean as I can make it starting off with the identity and then finishing up with whatever has to be in there okay do you see then that the dimension is R for sure because we've got our pivots are nonzero rows we've got the right number of vectors are there in the row space they're independent that's it they are a basis for the row space and we can even pin that down further how do I know that every row of a is a combination how do I know they span the row space well somebody says I've got the right number of them so they must but and that's true but let me just say how do I know that this Row is a combination of these by just reversing the steps of row reduction if I just reverse the steps and go from a from our back to a then what do i what am i doing I'm starting with these rows I'm taking combinations of them after a couple of steps undoing the subtractions that I did before I'm back to these rows so these rows are combinations of those rows those rows are combination those rows the two rows spaces are the same the bases are the same and the natural basis is this guy is that alright for the row space the row space is sitting there in are in its cleanest possible form okay now what about the fourth guy the null space of a transpose first of all why do I call that the left null space so let me save that and bring that okay so the fourth space is the null space of a transpose so it has in it vectors let me call them Y so that a transpose y equals zero if a transpose y equals zero then Y is in the null space of a transpose of course so this is a matrix times a column equaling zero and now because I want Y to sit on the left and I want a instead of a transpose I'll just transpose that equation can I just transpose that on the right it makes the zero vector lie down and on the Left it's a product a a transpose times y if I take the transpose then they come in opposite order right so it's y transpose times a transpose transpose but nobody's going to leave it like that that's a transpose transpose is just a of course when I transpose a transpose I cut back to a now do you see what I have now I have a row vector Y transpose multiplying a and multiplying from the left that's why I call it the left null-space but by making it putting it on the left I had to make it into a row instead of a column vector and so my convention is I usually don't do that I usually stay with a transpose y equals zero okay and you might ask how do we get a basis or I might ask how do we get a basis for this forth space this left null space okay I'll do it the example as always I'm not that one the left null space is not jumping out at me here the then I know which are the free very variables the special solutions but those are special solutions to ax equals zero and now I'm looking at a transpose and I'm not seeing here so but somehow you feel that the work that you did which simplified a to R should have revealed the left null space to a slightly less immediate but it's there so from a to R I took some steps and I guess I'm interested in what were those steps or what were all of them together I don't I'm not interested in what particular ones they were I'm interested in what was the whole matrix that took me from a to R how would you find that do you remember gauss-jordan where you tack on the identity matrix let's do that again so I'll do it above here so this is now this is now the idea of I take the matrix a which is M by n in Gauss Jordan when we saw him before a was a square invertible matrix and we were finding its inverse now the matrix isn't square it's probably rectangular but I'll still tack on the identity matrix and of course since these have length m it better be M by M and now I'll do the reduced row echelon form of this matrix and what do I get the reduced row echelon form starts with these columns starts with the first columns works like mad and produces are of course still that same size and by N and we did it before and then whatever it did to get R something else is going to show up here let me call it e M by M it's whatever you see that E is just going to contain a record of what we did we did whatever it took to get a to become R and at the same time we were doing it to the identity matrix so we started with the identity matrix we buzzed along so we took some all this row reduce reduction amounted to multiplying on the left by some matrix some series of elementary matrices that altogether gave us one matrix and that matrix is e so all this row reduction stuff amounted to multiplying by e how do I know that it certainly amounted to multiplying by something and that something took I to e so that something was e so now look at the first part e a is are no big deal all I've said is that the row reduction steps that we all know well taking aid are are in some matrix and I can find out what that matrix is by just tacking I on and seeing what comes out what comes out is e let's just review the invertible square case what happened then because I was interested in it in Chapter two also when a was Square and invertible I took a I I did row row elimination what was the our that came out it was I so in chapter two in Chapter two our was I the row the reduced row echelon form of a nice invertible square matrix is the identity so if R was I in that case then II was then II was a inverse because ei is I good that's that was good and easy now what I'm saying is that there still is an e it's not a inverse anymore because a is rectangle it hasn't got an inverse but there is still some matrix e that connected this to this oh I should have figured out invents what it was shoot I didn't I did those steps and sort of erased as I went and I should have done them to the identity to can I do that can I do that could I'll keep the identity matrix like I'm supposed to do and I'll do this same operations on it and see what I end up with okay so I'm starting with the identity which I'll write in light light enough but okay what did I do I subtracted that row from that one and that row from that one okay I'll do that to the identity so I subtract that first row from Row two and Row three good then I think I multiply do you remember I multiplied Row two by minus one let me just do that then what did I do I subtracted two of Row two away from Row one I better do that subtract two of this away from this that's minus one two of these away leaves a plus two and 0 I believe that's e the way to check is to see multiply that e by this a just to see did I do it right so I believe E was minus 1 to 0 1 minus 1 0 and minus 1 0 1 ok that's my E that's my a and that's our all right all I'm struggling to do is write the reason I wanted this blasted e was so that I could figure out the left null-space not only its dimension which I know actually what is the dimension of the left null space here's my matrix what's the rank of the matrix 2 and the dimension of the null the left null space is supposed to be M minus R 3 minus 2 I believe that the left null space is one dimensional there is one combination of those three rows that produces the zero row there is a there's a basis a basis for the left null space has only got one vector in it and what is that vector it's here in the last row of E but I could have seen it earlier what combination of those rows gives the zero row minus one of that plus one of that so a basis for the left null space of this matrix I'm looking for combinations of rows that give the zero row if I'm looking at the left null space for the null space I looking at combinations of columns to get the zero column now I'm looking at combinations of these three rows to get the zero row and of course there is my zero row and here is my vector that produced it minus 1 of that row and one of that row obvious okay so in that example and actually in all examples we have seen how to produce a basis for the left null space I won't ask you that all the time because it did it didn't come out immediately from our we had to keep track of e for that left null space but at least it didn't require us to transpose the matrix and start all over again okay those are the four subspaces can I review them the row space and the null space are in RN their dimensions add two in the column space and the left null space are in RM and their dimensions add to M okay so let me close these last minutes by pushing you a little bit more to a new type of vector space all our vector spaces all the ones that we took seriously have been subspaces of some real three or n dimensional space now I'm going to write down another vector space a new vector space say all three by three matrices my matrices are the vectors is that all right I'm just naming them you can put quotes around vectors every three by three matrix is one of my vectors now how am I entitled to call those things vectors and they look very much like matrices but they're vectors in my vector space because they obey the rules all I'm supposed to be able to do with vectors is add them I can add matrices I'm supposed to be able to multiply them by scalar numbers like seven well I can multiply matrix by seven and that and I can take combinations of matrices I can take three of one matrix minus five of another matrix and those combinations there's a zero matrix the matrix that has all zeros in it if I add that to another matrix it doesn't change it all the good stuff if I multiply matrix by one it doesn't change it all those eight rules for a vector space that we never wrote down all easily satisfied so now we have a different now of course you can say you can multiply those matrices I don't care for the moment I'm only thinking of these matrices as forming a vector space so I am only doing a plus B and C times a I'm not interested in a B for now the fact that I can multiply is not relevant to the to a vector space okay so I have three by three matrices and how about subspaces what's tell me a subspace of this matrix space let me call this matrix space M that's my matrix space my space of all 3x3 matrices tell me a subspace of it okay what about the trying upper triangular matrices so subspaces subspaces of em all all upper triangular matrices another subspace all symmetric matrices the intersection of two subspaces is supposed to be a subspace we gave a little effort to the proof of that fact if I look at the matrices that are in this subspace they're symmetric and they're also in this subspace their upper triangular what do they look like well if they're symmetric but they have zeros below the diagonal they better have zeros above the diagonal so the intersection would be diagonal matrices that's another subspace smaller than those how can I use the word smaller well I'm now entitled to use the word smaller I mean well one way to say is okay these are contained in those these are contained in those but more precisely I could give the dimension of these spaces so I could we can compute let's next time the dimension of all opera of the subspace of upper triangular 3x3 matrices the dimension of symmetric 3x3 matrices the dimension of diagonal 3x3 matrices well to produce dimension that means I'm supposed to produce a basis and then I just count how many vector how many I needed in the basis let me give you the answer for this one what's the dimension the dimension of this say this subspace let me call it D all diagonal matrices the dimension of this subspace is as I write you're working it out three because here's a matrix in this it's a diagonal matrix here's another one here's another one better make a diagonal let me put a 7 there that was not a very great choice but it's three diagonal matrices and I believe that they're a basis I believe that those three matrices are independent and I believe that any diagonal matrix is a combination of those three so they span the subspace of diagonal matrices do you see that idea it's like stretching the idea from RN to R n by n 3 by 3 but if we can still add we can still multiply by numbers and we just ignore the fact that we can multiply two matrices together ok thank you that's lecture 10
Lec 10 | 18.06 Linear Algebra, Spring 2005 for 2023 is part of preparation. The notes and questions for Lec 10 | 18.06 Linear Algebra, Spring 2005 have been
prepared according to the exam syllabus. Information about Lec 10 | 18.06 Linear Algebra, Spring 2005 covers all important topics for
2023 Exam. Find important definitions, questions, notes, meanings, examples, exercises and tests below for Lec 10 | 18.06 Linear Algebra, Spring 2005.
Introduction of Lec 10 | 18.06 Linear Algebra, Spring 2005 in English is available as part of our for & Lec 10 | 18.06 Linear Algebra, Spring 2005 in Hindi
Download more important topics related with notes, lectures and mock test series for Exam by signing up for free.
Video Lecture & Questions for Lec 10 | 18.06 Linear Algebra, Spring 2005 Video Lecture full syllabus preparation | Free video exam.
Information about Lec 10 | 18.06 Linear Algebra, Spring 2005
Here you can find the meaning of Lec 10 | 18.06 Linear Algebra, Spring 2005 defined & explained in the simplest way possible. Besides explaining types of
Lec 10 | 18.06 Linear Algebra, Spring 2005 theory, EduRev gives you an ample number of questions to practice Lec 10 | 18.06 Linear Algebra, Spring 2005 tests, examples and also practice
Lec 10 | 18.06 Linear Algebra, Spring 2005 is the part of for 2023 exam preparation. The content of Lec 10 | 18.06 Linear Algebra, Spring 2005 has been
prepared for learning according to the exam syllabus. Lec 10 | 18.06 Linear Algebra, Spring 2005 covers all important topics for
2023 Exam. Find important questions, notes, tests & features of Lec 10 | 18.06 Linear Algebra, Spring 2005 here.
How EduRev helps you in preparation?
EduRev provides you with complete coverage and for 2023 Lec 10 | 18.06 Linear Algebra, Spring 2005 as a part of our plan for
syllabus to prepare for exam. Our subject experts have curated special courses, tests & mock test
series. All previous year questions (PYQs) & topic wise tests like tests for Lec 10 | 18.06 Linear Algebra, Spring 2005 are provided with detailed
solutions. Download more important topics, notes, lectures and mock test series for Exam by signing up for free.
Download free EduRev App
Track your progress, build streaks, highlight & save important lessons and more!