141 - צירוף ליניארי: דוגמאות

Estimated read time: 1:20

    Learn to use AI like a Pro

    Get the latest AI workflows to boost your productivity and business performance, delivered weekly by expert consultants. Enjoy step-by-step guides, weekly Q&A sessions, and full access to our AI workflow archive.

    Canva Logo
    Claude AI Logo
    Google Gemini Logo
    HeyGen Logo
    Hugging Face Logo
    Microsoft Logo
    OpenAI Logo
    Zapier Logo
    Canva Logo
    Claude AI Logo
    Google Gemini Logo
    HeyGen Logo
    Hugging Face Logo
    Microsoft Logo
    OpenAI Logo
    Zapier Logo

    Summary

    In this insightful lecture, Dr. Aliza Malek from the Technion continues her explanation of linear combinations in vector spaces. She begins by reiterating the definition of linear combinations, exemplified through solving exercises to clarify the concept. A key point is that in a vector space like \(R^3\), every vector is a linear combination of a given set of vectors, a notion explored using geometric representations and algebraic processes. The lecture delves into different scenarios involving vector spaces such as 2x2 matrices and polynomial functions, illustrating how linear combinations work in each context, and concludes on the topic of unique and infinite solutions within these combinations.

      Highlights

      • Dr. Malek simplifies linear combination for R3, showing that each vector is expressible using basic operations. 🎓
      • The uniqueness of solutions in certain algebraic problems is demonstrated through straightforward examples. 🔍
      • Geometric illustration helps in grasping how vectors add up spatially. 🗺️
      • The lecture expands the concept to 2x2 matrices, demonstrating there can be non-combinable matrices. This opens up a discussion about subspaces. 🏢
      • In polynomial terms, the lecture shows how guessing solutions can sometimes be effective, simplifying seemingly complex problems. 🤔
      • Shows how infinite solutions exist when solving for linear combinations, indicating endless possibilities within constraints. 🔄

      Key Takeaways

      • Understanding linear combinations in vector spaces is crucial for solving algebraic problems. 🧠
      • Every vector in R3 can be expressed as a linear combination of a given set of vectors, highlighting the power of vector spaces. ✨
      • The process involves simple algebraic equations, proving that even complex systems can boil down to basic arithmetic operations. ➗
      • Illustrating vector solutions geometrically helps visualize abstract mathematical concepts. 📐
      • Linear combinations extend beyond vectors to matrices and polynomials, showcasing the versatility of the concept. 🔄
      • Not every matrix can be a linear combination of others if they're not in the same vector subspace, which is key in understanding constraints in algebraic systems. 🚫

      Overview

      Dr. Aliza Malek delves into the intricacies of linear combinations within vector spaces, going beyond basic definitions to practical applications. By focusing on examples involving \(R^3\) vectors, she elucidates the idea of constructing any vector from a set of given vectors through linear combinations. This concept is vital in understanding vector spaces' flexibility and functionality.

        The lecture covers the handling of complex vector space problems, such as those involving 2x2 matrices and polynomials. Dr. Malek emphasizes the uniqueness of solution sets, where, under specific conditions, a single, distinct solution exists. This part of the lecture uses both algebraic techniques and geometric visualization to clarify the abstract nature of the topic.

          Furthermore, Dr. Malek tackles the situation of infinite solutions, particularly in polynomial systems, highlighting the sometimes surprising simplicity behind solving these equations. Her intuitive approach combined with methodical problem-solving strategies helps demystify the process, preparing students for more advanced concepts in linear algebra.

            Chapters

            • 00:00 - 00:16: Chapter 4 - Vector Spaces In Chapter 4 - Vector Spaces, Dr. Aliza Malek continues the discussion on linear combinations in the context of vector spaces. She begins with a brief recap of what a linear combination is and proceeds to solve several exercises to refine the understanding of this concept. The chapter emphasizes that given a vector space V over a field F and a set S consisting of k vectors in V, a vector w is a linear combination of the vectors in S if it can be expressed using those vectors.

            141 - צירוף ליניארי: דוגמאות Transcription

            • 00:00 - 00:30 Chapter 4 - Vector Spaces Examples Of Linear Combination Linear Algebra Course Dr. Aliza Malek Hello, thank you for coming back. Last lesson we learned what a linear combination is. In this lesson we will solve a few more exercises to refine the definition. So let's begin. I remind you what a linear combination is, I have V, a vector space over the field F, I have S a set of k vectors in V, vector w is a linear combination of the vectors in S, if w can be expressed using the vectors in S.
            • 00:30 - 01:00 What does it mean to express using the vectors in S? It means, with the operations we know how to do: addition and multiplication by a scalar only. That is, w is equal to alpha 1 v1, plus alpha 2 v2, plus alpha k vk, where alpha 1 to alpha k are scalars in the field F. So here is a first example: We have V being R3, a vector space over R, and we define S to be the three vectors that are written here in this set:
            • 01:00 - 01:30 1, 1, 0, 1, 0, 0, and 0, 0, 1. And we ask, is 1, 2, 3 a linear combination? In this case, I am told that yes. Show that... Show that, means, we have already checked. Now show that we are right. Here, it is a linear combination of the terms in S. In how many ways can it be written? We saw this, remember? Sometimes there is a single solution, sometimes an infinity, and sometimes none at all. So let's see what we get here.
            • 01:30 - 02:00 We are looking for scalars, so that 1, 2, 3, will be equal to alpha times the first, plus beta times the second, plus gamma times the third. When we have few scalars, we call them alpha, bet,a gamma, and not alpha 1, alpha 2, alpha 3. It is very easy to find the answer here, here you don't need a system of equations, you see it right away. Alpha appears only in the element here, alpha, plus zero beta, plus zero gamma. Therefore, alpha must be equal to 2. Same with thing gamma.
            • 02:00 - 02:30 Gamma only appears here, we don't have beta, and we don't have alpha, so gamma must be equal to 3. From all this it follows that alpha plus beta is equal to 1, when alpha is 2, then beta must be -1. If we write the equations, you get the solution immediately, we don't need matrices, and we don't need to rank. Also note that this is a single solution, it is the only solution that exists. Alpha is directly equal to 2, gamma is directly equal to 3,
            • 02:30 - 03:00 therefore it follows that beta must be -1, and there is no other choice. So this time we got that there is a solution, and it is unique. Now we ask ourselves whether there exists any vector in V, I remind you, V is R3, which is not a linear combination of the terms in S. Will they all be they linear combination, or not, is there something that isn't? Let's check what we need to do. We actually want that a general vector in R3, of the form x, y, z,
            • 03:00 - 03:30 will be a linear combination of the vectors in S. That is, it will hold that x, y, z, will be equal to some scalar multiplied by 1, 1, 0, plus a scalar times 1, 0, 0, plus a scalar times 0, 0, 1. According to the calculations we did earlier, It is very easy to guess the answer this time as well. y plus 0, plus 0, ans therefore, the coefficient of 1, 1, 0, must be y.
            • 03:30 - 04:00 The same way, z must be over here: Because the only way to get z comes from the vector 0, 0, 1. Because here I don't have an element, and here I don't have an element, except 0. And so on... Again, we get equations which are very easy to solve, and here, no matter which x, y, z, you give me, I can always find scalars. I will choose y as the coefficient of 1, 1, 0, x minus y as the coefficient of 1, 0, 0, and z as the coefficient of 0, 0, 1.
            • 04:00 - 04:30 And therefore, every vector in R3 is a linear combination of the vectors in S. Note that this solution is unique. This will be the only solution that can be obtained when we write the linear combination. Let's take a look at the example we just saw, the algebraic view says, 1, 2, 3, remember what we got, equal to 2 times 1, 1, 0,
            • 04:30 - 05:00 minus 1 times 1, 0, 0, and 3 times 0, 0, 1. This was the algebraic calculation we did. Now let's take a geometric look. R3 is in fact the space, remember? We know how to draw vectors in a space, so we can illustrate the idea. Let's take a look... The vector 1, 2, 3 is represented here in black color, here:
            • 05:00 - 05:30 I want to write it using three vectors. 1, 1, 0, it's some vector on the purple line, 1, 0, 0, is some vector on the red line, and 0, 0, 1, which is a vector on the blue line. If I take the vector two times, the vector 1, 1, 0 is approximately here. If I take it two times, I get here. If I take 1, 0, 0 times -1, here,
            • 05:30 - 06:00 and I take 0, 0, 1 three times, which is here, I add everything together, like adding vectors in space, we will get exactly the black vector. In fact, you can see here that these are three vectors that are not on the same plane. You see the lines, they are not on the same plane. And three vectors that are not on the same plane, those who have already studied, and are familiar with it, knows that they give every vector in a space.
            • 06:00 - 06:30 Here is another example, this time I will no longer be able to illustrate it geometrically. I take V to be 2 by 2 matrices, and S to be the three matrices that are here in this set. I want to know if there exists a matrix A in V, some kind of a real 2 by 2 matrix, where A is not a linear combination of the vectors in S. And again, if we look a little deeper, What will we see? Remember that in the previous example we had symmetric matrices?
            • 06:30 - 07:00 2 by 2 symmetric matrices were a vector subspace, closed to addition and multiplication by a scalar. Does this happen here as well? We can see this time that all the matrices in S are upper triangular. Here I have 0 below the diagonal, 0 below the diagonal, 0 below the diagonal. The upper triangular matrices are also a vector subspace. Therefore, they are also closed to addition and multiplication by a scalar, to linear combinations, that is.
            • 07:00 - 07:30 So any matrix we choose that is not upper triangular, we will immediately see that we cannot get it as a linear combination of the matrices in S. For example, take A to be lower triangular 0, 0, 1, 0, we don't have to have a lower triangular, but a lower triangular will definitely work, and it will not be a linear combination of matrices that are upper triangular. In other words, it means that if I wanted to solve the system
            • 07:30 - 08:00 0, 0, 1, 0, equals: for alpha times 1, 2, 0, 3, plus beta 0, 2, 0, 0, plus gamma 0, 0, 0, 3, I will get that there is no solution. Indeed, you can look at the element below the diagonal, here it says 1, and here I will get 0 plus 0, plus 0. Therefore there is no solution, and this matrix is not a linear combination of the matrices in S.
            • 08:00 - 08:30 I am being asked here will every matrix is upper triangular be a linear combination of these matrices? And interesting question, let's check. What are all the matrices that can be written as a linear combination of elements of S? That's what we want to know. What are all the matrices that are a linear combination of the matrices in S? So we will see that indeed each matrix A that is upper triangular, 2 by 2, will be a linear combination.
            • 08:30 - 09:00 It is clear to us that if it is not upper triangular, it is not a linear combination. But if it is upper triangular, it is surely a linear combination? That's what we are wondering about. What should be done? We should take a general upper triangular matrix, x, y, 0, z, and attempt to write it as a linear combination of the matrices in S. That is, to solve the system: Alpha times 1, 2, 0, 3, plus beta 0, 2, 0, 0, plus gamma times 0, 0, 0, 3.
            • 09:00 - 09:30 Again, the equations come out simple, not too complicated, And to solve it means writing alpha, beta, gamma, which are our unknowns, using x, y, 0, and z, or x, y, and z. Why? Because then every upper triangular x, y, z matrix that you give me, I will be able to place the connection instead of alpha beta gamma here, and get the linear combination. This is what we solve the system means.
            • 09:30 - 10:00 It is very easy to solve and check that we indeed get that alpha is equal to x, beta is equal to half y minus x, and gamma is equal to one-third z minus x. Solve it by yourself, and see that this is the answer that comes out. Then we get A as a unique linear combination of the terms in S, because there is a unique solution here. In other words, any upper triangular matrix we take will be a linear combination of the matrices in the set S.
            • 10:00 - 10:30 Is the notation always the same? In this case yes, because that is what we got. No matter which x, y, z, matrix we take, with a 0 at the bottom corner, we will get a unique solution for alpha, beta, gamma. But is it always like this? Of course not. It all depends on which the vectors in the set S. Let's see the following example: This time we will take V to be polynomials of a degree of at most 2,
            • 10:30 - 11:00 with real coefficients. Here is our S set, x plus 1, x squared plus 1, x squared, and x minus 1. Four polynomials in the space V. We want to know if it is possible to write any polynomial of the form ax plus b as a linear combination of the terms in S. That is, is every polynomial, of a degree of at most 1, will be a linear combination of the terms in S?
            • 11:00 - 11:30 That is the question, in other words. Let's check. So the answer is yes, here it is: I am guessing the solution for you. Is guessing what we do? No, but we have already learned to solve equations. Just place in coefficients, solve the equations, and get to it. I'm just showing you a slightly different direction here. Sometimes, we can just guess. So look, I will take the polynomial ax plus b, since the degree is at most 1, and I will use just x plus 1, and x minus 1.
            • 11:30 - 12:00 I don't need neither x squared, nor x squared plus 1, and here is what I will get: a plus b divided by 2, times x plus 1, plus a minus b divided by 2, times x minus 1. That's it, I managed to write this down. But that's not the only way to do it. Because don't forget that I have another two polynomials here, which I also can use too. And here, I can write x plus 1,
            • 12:00 - 12:30 first and foremost, using the formula written for me over here, look, where ax plus b, in the polynomial x plus 1, a is 1, and b is 1. So what do you get here? 1 plus 1 divided by 2, is 1, meaning one time x plus 1, plus 1 minus 1 divided by 2, is 0, 0 times x minus 1, and we have already seen that x squared plus 1, and x squared, do not participate. That's why I take the 0 coefficients there as well. So how did I write x plus 1?
            • 12:30 - 13:00 One time x plus 1, and all the others I take 0 times. Do I have to write it like this? Not at all. We can write it a little differently, here: This time I don't want to use x plus 1 at all. I took it 0 times. I will take two times x squared plus 1, I'm going to take minus two times x squared, and one time x minus 1. I wrote it a different way and got the same thing.
            • 13:00 - 13:30 And you know, if I have more than one solution, I will automatically have infinite solutions. Let's check that we really get infinite solutions like we saw here. So we are basically looking for scalars, so that x plus 1 is equal to alpha times x plus 1, plus beta x squared plus 1, plus gamma times x squared, plus delta times x minus 1. That is, if we compare coefficients, we will get,
            • 13:30 - 14:00 that the coefficient of x squared here, is 0. What was the coefficient of x squared on the right side? Beta plus gamma. Therefore beta plus gamma is 0. Now, we will compare the coefficient of x. Here it is equal to 1, here. What is the coefficient of x on the right side? I have alpha from here, minus delta from there. Plus delta, alpha plus delta equals 1. And what is the free coefficient?
            • 14:00 - 14:30 I have 1 from here, and I have alpha plus beta, minus delta, on the right side. Look at what we got. We got, one, two, three equations. And how many unknowns? Four, alpha, beta, gamma, delta. Four unknowns, three equations, and what do we always know in theory? That we will have either a contradiction, or an infinity of solutions.
            • 14:30 - 15:00 Now between us, quietly, we have already seen that there is a solution. Because we have already managed to write x plus 1 as a linear combination. So we know there won't be a contradiction, Therefore, we automatically get infinite solutions. But, if we were really trying to solve this, here is the coefficient matrix of the system, We rank it, and we get: that we have r(A), equals to r(AB), equals to 3, smaller than n, that equals 4. And so we really got an infinity of solutions. That is, all these feelings that we witnessed before in a general way,
            • 15:00 - 15:30 without really solving in a general way with parameters, and coefficients alpha, beta, gamma, and delta, all those feelings we had there, and all the results we got there, are also correct in a way general, if we were to solve the system in a general way. That is it, there is no contradiction, there are infinite solutions, and we are done. In the next lesson we will learn a new concept that talks about the collection of all linear combinations, which can be obtained via vectors in a given set S.
            • 15:30 - 16:00 Thank you very much.