法庭上认可零和博弈的理论吗_从零开始的本征理论
法庭上認可零和博弈的理論嗎
In machine learning and data science, we often use the Eigen theory. It is widely used in the data dimensionality reduction technique; PCA. Also, even Google’s search algorithm ‘PageRank’ is based on this concept. Once you read this article you will be able to understand what exactly the Eigen theory is and how it can be used to solve problems.
在機器學習和數據科學中,我們經常使用本征理論。 它被廣泛用于數據降維技術中。 PCA。 同樣,甚至谷歌的搜索算法“ PageRank ”也基于此概念。 閱讀本文后,您將能夠了解本征理論的確切含義以及如何將其用于解決問題。
In linear algebra, we can represent an object with combinations of multiple vectors. When we apply any kind of transformation on an object, we shall observe its impact on all the vectors.
線性代數中,我們可以表示與多個矢量的組合的對象。 當我們對對象應用任何類型的變換時,我們將觀察其對所有向量的影響。
When a transformation is applied, some vectors end up staying in the same position. These vectors are called eigenvectors. The factor by which eigenvectors are scaled after the transformation is called eigenvalues.
應用轉換后,某些向量最終會停留在相同位置。 這些向量稱為特征向量。 變換后縮放特征向量的因素稱為特征值。
Now let us understand this with an example of a square shown below(figure1). Although the square is made up of millions of vectors, for our ease of understanding we consider only 3 vectors r, s, v. Vector r, s, v are on plane1, plane3, plane2 respectively.
現在,讓我們用下面所示的正方形示例來了解這一點( 圖1 )。 盡管正方形由數百萬個向量組成,但為了便于理解,我們僅考慮3個向量r,s,v 。 向量r,s,v分別在plane1,plane3,plane2上。
Figure 1圖1 Transformation Matrix A轉換矩陣AWe apply transformation and let A be our transformation matrix.
我們應用變換,使A為變換矩陣。
Let’s have a very quick overview of what exactly the transformation matrix is.
讓我們快速了解一下轉換矩陣的確切含義。
When vector(s) are represented by a matrix, it must be read column-wise to identify vector(s).
當向量由矩陣表示時,必須逐列讀取以識別向量。
Now, let’s see what exactly our transformation matrix A is doing to vector s. The x coordinate of vector s is 1 and the y-coordinate is also 1. (Make sure you are reading matrix A column-wise!)
現在,讓我們看看我們的變換矩陣A對向量s到底做了什么。 向量s的x坐標為1 , y坐標也為1。(確保您按列讀取矩陣A !)
When transformation matrix A is applied, x coordinate of s will move 1 unit in the x-direction and 0 unit in y-direction whereas y cordinate of s will move 0 unit in the x-direction and 2 unit in the y-direction.
當應用變換矩陣A時,s的x坐標將在x方向上移動1個單位,在y方向上移動0個單位,而s的y坐標將在x方向上移動0個單位,在y方向上移動2個單位。
s的變換So, we can see that after transformation coordinates of s are (1, 2).
因此,我們可以看到轉換后的s坐標 是(1、2)。
The same transformation we apply to all the vector r, s, and v. Once the transformation is applied, our new coordinates are (0, 2), (1, 2), and (1, 0) respectively.
我們適用于所有的矢量R,S,與V相同的轉變。 應用轉換后,我們的新坐標分別為(0,2),(1、2)和(1,0)。
Figure 2圖2As you can see in figure 2, vector r and v are on the same plane (the plane is displayed with a dotted green line; that is plane1 and plane 2) as before whereas vector s is not on the same plane. plane3 is the original plane of vector s but when the transformation is applied it has changed.
從圖2中可以看到,向量r和v與以前在同一平面上(該平面顯示有一條虛線綠色;即plane1和plane 2 ),而向量s不在同一平面上。 plane3是向量s的原始平面,但是當應用轉換時,它已更改。
Please note that when I say on the same plane, it means that those vectors are linearly dependent. Vectors are considered linearly independent if and only if c1 and c2 (equation 0)are equal to zero else it is linearly dependent.
請注意,當我在同一平面上講話時 ,這意味著這些向量是線性相關的 。 當且僅當c1和c2(等式0)等于零時,向量才被認為是線性獨立的,否則它是線性相關的。
For example, we saw vector s is not on the same plane after the transformation. Initially, coordinates of s were (1, 1) later it became (1, 2), let’s call it s’. Look at the formula mentioned below.
例如,我們看到向量s在變換后不在同一平面上。 最初, s的坐標是(1,1),后來變成(1,2),我們稱它為s' 。 看下面提到的公式。
Equation 0方程式0Therefore, we can conclude that vector r and v are our eigenvectors as they ended up staying in the same position(check figure 2). The length of vector v is the same so the eigenvalue for v is 1 and the length of vector r has been doubled following the transformation so the eigenvalue of r is 2.
因此,我們可以得出結論,向量r和v是我們的特征向量,因為它們最終停留在相同的位置(請參見圖2 )。 向量v的長度 是相同的,所以v的特征值 是1并且向量r的長度 變換后已加倍,因此r的特征值是2。
Now, we will calculate all the possible eigenvectors and eigenvalues for the same example.
現在 ,我們將為同一示例計算所有可能的特征向量和特征值。
Transformation matrix A轉換矩陣AOur transformation matrix is A. let x be our set of eigenvectors and λ represents eigenvalue(s).
我們的變換矩陣是A。 令x為我們的特征向量集, λ代表特征值。
The relation between these 3 terms can be expressed with this equation (equation 1).
這三個項之間的關系可以用該等式( 等式1 )表示。
Equation 1等式1Here there are 2 possibilities, either x = 0 or (A-λI) = 0. x is a set of eigenvectors that we are supposed to calculate so it can not be zero. Hence, we will consider (A-λI) = 0.
這里有2種可能,要么X = 0或(A-λI)= 0,x是一組特征向量,我們應該計算所以它不能為零。 因此,我們將考慮(A-λI)= 0。
Equation 2方程式2To calculate eigenvectors and eigenvalues follow the steps mentioned below:
要計算特征向量和特征值,請遵循以下步驟:
With the help of equation 2, calculate all the possible value(s) of λ.
借助等式2 ,計算所有可能的λ值。
Substitute the value of λ back to equation 1 and calculate x for each value of λ.
將λ的值代入等式1,并為每個λ值計算x 。
So, now we got two eigenvalues 1 and 2. For both cases, we will calculate x from equation 1.
因此,現在我們得到兩個特征值1和2。對于這兩種情況,我們將從等式1計算x 。
Calculations for Eigenvectors (Calculation 1.2)特征向量的計算(計算1.2)For λ2, we calculated that (-x2, 0) = 0. It means that x2 must be zero, however, we do not care what y2 is; y2 can be anything (To verify, replace y2 with any random value, every time you get the same answer ‘0’!). Hence, (0, t) is eigenvectors’ set for eigenvalue 2, where t can be any number. Now check the vector r (Figure 2) from the previous example. For r, λ = 2 and eigenvector is (0, 2) which certainly verifies our above calculations.
對于λ2,我們計算(-x2,0)=0。這意味著x2必須為零,但是,我們不在乎y2是什么; y2可以是任何值(要驗證,每次獲得相同的答案“ 0”時,請將y2替換為任意隨機值!)。 因此,(0,t)是特征值2的特征向量集,其中t可以是任何數字。 現在檢查上一個示例中的向量r ( 圖2 )。 對于r , λ = 2,特征向量為(0,2),這肯定驗證了我們的上述計算。
The same explanation can be made for λ = 1 and can be verified by vector v from the previous example. (Figure 2)
可以對λ= 1進行相同的解釋,并可以通過前面示例中的向量v進行驗證。 ( 圖2 )
行使: (Exercise:)
TTCalculate eigenvectors and eigenvalues for shear operation where the transformation matrix is A.
計算變換矩陣為A的剪切操作的特征向量和特征值。
Answer:
回答:
Set of eigenvectors are: (t, 0), for λ = 1. It has only one eigenvalue and one set of eigenvectors. Answer (t, 0) says that when transformation matrix T is applied, any vector on the x-axis is our eigenvector with 1 as an eigenvalue. In other words, whenever we apply transformation matrix T any vectors on the x-axis end up staying in the same position.
對于λ= 1 ,特征向量集為:(t,0) 。它只有一個特征值和一組特征向量。 答案(t,0)表示,當應用變換矩陣T時,x軸上的任何向量都是我們的特征向量,特征值為1。 換句話說,每當我們應用變換矩陣T時,x軸上的任何向量最終都停留在相同的位置。
Let’s see graphical representation:
讓我們看一下圖形表示:
Figure 3圖3 Figure 4圖4As you can see in Figure 4 after the transformation is applied, vector r and s have changed their respective plane; vector r moved from plane 2 to plane 3 and vector s moved from plane 3 to plane 4. In terms of linear algebra, r and r’, s and s’ are linearly independent. However, if you see vector v is on the same plane(Do check figure 4) so it is our eigenvector with eigenvalue of 1. Do check out this animation.
如圖4所示 ,應用變換后,向量r和s改變了它們各自的平面。 向量r從平面2移動到平面3 ,向量s從平面3移動到平面4 。 就線性代數而言, r和r' , s和s'是線性獨立的 。 但是,如果您看到向量v在同一平面上(請檢查圖4),那么它就是我們的特征向量為1的特征向量。請檢查此動畫。
真實示例(或幻想世界示例!?!) (Real-World Example (Or Fantasy World Example!?!))
Let’s assume that we are in 2050 and as per Elon Musk’s Mars plan, there are 1 million people on Mars where birth and death rates are 15% and 12% respectively, and 1% of people moving back to the Earth(may be they don’t like Mars!). Also, there are 9 billion people on Earth where birth and death rates are 12% and 10% respectively and 5% of people are moving to Mars every year. Now my question is what would be the population of Earth and Mars in 2100?
假設我們到2050年,并且按照埃隆·馬斯克(Elon Musk)的火星計劃 , 火星上有100萬人的出生率和死亡率分別為15%和12%,并且有1%的人返回地球(也許他們沒有不喜歡火星!)。 此外,地球上每年有90億人的出生率和死亡率分別為12%和10%,每年有5%的人移居火星。 現在我的問題是2100年地球和火星的人口將是多少?
Firstly, we need to put the data into a proper format so we will calculate the transformation matrix as mentioned below:
首先,我們需要將數據放入適當的格式,因此我們將如下所述計算轉換矩陣:
Forming Transformation Matrix(Calculation 2.1)形成變換矩陣(計算2.1)Where dE/dT means a change in Earth’s population per year and dM/dT means a change in Mars’s population per year.
dE / dT表示每年地球人口的變化,而dM / dT表示每年火星人口的變化。
Transformation matrix T represents vector E(Earth’s) and M(Mars’s). (Read column-wise)
變換矩陣T表示向量E(地球)和M(火星)。 (逐欄閱讀)
Today(I mean in 2050!), the population of Earth and Mars is 9 billion and 1 million respectively. So vector P would be (9B, 1M). To calculate the population in 2100 we need to perform T?? x P.
今天(我的意思是在2050年!),地球和火星的人口分別為90億和100萬。 因此向量P為(9B,1M)。 為了計算2100年的人口,我們需要執行T??xP。
Final Population (Calculation 2.2)最終人口(計算2.2)So in 2100, the population of Earth and Mars would be 5.6 billion and 26.4 billion respectively. Luckily in this problem, we dealt with just 2x2 matrix, however, in the real-world matrix can be so huge that let our systems run out space and time, if still, we use this method!
因此,到2100年,地球和火星的人口分別為56億和264億。 幸運的是,在此問題中,我們只處理了2x2矩陣,但是,在現實世界中,矩陣可能是如此之大,以至于我們的系統用盡了空間和時間,即使如此,我們仍然使用這種方法!
Now just imagine what if we had a diagonal matrix? Then finding nth power of the matrix would be so easy, right?
現在,假設我們有一個對角矩陣怎么辦? 那么找到矩陣的n次方將非常容易,對吧?
nth power of the diagonal matrix對角矩陣的n次方Here eigen theory comes into the picture. To find the nth power of matrix T we will use eigenvectors and eigenvalues. This method called Diagonalization of the matrix. Steps are mentioned below:
在這里, 本征理論就出現了。 為了找到矩陣T的n次冪,我們將使用特征向量和特征值。 這種方法稱為矩陣的對角線化。 步驟如下:
Change the basis vectors of matrix T and let eigenvectors be a new basis vector or, in another word, moving vectors from the standard coordinate system (1,0), (0,1) to (x1, y1), (x2, y2). Call this matrix D.
更改矩陣T的基本向量,并將特征向量設為新的基本向量,或者換句話說,將向量從標準坐標系(1,0),(0,1)移至( x1,y1),(x2,y2 )。 將此矩陣稱為D。
Let C be a diagonal matrix which is nothing but a matrix of eigenvectors.
令C為對角矩陣,它僅是特征向量矩陣。
Calculate the nth power of matrix C (in our case n is 50).
計算矩陣C的n次冪(在我們的情況下,n為50)。
- Again change the basis and bring back vectors to the standard coordinate system/ or whatever coordinate system was used before. 再次更改基礎,然后將矢量恢復到標準坐標系/或以前使用的任何坐標系。
 
We have already prepared our transformation matrix T. Now we will calculate its eigenvalues and eigenvectors using the same method mentioned above. (However, calculation of eigenvectors is not mentioned here but you can directly calculate from here or review calculation 1.2 again). Calculations for matrix diagonalization is mentioned below:
我們已經準備好了變換矩陣T。現在,我們將使用上述相同的方法來計算其特征值和特征向量。 (但是,此處未提及特征向量的計算,但是您可以從此處直接進行計算,或者再次查看計算1.2 )。 矩陣對角化的計算方法如下:
Matrix Diagonalization (Calculation 2.3)矩陣對角化(計算2.3)Using this diagonalization, we have drastically reduced the number of computations and it just became so easy. Now just multiply T?? with vector P (which denotes the populations). Results are here:
使用這種對角線化,我們大大減少了計算量,并且變得非常容易。 現在,只需將T??乘以矢量P(表示總體)即可。 結果在這里:
The population of Earth and Mars (Calculation 2.4)地球和火星的人口(計算2.4)That is it! Now we know what would be the population of Earth and Mars in 2100. Numbers are just fascinating, right! Also, there might be some minor errors in the answers from both the methods due to round off so just ignore that.
這就對了! 現在我們知道2100年地球和火星的人口是多少。數字令人著迷,對! 此外,由于四舍五入,兩種方法的答案中可能都存在一些小錯誤,因此請忽略該錯誤。
Also, look at figure 5 to see how the population expands every year. (X-axis is for Earth’s population and Y-axis is for Mars’s population).
另外,請參見圖5以查看人口每年的增長情況。 (X軸代表地球的人口,Y軸代表火星的人口)。
Figure 5圖5如何用Python計算? (How to calculate in Python?)
In python, using numpy library we can calculate eigenvalues and eigenvectors.
在python中,使用numpy庫,我們可以計算特征值和特征向量。
import numpy.linalg as la #numpy’s linear algebra libraryimport numpy as npT = np.array([[0.97, 0.01] #defining the transformatin matrix
[0.05, 1.02]])#calculating eigenvalues and eigenvectors of matrix TeigenValues, eigenVectors = la.eig(T)
print(eigenValues, eigenVectors)
In case of any doubts or queries feel free to contact me at 7mayurpshah@gmail.com
如有任何疑問或疑問,請隨時通過7mayurpshah@gmail.com與我聯系。
Peace!
和平!
翻譯自: https://medium.com/swlh/eigen-theory-from-the-scratch-a73e0b5a25da
法庭上認可零和博弈的理論嗎
總結
以上是生活随笔為你收集整理的法庭上认可零和博弈的理论吗_从零开始的本征理论的全部內容,希望文章能夠幫你解決所遇到的問題。
                            
                        - 上一篇: 力合微电子上市日期
 - 下一篇: 新世界股票是干什么的