Content On This Page | ||
---|---|---|
Introduction to Matrices | Types of Matrices | Equality of Matrices |
Algebra of Matrices (Addition, Subtraction, Scalar Multiplication, Multiplication) |
Matrices: Introduction, Types, and Basic Operations
Introduction to Matrices
Organizing Data in Rows and Columns
In various fields, including mathematics, science, engineering, and data analysis, we often encounter situations where data or mathematical quantities are naturally arranged in a structured, rectangular format. Think about tables of data: the marks obtained by several students in different subjects, the prices of various products in different stores, or the coefficients of variables in a system of linear equations. These can all be represented neatly in rows and columns.
In mathematics, such a rectangular arrangement of numbers or functions is formally called a matrix. Matrices provide a compact and organized way to represent and work with related information.
Definition of a Matrix
A matrix is defined as a rectangular array or arrangement of numbers, symbols, or expressions, arranged in rows and columns. These individual numbers or expressions within the matrix are referred to as the elements or entries of the matrix.
Matrices are typically denoted by capital letters of the English alphabet (e.g., $A, B, C, X, Y$). The elements of a matrix are enclosed within square brackets $[ ]$ or sometimes large parentheses $( )$.
Example:
$A = \begin{bmatrix} 2 & 3 & 7 \\ 1 & -1 & 5 \end{bmatrix} $
In this matrix $A$, the numbers $2, 3, 7, 1, -1,$ and $5$ are the elements. The horizontal lines of elements are called rows, and the vertical lines of elements are called columns.
Matrix $A$ has two rows: the first row is $(2, 3, 7)$ and the second row is $(1, -1, 5)$.
Matrix $A$ has three columns: the first column is $\begin{bmatrix} 2 \\ 1 \end{bmatrix}$, the second column is $\begin{bmatrix} 3 \\ -1 \end{bmatrix}$, and the third column is $\begin{bmatrix} 7 \\ 5 \end{bmatrix}$.
The elements of a matrix can be real numbers, complex numbers, functions, or even other matrices in more advanced contexts.
Order of a Matrix
The order or dimensions of a matrix specifies its size by stating the number of rows and the number of columns it contains. If a matrix has $m$ rows and $n$ columns, its order is denoted as $m \times n$ (read as "m by n"). When writing the order, the number of rows is always mentioned first, followed by the number of columns.
The total number of elements in an $m \times n$ matrix is the product $m \times n$.
For the matrix $A = \begin{bmatrix} 2 & 3 & 7 \\ 1 & -1 & 5 \end{bmatrix}$ discussed above, there are 2 rows and 3 columns. Therefore, the order of matrix $A$ is $2 \times 3$. The total number of elements is $2 \times 3 = 6$.
Example:
$B = \begin{bmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{bmatrix} $
Matrix $B$ has 3 rows and 2 columns. Its order is $3 \times 2$. The total number of elements is $3 \times 2 = 6$. Notice that a $2 \times 3$ matrix and a $3 \times 2$ matrix have the same number of elements but are different in shape and order.
Example:
$C = \begin{bmatrix} 5 & -1 \end{bmatrix} $
Matrix $C$ has 1 row and 2 columns. Its order is $1 \times 2$. This is also called a row matrix.
Example:
$D = \begin{bmatrix} 4 \\ 0 \\ 1 \end{bmatrix} $
Matrix $D$ has 3 rows and 1 column. Its order is $3 \times 1$. This is also called a column matrix.
Example:
$E = \begin{bmatrix} 5 & 2 \\ 1 & 9 \end{bmatrix} $
Matrix $E$ has 2 rows and 2 columns. Its order is $2 \times 2$. When the number of rows equals the number of columns ($m=n$), the matrix is called a square matrix.
Notation for Elements of a Matrix
To refer to a specific element within a matrix, we use subscript notation that indicates the element's position by its row number and column number. If $A$ is a matrix, the element located in the $i$-th row and the $j$-th column is denoted by $a_{ij}$ or $A_{i,j}$ or $A_{ij}$. The first subscript $i$ always represents the row number, and the second subscript $j$ always represents the column number.
For the matrix $A = \begin{bmatrix} 2 & 3 & 7 \\ 1 & -1 & 5 \end{bmatrix}$ (order $2 \times 3$):
- The element in the 1st row and 1st column is $a_{11} = 2$.
- The element in the 1st row and 2nd column is $a_{12} = 3$.
- The element in the 1st row and 3rd column is $a_{13} = 7$.
- The element in the 2nd row and 1st column is $a_{21} = 1$.
- The element in the 2nd row and 2nd column is $a_{22} = -1$.
- The element in the 2nd row and 3rd column is $a_{23} = 5$.
In general, an $m \times n$ matrix $A$ can be concisely represented by its generic element $[a_{ij}]_{m \times n}$, where $i$ ranges from 1 to $m$ and $j$ ranges from 1 to $n$.
$A = \begin{bmatrix} a_{11} & a_{12} & \ldots & a_{1j} & \ldots & a_{1n} \\ a_{21} & a_{22} & \ldots & a_{2j} & \ldots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots & \ddots & \vdots \\ a_{i1} & a_{i2} & \ldots & a_{ij} & \ldots & a_{in} \\ \vdots & \vdots & \ddots & \vdots & \ddots & \vdots \\ a_{m1} & a_{m2} & \ldots & a_{mj} & \ldots & a_{mn} \end{bmatrix}_{m \times n} $
The element $a_{ij}$ is located at the intersection of the $i$-th row and the $j$-th column.
Context and Uses of Matrices
Matrices are fundamental mathematical objects that originated from the study of systems of linear equations and linear transformations. They have evolved into an essential tool with widespread applications in numerous fields:
- Solving Systems of Linear Equations: A system of linear equations can be represented compactly using matrices. Matrix methods (like Gaussian elimination, Cramer's rule, or matrix inversion) provide systematic ways to solve such systems.
- Linear Transformations: Matrices are used to represent linear transformations, which are functions that map vectors from one space to another in a linear way. In geometry, this includes transformations like rotations, reflections, scaling, and translations.
- Computer Graphics: Matrices are extensively used in computer graphics to perform transformations on objects (like moving, rotating, and scaling), project 3D objects onto 2D screens, and handle camera views.
- Physics and Engineering: Matrices appear in quantum mechanics (representing operators), electrical engineering (circuit analysis), mechanical engineering (stress analysis), optics, and many other areas.
- Economics: Matrices are used in input-output models (Leontief models), game theory, and linear programming to model economic systems and optimize resource allocation.
- Statistics and Data Science: Matrices are used to organize datasets, perform statistical calculations (like covariance matrices), and are fundamental to algorithms in machine learning (e.g., regression, principal component analysis).
- Computer Science: Matrices are used to represent graphs (adjacency matrices), implement algorithms, and in areas like cryptography and error correction codes.
In essence, matrices provide a powerful and efficient framework for representing, manipulating, and solving problems involving linear relationships and structured data across diverse disciplines.
Types of Matrices
Matrices are classified into various types based on their structure, the values of their elements, and specific properties they satisfy. Understanding these different types is important because each type has particular characteristics and is used in specific mathematical contexts and applications.
Classification Based on Dimensions
Matrices are primarily categorized based on the number of rows and columns they have, which defines their order ($m \times n$).
1. Column Matrix
A matrix is called a column matrix if it has only one column, regardless of the number of rows.
The order of a column matrix is always of the form $m \times 1$, where $m$ is the number of rows.
Examples:
$$ \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}_{3 \times 1} \quad \begin{bmatrix} 5 \end{bmatrix}_{1 \times 1} \quad \begin{bmatrix} a \\ b \end{bmatrix}_{2 \times 1} $$A column matrix is sometimes referred to as a column vector.
2. Row Matrix
A matrix is called a row matrix if it has only one row, regardless of the number of columns.
The order of a row matrix is always of the form $1 \times n$, where $n$ is the number of columns.
Examples:
$$ \begin{bmatrix} 1 & 2 & 3 \end{bmatrix}_{1 \times 3} \quad \begin{bmatrix} 5 \end{bmatrix}_{1 \times 1} \quad \begin{bmatrix} x & y \end{bmatrix}_{1 \times 2} $$A row matrix is sometimes referred to as a row vector.
3. Square Matrix
A matrix is called a square matrix if the number of rows is equal to the number of columns.
If a square matrix has $n$ rows and $n$ columns, its order is $n \times n$, and it is often simply referred to as a "square matrix of order $n$".
Examples:
$$ \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}_{2 \times 2} \quad \begin{bmatrix} 5 \end{bmatrix}_{1 \times 1} \quad \begin{bmatrix} 1 & 0 & -1 \\ 2 & 3 & 4 \\ 0 & 1 & 5 \end{bmatrix}_{3 \times 3} $$In a square matrix $A = [a_{ij}]_{n \times n}$, the elements $a_{ij}$ where the row index $i$ is equal to the column index $j$ (i.e., $a_{11}, a_{22}, a_{33}, \ldots, a_{nn}$) are called the diagonal elements. These elements lie on the main diagonal (or principal diagonal) of the matrix.
Classification Based on Elements/Properties
These classifications often apply specifically to square matrices or are based on the values that the elements take.
4. Diagonal Matrix
A diagonal matrix is a square matrix in which all the non-diagonal elements are zero. That is, a square matrix $A = [a_{ij}]$ is a diagonal matrix if $a_{ij} = 0$ for all $i \neq j$.
Examples:
$$ \begin{bmatrix} 5 & 0 \\ 0 & 2 \end{bmatrix}_{2 \times 2} \quad \begin{bmatrix} 1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & -3 \end{bmatrix}_{3 \times 3} \quad \begin{bmatrix} 7 \end{bmatrix}_{1 \times 1} $$Note that the diagonal elements ($a_{ii}$) in a diagonal matrix can be any value, including zero. It's the off-diagonal elements that must be zero.
5. Scalar Matrix
A scalar matrix is a special type of diagonal matrix where all the diagonal elements are equal to the same constant value (a scalar). That is, a diagonal matrix $A = [a_{ij}]$ is a scalar matrix if $a_{ij} = 0$ for $i \neq j$, and $a_{ii} = k$ for some constant scalar $k$ for all $i$.
Examples:
$$ \begin{bmatrix} 5 & 0 \\ 0 & 5 \end{bmatrix}_{2 \times 2} \quad \begin{bmatrix} -3 & 0 & 0 \\ 0 & -3 & 0 \\ 0 & 0 & -3 \end{bmatrix}_{3 \times 3} \quad \begin{bmatrix} 7 \end{bmatrix}_{1 \times 1} $$A scalar matrix is essentially a scalar multiple of an identity matrix: $A = kI_n$, where $k$ is the common diagonal element and $I_n$ is the identity matrix of the same order $n$.
6. Identity Matrix (or Unit Matrix)
An identity matrix is a special type of scalar matrix where all the diagonal elements are equal to 1. It is a square matrix with ones on the main diagonal and zeros everywhere else. It is usually denoted by $I_n$, where $n$ is the order of the square matrix.
That is, $A = [a_{ij}]_{n \times n}$ is an identity matrix if $a_{ij} = 1$ when $i=j$ and $a_{ij} = 0$ when $i \neq j$.
Examples:
$$ I_1 = \begin{bmatrix} 1 \end{bmatrix}_{1 \times 1} \quad I_2 = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}_{2 \times 2} \quad I_3 = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}_{3 \times 3} $$The identity matrix plays a crucial role in matrix algebra, similar to the number 1 in arithmetic. When an identity matrix (of a compatible order) is multiplied by another matrix $A$, the matrix $A$ remains unchanged: $A I = I A = A$.
7. Zero Matrix (or Null Matrix)
A zero matrix (or null matrix) is a matrix in which every single element is zero. It can be of any order ($m \times n$). It is usually denoted by $O$ or $0_{m \times n}$.
Examples:
$$ \begin{bmatrix} 0 & 0 \\ 0 & 0 \end{bmatrix}_{2 \times 2} \quad \begin{bmatrix} 0 & 0 & 0 \end{bmatrix}_{1 \times 3} \quad \begin{bmatrix} 0 \\ 0 \end{bmatrix}_{2 \times 1} \quad \begin{bmatrix} 0 \end{bmatrix}_{1 \times 1} $$The zero matrix plays a role similar to the number 0 in arithmetic. Adding a zero matrix (of compatible order) to a matrix $A$ results in $A$: $A + O = A$. Multiplying any matrix by a zero matrix (of compatible orders) results in a zero matrix: $A \times O = O$ and $O \times A = O$.
8. Upper Triangular Matrix (for square matrices)
An upper triangular matrix is a square matrix where all the elements located below the main diagonal are zero. That is, a square matrix $A = [a_{ij}]_{n \times n}$ is upper triangular if $a_{ij} = 0$ for all $i > j$ (when the row index is greater than the column index).
Example:
$$ \begin{bmatrix} 1 & 2 & 3 \\ 0 & 4 & 5 \\ 0 & 0 & 6 \end{bmatrix}_{3 \times 3} \quad \begin{bmatrix} a & b \\ 0 & c \end{bmatrix}_{2 \times 2} $$The elements on or above the main diagonal can be any value (including zero).
9. Lower Triangular Matrix (for square matrices)
A lower triangular matrix is a square matrix where all the elements located above the main diagonal are zero. That is, a square matrix $A = [a_{ij}]_{n \times n}$ is lower triangular if $a_{ij} = 0$ for all $i < j$ (when the row index is less than the column index).
Example:
$$ \begin{bmatrix} 1 & 0 & 0 \\ 2 & 3 & 0 \\ 4 & 5 & 6 \end{bmatrix}_{3 \times 3} \quad \begin{bmatrix} a & 0 \\ b & c \end{bmatrix}_{2 \times 2} $$The elements on or below the main diagonal can be any value (including zero).
10. Symmetric Matrix (for square matrices)
A symmetric matrix is a square matrix that is equal to its transpose. The transpose of a matrix $A$, denoted by $A^T$, is obtained by interchanging its rows and columns (i.e., the element $a_{ij}$ in $A$ becomes the element $a_{ji}$ in $A^T$). A square matrix $A$ is symmetric if $A^T = A$. This condition implies that $a_{ij} = a_{ji}$ for all possible values of $i$ and $j$.
Example:
$$ A = \begin{bmatrix} 1 & 2 & 3 \\ 2 & 4 & 5 \\ 3 & 5 & 6 \end{bmatrix}_{3 \times 3} \quad A^T = \begin{bmatrix} 1 & 2 & 3 \\ 2 & 4 & 5 \\ 3 & 5 & 6 \end{bmatrix}_{3 \times 3} $$Here, $a_{12}=2$ and $a_{21}=2$, $a_{13}=3$ and $a_{31}=3$, $a_{23}=5$ and $a_{32}=5$. Since $A^T = A$, the matrix is symmetric.
11. Skew-Symmetric Matrix (for square matrices)
A skew-symmetric matrix is a square matrix that is equal to the negative of its transpose. A square matrix $A$ is skew-symmetric if $A^T = -A$. This condition implies that $a_{ij} = -a_{ji}$ for all possible values of $i$ and $j$. If $i=j$ (for the diagonal elements), then $a_{ii} = -a_{ii}$, which means $2a_{ii} = 0$, and therefore $a_{ii} = 0$. Thus, the diagonal elements of a skew-symmetric matrix must all be zero.
Example:
$$ A = \begin{bmatrix} 0 & 2 & 3 \\ -2 & 0 & 5 \\ -3 & -5 & 0 \end{bmatrix}_{3 \times 3} \quad A^T = \begin{bmatrix} 0 & -2 & -3 \\ 2 & 0 & -5 \\ 3 & 5 & 0 \end{bmatrix}_{3 \times 3} \quad -A = \begin{bmatrix} 0 & -2 & -3 \\ 2 & 0 & -5 \\ 3 & 5 & 0 \end{bmatrix}_{3 \times 3} $$Here, $a_{12}=2$ and $a_{21}=-2$, so $a_{12} = -a_{21}$. $a_{13}=3$ and $a_{31}=-3$, so $a_{13} = -a_{31}$. $a_{23}=5$ and $a_{32}=-5$, so $a_{23} = -a_{32}$. The diagonal elements $a_{11}, a_{22}, a_{33}$ are all 0. Since $A^T = -A$, the matrix is skew-symmetric.
These different types of matrices provide a framework for categorizing matrices based on their structural and elemental properties, which is essential for understanding their behavior under matrix operations and their applications in various mathematical and scientific domains.
Equality of Matrices
When are Two Matrices Considered Equal?
In arithmetic, two numbers are equal if they represent the same value (e.g., $5 = 5$, $2+3 = 5$). For matrices, the concept of equality is more structured because matrices are arrays of numbers with specific dimensions and element positions. For two matrices to be considered equal, they must match perfectly in terms of their shape and the values of their entries.
Definition of Equality of Matrices
Two matrices, say matrix $A$ and matrix $B$, are defined as being equal if and only if the following two conditions are simultaneously met:
- Same Order (Dimensions): They must have the exact same number of rows and the exact same number of columns. If matrix $A$ is of order $m \times n$, then matrix $B$ must also be of order $m \times n$. If their orders are different, they cannot be equal, even if they contain the same numbers.
- Equal Corresponding Elements: Every element in matrix $A$ must be equal to the corresponding element in matrix $B$ located at the same row and column position. That is, if $A = [a_{ij}]$ and $B = [b_{ij}]$ are two matrices of the same order $m \times n$, then $A = B$ if and only if $a_{ij} = b_{ij}$ for all possible values of $i$ (from 1 to $m$) and $j$ (from 1 to $n$).
The notation $A = B$ means that matrix $A$ is equal to matrix $B$. If $A$ and $B$ are not equal, we write $A \neq B$.
The first condition (same order) is a prerequisite for the second condition (equal corresponding elements) to even be checked.
Examples of Matrix Equality
Let's look at some examples to illustrate the concept of matrix equality.
Example 1. Are the matrices $A = \begin{bmatrix} 2 & 3 \\ 4 & 5 \end{bmatrix}$ and $B = \begin{bmatrix} 2 & 3 \\ 4 & 5 \end{bmatrix}$ equal?
Answer:
Let's check the two conditions for equality:
1. Order of A is $2 \times 2$. Order of B is $2 \times 2$. The orders are the same.
2. Compare corresponding elements:
- Element in 1st row, 1st column: $a_{11} = 2$ and $b_{11} = 2$. $a_{11} = b_{11}$.
- Element in 1st row, 2nd column: $a_{12} = 3$ and $b_{12} = 3$. $a_{12} = b_{12}$.
- Element in 2nd row, 1st column: $a_{21} = 4$ and $b_{21} = 4$. $a_{21} = b_{21}$.
- Element in 2nd row, 2nd column: $a_{22} = 5$ and $b_{22} = 5$. $a_{22} = b_{22}$.
Since both conditions are met (same order and all corresponding elements are equal), the matrices $A$ and $B$ are equal.
Answer: Yes, the matrices A and B are equal, $A = B$.
Example 2. Are the matrices $C = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}$ and $D = \begin{bmatrix} 1 & 3 \\ 2 & 4 \end{bmatrix}$ equal?
Answer:
Let's check the two conditions for equality:
1. Order of C is $2 \times 2$. Order of D is $2 \times 2$. The orders are the same.
2. Compare corresponding elements:
- Element in 1st row, 1st column: $c_{11} = 1$ and $d_{11} = 1$. $c_{11} = d_{11}$.
- Element in 1st row, 2nd column: $c_{12} = 2$ and $d_{12} = 3$. $c_{12} \neq d_{12}$.
Since at least one pair of corresponding elements is not equal ($c_{12} \neq d_{12}$), the matrices $C$ and $D$ are not equal, even though their orders are the same and some other elements are equal.
Answer: No, the matrices C and D are not equal, $C \neq D$.
Example 3. Find the values of $x, y,$ and $z$ if the matrix $\begin{bmatrix} x+1 & 2 \\ 3 & y-2 \end{bmatrix}$ is equal to the matrix $\begin{bmatrix} 5 & 2 \\ 3 & 4 \end{bmatrix}$.
Answer:
We are given that the two matrices are equal:
$$ \begin{bmatrix} x+1 & 2 \\ 3 & y-2 \end{bmatrix} = \begin{bmatrix} 5 & 2 \\ 3 & 4 \end{bmatrix} $$Both matrices are of order $2 \times 2$. Since they are stated to be equal, their corresponding elements must be equal according to the definition of matrix equality.
Equating the corresponding elements, position by position:
- Equating elements in the 1st row, 1st column: $x+1 = 5$.
- Equating elements in the 1st row, 2nd column: $2 = 2$. (This equation is consistent and gives no information about variables).
- Equating elements in the 2nd row, 1st column: $3 = 3$. (This equation is also consistent).
- Equating elements in the 2nd row, 2nd column: $y-2 = 4$.
Now, solve the equations involving the variables:
From the first equation: $x+1 = 5$. Subtract 1 from both sides: $x = 5 - 1 \implies x = 4$.
From the fourth equation: $y-2 = 4$. Add 2 to both sides: $y = 4 + 2 \implies y = 6$.
The variable $z$ is not present in either of the given matrices, so the equality of these two matrices does not provide any information about the value of $z$. Its value is not determined by this matrix equation.
Answer: The values are $x = 4$ and $y = 6$. The value of $z$ is not determined by the given equality.
The strict definition of matrix equality is fundamental for understanding and performing operations such as matrix addition, subtraction, and for solving matrix equations.
Algebra of Matrices (Addition, Subtraction, Scalar Multiplication, Multiplication)
Just as we can perform arithmetic operations like addition, subtraction, and multiplication on numbers, we can define analogous operations for matrices. However, the rules and conditions for performing these operations on matrices are specific and are primarily determined by the dimensions (order) of the matrices involved. These operations form the basic algebra of matrices.
1. Addition of Matrices
Matrix addition is a straightforward operation, but it is strictly defined only for matrices that have the exact same order. To add two matrices of the same order, we simply add their corresponding elements.
Let $A = [a_{ij}]_{m \times n}$ and $B = [b_{ij}]_{m \times n}$ be two matrices of the same order, $m \times n$. Their sum, denoted by $A + B$, is a new matrix $C = [c_{ij}]_{m \times n}$ of the same order. Each element $c_{ij}$ in the sum matrix $C$ is obtained by adding the element $a_{ij}$ from matrix $A$ to the corresponding element $b_{ij}$ from matrix $B$.
If $A = [a_{ij}]_{m \times n}$ and $B = [b_{ij}]_{m \times n}$, then $A + B = [a_{ij} + b_{ij}]_{m \times n} $
(Matrix Addition Rule)
If the orders of two matrices are different, their addition is not defined.
Example
Example 1. Add the matrices $A = \begin{bmatrix} 2 & 3 \\ 1 & 5 \end{bmatrix}$ and $B = \begin{bmatrix} -1 & 4 \\ 0 & 2 \end{bmatrix}$.
Answer:
Matrix $A$ is of order $2 \times 2$. Matrix $B$ is of order $2 \times 2$. Since they have the same order, their addition is possible.
Add the corresponding elements:
$$ A + B = \begin{bmatrix} a_{11}+b_{11} & a_{12}+b_{12} \\ a_{21}+b_{21} & a_{22}+b_{22} \end{bmatrix} = \begin{bmatrix} 2 + (-1) & 3 + 4 \\ 1 + 0 & 5 + 2 \end{bmatrix} $$Perform the additions:
$$ A + B = \begin{bmatrix} 1 & 7 \\ 1 & 7 \end{bmatrix} $$The resulting matrix has the same order, $2 \times 2$.
Answer: $A + B = \begin{bmatrix} 1 & 7 \\ 1 & 7 \end{bmatrix}$.
Properties of Matrix Addition:
Assuming $A, B,$ and $C$ are matrices of the same order $m \times n$, and $O$ is the zero matrix of the same order:
- Commutativity: Matrix addition is commutative, meaning the order of addition does not matter: $A + B = B + A$.
- Associativity: Matrix addition is associative, meaning the grouping of matrices in a sum does not affect the result: $(A + B) + C = A + (B + C)$.
- Existence of Additive Identity: There exists a zero matrix $O$ (of the same order) such that when added to any matrix $A$, the result is $A$: $A + O = A$.
- Existence of Additive Inverse: For every matrix $A$, there exists a matrix $-A$ (called the additive inverse of $A$) such that $A + (-A) = O$. The matrix $-A$ is obtained by negating each element of $A$, i.e., if $A = [a_{ij}]$, then $-A = [-a_{ij}]$.
2. Subtraction of Matrices
Similar to addition, matrix subtraction is defined only for matrices of the same order. To subtract matrix $B$ from matrix $A$ (both of the same order), we subtract the corresponding elements of $B$ from the corresponding elements of $A$.
Let $A = [a_{ij}]_{m \times n}$ and $B = [b_{ij}]_{m \times n}$ be two matrices of the same order $m \times n$. Their difference, denoted by $A - B$, is a new matrix $D = [d_{ij}]_{m \times n}$ of the same order, where each element $d_{ij}$ is the difference between the corresponding elements $a_{ij}$ and $b_{ij}$.
If $A = [a_{ij}]_{m \times n}$ and $B = [b_{ij}]_{m \times n}$, then $A - B = [a_{ij} - b_{ij}]_{m \times n} $
(Matrix Subtraction Rule)
Matrix subtraction can also be viewed as adding the additive inverse of $B$ to $A$: $A - B = A + (-B)$.
Example
Example 2. Subtract matrix $B$ from matrix $A$, where $A = \begin{bmatrix} 5 & 2 \\ 6 & 1 \end{bmatrix}$ and $B = \begin{bmatrix} 3 & -1 \\ 0 & 4 \end{bmatrix}$.
Answer:
Matrix $A$ is of order $2 \times 2$. Matrix $B$ is of order $2 \times 2$. Since they have the same order, their subtraction is possible.
Subtract the corresponding elements of $B$ from $A$:
$$ A - B = \begin{bmatrix} a_{11}-b_{11} & a_{12}-b_{12} \\ a_{21}-b_{21} & a_{22}-b_{22} \end{bmatrix} = \begin{bmatrix} 5 - 3 & 2 - (-1) \\ 6 - 0 & 1 - 4 \end{bmatrix} $$Perform the subtractions:
$$ A - B = \begin{bmatrix} 2 & 2 + 1 \\ 6 & -3 \end{bmatrix} = \begin{bmatrix} 2 & 3 \\ 6 & -3 \end{bmatrix} $$The resulting matrix has the same order, $2 \times 2$.
Answer: $A - B = \begin{bmatrix} 2 & 3 \\ 6 & -3 \end{bmatrix}$.
Matrix subtraction is generally not commutative ($A-B \neq B-A$) or associative ($(A-B)-C \neq A-(B-C)$).
3. Scalar Multiplication of a Matrix
Scalar multiplication involves multiplying a matrix by a single number, which is called a scalar in the context of matrix algebra. Unlike matrix addition and subtraction, scalar multiplication is defined for any matrix of any order.
Let $A = [a_{ij}]_{m \times n}$ be a matrix and $k$ be any scalar (a real number). The product of the scalar $k$ and the matrix $A$, denoted by $kA$, is a new matrix $B = [b_{ij}]_{m \times n}$ of the same order as $A$. Each element $b_{ij}$ in the resulting matrix is obtained by multiplying the corresponding element $a_{ij}$ of matrix $A$ by the scalar $k$.
If $A = [a_{ij}]_{m \times n}$ and $k$ is a scalar, then $kA = [k \cdot a_{ij}]_{m \times n} $
(Scalar Multiplication Rule)
Example
Example 3. If $A = \begin{bmatrix} 1 & -2 \\ 3 & 0 \end{bmatrix}$, find the matrices $3A$ and $-A$.
Answer:
The matrix $A$ is of order $2 \times 2$.
To find $3A$, we multiply each element of matrix $A$ by the scalar $k=3$:
$$ 3A = 3 \begin{bmatrix} 1 & -2 \\ 3 & 0 \end{bmatrix} = \begin{bmatrix} 3 \times 1 & 3 \times (-2) \\ 3 \times 3 & 3 \times 0 \end{bmatrix} = \begin{bmatrix} 3 & -6 \\ 9 & 0 \end{bmatrix} $$To find $-A$, this is equivalent to multiplying $A$ by the scalar $k=-1$. Multiply each element of matrix $A$ by $-1$:
$$ -A = -1 \times A = -1 \begin{bmatrix} 1 & -2 \\ 3 & 0 \end{bmatrix} = \begin{bmatrix} -1 \times 1 & -1 \times (-2) \\ -1 \times 3 & -1 \times 0 \end{bmatrix} = \begin{bmatrix} -1 & 2 \\ -3 & 0 \end{bmatrix} $$Answer: $3A = \begin{bmatrix} 3 & -6 \\ 9 & 0 \end{bmatrix}$ and $-A = \begin{bmatrix} -1 & 2 \\ -3 & 0 \end{bmatrix}$.
Properties of Scalar Multiplication:
Assuming $A, B$ are matrices of the same order $m \times n$, and $k, l$ are scalars:
- $k(A + B) = kA + kB$ (Scalar multiplication is distributive over matrix addition).
- $(k + l)A = kA + lA$ (Matrix multiplication is distributive over scalar addition).
- $(kl)A = k(lA)$ (Scalar multiplication is associative).
- $1A = A$ (Multiplication by the scalar 1 leaves the matrix unchanged).
- $0A = O$ (Multiplication by the scalar 0 results in the zero matrix of the same order as $A$).
- $kO = O$ (Multiplication of a zero matrix by any scalar results in a zero matrix).
- $(-1)A = -A$ (Multiplication by -1 gives the additive inverse).
4. Multiplication of Matrices
Matrix multiplication is the most complex of the basic matrix operations and has specific conditions for when it is defined. Unlike scalar multiplication (number times matrix) or element-wise operations (addition/subtraction), matrix multiplication involves a combination of row-by-column operations.
For the product of two matrices $A$ and $B$, denoted as $AB$, to be defined, the number of columns in the first matrix ($A$) must be equal to the number of rows in the second matrix ($B$).
Let $A$ be a matrix of order $m \times n$ and $B$ be a matrix of order $n \times p$. The product matrix $C = AB$ will be of order $m \times p$. The number of columns of $A$ ($n$) must match the number of rows of $B$ ($n$). The resulting matrix $C$ has the number of rows of $A$ ($m$) and the number of columns of $B$ ($p$).
If $A = [a_{ij}]_{m \times n}$ and $B = [b_{jk}]_{n \times p}$, the element $c_{ik}$ located in the $i$-th row and $k$-th column of the product matrix $C = AB$ is calculated by taking the sum of the products of the corresponding elements of the $i$-th row of $A$ and the $k$-th column of $B$. This is often described as taking the dot product of the $i$-th row vector of $A$ and the $k$-th column vector of $B$.
$c_{ik} = a_{i1}b_{1k} + a_{i2}b_{2k} + \ldots + a_{in}b_{nk} = \sum_{j=1}^{n} a_{ij}b_{jk} $
(Matrix Multiplication Rule)
To calculate the element $c_{ik}$, you move across the $i$-th row of matrix $A$ and down the $k$-th column of matrix $B$, multiplying the first element of the row by the first element of the column, the second by the second, and so on, and then sum up all these $n$ products.
Examples
Example 4. If $A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix}$ and $B = \begin{bmatrix} 5 & 6 \\ 7 & 8 \end{bmatrix}$, find the product $AB$.
Answer:
Matrix $A$ is of order $2 \times 2$. Matrix $B$ is of order $2 \times 2$.
Check for compatibility: Number of columns in A is 2. Number of rows in B is 2. They match, so the product $AB$ is defined. The order of the resulting matrix $AB$ will be $2 \times 2$ (number of rows in A by number of columns in B).
Let $C = AB = [c_{ij}]_{2 \times 2}$. We need to calculate each element $c_{ij}$.
- $c_{11}$: Element in 1st row, 1st column of AB. Use 1st row of A and 1st column of B. $$ c_{11} = (1 \times 5) + (2 \times 7) = 5 + 14 = 19 $$
- $c_{12}$: Element in 1st row, 2nd column of AB. Use 1st row of A and 2nd column of B. $$ c_{12} = (1 \times 6) + (2 \times 8) = 6 + 16 = 22 $$
- $c_{21}$: Element in 2nd row, 1st column of AB. Use 2nd row of A and 1st column of B. $$ c_{21} = (3 \times 5) + (4 \times 7) = 15 + 28 = 43 $$
- $c_{22}$: Element in 2nd row, 2nd column of AB. Use 2nd row of A and 2nd column of B. $$ c_{22} = (3 \times 6) + (4 \times 8) = 18 + 32 = 50 $$
Assemble the elements into the product matrix $C=AB$:
$$ AB = \begin{bmatrix} 19 & 22 \\ 43 & 50 \end{bmatrix} $$Answer: $AB = \begin{bmatrix} 19 & 22 \\ 43 & 50 \end{bmatrix}$.
Example 5. If $A = \begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \end{bmatrix}$ and $B = \begin{bmatrix} 7 & 8 \\ 9 & 10 \\ 11 & 12 \end{bmatrix}$, find the product $AB$.
Answer:
Matrix $A$ is of order $2 \times 3$. Matrix $B$ is of order $3 \times 2$.
Check for compatibility: Number of columns in A is 3. Number of rows in B is 3. They match, so the product $AB$ is defined. The order of the resulting matrix $AB$ will be $2 \times 2$ (number of rows in A by number of columns in B).
Let $C = AB = [c_{ij}]_{2 \times 2}$. We need to calculate each element $c_{ij}$.
- $c_{11}$: 1st row of A, 1st column of B. $$ c_{11} = (1 \times 7) + (2 \times 9) + (3 \times 11) = 7 + 18 + 33 = 58 $$
- $c_{12}$: 1st row of A, 2nd column of B. $$ c_{12} = (1 \times 8) + (2 \times 10) + (3 \times 12) = 8 + 20 + 36 = 64 $$
- $c_{21}$: 2nd row of A, 1st column of B. $$ c_{21} = (4 \times 7) + (5 \times 9) + (6 \times 11) = 28 + 45 + 66 = 139 $$
- $c_{22}$: 2nd row of A, 2nd column of B. $$ c_{22} = (4 \times 8) + (5 \times 10) + (6 \times 12) = 32 + 50 + 72 = 154 $$
Assemble the elements into the product matrix $C=AB$:
$$ AB = \begin{bmatrix} 58 & 64 \\ 139 & 154 \end{bmatrix} $$Answer: $AB = \begin{bmatrix} 58 & 64 \\ 139 & 154 \end{bmatrix}$.
Properties of Matrix Multiplication:
Assuming the matrix products and sums involved are defined (i.e., the matrices have compatible orders):
- Associativity: Matrix multiplication is associative: $(AB)C = A(BC)$.
- Distributivity: Matrix multiplication is distributive over matrix addition: $A(B + C) = AB + AC$ (left distributive law) and $(A + B)C = AC + BC$ (right distributive law).
- Existence of Multiplicative Identity: For an $m \times n$ matrix $A$, the identity matrix $I_m$ of order $m$ acts as a left identity ($I_m A = A$), and the identity matrix $I_n$ of order $n$ acts as a right identity ($A I_n = A$).
- Property of Zero Matrix: Multiplying by a zero matrix (of compatible order) results in a zero matrix: $A O = O$ and $O A = O$.
- Non-Commutativity: In general, matrix multiplication is not commutative: $AB \neq BA$. This is a crucial difference from multiplication of numbers. Even if both $AB$ and $BA$ are defined and are of the same order (which requires $A$ and $B$ to be square matrices of the same order), the resulting matrices $AB$ and $BA$ are usually not equal.
Understanding these operations and their specific rules is fundamental to working with matrices in various mathematical and applied contexts.