My Tools

    No Products in the Wishlist

Vector Norms for AI – Everything You Should Know About Norms

Norms are a very useful linear algebra concept for AI. ML, DL, ANN whatever type of AI you choose you have to use norm for optimization and Distance measuring(Loss Calculation). In this article, I will give you a solid understanding of Norms and how to use Norms for AI with practical examples.

What are Norms?

Norms are one of the important concepts in Linear Algebra that are used to measure the size or length of vectors and matrices. In other words, they provide a way to quantify the magnitude or distance of an object from a reference point or origin.

Imagine, you have a vector and want to know how big it is. That’s where norms come in. It’s a function that takes a vector as input and gives you a number as output. This number represents the length or size of the vector. Remember size concerns the magnitude of a vector’s components, not its dimensionality(number of elements).

If you’re unfamiliar with these linear algebra concepts, refer to our comprehensive guide on Linear Algebra for a better understanding.

Different types of norms:

  • Euclidean Norm (L2 norm)
  • Manhattan Norm (L1 norm)
  • Maximum norm (Infinity norm or L∞ norm)
  • Spectral norm
  • Frobenius Norm

Norms are not limited to vectors. They can be used for other mathematical objects too. The most commonly used matrix norms are the induced norm, Spectral norm, and Frobenius norm.

Norm helps us to solve linear equations and understand vector spaces and subspaces, Also it is used to analyze matrix properties and study transformations. They play a big role in linear algebra calculation.

Why do we need Norms in AI?

Norm plays a big role in  AI. They are used for Regularization, model complexity controlling, Normalization, Data Preprocessing, Feature Engineering, Similarity Measuring, Optimization, and many more.

  • Regularization and model complexity control: Norms are used as regularization techniques to prevent overfitting(becomes too specialized to the training data and performs poorly on unseen data) and improve generalization( ability to perform accurately on unseen or new data)
  • Distance Metrics and Similarity Measures: Norms provide a way to measure the similarity or distance between data points. So norms are used in clustering algorithms, nearest neighbor search, recommendation systems, and other AI applications that involve comparing and grouping data instances. Euclidean norm, Manhattan norm, or cosine similarity, are commonly used to quantify the similarity or dissimilarity between data points.
  • Optimization Algorithms: Norms play a role in optimization algorithms used in AI. Gradient-based optimization methods like gradient descent use norms to compute and update the gradients of the objective function.
  • Loss Functions: Norms are used in defining loss functions (measuring the error or difference between predicted and true value) in various AI tasks, such as regression and classification. For example, the mean squared error (MSE) loss, which is commonly used in regression, measures the squared L2 norm between predicted and target values. The L1 norm is used in loss functions like mean absolute error (MAE), which measures the absolute difference between predicted and target values.

Overall, Norm provides mathematical tools that enable AI algorithms to handle, process, and interpret data effectively. It contributes to model regularization, feature normalization, distance computations, optimization, loss functions, and evaluation, which are essential aspects of building and deploying AI systems.

Types of Norms

There are a lot of norms that we can use in AI, machine learning, and various mathematical applications. Here are some of the popular types of norms used in AI:

Throughout this article, I use Python and PyTorch combination for programming, You can follow our installation guide to set up your PC properly!

Euclidean Norm (L2 norm)

Euclidean norm is defined as the square root of the sum of the squares of the vector components.For a vector x = [x1, x2, …, xn], the Euclidean norm is given by:

||x|| = √(x1^2 + x2^2 + … + xn^2)

For example, If you have a vector [3, -4] the Euclidean norm of it can be calculated as: ||[3, 4]|| = √(3^2 + -4^2) = √(9 + 16) = √25 = 5(Euclidean norm of [3, -4] is 5)

In Pytorch we use torch.norm() function to calculate the Euclidean norm. Remember for this, the element of the tensor should be either float or complex values. Otherwise, it makes an error(it is a condition used to avoid information loss).

L2 norm in PyTorch

Output:

L2 norm in PyTorch - Output

Manhattan Norm (L1 norm)

The Manhattan norm is defined as the sum of the absolute values of the vector components. This norm calculates the length of a vector by adding up the absolute values(value sign does not consider) of its components. For a vector x = [x1, x2, …, xn], the Manhattan norm is given by:

||x|| = |x1| + |x2| + … + |xn|

For example, If you have a vector [3, -4] the Manhattan norm of it can be calculated as: ||[3, 4]|| = |3| + |-4| = 3 + 4 = 9(Manhattan norm of [3, -4] is 9)

In Pytorch there is no special function to calculate the Euclidean norm. We use abs() function to get the absolute value of elements and then sum them using sum() Here elements of the tensor do not need to be float or complex values.

L1 norm in PyTorch

Output:

L1 norm in PyTorch - Output

Frobenius Norm

Forbenius norm is used to calculate the norm in matrices. It is defined as the square root of the sum of the squares of a matrix’s elements:

XF=i=1mj=1nxij2

This behaves like the L2 norm works in vectors. We use torch.norm() function in PyTorch to calculate the Forbenius norm in a matrix. Remember for this, the element of the tensor should be either float or complex values. Otherwise, it makes an error(it is a condition used to avoid information loss)

Frobenius Norm in PyTorch

torch.ones() – creates tensors with only value 1, Here (3,4) creates a 3 by 4 matrix that only has value 1. This 1 is a float value(1.)

Output:

Frobenius Norm in PyTorch -Output

Throughout this article, we looked at the concept of a vector norm, its properties, and some commonly used norms that we encounter in AI with practical examples and codes. You can see how much this is useful when you start building AI models. Till then, practice with this Linear Algebra concept.

Leave a Reply