Python

PyTorch – Vstack()

We will discuss about the vstack() in this PyTorch tutorial.

PyTorch is an open-source framework available with a Python programming language. Tensor is a multidimensional array that is used to store the data. To use a tensor, we have to import the torch module. To create a tensor, the method used is tensor().

Syntax:

torch.tensor(data)

Where the data is a multi-dimensional array.

Torch.vstack()

Torch.vstack() joins two or more tensors vertically.

Syntax:

torch.vstack(tensor_object1,tensor_object2,...........)

Parameter:

It takes two or more tensors.

Example 1:

In this example, we will create two one dimensional tensors and join them vertically using the torch.vstack() function.

#import torch module

import torch

 

#create 2 tensors

data1 = torch.tensor([10,20,40,50])

data2 = torch.tensor([2,3,4,5])

#display

print("Actual Tensors: ")

print(data1)

print(data2)

 

#join two tensors

print("Joined Tensor: ",torch.vstack((data1,data2)))

Output:

Actual Tensors:

tensor([10, 20, 40, 50])

tensor([2, 3, 4, 5])

Joined Tensor: tensor([[10, 20, 40, 50],

[ 2, 3, 4, 5]])

Two tensors are joined vertically.

Example 2:

In this example, we will create five one dimensional tensors and join them vertically using the torch.vstack() function.

#import torch module

import torch

 

 

#create 5 tensors

data1 = torch.tensor([10,20,40,50])

data2 = torch.tensor([2,3,4,5])

data3 = torch.tensor([12,45,67,89])

data4 = torch.tensor([100,32,45,67])

data5 = torch.tensor([120,456,1,1])

 

#display

print("Actual Tensors: ")

print(data1)

print(data2)

print(data3)

print(data4)

print(data5)

 

#join five tensors

print("Joined Tensor: ",torch.vstack((data1,data2,data3,data4,data5)))

Output:

Actual Tensors:

tensor([10, 20, 40, 50])

tensor([2, 3, 4, 5])

tensor([12, 45, 67, 89])

tensor([100, 32, 45, 67])

tensor([120, 456, 1, 1])

Joined Tensor: tensor([[ 10, 20, 40, 50],

[ 2, 3, 4, 5],

[ 12, 45, 67, 89],

[100, 32, 45, 67],

[120, 456, 1, 1]])

Five tensors are joined vertically.

Example 3:

In this example, we will create five two dimensional tensors and join them vertically using the torch.vstack() function.

#import torch module

import torch

 

 

#create 5 tensors with 2 Dimensions each

data1 = torch.tensor([[10,20,40,50],[1,2,3,4]])

data2 = torch.tensor([[2,3,4,5],[20,70,89,0]])

data3 = torch.tensor([[12,4,5,6],[56,34,56,787]])

data4 = torch.tensor([[100,1,2,3],[67,87,6,78]])

data5 = torch.tensor([[120,33,56,78],[45,56,78,6]])

 

#display

print("Actual Tensors: ")

print(data1)

print(data2)

print(data3)

print(data4)

print(data5)

 

#join five tensors

print("Joined Tensor: ",torch.vstack((data1,data2,data3,data4,data5)))

Output:

Actual Tensors:

tensor([[10, 20, 40, 50],

[ 1, 2, 3, 4]])

tensor([[ 2, 3, 4, 5],

[20, 70, 89, 0]])

tensor([[ 12, 4, 5, 6],

[ 56, 34, 56, 787]])

tensor([[100, 1, 2, 3],

[ 67, 87, 6, 78]])

tensor([[120, 33, 56, 78],

[ 45, 56, 78, 6]])

Joined Tensor: tensor([[ 10, 20, 40, 50],

[ 1, 2, 3, 4],

[ 2, 3, 4, 5],

[ 20, 70, 89, 0],

[ 12, 4, 5, 6],

[ 56, 34, 56, 787],

[100, 1, 2, 3],

[ 67, 87, 6, 78],

[120, 33, 56, 78],

[ 45, 56, 78, 6]])

Five tensors are joined vertically.

Work with CPU

If you want to run a vstack() function on the CPU, we have to create a tensor with a cpu() function. This will run on a CPU machine.

When we create a tensor, this time, we can use the cpu() function.

Syntax:

torch.tensor(data).cpu()

Example 1:

In this example, we will create two one dimensional tensors on the cpu and join them vertically using the torch.vstack() function.

#import torch module

import torch

 

#create 2 tensors

data1 = torch.tensor([10,20,40,50]).cpu()

data2 = torch.tensor([2,3,4,5]).cpu()

#display

print("Actual Tensors: ")

print(data1)

print(data2)

 

#join two tensors

print("Joined Tensor: ",torch.vstack((data1,data2)))

Output:

Actual Tensors:

tensor([10, 20, 40, 50])

tensor([2, 3, 4, 5])

Joined Tensor: tensor([[10, 20, 40, 50],

[ 2, 3, 4, 5]])

Two tensors are joined vertically.

Example 2:

In this example, we will create five one dimensional tensors on the cpu and join them vertically using the torch.vstack() function.

#import torch module

import torch

 

 

#create 5 tensors

data1 = torch.tensor([10,20,40,50]).cpu()

data2 = torch.tensor([2,3,4,5]).cpu()

data3 = torch.tensor([12,45,67,89]).cpu()

data4 = torch.tensor([100,32,45,67]).cpu()

data5 = torch.tensor([120,456,1,1]).cpu()

 

#display

print("Actual Tensors: ")

print(data1)

print(data2)

print(data3)

print(data4)

print(data5)

 

#join five tensors

print("Joined Tensor: ",torch.vstack((data1,data2,data3,data4,data5)))

Output:

Actual Tensors:

tensor([10, 20, 40, 50])

tensor([2, 3, 4, 5])

tensor([12, 45, 67, 89])

tensor([100, 32, 45, 67])

tensor([120, 456, 1, 1])

Joined Tensor: tensor([[ 10, 20, 40, 50],

[ 2, 3, 4, 5],

[ 12, 45, 67, 89],

[100, 32, 45, 67],

[120, 456, 1, 1]])

Five tensors are joined vertically.

Example 3:

In this example, we will create five two dimensional tensors on the cpu and join them vertically using the torch.vstack() function.

#import torch module

import torch

 

 

#create 5 tensors with 2 Dimensions each

data1 = torch.tensor([[10,20,40,50],[1,2,3,4]]).cpu()

data2 = torch.tensor([[2,3,4,5],[20,70,89,0]]).cpu()

data3 = torch.tensor([[12,4,5,6],[56,34,56,787]]).cpu()

data4 = torch.tensor([[100,1,2,3],[67,87,6,78]]).cpu()

data5 = torch.tensor([[120,33,56,78],[45,56,78,6]]).cpu()

 

#display

print("Actual Tensors: ")

print(data1)

print(data2)

print(data3)

print(data4)

print(data5)

 

#join five tensors

print("Joined Tensor: ",torch.vstack((data1,data2,data3,data4,data5)))

Output:

Actual Tensors:

tensor([[10, 20, 40, 50],

[ 1, 2, 3, 4]])

tensor([[ 2, 3, 4, 5],

[20, 70, 89, 0]])

tensor([[ 12, 4, 5, 6],

[ 56, 34, 56, 787]])

tensor([[100, 1, 2, 3],

[ 67, 87, 6, 78]])

tensor([[120, 33, 56, 78],

[ 45, 56, 78, 6]])

Joined Tensor: tensor([[ 10, 20, 40, 50],

[ 1, 2, 3, 4],

[ 2, 3, 4, 5],

[ 20, 70, 89, 0],

[ 12, 4, 5, 6],

[ 56, 34, 56, 787],

[100, 1, 2, 3],

[ 67, 87, 6, 78],

[120, 33, 56, 78],

[ 45, 56, 78, 6]])

Five tensors are joined vertically.

Conclusion

We learned how to join two or more tensors vertically in PyTorch using the vstack() function. In this article, we implemented the several examples to join one and two dimensional tensors and implemented the vstack() on cpu using the cpu() function.

About the author

Gottumukkala Sravan Kumar

B tech-hon's in Information Technology; Known programming languages - Python, R , PHP MySQL; Published 500+ articles on computer science domain