Python

PyTorch -hstack()

PyTorch is an open-source framework for the Python programming language.

A tensor is a multidimensional array that is used to store data. So to use a tensor, we have to import the torch module.

To create a tensor, the method used is tensor()

Syntax:
torch.tensor(data)

Where data is a multi-dimensional array.

torch.hstack()

torch.hstack() joins two or more tensors horizontally,

Syntax:
torch.hstack(tensor_object1,tensor_object2,………..)

Parameter:
It takes two or more tensors.

Example 1:
In this example, we will create two one-dimensional tensors and join them horizontally using a torch.hstack().

#import torch module
import torch
 
#create 2 tensors
data1 = torch.tensor([10,20,40,50])  
data2 = torch.tensor([2,3,4,5])
#display
print("Actual Tensors: ")
print(data1)
print(data2)
 
#join two tensors
print("Joined Tensor: ",torch.hstack((data1,data2)))

Output:

Actual Tensors:
tensor([10, 20, 40, 50])
tensor([2, 3, 4, 5])
Joined Tensor:  tensor([10, 20, 40, 50,  2,  3,  4,  5])
Two tensors joined horizontally.

Example 2:
In this example, we will create five one-dimensional tensors and join them horizontally using a torch.hstack().

#import torch module
import torch
 
 
#create 5 tensors
data1 = torch.tensor([10,20,40,50])  
data2 = torch.tensor([2,3,4,5])
data3 = torch.tensor([12])  
data4 = torch.tensor([100])
data5 = torch.tensor([120,456])  
 
#display
print("Actual Tensors: ")
print(data1)
print(data2)
print(data3)
print(data4)
print(data5)
 
#join five tensors
print("Joined Tensor: ",torch.hstack((data1,data2,data3,data4,data5)))

Output:

Actual Tensors:
tensor([10, 20, 40, 50])
tensor([2, 3, 4, 5])
tensor([12])
tensor([100])
tensor([120, 456])
Joined Tensor:  tensor([ 10,  20,  40,  50,   2,   3,   4,   5,  12, 100, 120, 456])

Five tensors are joined horizontally.

Example 3:
In this example, we will create five two-dimensional tensors and join them horizontally using a torch.hstack().

#import torch module
import torch
 
 
#create 5 tensors with 2 Dimensions each
data1 = torch.tensor([[10,20,40,50],[1,2,3,4]])  
data2 = torch.tensor([[2,3,4,5],[20,70,89,0]])
data3 = torch.tensor([[12],[56]])  
data4 = torch.tensor([[100],[67]])
data5 = torch.tensor([[120],[456]])  
 
#display
print("Actual Tensors: ")
print(data1)
print(data2)
print(data3)
print(data4)
print(data5)
 
#join five  tensors
print("Joined Tensor: ",torch.hstack((data1,data2,data3,data4,data5)))

Output:

Actual Tensors:
tensor([[10, 20, 40, 50],
        [ 1,  2,  3,  4]])
tensor([[ 2,  3,  4,  5],
        [20, 70, 89,  0]])
tensor([[12],
        [56]])
tensor([[100],
        [ 67]])
tensor([[120],
        [456]])
Joined Tensor:  tensor([[ 10,  20,  40,  50,   2,   3,   4,   5,  12, 100, 120],
        [  1,   2,   3,   4,  20,  70,  89,   0,  56,  67, 456]])

Five tensors are joined horizontally.

Work with CPU

If you want to run a hstack() function on the CPU, then we have to create a tensor with a cpu() function. This will run on a CPU machine.

At this time, when we are creating a tensor, we can use the cpu() function.

Syntax:
torch.tensor(data).cpu()

Example 1:
In this example, we will create two one-dimensional tensors on the CPU and join them horizontally using a torch.hstack().

#import torch module
import torch
 
#create 2 tensors
data1 = torch.tensor([10,20,40,50].cpu()  
data2 = torch.tensor([2,3,4,5].cpu()
#display
print("Actual Tensors: ")
print(data1)
print(data2)
 
#join two tensors
print("Joined Tensor: ",torch.hstack((data1,data2)))

Output:

Actual Tensors:
tensor([10, 20, 40, 50])
tensor([2, 3, 4, 5])
Joined Tensor:  tensor([10, 20, 40, 50,  2,  3,  4,  5])

Two tensors are joined horizontally.

Example 2:

In this example, we will create five one-dimensional tensors on the CPU and join them horizontally using a torch.hstack().

#import torch module
import torch
 
 
#create 5 tensors
data1 = torch.tensor([10,20,40,50]).cpu()  
data2 = torch.tensor([2,3,4,5]).cpu()
data3 = torch.tensor([12]).cpu()  
data4 = torch.tensor([100]).cpu()
data5 = torch.tensor([120,456]).cpu()  
 
#display
print("Actual Tensors: ")
print(data1)
print(data2)
print(data3)
print(data4)
print(data5)
 
#join five tensors
print("Joined Tensor: ",torch.hstack((data1,data2,data3,data4,data5)))

Output:

Actual Tensors:
tensor([10, 20, 40, 50])
tensor([2, 3, 4, 5])
tensor([12])
tensor([100])
tensor([120, 456])
Joined Tensor:  tensor([ 10,  20,  40,  50,   2,   3,   4,   5,  12, 100, 120, 456])

Five tensors are joined horizontally.

Example 3:
In this example, we will create five two-dimensional tensors on the CPU and join them horizontally using a torch.hstack().

#import torch module
import torch
 
#create 5 tensors with 2 Dimensions each
data1 = torch.tensor([[10,20,40,50],[1,2,3,4]]).cpu()  
data2 = torch.tensor([[2,3,4,5],[20,70,89,0]]).cpu()
data3 = torch.tensor([[12],[56]]).cpu()  
data4 = torch.tensor([[100],[67]]).cpu()  
data5 = torch.tensor([[120],[456]]).cpu()  
 
#display
print("Actual Tensors: ")
print(data1)
print(data2)
print(data3)
print(data4)
print(data5)
 
#join five tensors
print("Joined Tensor: ",torch.hstack((data1,data2,data3,data4,data5)))

Output:

Actual Tensors:
tensor([[10, 20, 40, 50],
        [ 1,  2,  3,  4]])
tensor([[ 2,  3,  4,  5],
        [20, 70, 89,  0]])
tensor([[12],
        [56]])
tensor([[100],
        [ 67]])
tensor([[120],
        [456]])
Joined Tensor:  tensor([[ 10,  20,  40,  50,   2,   3,   4,   5,  12, 100, 120],
        [  1,   2,   3,   4,  20,  70,  89,   0,  56,  67, 456]])

Five tensors are joined horizontally.

Conclusion

We saw how to join two or more tensors horizontally in PyTorch using the hstack() function. In this article, we implemented several examples to join one and two dimensional tensors and also implemented hstack() on the CPU using the cpu() function.

About the author

Gottumukkala Sravan Kumar

B tech-hon's in Information Technology; Known programming languages - Python, R , PHP MySQL; Published 500+ articles on computer science domain