A tensor is a multidimensional array that is used to store data. So to use a tensor, we have to import the torch module.
To create a tensor the method used is tensor().
Syntax:
torch.tensor(data)
Where data is a multi-dimensional array.
torch.column_stack()
torch.column_stack() joins two or more tensors horizontally.
Syntax:
torch.column_stack(tensor_object1,tensor_object2,………..)
Parameter:
It takes two or more tensors.
Example 1:
In this example, we will create two one-dimensional tensors and join them horizontally using torch.column_stack().
import torch
#create 2 tensors
data1 = torch.tensor([10,20,40,50])
data2 = torch.tensor([2,3,4,5])
#display
print("Actual Tensors: ")
print(data1)
print(data2)
#join two tensors
print("Joined Tensor: ",torch.column_stack((data1,data2)))
Output:
tensor([10, 20, 40, 50])
tensor([2, 3, 4, 5])
Joined Tensor: tensor([[10, 2],
[20, 3],
[40, 4],
[50, 5]])
Two tensors are joined horizontally..
Example 2:
In this example, we will create five one-dimensional tensors and join them horizontally using torch.column_stack().
import torch
#create 5 tensors
data1 = torch.tensor([10,20,40,50])
data2 = torch.tensor([2,3,4,5])
data3 = torch.tensor([12,45,67,89])
data4 = torch.tensor([100,32,45,67])
data5 = torch.tensor([120,456,1,1])
#display
print("Actual Tensors: ")
print(data1)
print(data2)
print(data3)
print(data4)
print(data5)
#join five tensors
print("Joined Tensor: ",torch.column_stack((data1,data2,data3,data4,data5)))
Output:
tensor([10, 20, 40, 50])
tensor([2, 3, 4, 5])
tensor([12, 45, 67, 89])
tensor([100, 32, 45, 67])
tensor([120, 456, 1, 1])
Joined Tensor: tensor([[ 10, 2, 12, 100, 120],
[ 20, 3, 45, 32, 456],
[ 40, 4, 67, 45, 1],
[ 50, 5, 89, 67, 1]])
Five tensors are joined horizontally.
Example 3:
In this example, we will create five two-dimensional tensors and join them horizontally using torch.column_stack().
import torch
#create 5 tensors with 2 Dimensions each
data1 = torch.tensor([[10,20,40,50],[1,2,3,4]])
data2 = torch.tensor([[2,3,4,5],[20,70,89,0]])
data3 = torch.tensor([[12,4,5,6],[56,34,56,787]])
data4 = torch.tensor([[100,1,2,3],[67,87,6,78]])
data5 = torch.tensor([[120,33,56,78],[45,56,78,6]])
#display
print("Actual Tensors: ")
print(data1)
print(data2)
print(data3)
print(data4)
print(data5)
#join five tensors
print("Joined Tensor: ",torch.column_stack((data1,data2,data3,data4,data5)))
Output:
tensor([[10, 20, 40, 50],
[ 1, 2, 3, 4]])
tensor([[ 2, 3, 4, 5],
[20, 70, 89, 0]])
tensor([[ 12, 4, 5, 6],
[ 56, 34, 56, 787]])
tensor([[100, 1, 2, 3],
[ 67, 87, 6, 78]])
tensor([[120, 33, 56, 78],
[ 45, 56, 78, 6]])
Joined Tensor: tensor([[ 10, 20, 40, 50, 2, 3, 4, 5, 12, 4, 5, 6, 100, 1,
2, 3, 120, 33, 56, 78],
[ 1, 2, 3, 4, 20, 70, 89, 0, 56, 34, 56, 787, 67, 87,
6, 78, 45, 56, 78, 6]])
Five tensors are joined horizontally.
Work with CPU
If you want to run a column_stack() function on the CPU, then we have to create a tensor with a cpu() function. This will run on a CPU machine.
At this time, when we are creating a tensor, we can use the cpu() function.
Syntax:
torch.tensor(data).cpu()
Example 1:
In this example, we will create two one-dimensional tensors on the CPU and join them horizontally using torch.column_stack().
import torch
#create 2 tensors
data1 = torch.tensor([10,20,40,50]).cpu()
data2 = torch.tensor([2,3,4,5]).cpu()
#display
print("Actual Tensors: ")
print(data1)
print(data2)
#join two tensors
print("Joined Tensor: ",torch.column_stack((data1,data2)))
Output:
tensor([10, 20, 40, 50])
tensor([2, 3, 4, 5])
Joined Tensor: tensor([[10, 2],
[20, 3],
[40, 4],
[50, 5]])
Two tensors are joined horizontally.
Example 2:
In this example, we will create five one-dimensional tensors on the CPU and join them horizontally using torch.column_stack().
import torch
#create 5 tensors
data1 = torch.tensor([10,20,40,50]).cpu()
data2 = torch.tensor([2,3,4,5]).cpu()
data3 = torch.tensor([12,45,67,89]).cpu()
data4 = torch.tensor([100,32,45,67]).cpu()
data5 = torch.tensor([120,456,1,1]).cpu()
#display
print(“Actual Tensors: “)
print(data1)
print(data2)
print(data3)
print(data4)
print(data5)
#join five tensors
print(“Joined Tensor: “,torch.column_stack((data1,data2,data3,data4,data5)))
Output:
tensor([10, 20, 40, 50])
tensor([2, 3, 4, 5])
tensor([12, 45, 67, 89])
tensor([100, 32, 45, 67])
tensor([120, 456, 1, 1])
Joined Tensor: tensor([[ 10, 2, 12, 100, 120],
[ 20, 3, 45, 32, 456],
[ 40, 4, 67, 45, 1],
[ 50, 5, 89, 67, 1]])
Five tensors are joined horizontally.
Example 3:
In this example, we will create five two-dimensional tensors on the CPU and join then horizontally using torch.column_stack().
import torch
#create 5 tensors with 2 Dimensions each
data1 = torch.tensor([[10,20,40,50],[1,2,3,4]]).cpu()
data2 = torch.tensor([[2,3,4,5],[20,70,89,0]]).cpu()
data3 = torch.tensor([[12,4,5,6],[56,34,56,787]]).cpu()
data4 = torch.tensor([[100,1,2,3],[67,87,6,78]]).cpu()
data5 = torch.tensor([[120,33,56,78],[45,56,78,6]]).cpu()
#display
print("Actual Tensors: ")
print(data1)
print(data2)
print(data3)
print(data4)
print(data5)
#join five tensors
print("Joined Tensor: ",torch.column_stack((data1,data2,data3,data4,data5)))
Output:
tensor([[10, 20, 40, 50],
[ 1, 2, 3, 4]])
tensor([[ 2, 3, 4, 5],
[20, 70, 89, 0]])
tensor([[ 12, 4, 5, 6],
[ 56, 34, 56, 787]])
tensor([[100, 1, 2, 3],
[ 67, 87, 6, 78]])
tensor([[120, 33, 56, 78],
[ 45, 56, 78, 6]])
Joined Tensor: tensor([[ 10, 20, 40, 50, 2, 3, 4, 5, 12, 4, 5, 6, 100, 1,
2, 3, 120, 33, 56, 78],
[ 1, 2, 3, 4, 20, 70, 89, 0, 56, 34, 56, 787, 67, 87,
6, 78, 45, 56, 78, 6]])
Five tensors are joined horizontally.
Conclusion
We saw how to join two or more tensors horizontally in PyTorch using the column_stack() function. In this article, we implemented several examples to join one and two-dimensional tensors and also implemented column_stack() on the CPU using the cpu() function.