DEV Community

Super Kai (Kazuya Ito)
Super Kai (Kazuya Ito)

Posted on • Edited on

hstack and column_stack in PyTorch

Buy Me a Coffee

*Memos:

hstack() can get the 1D or more D horizontally(column-wisely) stacked tensor of zero or more elements from the one or more 0D or more D tensors of zero or more elements as shown below:

*Memos:

  • hstack() can be used with torch but not with a tensor.
  • The 1st argument with torch is tensors(Required-Type:tuple or list of tensor of int, float, complex or bool). *Basically, the size of tensors must be the same.
  • There is out argument with torch(Optional-Default:None-Type:tensor): *Memos:
    • out= must be used.
    • My post explains out argument.
import torch

tensor1 = torch.tensor(2)
tensor2 = torch.tensor(7)
tensor3 = torch.tensor(4)

torch.hstack(tensors=(tensor1, tensor2, tensor3))
# tensor([2, 7, 4])

tensor1 = torch.tensor([2, 7, 4])
tensor2 = torch.tensor([8, 3, 2])
tensor3 = torch.tensor([5, 0, 8])

torch.hstack(tensors=(tensor1, tensor2, tensor3))
# tensor([2, 7, 4, 8, 3, 2, 5, 0, 8])

tensor1 = torch.tensor([[2, 7, 4], [8, 3, 2]])
tensor2 = torch.tensor([[5, 0, 8], [3, 6, 1]])
tensor3 = torch.tensor([[9, 4, 7], [1, 0, 5]])

torch.hstack(tensors=(tensor1, tensor2, tensor3))
# tensor([[2, 7, 4, 5, 0, 8, 9, 4, 7],
#         [8, 3, 2, 3, 6, 1, 1, 0, 5]])

tensor1 = torch.tensor([[2., 7., 4.], [8., 3., 2.]])
tensor2 = torch.tensor([[5., 0., 8.], [3., 6., 1.]])
tensor3 = torch.tensor([[9., 4., 7.], [1., 0., 5.]])

torch.hstack(tensors=(tensor1, tensor2, tensor3))
# tensor([[2., 7., 4., 5., 0., 8., 9., 4., 7.],
#         [8., 3., 2., 3., 6., 1., 1., 0., 5.]])

tensor1 = torch.tensor([[2.+0.j, 7.+0.j, 4.+0.j],
                        [8.+0.j, 3.+0.j, 2.+0.j]])
tensor2 = torch.tensor([[5.+0.j, 0.+0.j, 8.+0.j],
                        [3.+0.j, 6.+0.j, 1.+0.j]])
tensor3 = torch.tensor([[9.+0.j, 4.+0.j, 7.+0.j],
                        [1.+0.j, 0.+0.j, 5.+0.j]])
torch.hstack(tensors=(tensor1, tensor2, tensor3))
# tensor([[2.+0.j, 7.+0.j, 4.+0.j, 5.+0.j, 0.+0.j,
#          8.+0.j, 9.+0.j, 4.+0.j, 7.+0.j],
#         [8.+0.j, 3.+0.j, 2.+0.j, 3.+0.j, 6.+0.j,
#          1.+0.j, 1.+0.j, 0.+0.j, 5.+0.j]])

tensor1 = torch.tensor([[True, False, True], [False, True, False]])
tensor2 = torch.tensor([[False, True, False], [True, False, True]])
tensor3 = torch.tensor([[True, False, True], [False, True, False]])

torch.hstack(tensors=(tensor1, tensor2, tensor3))
# tensor([[True, False, True, False, True, False, True, False, True],
#         [False, True, False, True, False, True, False, True, False]])

tensor1 = torch.tensor([[[2, 7, 4]]])
tensor2 = torch.tensor([])
tensor3 = torch.tensor([[[5, 0, 8]]])

torch.hstack(tensors=(tensor1, tensor2, tensor3))
# tensor([[[2., 7., 4.],
#          [5., 0., 8.]]])
Enter fullscreen mode Exit fullscreen mode

column_stack() can get the 2D or more D horizontally stacked tensor of zero or more elements from the one or more 0D or more D tensors of zero or more elements as shown below:

*Memos:

  • column_stack() can be used with torch but not with a tensor.
  • The 1st argument with torch is tensors(Required-Type:tuple or list of tensor of int, float, complex or bool). *Basically, the size of tensors must be the same.
  • There is out argument with torch(Optional-Default:None-Type:tensor): *Memos:
    • out= must be used.
    • My post explains out argument.
import torch

tensor1 = torch.tensor(2)
tensor2 = torch.tensor(7)
tensor3 = torch.tensor(4)

torch.column_stack(tensors=(tensor1, tensor2, tensor3))
# tensor([[2, 7, 4]])

tensor1 = torch.tensor([2, 7, 4])
tensor2 = torch.tensor([8, 3, 2])
tensor3 = torch.tensor([5, 0, 8])

torch.column_stack(tensors=(tensor1, tensor2, tensor3))
# tensor([[2, 8, 5], [7, 3, 0], [4, 2, 8]])

tensor1 = torch.tensor([[2, 7, 4], [8, 3, 2]])
tensor2 = torch.tensor([[5, 0, 8], [3, 6, 1]])
tensor3 = torch.tensor([[9, 4, 7], [1, 0, 5]])

torch.column_stack(tensors=(tensor1, tensor2, tensor3))
# tensor([[2, 7, 4, 5, 0, 8, 9, 4, 7],
#         [8, 3, 2, 3, 6, 1, 1, 0, 5]])

tensor1 = torch.tensor([[2., 7., 4.], [8., 3., 2.]])
tensor2 = torch.tensor([[5., 0., 8.], [3., 6., 1.]])
tensor3 = torch.tensor([[9., 4., 7.], [1., 0., 5.]])

torch.column_stack(tensors=(tensor1, tensor2, tensor3))
# tensor([[2., 7., 4., 5., 0., 8., 9., 4., 7.],
#         [8., 3., 2., 3., 6., 1., 1., 0., 5.]])

tensor1 = torch.tensor([[2.+0.j, 7.+0.j, 4.+0.j],
                        [8.+0.j, 3.+0.j, 2.+0.j]])
tensor2 = torch.tensor([[5.+0.j, 0.+0.j, 8.+0.j],
                        [3.+0.j, 6.+0.j, 1.+0.j]])
tensor3 = torch.tensor([[9.+0.j, 4.+0.j, 7.+0.j],
                        [1.+0.j, 0.+0.j, 5.+0.j]])
torch.column_stack(tensors=(tensor1, tensor2, tensor3))
# tensor([[2.+0.j, 7.+0.j, 4.+0.j, 5.+0.j, 0.+0.j,
#          8.+0.j, 9.+0.j, 4.+0.j, 7.+0.j],
#         [8.+0.j, 3.+0.j, 2.+0.j, 3.+0.j, 6.+0.j,
#          1.+0.j, 1.+0.j, 0.+0.j, 5.+0.j]])

tensor1 = torch.tensor([[True, False, True], [False, True, False]])
tensor2 = torch.tensor([[False, True, False], [True, False, True]])
tensor3 = torch.tensor([[True, False, True], [False, True, False]])

torch.column_stack(tensors=(tensor1, tensor2, tensor3))
# tensor([[True, False, True, False, True, False, True, False, True],
#         [False, True, False, True, False, True, False, True, False]])

tensor1 = torch.tensor([[]])
tensor2 = torch.tensor([8])
tensor3 = torch.tensor([[]])

torch.column_stack(tensors=(tensor1, tensor2, tensor3))
# tensor([[8.]])
Enter fullscreen mode Exit fullscreen mode

Postmark Image

Speedy emails, satisfied customers

Are delayed transactional emails costing you user satisfaction? Postmark delivers your emails almost instantly, keeping your customers happy and connected.

Sign up

Top comments (0)

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay