Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add tp.pad #203

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from
Draft

Add tp.pad #203

wants to merge 2 commits into from

Conversation

yizhuoz004
Copy link
Collaborator

No description provided.

@yizhuoz004
Copy link
Collaborator Author

Blocked until translation from stablehlo.dynamic_pad -> tensorrt.slice is fixed.

@yizhuoz004 yizhuoz004 self-assigned this Sep 11, 2024
Args:
input: The input tensor.

padding_sizes: A sequence of padding sizes of each dimension. Its length must equal to the rank
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
padding_sizes: A sequence of padding sizes of each dimension. Its length must equal to the rank
padding_sizes: A sequence of padding sizes of each dimension. Its length must be equal to the rank

input: The input tensor.

padding_sizes: A sequence of padding sizes of each dimension. Its length must equal to the rank
of `input`. Each element of `padding_size` is a tuple of integers or scalars `(low, high)`,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

padding_size can also be a Tensor.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think passing padding_sizes as a Nx2 Tensor is a common use case, it is always more convenient for users to construct a sequence than a Tensor.

However, using ScalarShape in padding_sizes is supported, for example:

pad(inp, [(0, a.shape[0]), (0, a.shape[1])])

sizes_1d = []
for size in padding_sizes:
if isinstance(size, Tensor):
assert size.rank == 0
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
assert size.rank == 0
assert size.rank == 0, f"Size expected to be of rank 0, got {size.rank}."

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants