Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about grid_scaling #58

Open
MuQoe opened this issue Jun 3, 2024 · 2 comments
Open

Question about grid_scaling #58

MuQoe opened this issue Jun 3, 2024 · 2 comments

Comments

@MuQoe
Copy link

MuQoe commented Jun 3, 2024

upon this code section

` # post-process cov
scaling = scaling_repeat[:,3:] * torch.sigmoid(scale_rot[:,:3]) # * (1+torch.sigmoid(repeat_dist))
rot = pc.rotation_activation(scale_rot[:,3:7])

# post-process offsets to get centers for gaussians
offsets = offsets * scaling_repeat[:,:3]
xyz = repeat_anchor + offsets

`

What does this stands for scaling_repeat[:,3:], i have found that this is from grid_scaling initialize by

scales = torch.log(torch.sqrt(dist2))[...,None].repeat(1, 6)
Could you explain the role of each dimension in scaling_repeat, especially in the context of the slicing [:,3:] and [:,:3]?

Thanks

@inspirelt
Copy link
Collaborator

The [:,:3] controls the step size of offset. The [:,3:] serves as the base scale for neural gaussian's shape, which means the cov MLP learn a residual scales.

@renaissanceee
Copy link

Therefore, the learnable scaling l takes charge of both scales and positions of 10 neural Gaussians.
If the shape of anchors is [N,3], then the shape of scaling is [N,6].
Is my understanding correct? Just want to make sure.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants