Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

training data also normalized? #12

Open
hli2020 opened this issue Apr 6, 2016 · 2 comments
Open

training data also normalized? #12

hli2020 opened this issue Apr 6, 2016 · 2 comments

Comments

@hli2020
Copy link

hli2020 commented Apr 6, 2016

Hi G,
I am so new to Torch, just a quick concern about fetching training data. The training data is supposed to be normalized, too. However, I see no such operation in the dataTrain:getBatch() call. Specifically, the code here does not pass input value back to batch. Can you point out where I misunderstood? Thanks!

@hughperkins
Copy link

hughperkins commented Apr 16, 2016

If I'm reading correctly:

(Edit: actually, input is not a refernce variable, it's actually a brand-new torch tensor, but that tensor has the exact same underlying storage as the origianl tensor, and any changes to the data in the input tensor write through to the exact same storage in the original tensor. this is probably the new information you are looking for?)

@gcr
Copy link
Owner

gcr commented Apr 17, 2016

Yes, that's right. Perhaps this should have been made more clear with a comment or something: input refers to the same memory as batch.inputs, so mutating the values at input will also propagate to batch.input

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants