Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds batch/event shape test and fix failure #29

Open
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

feynmanliang
Copy link
Collaborator

@feynmanliang feynmanliang commented Jan 13, 2021

NOTE: it may be incorrect to remove event_dim=1 from the bijector; @stefanwebb to confirm

This change is Reviewable

@feynmanliang feynmanliang changed the title Adds batch/event shape test and fix fialure Adds batch/event shape test and fix failure Jan 13, 2021
@@ -11,7 +11,7 @@


class AffineAutoregressive(flowtorch.Bijector):
event_dim = 1
event_dim = 0
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The batch_dim erroneously gets reinterpreted into an event_dim when this is 1

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, shouldn't this be by design? The AffineAutoregressive makes the final dimension of a random variable dependent, so will introduce correlations if the base distribution's final dimension is a batch dim

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should AffineAutoregressive (or any Bijector) continue across batch dimension? Across batch dimension, samples are independent (https://www.tensorflow.org/probability/api_docs/python/tfp/distributions/Distribution#shapes_2) so I'm inclined to say a completely separate bijector (which cannot use autoregressive predictors from the previous batch instance) should be instantiated for a total of prod(batch_shape) bijectors.

This is similar to sample_dim, except there we re-use the same Bijector due to IID whereas here samples across batch_dim are not identically distributed.

Repository owner deleted a comment from stefanwebb Jan 30, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants