Open
Conversation
Author
|
Oops I think I PRed from the wrong branch (should be main). Will send another later if this PR is seriously considered. |
Collaborator
|
@jack-wu05 PRing from your |
Member
|
Thanks Jack, really excellent work. I have triggered the CI for now, feel free to ping me during or after the workshop next week. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Here is the tree reference I worked on with Trevor for the summer. I also implemented a dense GaussianReference.
A quick summary of what the tree does: Essentially, we fit each variable with a gaussian. We then construct a complete tree with the variables, where each edge is weighted with the mutual information between the two variables. Note that because direction is needed, instead of using one undirected arrow between two variables I pushed the edges so that there are 2 edges: one going each way; mutual information is symmetric so the weights are the same. I then run a max spanning tree algorithm to get the max spanning tree, and store the edge connections. We use these edge connections, in the form of conditional distributions, later on for sampling and density evaluation. The "which_variable" and "which_index" structures I have in the struct def are to keep track of which component of which variable a given node represents, so that component can be appropriately updated. In hindsight, this may not be totally necessary since the only variable is :singleton_variable and the nodes are labelled with integers throughout the code pushed in order, so you may be able to update directly. I'll leave this part for now.
Please note a couple things:
I had to integrate the covariance matrix into a new recorder as this wasn't supported before, hence all the changes to the recorder codes. Please note that I took my new _transformed_online_full recorder and replaced the GaussianReference recorder with it as well, so _transformed_online was totally removed. I wasn't sure how to dispatch this otherwise.
I have not yet tested the gradient nor kernel invariance (which would test the sample_iid! and density evaluation of my new reference), this still needs to be done. I have already tested the correctness of the max spanning tree algorithm.
I made sure to generate randomly from replica.rng, which should preserve SPI.
Here are some promising results from a 601-dimensional HMM test case (which I also added to the codebase) that make this update (and all the effort for full integration) worth pursuing:
Of course, feel free to ask me any questions if any part is confusing. And if my implementation is too poor, please do take inspiration and make an improved version :)