Replies: 2 comments 2 replies
-
|
Thanks for the report. What sort of cover are you using? |
Beta Was this translation helpful? Give feedback.
1 reply
-
|
Thank you. This means that the output of the covering step is a 100'000 x 1'000 boolean array. This by itself does not seem like it should be a problem, memory-wise. It would be good to better understand at what point of the pipeline you get your system crash. One simple way would be to pass |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I am having issues running the giotto mapper implementation on a dataset of ~100,000 points with ~20 features. I used PCA as the filter function and the DBSCAN algorithm for clusterer and set n_jobs=-1. The dataset is scaled. I get a terminated worker error. Does giotto have constraints as to dataset size? Thanks
JN
Beta Was this translation helpful? Give feedback.
All reactions