Hi Authors!!
Thank you so much for this contribution which has really helped the community by leaps and bounds!
I had a very basic doubt.
Since each multivariate dataset has its own number of features/channels, what is the size considered for experimenting with deep learning models wherein the dataloader must be fed with the data of equal size.
Basically, if the dataset is of shape (b,w,n) where b is the batch size, w is the window size and n is the number of channels then what is the value of n?
Once again, thank you so much for this amazing library!