Solved – Wavelets and machine learning

machine learningwavelet

I am trying to learn features from a signal using Wavelet transform and then apply ML techniques on it to classify a signal. The problem I am facing is that, at each of level of decomposition, my signal has a vector, and I have more than 10,000 signals.

Or let me rephrase: Suppose I want to classify if a house will sell or not. Each house has a lot of features that let me decide. In my feature matrix, each row represents a single house. And in each row, each column represents a random variable pertaining to size, lawn, floorplan etc..

But in the case of Wavelets, I have a signal which I have decomposed. Every level of decomposition has a vector associated with it. And there are many signals. Now in my feature matrix, every row represents a single signal. And each rows' each column is a random variable which is supposed to be a scalar, but in this case, it is a vector. Does anyone know how to tackle this problem? I can't do averaging or mean or suppose it belongs to any distribution.

Best Answer

I agree with @TenaliRaman. You should simply flatten your the 2D matrix with shape(m x n) of wavelet coefficients at different level to a 1D vector with shape(1, m x n).

Related Question