[scikit-learn] Complex variables in Gaussian mixture models?
Jacob Schreiber
jmschreiber91 at gmail.com
Mon Jan 9 15:43:23 EST 2017
I'm not too familiar with how complex values are traditionally treated, but
is it possible to make the complex component a real valued component and
treat it just as having twice as many features?
On Mon, Jan 9, 2017 at 11:34 AM, Rory Smith <smith_r at ligo.caltech.edu>
wrote:
> Hi All,
>
> I’d like to set up a GMM using mixture.BayesianGaussianMixture to model a
> probability density of complex random variables (the learned means and
> covariances should also be complex valued). I wasn’t able to see any
> mention of how to handle complex variables in the documentation so I’m
> curious if it’s possible in the current implementation.
> I tried the obvious thing of first generating a 1D array of complex
> random numbers, but I see these warning when I try and fit the array X
> using
>
> dpgmm = mixture.BayesianGaussianMixture(n_components=4,
> covariance_type='full', n_init=1
> ).fit(X)
>
> ~/miniconda2/lib/python2.7/site-packages/sklearn/utils/validation.py:382:
> ComplexWarning: Casting complex values to real discards the imaginary part
> array = np.array(array, dtype=dtype, order=order, copy=copy)
>
>
> And as might be expected from the warning, the learned means are real.
>
> Any advice on this problem would be greatly appreciated!
>
> Best,
> Rory
>
> _______________________________________________
> scikit-learn mailing list
> scikit-learn at python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scikit-learn/attachments/20170109/bf7d7dec/attachment.html>
More information about the scikit-learn
mailing list