[scikit-learn] Novel efficient one-shot optimizer for regression

pyformulas pyformulas at gmail.com
Sat Jun 2 01:13:52 EDT 2018


 Hi, I created an algorithm that may solve linear regression problems with
less time complexity than Singular Value Decomposition. It only requires
the gradient and the diagonal of the hessian to calculate the optimal
weights. I attached the Tensorflow code below. I haven't been able to get
it to work in pure NumPy yet, but I'm sure someone will be able port it if
it really does what it purports to do.

import numpy as np

Y = np.arange(10).reshape(10,1)**0.5

bias_X = np.ones(10).reshape(10,1)
X_feature1 = Y**3
X_feature2 = Y**4
X_feature3 = Y**5
X = np.concatenate((bias_X, X_feature1, X_feature2, X_feature3), axis=1)

num_features = 4


import tensorflow as tf

X_in = tf.placeholder(tf.float64, [None,num_features])
Y_in = tf.placeholder(tf.float64, [None,1])

W = tf.placeholder(tf.float64, [num_features,1])

W_squeezed = tf.squeeze(W)

Y_hat = tf.expand_dims(tf.tensordot(X_in, W_squeezed, ([1],[0])), axis=1)

loss = tf.reduce_mean(Y - Y_hat)**2

gradient = tf.gradients(loss, [W_squeezed])[0]

gradient_2nd = tf.diag_part(tf.hessians(loss, [W_squeezed])[0])

vertex_offset = -gradient/gradient_2nd/num_features

W_star = W_squeezed + vertex_offset
W_star = tf.expand_dims(W_star, axis=1)

with tf.Session() as sess:
    random_W = np.random.normal(size=(num_features,1)).astype(np.float64)
    result1 = sess.run([loss, W_star, gradient, gradient_2nd],
feed_dict={X_in:X, Y_in:Y, W:random_W})
    random_loss = result1[0]
    optimal_W = result1[1]
    print('Random loss:',result1[0])
    print('Gradient:', result1[-2])
    print("2nd-order Gradient:", result1[-1])

    print("W:")
    print(random_W)
    print()
    print("W*:")
    print(result1[1])
    print()

    optimal_loss = sess.run(loss, feed_dict={X_in:X, Y_in:Y, W:optimal_W})
    print('Optimal loss:', optimal_loss)
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scikit-learn/attachments/20180601/0b0adc95/attachment.html>


More information about the scikit-learn mailing list