Is there a direct calculation for gradient descent?

Discussion of Common Lisp
Post Reply
wvxvw
Posts: 127
Joined: Sat Mar 26, 2011 6:23 am

Is there a direct calculation for gradient descent?

Post by wvxvw » Mon Sep 10, 2012 2:30 am

This question isn't strictly in CL's domain, if moderators find it a better place - please move it.
So, the wiki article and many examples of gradient descent suggest that the way to calculate the function that is the fastest way to get to the local minimum is by iteratively approaching it in hopes it will converge after *enough* tries. This feels eeky as this also means that there always be an error + this looks like a rather inefficient way to solve the problem.

I am convinced of that there is always either single best or multiple equaly good functions to reach the local minimum, thus there must be a way to find them without any error (within reasonable limitations of how our computations may be precise).

I suspect there must be some direct calculation to find such function, rather then by repetitively trying different hypotheses.

To provide more data: suppose you have a list of conses let it be

Code: Select all

(defvar *data* '((x_0 . y_0) (x_1 . y_1) ... (x_i . y_i)))
. The task is to find a function f such that for all

Code: Select all

(reduce #'+ (mapcar #'f *data*))
would give you the minimum possible value, given f is of a form:

Code: Select all

(defun f (x) (expt (- (car x) (+ b (* (cdr x) k))) 2))
In other words, you need to find b and k to fit them into function f such as the sum of all results of the function applied to the testing set would be the smallest possible.

Goheeca
Posts: 271
Joined: Thu May 10, 2012 12:54 pm
Contact:

Re: Is there a direct calculation for gradient descent?

Post by Goheeca » Mon Sep 10, 2012 2:48 am

It's only posible if the input is described by analytically solvable function, otherwise you must use numeric methods.
And matematically it's laplace of the input, where laplace is div grad (http://en.wikipedia.org/wiki/Laplace_operator).
cl-2dsyntax is my attempt to create a Python-like reader. My mirror of CLHS (and the dark themed version). Temporary mirrors of aferomentioned: CLHS and a dark version.

Goheeca
Posts: 271
Joined: Thu May 10, 2012 12:54 pm
Contact:

Re: Is there a direct calculation for gradient descent?

Post by Goheeca » Mon Sep 10, 2012 2:55 am

And I see you want to generate a function based on a template, that's a bit different and more difficult problem.
Search something about minimization of a functional.
cl-2dsyntax is my attempt to create a Python-like reader. My mirror of CLHS (and the dark themed version). Temporary mirrors of aferomentioned: CLHS and a dark version.

wvxvw
Posts: 127
Joined: Sat Mar 26, 2011 6:23 am

Re: Is there a direct calculation for gradient descent?

Post by wvxvw » Mon Sep 10, 2012 3:15 am

Thanks, yeah, the minimization is something that rings the bell to me. But what do you mean when you say "analytically solvable" - this must be my laps of technical language, I don't know what does it mean when applied to function, can you please explain or have a link to an article?

Goheeca
Posts: 271
Joined: Thu May 10, 2012 12:54 pm
Contact:

Re: Is there a direct calculation for gradient descent?

Post by Goheeca » Mon Sep 10, 2012 3:22 am

If we have some kind of a nonlinear system so it's not analytically solvable.
For example: a transcendental equation, Navier-Stokes, EFE.
cl-2dsyntax is my attempt to create a Python-like reader. My mirror of CLHS (and the dark themed version). Temporary mirrors of aferomentioned: CLHS and a dark version.

wvxvw
Posts: 127
Joined: Sat Mar 26, 2011 6:23 am

Re: Is there a direct calculation for gradient descent?

Post by wvxvw » Mon Sep 10, 2012 3:26 am

Aha, I see, thanks for clarification!

Post Reply