Skip to content

themantalope/opencl_fun

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A repo for a small (py)opencl project

I will be implementing stochastic gradient descent for logistic regression using OpenCL.


OLS Gradient descent algorithm

  • OLS = ordinary least squares
  • The objective of this algorithm is to estimate model parameters by minimizing the sum of squares between the model estimates and training examples
  • Cost function:

$$ J(x,y,\beta) = \frac{1}{2} \sum_{i=1}{N} (h(x_{i}, \beta) - y_{i})^{2} $$

Where $h(.)$ is the hypothesis function (a.k.a. model).

  • Since this is a convex function, this is minimized for the parameters $\beta$ when:

$$ \frac{\partial J}{\partial \beta} = 0 \newline \sum_{i=1}^{N} (h(x_{i}, \beta)) * \frac{\partial h(x_{i}, \beta)}{\partial beta} = 0 $$

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published