Oracles API

First Order Oracles for superquantile optimization

class spqr.OracleSubgradient(loss, loss_grad, p)[source]

Base class that instantiate the superquantile oracle for a non differentiable loss

For an input oracle \(L\) given through two functions loss and loss_grad, this class is an interface to compute the value and a subgradient of the function \(w \mapsto Cvar \circ L(w)\) over a specified dataset

Parameters
  • loss – function associated to the oracle

  • loss_grad – gradient associated to the oracle

  • p – probability level (by default 0.8)

cost_function(w, x, y)[source]

Computes the value of \(w \mapsto Cvar \circ L(w)\) for the dataset \((x,y)\)

f(w, x, y)[source]

Does the exact same job as OracleSubgradient.cost_function

g(w, x, y)[source]

Computes a subgradient of \(w \mapsto Cvar \circ L(w)\) for the dataset \((x,y)\)

class spqr.OracleSmoothGradient(loss, loss_grad, p, smoothing_parameter=1000.0)[source]

Base class that instantiate the superquantile oracle for a differentiable loss

For an input oracle \(L\) given through two functions loss and grad_loss, this class is an interface to compute the value and the gradient of the function \(w \mapsto Cvar \circ L(w)\) over a specified dataset.

Parameters
  • loss – function associated to the oracle

  • loss_grad – gradient associated to the oracle

  • p – probability level (by default 0.8)

  • smoothing_parameter – specified smoothing parameter according to Nesterov’s smoothing.

cost_function(w, x, y)[source]

Computes the value of \(w \mapsto Cvar \circ L(w)\) for the dataset \((x,y)\)

f(w, x, y)[source]

Computes the value of the smooth approximation \(w \mapsto Cvar \circ L(w)\)

g(w, x, y)[source]

Computes the gradient of the smooth approximation of \(w \mapsto Cvar \circ L(w)\)

First Order Oracles for hyperquantile optimization

class spqr.IntergratedOracleSubgradient(loss, loss_grad, p)[source]

Base class that instantiate the hyperquantile oracle for a non differentiable loss

For an input oracle \(L\) given through two functions loss and loss_grad, this class is an interface to compute the value and a subgradient of the function \(w \mapsto ar{Cvar} \circ L(w)\) over a specified dataset

param loss

function associated to the oracle

param loss_grad

gradient associated to the oracle

param p

probability level (by default 0.8)

cost_function(w, x, y)[source]

Computes the value of \(w \mapsto ar{Cvar} \circ L(w)\) for the dataset \((x,y)\)

f(w, x, y)[source]

Does the exact same job as OracleSubgradient.cost_function

g(w, x, y)[source]

Computes a subgradient of \(w \mapsto ar{Cvar} \circ L(w)\) for the dataset \((x,y)\)

class spqr.IntegratedOracleSmoothGradient(loss, loss_grad, p, smoothing_parameter=1000.0)[source]

Base class that instantiate the hyperquantile oracle for a differentiable loss

For an input oracle \(L\) given through two functions loss and loss_grad, this class is an interface to compute the value and the gradient of the function \(w \mapsto ar{Cvar} \circ L(w)\) over a specified dataset

param loss

function associated to the oracle

param loss_grad

gradient associated to the oracle

param p

probability level (by default 0.8)

param smoothing_parameter

specified smoothing parameter according to Nesterov’s smoothing.

cost_function(w, x, y)[source]

Computes the value of \(w \mapsto ar{Cvar} \circ L(w)\) for the dataset \((x,y)\)

f(w, x, y)[source]

Computes the value of the smooth approximation \(w \mapsto ar{Cvar} \circ L(w)\)

g(w, x, y)[source]

Computes the gradient of the smooth approximation \(w \mapsto ar{Cvar} \circ L(w)\)