Optimizing Optimization: The Next Generation of Optimization Applications and Theory (Quantitative Finance)

(Romina) #1

214 Optimizing Optimization


then a number of decreasing quantiles of these changes. The procedure is sum-
marized in Algorithm 3.


Algorithm 3 Computing the threshold sequence — Variant 1.

1: initialize n (^) Rounds (# of thresholds), n (^) Deltas (# of random solutions)
2: for i  1 to n (^) Deltas do
3: randomly generate current solution xc∈X
4: generate xxnc∈N()
5: compute Δ (^) i  | Φ ( x n )  Φ ( x c )|
6: end for
7: sort ΔΔ 12  ΔnDeltas
8: set τΔΔnRounds,, 1
The number of thresholds n (^) Rounds with this approach is usually large, hence
the number of steps n (^) Steps per threshold (in the inner loop of Algorithm 2) can
be low; often it is only one step per threshold.
A variation of this method, described in Gilli, K ë llezi, and Hysi (2006) and
stated in Algorithm 4, is to take a random walk through the data where the
steps are made according to the neighborhood definition. At every iteration,
the changes in the objective function value are recorded. The thresholds are
then a number of decreasing quantiles of these changes.


Algorithm 4 Computing the threshold sequence — Variant 2.

1: initialize n (^) Rounds (# of thresholds), n (^) Deltas (# of random steps)
2: randomly generate current solution xc∈X
3: for i  1 : n (^) Deltas do
4: generate xxnc∈N() and compute Δi  | Φ ( x n )  Φ ( x c )|
5: x c  x n
6: end for
7: compute empirical distribution CDF of Δ (^) i , i  1, ... , n (^) Deltas
8: compute threshold sequence τr
nr
n

 
CDF Rounds
Rounds
1 ⎛

⎜⎜
⎜⎜


⎟⎟
⎟⎟ , r^  1, ... , n^ Rounds^

Free download pdf