Neural network tutorial
It's possible to define a neural network as a surrogate, using Flux. This is useful because we can call optimization methods on it.
First of all we will define the Schaffer
function we are going to build surrogate for.
using Plots
using Surrogates
using Flux
function schaffer(x)
x1=x[1]
x2=x[2]
fact1 = x1 ^2;
fact2 = x2 ^2;
y = fact1 + fact2;
end
schaffer (generic function with 1 method)
Sampling
Let's define our bounds, this time we are working in two dimensions. In particular we want our first dimension x
to have bounds 0, 8
, and 0, 8
for the second dimension. We are taking 60 samples of the space using Sobol Sequences. We then evaluate our function on all of the sampling points.
n_samples = 60
lower_bound = [0.0, 0.0]
upper_bound = [8.0, 8.0]
xys = sample(n_samples, lower_bound, upper_bound, SobolSample())
zs = schaffer.(xys);
60-element Vector{Float64}:
4.65625
56.65625
40.65625
22.65625
12.65625
80.65625
38.65625
52.65625
3.90625
55.90625
⋮
56.7578125
31.0078125
126.0078125
39.0078125
39.0078125
8.0078125
71.0078125
55.5078125
31.5078125
Building a surrogate
You can specify your own model, optimization function, loss functions and epochs. As always, getting the model right is hardest thing.
model1 = Chain(
Dense(2, 5, σ),
Dense(5,2,σ),
Dense(2, 1)
)
neural = NeuralSurrogate(xys, zs, lower_bound, upper_bound, model = model1, n_echos = 10)
(::NeuralSurrogate{Vector{Tuple{Float64, Float64}}, Vector{Float64}, Flux.Chain{Tuple{Flux.Dense{typeof(NNlib.σ), Matrix{Float32}, Vector{Float32}}, Flux.Dense{typeof(NNlib.σ), Matrix{Float32}, Vector{Float32}}, Flux.Dense{typeof(identity), Matrix{Float32}, Vector{Float32}}}}, Surrogates.var"#232#236"{Flux.Chain{Tuple{Flux.Dense{typeof(NNlib.σ), Matrix{Float32}, Vector{Float32}}, Flux.Dense{typeof(NNlib.σ), Matrix{Float32}, Vector{Float32}}, Flux.Dense{typeof(identity), Matrix{Float32}, Vector{Float32}}}}}, Flux.Optimise.Descent, Zygote.Params, Int64, Vector{Float64}, Vector{Float64}}) (generic function with 1 method)
Optimization
We can now call an optimization function on the neural network:
surrogate_optimize(schaffer, SRBF(), lower_bound, upper_bound, neural, SobolSample(), maxiters=20, num_new_samples=10)
((0.9375, 0.9375), 1.7578125)