Optimization
Optim.Optim
— ModuleOptimization
The Optimization
module provides tools and abstractions for defining, configuring, and executing optimization routines within the Planar.jl framework. It is designed to support a variety of optimization strategies, including parameter tuning, strategy selection, and performance evaluation for trading systems and related applications. The module integrates seamlessly with other Planar.jl components, ensuring type safety, extensibility, and efficient execution.
Main features:
- Flexible optimization workflows for trading strategies and system parameters
- Integration with Planar.jl's data, strategy, and execution layers
- Support for precompilation and dynamic loading
- Extensible design for custom optimization algorithms
Comparison of Search Methods
Function | Data Segmentation | Parameter Selection | Main Use Case |
---|---|---|---|
progsearch | Segments by offset | Filters after each round | Robustness across data segments |
broadsearch | Slices by fixed size | Filters after each slice | Adapting to changing regimes over time |
slidetest | Slides by timeframe | No parameter search | Granular, rolling/walk-forward backtesting |
progsearch
: Progressive grid search with filtering and offsetting for robustness.broadsearch
: Sequential grid search over contiguous slices, filtering at each step.slidetest
: Sliding window backtest, moving by the smallest timeframe increment.
User-facing Optimization/Search Functions
gridsearch(s::Strategy; ...)
: Grid search over parameter combinations for a strategy.progsearch(s::Strategy; ...)
: Progressive search, running multiple grid searches with filtering and resampling.slidetest(s::Strategy; ...)
: Slides a window over the backtesting period, running optimizations at each step.broadsearch(s::Strategy; ...)
: Performs a broad search by slicing the context and optimizing in each slice.optimize(s::Strategy; ...)
: Black-box optimization using the Optimization.jl framework (supports global optimization algorithms).boptimize!(s::Strategy; ...)
: Bayesian optimization using Gaussian Processes (requires BayesExt and BayesianOptimization.jl).
Optim.DEFAULT_OBJ
— ConstantA constant representing the default objective value.
Optim.RUNNING
— ConstantA constant instance of OptRunning
initialized with false
.
Optim.disabled_methods
— ConstantA set of optimization methods that are disabled and not used with the BlackBoxOptim
package.
Optim.BestColumn
— TypeA column in the progress bar representing the best optimization result.
job
segments
measure
best
This struct represents a column in the progress bar that displays the best result of the optimization job. It contains a ProgressJob
, a vector of Segment
objects, a Measure
object, and a reference to the best result. The constructor creates a Segment
with a string representation of the best result and sets the width of the measure to 15.
Optim.ContextSpace
— TypeA named tuple representing the context and space in the optimization process.
Optim.ETAColumn
— TypeA column in the progress bar representing the estimated time remaining.
job
segments
measure
start_time
last_update
completed
total
This struct represents a column in the progress bar that displays the estimated time remaining for the optimization job. It contains a ProgressJob
, a vector of Segment
objects, a Measure
object, and references to track progress timing. The constructor creates a Segment
with a string representation of the ETA and sets the width of the measure to 15.
Optim.OptRunning
— TypeA mutable structure representing the running state of an optimization process.
value
This structure contains a single field value
which is an atomic boolean. It is used to indicate whether the optimization process is currently running or not.
Optim.OptSession
— TypeA structure representing an optimization session.
s
ctx
params
attrs
results
best
lock
s_clones
ctx_clones
This structure stores all the evaluated parameters combinations during an optimization session. It contains fields for the strategy, context, parameters, attributes, results, best result, lock, and clones of the strategy and context for each thread. The constructor for OptSession
also takes an offset and number of threads as optional parameters, with default values of 0 and the number of available threads, respectively.
Optim.ParamsColumn
— TypeA column in the progress bar representing parameters.
job
segments
measure
params
This struct represents a column in the progress bar that displays the parameters of the optimization job. It contains a ProgressJob
, a vector of Segment
objects, a Measure
object, and a reference to the parameters. The constructor creates a Segment
with a string representation of the parameters and sets the width of the measure to 15.
Misc.call!
— MethodApplies parameters to strategy before backtest
call!(_::Strategies.Strategy, params, _::Executors.OptRun)
Misc.call!
— MethodIndicates if the optimization is a minimization problem.
call!(
_::Strategies.Strategy,
_::Executors.OptMinimize
) -> Bool
Misc.call!
— MethodReturns Optim.ContextSpace
for backtesting
call!(_::Strategies.Strategy, _::Executors.OptSetup)
The ctx
field (Executors.Context
) specifies the backtest time period, while bounds
is a tuple of (lower, upper) bounds for the optimization parameters.
Optim._get_color_and_update_best
— MethodMulti-threaded optimization function.
_get_color_and_update_best(
sess,
obj,
pnl
) -> Tuple{String, String}
The function takes four arguments: splits
, backtest_func
, median_func
, and obj_type
. splits
is the number of splits for the optimization process, backtest_func
is the backtest function, median_func
is the function to calculate the median, and obj_type
is the type of the objective. The function returns a function that performs a multi-threaded optimization for a given set of parameters.
Optim._single_opt_func
— MethodSingle-threaded optimization function.
_single_opt_func(
sess,
splits,
backtest_func,
median_func,
args...
) -> Optim.var"#single_backtest_func#57"
The function takes four arguments: splits
, backtest_func
, median_func
, and obj_type
. splits
is the number of splits for the optimization process, backtest_func
is the backtest function, median_func
is the function to calculate the median, and obj_type
is the type of the objective. The function returns a function that performs a single-threaded optimization for a given set of parameters.
Optim._spacedims
— MethodReturns the dimension of the search space.
_spacedims(params) -> Any
This function takes the parameters as input, which should include lower and upper bounds arrays as the second and third elements. It asserts that the lengths of these arrays are equal and returns their common length, which represents the dimension of the search space.
Optim._tostring
— MethodConverts the provided parameters into a string representation.
_tostring(prefix, params) -> String
The function takes a prefix and a set of parameters as input. It joins the prefix and the parameters into a single string, with each parameter converted to a compact number representation. The resulting string is then truncated to fit the display size.
Optim.agg
— Methodagg(df::DataFrame; reduce_func=mean, agg_func=median)
Aggregates the DataFrame df
by grouping on all columns except :obj
, :cash
, :pnl
, and :trades
. Applies reduce_func
to each group, then agg_func
to the reduced results.
Optim.agg
— MethodGroups session results by repeat and aggregates metrics columns.
agg(
sess::OptSession;
sort_by,
filter_zero_trades
) -> DataFrames.DataFrame
sess
: The optimization session containing resultssort_by
: Column to sort by (default: :pnl_avg)filter_zero_trades
: Filter out rows with 0 trades (default: true)
Returns a DataFrame with one row per unique parameter combination, containing:
- Parameter columns (from first row of each group)
- Aggregated metrics: average, median, min, max for obj, cash, pnl, trades
Optim.apply_precision
— MethodApplies precision constraints to optimization parameters.
apply_precision(u, s::Strategies.Strategy) -> Any
This function rounds parameters according to the precision specification stored in the strategy's attributes. If no precision is specified, returns the parameters unchanged.
Optim.bbo_fitness_scheme
— MethodDetermines the fitness scheme for a given strategy and number of objectives.
bbo_fitness_scheme(
s::Strategies.Strategy,
n_obj
) -> BlackBoxOptim.ParetoFitnessScheme
This function takes a strategy and a number of objectives as input. It checks if the strategy has a custom weights function defined in its attributes. If it does, this function is used as the aggregator in the ParetoFitnessScheme. If not, a default ParetoFitnessScheme is returned.
Optim.bbomethods
— FunctionReturns a set of optimization methods supported by BlackBoxOptim.
bbomethods() -> Set{Symbol}
bbomethods(multi) -> Set
This function filters the methods based on the multi
parameter and excludes the methods listed in disabled_methods
. If multi
is true
, it returns multi-objective methods, otherwise it returns single-objective methods.
Optim.broadsearch
— MethodPerforms a broad search optimization that progressively moves through the context range.
broadsearch(
s::Strategies.Strategy;
slice_size,
sort_by,
kwargs...
)
slice_size
: Size of each slice in terms of strategy timeframe periods. If a float between 0 and 1, it is interpreted as a fraction of the total steps (default: 0.2, i.e., 1/5 of the total steps)sort_by
: Column to sort results by (:pnl or :obj, default: :pnl)
The search starts with the first slice of the context and at each iteration:
- Moves to the next contiguous slice
- Filters parameters based on filter_func
- Continues until reaching the end of the context
Optim.ctxfromstrat
— MethodExtracts the context, parameters, and bounds from a given strategy.
ctxfromstrat(s)
This function takes a strategy as input and returns the context, parameters, and bounds associated with that strategy. The bounds can be specified as:
- A tuple of (lower, upper) bounds
- A function that returns bounds
- A NamedTuple with :bounds and optional :precision and :categorical fields
Optim.ctxsteps
— MethodCalculates the small and big steps for the optimization context.
ctxsteps(
ctx,
splits,
wp
) -> NamedTuple{(:small_step, :big_step), <:Tuple{Any, Any}}
The function takes two arguments: ctx
and splits
. ctx
is the optimization context and splits
is the number of splits for the optimization process. The function returns a named tuple with small_step
and big_step
which represent the step size for the optimization process.
Optim.define_backtest_func
— MethodDefines the backtest function for an optimization session.
define_backtest_func(
sess,
small_step,
big_step;
verbose
) -> Optim.var"#opt_backtest_func#34"{Bool}
The function takes three arguments: sess
, small_step
, and big_step
. sess
is the optimization session, small_step
is the small step size for the optimization process, and big_step
is the big step size for the optimization process. The function returns a function that performs a backtest for a given set of parameters and a given iteration number.
Optim.define_median_func
— MethodDefines the median function for multi-objective mode.
define_median_func(
splits
) -> Union{Optim.var"#median_tuple#58", typeof(Statistics.median)}
The function takes a boolean argument ismulti
which indicates if the optimization is multi-objective. If ismulti
is true
, the function returns a function that calculates the median over all the repeated iterations. Otherwise, it returns a function that calculates the median of a given array.
Optim.define_opt_func
— MethodDefines the optimization function for a given strategy.
define_opt_func(
s::Strategies.Strategy;
backtest_func,
split_test,
splits,
n_jobs,
obj_type,
isthreaded,
sess
)
The function takes several arguments: s
, backtest_func
, ismulti
, splits
, obj_type
, and isthreaded
. s
is the strategy, backtest_func
is the backtest function, ismulti
indicates if the optimization is multi-objective, splits
is the number of splits for the optimization process, obj_type
is the type of the objective, and isthreaded
indicates if the optimization is threaded. The function returns the appropriate optimization function based on these parameters.
Optim.delete_sessions!
— MethodClears optimization sessions of a strategy.
delete_sessions!(s_name::String; keep_by, zi)
The function accepts a strategy name s_name
and an optional keep_by
dictionary. If keep_by
is provided, sessions matching these attributes (ctx
, params
, or attrs
) are not deleted. It checks each session, and deletes it if it doesn't match keep_by
or if keep_by
is empty.
Optim.extbayes!
— MethodLoads the BayesianOptimization extension.
The function checks if the BayesianOptimization package is installed in the current environment. If not, it prompts the user to add it to the main environment.
Optim.filter_results
— MethodFilters the optimization results based on certain criteria.
filter_results(
::Strategies.Strategy,
sess;
cut,
min_results
) -> Any
The function takes a strategy and a session as input, along with optional parameters for cut and minimum results. It filters the results based on the cut value and the minimum number of results.
Optim.filtervecs
— MethodFilters a vector of vectors across dimension 2.
filtervecs(
vov::Array{Array{T, 1}, 1};
...
) -> Vector{Vector{Float64}}
filtervecs(
vov::Array{Array{T, 1}, 1},
filter_func::Function;
default_val
) -> Vector{Vector{Float64}}
This function takes a vector of vectors vov
and a filter function filter_func
. It iterates across dimension 2 (columns) and constructs a new vector of vectors where each element is a filtered list of the corresponding elements from the input vector of vectors.
Examples
vov = [[1, 2, 3], [4, 5, 6], [7, 8, 9]]
filter_func = x -> x > 2
result = filtervecs(vov, filter_func)
# Result: [[3], [4, 5, 6], [7, 8, 9]]
Optim.get_params
— FunctionExtracts parameter values from a specific row as a named tuple.
get_params(df::DataFrames.DataFrame) -> NamedTuple
get_params(
df::DataFrames.DataFrame,
row_idx::Int64
) -> NamedTuple
df
: DataFrame containing aggregated results (from agg function)row_idx
: Row index to extract parameters from (default: 1 for best result)
Returns a named tuple with parameter names as keys and their values.
Optim.gridfromparams
— MethodGenerates a grid from the provided parameters.
gridfromparams(params) -> Any
The function takes a set of parameters as input. It generates a grid by taking the product of the parameters and reshaping it to the length of the parameters.
Optim.gridfromresults
— MethodGenerates a grid from the optimization results.
gridfromresults(sess::OptSession, results; kwargs...) -> Any
The function takes an optimization session and results as input. It generates a grid by extracting the parameters from each row of the results.
Optim.gridpbar!
— MethodInitializes a progress bar for grid optimization.
gridpbar!(
sess,
first_params
) -> Tuple{Base.RefValue, Tuple{Base.RefValue{Dates.DateTime}, Base.RefValue{Int64}, Base.RefValue{Int64}}}
This function sets up a progress bar for the grid optimization process. It creates a ParamsColumn
, a BestColumn
, and an ETAColumn
and adds them to the default columns. The function returns a tuple of (currentparams, etarefs) where eta_refs contains the ETA column references.
Optim.gridsearch
— MethodBacktests the strategy across combination of parameters.
gridsearch(
s::Strategies.SimStrategy;
seed,
splits,
n_jobs,
save_freq,
resume,
logging,
random_search,
zi,
grid_itr,
offset,
ctx
)
seed
: random seed set before each backtest run.splits
: the number segments into which the context is split.save_freq
: how frequently (Period
) to save results, whennothing
(default) saving is skipped.logging
: enabled loggingrandom_search
: shuffle parameters combinations before iterations
One parameter combination runs splits
times, where each run uses a period that is a segment of the full period of the given Context
given. (The Context
comes from the strategy call!(s, params, OptRun())
Optim.isrunning
— MethodChecks if the optimization process is currently running.
isrunning() -> Bool
This function returns the value
field of the RUNNING
instance, indicating whether the optimization process is currently running.
Optim.isthreadsafe
— MethodTests if if the strategy is thread safe by looking up the THREADSAFE
global.
Optim.load_session
— FunctionLoads an optimization session from storage.
load_session(name; ...) -> Any
load_session(name, startstop; ...) -> Any
load_session(name, startstop, params_k; ...) -> Any
load_session(
name,
startstop,
params_k,
code;
as_z,
results_only,
s,
zi
) -> Any
This function loads an optimization session from the provided zarr instance zi
based on the given parameters. The parameters include the strategy name, start and stop date of the backtesting context, the first letter of every parameter, and a hash of the parameters and attributes truncated to 4 characters. The function returns the loaded session, either as a zarr array if as_z
is true
, or as an OptSession
object otherwise. If results_only
is true
, only the results DataFrame of the session is returned.
Optim.log_path
— FunctionGenerates the path for the log file of a given strategy.
log_path(s) -> Tuple{Any, Any}
log_path(s, name) -> Tuple{Any, Any}
The function takes a strategy s
and an optional name
(defaulting to the current timestamp). It constructs a directory path based on the strategy's path, and ensures this directory exists. Then, it returns the full path to the log file within this directory, along with the directory path itself.
Optim.logs
— MethodReturns the paths to all log files for a given strategy.
logs(s) -> Any
The function takes a strategy s
as an argument. It retrieves the directory path for the strategy's log files and returns the full paths to all log files within this directory.
Optim.logs_clear
— MethodClears all log files for a given strategy.
logs_clear(s)
The function takes a strategy s
as an argument. It retrieves the directory path for the strategy's log files and removes all files within this directory.
Optim.lowerupper
— MethodExtracts the lower and upper bounds from a parameters dictionary.
lowerupper(
params
) -> Tuple{Vector{Float64}, Vector{Float64}}
The function takes a parameters dictionary params
as an argument. It returns two arrays, lower
and upper
, containing the first and last values of each parameter range in the dictionary, respectively.
Optim.metrics_func
— MethodCalculates the metrics for a given strategy.
metrics_func(s; initial_cash)
The function takes a strategy s
and an initial cash amount as arguments. It calculates the objective score, the current total cash, the profit and loss ratio, and the number of trades. The function returns these metrics as a named tuple.
Optim.objectives
— MethodReturns the number of objectives and their type.
objectives(s)
The function takes a strategy s
as an argument. It returns a tuple containing the type of the objective and the number of objectives.
Optim.optimize
— MethodOptimize parameters using the Optimization.jl framework.
optimize(
s::Strategies.SimStrategy;
seed,
splits,
resume,
save_freq,
zi,
maxiters,
maxtime,
opt_method,
opt_method_kwargs,
solve_method,
solve_method_kwargs,
split_test,
multistart,
n_jobs,
early_threshold,
max_failures,
kwargs...
)
splits
: how many times to run the backtest for each stepseed
: random seedmethod
: optimization method (defaults to BBOadaptivederand1_bin())maxiters
: maximum number of iterationsmaxtime
: maximum time budget for the optimizationkwargs
: The arguments to pass to the underlying Optimization.jl solve function.parallel
: if true, enables parallel evaluation of multiple parameter combinations (default: false)early_threshold
: if specified, terminates evaluation early if objective is below this threshold (default: -Inf)max_failures
: maximum number of consecutive failures before stopping (default: Inf)
From within your strategy, define three call!
functions:
call!(::Strategy, ::OptSetup)
: for the period of time to evaluate and the bounds for the optimization.call!(::Strategy, params, ::OptRun)
: called before running the backtest, should apply the parameters to the strategy.
For compatibility between optimization methods and solvers read Optimization.jl documentation carefully. Solvers that require auto differentiation might not work with your strategy.
Examples
# Optimize all parameters
optimize(s)
# Exclude signal_lifetime and trade_cooldown from optimization
optimize(s)
# Exclude multiple parameters
optimize(s)
Optim.optsession
— MethodRemoves results that don't have all the repeat
ed evaluation.
optsession(s::Strategies.Strategy; seed, splits, offset)
The function groups the results by session parameters and removes those groups that don't have a complete set of evaluations, as defined by the splits
attribute of the session.
Optim.optsessions
— MethodReturns the zarrays storing all the optimization session over the specified zarrinstance.
optsessions(
s_name::String;
zi
) -> Union{Nothing, Dict{String, Zarr.ZArray}}
The function takes a strategy s
as an argument. It retrieves the directory path for the strategy's log files and returns the full paths to all log files within this directory.
Optim.print_log
— FunctionPrints the content of a specific log file for a given strategy.
print_log(s)
print_log(s, idx)
The function takes a strategy s
and an optional index idx
(defaulting to the last log file). It retrieves the directory path for the strategy's log files, selects the log file at the specified index, and prints its content.
Optim.progsearch
— MethodA progressive search performs multiple grid searches with only 1 repetition per parameters combination.
progsearch(s; sess, rounds, cut, kwargs...)
After each search is completed, the results are filtered according to custom rules. The parameters from the results that match the filtering will be backtested again with a different offset
which modifies the backtesting period. rounds
: how many iterations (of grid searches) to perform sess
: If a Ref{<:OptSession>
is provided, search will resume from the session previous results
Additional kwargs are forwarded to the grid search.
Optim.remove_incomplete!
— MethodRemove results that don't have all the repeat
ed evalutaion.
Optim.result_params
— FunctionFetches the named tuple of a single parameters combination.
result_params(
sess::OptSession
) -> Union{Nothing, NamedTuple}
result_params(
sess::OptSession,
idx
) -> Union{Nothing, NamedTuple}
The function takes an optimization session sess
and an optional index idx
(defaulting to the last row of the results). It returns the parameters of the optimization session at the specified index as a named tuple.
Optim.resume!
— MethodResumes the optimization session from saved state.
resume!(sess; zi) -> Bool
The function attempts to load a saved session and resumes it. If the saved session does not match the current session in terms of strategy, context, parameters, or attributes, an error is thrown. If the session is successfully resumed, the results from the saved session are appended to the current session's results.
Optim.rgx_key
— MethodGenerates a regular expression for matching optimization session keys.
rgx_key(startstop, params_k, code) -> Regex
The function takes three arguments: startstop
, params_k
, and code
. These represent the start and stop date of the backtesting context, the first letter of every parameter, and a hash of the parameters and attributes truncated to 4 characters, respectively. The function returns a Regex
object that matches the string representation of an optimization session key.
Optim.running!
— MethodSets the running state of the optimization process to true
.
running!() -> Bool
This function changes the value
field of the RUNNING
instance to true
, indicating that the optimization process is currently running.
Optim.save_session
— MethodSave the optimization session over the provided zarr instance
save_session(
sess::OptSession;
from,
to,
zi
) -> Union{Nothing, Dict}
sess
is the OptSession
to be saved. The from
parameter specifies the starting index for saving optimization results progressively, while to
specifies the ending index. The function uses the provided zarr instance zi
for storage. The function first ensures that the zgroup for the strategy exists. Then, it writes various session attributes to zarr if we're starting from the beginning (from == 0
). Finally, it saves the result data for the specified range (from
to to
).
Optim.select_balanced_params
— MethodSelects parameter combinations that are both diverse and performant.
select_balanced_params(
sess::OptSession;
n,
sort_by
) -> DataFrames.DataFrame
sess
: The optimization session containing resultsn
: Number of parameter combinations to select (default: 10)sort_by
: Column to sort by for performance (:pnl, :cash, :obj, default: :pnl)
Returns a DataFrame with balanced diverse and performant parameter combinations that have at least 1 trade.
Optim.select_best_params
— MethodSelects parameter combinations with the best performance.
select_best_params(
sess::OptSession;
n,
sort_by,
ascending
) -> DataFrames.DataFrame
sess
: The optimization session containing resultsn
: Number of parameter combinations to select (default: 10)sort_by
: Column to sort by (:pnl, :cash, :obj, default: :pnl)ascending
: Whether to sort in ascending order (default: false for best performance)
Returns a DataFrame with the best performing parameter combinations that have at least 1 trade.
Optim.select_diverse_params
— MethodSelects the most different parameter combinations from optimization results.
select_diverse_params(
sess::OptSession;
n,
metric
) -> DataFrames.DataFrame
sess
: The optimization session containing resultsn
: Number of parameter combinations to select (default: 10)metric
: Distance metric to use (:euclidean, :manhattan, :cosine, default: :euclidean)
Returns a DataFrame with the most diverse parameter combinations that have at least 1 trade.
Optim.session_key
— MethodGenerates a unique key for an optimization session.
session_key(
sess::OptSession
) -> Tuple{Union{Base.AnnotatedString{String}, String}, NamedTuple{(:s_part, :ctx_part, :params_part, :config_part), <:Tuple{Any, String, Union{Base.AnnotatedString{String}, String}, String}}}
This function generates a unique key for an optimization session by combining various parts of the session's properties. The key is a combination of the session's strategy name, context range, parameters, and a hash of the parameters and attributes.
Optim.setparams!
— MethodOverride attributes in a strategy with values from a given parameters dictionary.
overrides!(s::AbstractStrategy, params::Dict, pidx::Dict) -> AbstractStrategy
Override attributes in s
with values from the params
dictionary using the parameter index pidx
. This is useful for updating strategy attributes during an optimization run.
Optim.slidetest
— MethodBacktests by sliding over the backtesting period, by the smallest timeframe (the strategy timeframe).
slidetest(
s::Strategies.Strategy;
n_jobs,
step_ratio,
params
)
Until a full range of timeframes is reached between the strategy timeframe and backtesting context timeframe.
multiplier
: the steps count (total stepps will bemultiplier * context_timeframe / s.timeframe
)
Optim.stopcall!
— MethodSets the running state of the optimization process to false
.
stopcall!() -> Bool
This function changes the value
field of the RUNNING
instance to false
, indicating that the optimization process is not currently running.
Optim.supports_parallel
— MethodChecks if a strategy supports parallel optimization.
supports_parallel(s::Strategies.Strategy) -> Any
This function checks if the strategy has the THREADSAFE flag set to true.
Optim.zgroup_opt
— MethodGet the Opt
group from the provided zarr instance.
Optim.zgroup_strategy
— MethodReturns the zarr group for a given strategy.
zgroup_strategy(
zi,
s_name::String
) -> NamedTuple{(:s_group, :opt_group), <:Tuple{Zarr.ZGroup, Union{Zarr.ZArray, Zarr.ZGroup}}}
This function checks if a zarr group exists for the given strategy name in the optimization group of the zarr instance. If it exists, the function returns the group; otherwise, it creates a new zarr group for the strategy.
Optim.@optimize
— Macro@optimize strategy [options...]
Macro for optimizing strategy parameters using Optimization.jl framework.
Arguments
strategy
: The strategy to optimizeoptions
: Optional keyword arguments for the optimization
Examples
@optimize my_strategy maxiters=500
@optimize my_strategy method=BBO_adaptive_de_rand_1_bin() maxiters=1000