ad_optim_lbfgs_mod¶
- optim.ad_optim_lbfgs_mod.optimize_state(state, ctm_env_init, loss_fn, obs_fn=None, post_proc=None, main_args=<config.MAINARGS object>, opt_args=<config.OPTARGS object>, ctm_args=<config.CTMARGS object>, global_args=<config.GLOBALARGS object>)[source]¶
- Parameters:
state (IPEPS) – initial wavefunction
ctm_env_init (ENV) – initial environment corresponding to
stateloss_fn (function(IPEPS,ENV,CTMARGS,OPTARGS,GLOBALARGS)->torch.tensor) – loss function
model (TODO Model base class) – model with definition of observables
local_args (argparse.Namespace) – parsed command line arguments
opt_args (OPTARGS) – optimization configuration
ctm_args (CTMARGS) – CTM algorithm configuration
global_args (GLOBALARGS) – global configuration
Optimizes initial wavefunction
statewith respect toloss_fnusingoptim.lbfgs_modified.LBFGS_MODoptimizer. The main parameters influencing the optimization process are given inconfig.OPTARGS. Calls to functionsloss_fn,obs_fn, andpost_procpass the current configuration as dictionary{"ctm_args":ctm_args, "opt_args":opt_args}The optimizer saves the best energy state into file
main_args.out_prefix+"_state.json"and checkpoints the optimization at every step tomain_args.out_prefix+"_state.json".
- optim.ad_optim_lbfgs_mod.store_checkpoint(checkpoint_file, state, optimizer, current_epoch, current_loss, verbosity=0)[source]¶
- Parameters:
checkpoint_file (str or Path) – target file
state (IPEPS) – ipeps wavefunction
optimizer (torch.optim.Optimizer) – Optimizer
current_epoch (int) – current epoch
current_loss (float) – current value of a loss function
verbosity (int) – verbosity
Store the current state of the optimization in
checkpoint_file.