View Categories

Stochastic

1 min read

The Stochastic Optimization method is a global optimization algorithm that explores the design space by generating random candidate solutions within specified bounds. Unlike deterministic methods such as Branch and Bound or Exhaustive Search, it relies on probability and repeated sampling to discover high-performing designs.

Each run begins by randomly assigning layer thicknesses between the defined lower and upper limits. These candidate designs are then refined using Adam. Repeating this process across many independent runs increases the likelihood of finding a near-global optimum, even in highly complex design landscapes.


Advantages

  • Broad exploration: Random sampling helps escape local minima.
  • Simplicity: Easy to configure and extend with more runs.
  • Effective with refinement: Nested Adam ensures each random candidate is optimized locally.
  • Scalable: More runs increase the chance of finding high-quality solutions.

Limitations

  • No global guarantees: Results depend on the number of runs and randomness.
  • Potentially high computational cost: More runs and larger stacks increase runtime.
  • Reproducibility: Different random seeds may lead to different outcomes.

In FilmOptima

In FilmOptima, Stochastic Optimization belongs to the Global Optimization category of algorithms.

ParameterDescription
ThicknessThe lower bound for candidate layer thicknesses.
ThicknessThe upper bound for candidate layer thicknesses.
# RunsNumber of independent random initializations to perform. Higher values improve reliability but increase runtime.
LearningRateControls the step size in the Adam optimizer during refinement. Higher values make updates faster but risk overshooting, while lower values are more stable but slower to converge.
PatienceDefines how many iterations the Adam optimizer will continue without improvement in the merit function before halving the learning rate.
MaxEpochSets the maximum number of training cycles for the Adam optimizer in each refinement step. Acts as a hard limit to keep optimization runs bounded.
Scroll to Top