Meta recently unveiled 'Beyond A*', a groundbreaking approach to improving planning algorithms that leverages the capabilities of Transformers in conjunction with search dynamics bootstrapping. This new method aims to enhance the efficiency of solving complex decision-making tasks, traditionally dominated by symbolic planners like the A* algorithm. By initializing with a Transformer and then performing a reduced number of iterations, the technique has shown promising results. Specifically, a Transformer model, referred to as 'Searchformer', has been developed to imitate-learn the dynamics of the A* search process. This model can be fine-tuned to produce optimal plans approximately 94% of the time, while reducing the search dynamics by about 27% in specific applications such as the puzzle game Sokoban. This advancement suggests a significant shift in planning algorithms, potentially surpassing traditional methods in both effectiveness and efficiency.
Searchformers are eating symbolic planning. Beyond A∗: Better Planning with Transformers via Search Dynamics Bootstrapping https://t.co/ZLBOMbzIF1 My TL;DR: 1. Generate syndata of A* search algorithms, tokenize the execution trace. Train "Searchformer" to predict these… https://t.co/Ch04R87FJP https://t.co/RRhU2arAcR
“Beyond A*: Better Planning with Transformers via Search Dynamics Bootstrapping” A specular paper Link: https://t.co/FTAT8Tcrty
Thanks @_akhaliq for promoting our work! We propose Searchformer, a Transformer that imitate-learns A* search dynamics (i.e., how the search is performed), and can be fine-tuned to still output optimal plans ~94% of the time, but with ~27% shorter search dynamics in Sokoban,… https://t.co/EYZxon3fi8
Transformers for planning better than a* - initialize with transformer and then do some (but fewer!) Iterations. https://t.co/6LkFkEUpOi
Thanks @_akhaliq for promoting our work! We find that a Transformer that imitate-learns A* search dynamics (i.e., how the search is performed), can be fine-tuned to still output optimal plans ~94% of the time, but with ~27% shorter search dynamics in Sokoban, essentially learning… https://t.co/EYZxon3fi8
Meta presents Beyond A* Better Planning with Transformers via Search Dynamics Bootstrapping While Transformers have enabled tremendous progress in various application settings, such architectures still lag behind traditional symbolic planners for solving complex decision making… https://t.co/pLYR6cZwIT