Equations (APPLICATION OF LARGE LANGUAGE MODELS TO SOLVE DIFFERENTIAL EQUATIONS)
Project leader: Sergey Koltsov
Project participants: Anton Surkov, Vera Ignatenko, Vladimir Zaharov
This work explores the feasibility of applying large language models for obtaining analytical solutions to differential equations. Within this approach, differential equations and their solutions are viewed as symbolic sequences. Consequently, predicting such sequences can be reduced to the problem of applying seq2seq models. The representation of differential equations and their solutions as a set of symbols will be implemented using a range of Python libraries, allowing the transformation of formulas into text representations in Latex format.
The development of seq2seq models for differential equations consists of two stages. The first stage involves examining recurrent models such as RNN, LSTM, GRU, acting as baseline models. In the second stage, large language models (based on Transformer models) such as BERT, XLNet, T5 are utilized, which are fine-tuned on the obtained symbolic representations of equations.
Publications on the project:
Vladimir Zakharov, Anton Surkov, Sergei Koltcov AGDES: a Python package and an approach to generating synthetic data for differential equation solving with LLMs // Procedia Computer Science, 2025, Volume 258, Pages 1169-1178, ISSN 1877-0509 DOI
Have you spotted a typo?
Highlight it, click Ctrl+Enter and send us a message. Thank you for your help!
To be used only for spelling or punctuation mistakes.