• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site

Equations (APPLICATION OF LARGE LANGUAGE MODELS TO SOLVE DIFFERENTIAL EQUATIONS)

Project leader: Sergey Koltsov

Project participants: Anton Surkov

This work explores the feasibility of applying large language models for obtaining analytical solutions to differential equations. Within this approach, differential equations and their solutions are viewed as symbolic sequences. Consequently, predicting such sequences can be reduced to the problem of applying seq2seq models. The representation of differential equations and their solutions as a set of symbols will be implemented using a range of Python libraries, allowing the transformation of formulas into text representations in Latex format.

The development of seq2seq models for differential equations consists of two stages. The first stage involves examining recurrent models such as RNN, LSTM, GRU, acting as baseline models. In the second stage, large language models (based on Transformer models) such as BERT, XLNet, T5 are utilized, which are fine-tuned on the obtained symbolic representations of equations.

 

Have you spotted a typo?
Highlight it, click Ctrl+Enter and send us a message. Thank you for your help!
To be used only for spelling or punctuation mistakes.