Spaces:
Sleeping
Sleeping
title: SSM Blog Posts | |
emoji: 📝 | |
colorFrom: purple | |
colorTo: yellow | |
sdk: static | |
pinned: false | |
<b><p style="text-align: center; color:red">Une version en français est disponible sur mon [blog](https://lbourdois.github.io/blog/ssm/)</p></b> | |
<br> | |
October 7, 2021, while wondering whether [AK](https://hf.co/akhaliq) was a bot or a human, I saw one of his [tweets](https://twitter.com/_akhaliq/status/1445931206030282756). A link to a publication on [open-review.net](https://openreview.net/forum?id=uYLFoz1vlAC) accompanied by the following image: | |
<center> | |
<img src="https://cdn-uploads.huggingface.co/production/uploads/613b0a62a14099d5afed7830/QMpNVGwdQV2jRw-jYalxa.png" alt="alt text" width="800" height="450"> | |
</center> | |
Intrigued by the results announced, I decided to read about this S3 model, which would be renamed less than a month later to [S4](https://twitter.com/_albertgu/status/1456031299194470407) ([link](https://github.com/lbourdois/blog/blob/master/assets/efficiently_modeling_long_sequences_s3.pdf) of the version from when it was still called S3 for those interested). | |
This brilliant article impressed me. At the time, I was convinced that State Space Models (SSM) were going to be a revolution, replacing transformers in the coming months. Two years later, I'm forced to admit that I was completely wrong, given the tsunami of LLMs making the news in NLP. | |
Nevertheless, on Monday December 4, 2023, the announcement of Mamba by [Albert Gu](https://twitter.com/_albertgu/status/1731727672286294400) and [Tri Dao](https://twitter.com/tri_dao/status/1731728602230890895) revived their interest. This phenomenon was accentuated 4 days later with the announcement of [StripedHyena](https://twitter.com/togethercompute/status/1733213267185762411) by Together AI. | |
A good opportunity for me to write a few words about the developments in SSM over the last two years. | |
I plan to write three articles first, where the aim is to illustrate the basics of SSM with S4 (the "Attention is all you need" of the field) before doing a literature review of the evolution of SSM since that first paper: | |
- [Introduction to SSM and S4](https://huggingface.co/blog/lbourdois/introduction-ssm) | |
- [SSM evolutions in 2022](WIP) (WIP) | |
- [SSM developments in 2023](WIP) (WIP) | |
I also hope in a second time to go into the details of the architectures of some specific SSMs with animations ✨ |