Description
Self-Attention Algorithm.
Description
Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm, this is based on Vaswani et al. (2017) <doi:10.48550/arXiv.1706.03762>, Dan Jurafsky and James H. Martin (2022, ISBN:978-0131873216) <https://web.stanford.edu/~jurafsky/slp3/> "Speech and Language Processing (3rd ed.)" and Alex Graves (2020) <https://www.youtube.com/watch?v=AIiwuClvH6k> "Attention and Memory in Deep Learning".
README.md
attention
Self-Attention algorithm helper functions and demonstration vignettes of increasing depth on how to construct the Self-Attention algorithm.
CRAN install
The package can be installed from CRAN using:
install.packages('attention')
Preview version
The development version, to be used at your peril, can be installed from GitHub using the remotes package.
if (!require('remotes')) install.packages('remotes')
remotes::install_github('bquast/attention')
Development
Development takes place on the GitHub page.
https://github.com/bquast/attention
Bugs can be filed on the issues page on GitHub.