applicable models

recurrent

Techniques for RNNs, LSTMs, and GRUs

2 techniques
GoalsModelsData TypesDescription
Contextual Decomposition
Algorithmic
Architecture/neural Networks/recurrent
Requirements/white Box
+1
Text
Contextual Decomposition explains LSTM and RNN predictions by decomposing the final hidden state into contributions from...
Classical Attention Analysis in Neural Networks
Algorithmic
Architecture/neural Networks/recurrent
Requirements/architecture Specific
+1
Any
Classical attention mechanisms in RNNs and CNNs create alignment matrices and temporal attention patterns that show how...
Rows per page
Page 1 of 1