test_rnn / README.md
kigichang's picture
Update README.md
580baa9 verified
---
license: mit
---
# test rnn
The Models in test rnn is for [Huggingface Candle PR#2542](https://github.com/huggingface/candle/pull/2542) as example test cases.
Test models are refered to Pytorch [LSTM](https://pytorch.org/docs/stable/generated/torch.nn.LSTM.html) and [GRU](https://pytorch.org/docs/stable/generated/torch.nn.GRU.html).
Test models are generated by the following codes:
- lstm_test.pt: A simple LSTM model with 1 layer.
```python
import torch
import torch.nn as nn
rnn = nn.LSTM(10, 20, num_layers=1, batch_first=True)
input = torch.randn(5, 3, 10)
output, (hn, cn) = rnn(input)
state_dict = rnn.state_dict()
state_dict['input'] = input
state_dict['output'] = output.contiguous()
state_dict['hn'] = hn
state_dict['cn'] = cn
torch.save(state_dict, "lstm_test.pt")
```
- gru_test.pt: A simple GRU model with 1 layer.
```python
import torch
import torch.nn as nn
rnn = nn.GRU(10, 20, num_layers=1, batch_first=True)
input = torch.randn(5, 3, 10)
output, hn = rnn(input)
state_dict = rnn.state_dict()
state_dict['input'] = input
state_dict['output'] = output.contiguous()
state_dict['hn'] = hn
torch.save(state_dict, "gru_test.pt")
```
- bi_lstm_test.pt: A bidirectional LSTM model with 1 layer.
```python
import torch
import torch.nn as nn
rnn = nn.LSTM(10, 20, num_layers=1, bidirectional=True, batch_first=True)
input = torch.randn(5, 3, 10)
output, (hn, cn) = rnn(input)
state_dict = rnn.state_dict()
state_dict['input'] = input
state_dict['output'] = output.contiguous()
state_dict['hn'] = hn
state_dict['cn'] = cn
torch.save(state_dict, "bi_lstm_test.pt")
```
- bi_gru_test.pt: A bidirectional GRU model with 1 layer.
```python
import torch
import torch.nn as nn
rnn = nn.GRU(10, 20, num_layers=1, bidirectional=True, batch_first=True)
input = torch.randn(5, 3, 10)
output, hn = rnn(input)
state_dict = rnn.state_dict()
state_dict['input'] = input
state_dict['output'] = output.contiguous()
state_dict['hn'] = hn
torch.save(state_dict, "bi_gru_test.pt")
```
- lstm_nlayer_test.pt: A LSTM model with 3 layers.
```python
import torch
import torch.nn as nn
rnn = nn.LSTM(10, 20, num_layers=3, batch_first=True)
input = torch.randn(5, 3, 10)
output, (hn, cn) = rnn(input)
state_dict = rnn.state_dict()
state_dict['input'] = input
state_dict['output'] = output.contiguous()
state_dict['hn'] = hn
state_dict['cn'] = cn
torch.save(state_dict, "lstm_nlayer_test.pt")
```
- bi_lstm_nlayer_test.pt: A bidirectional LSTM model with 3 layers.
```python
import torch
import torch.nn as nn
rnn = nn.LSTM(10, 20, num_layers=3, bidirectional=True, batch_first=True)
input = torch.randn(5, 3, 10)
output, (hn, cn) = rnn(input)
state_dict = rnn.state_dict()
state_dict['input'] = input
state_dict['output'] = output.contiguous()
state_dict['hn'] = hn
state_dict['cn'] = cn
torch.save(state_dict, "bi_lstm_nlayer_test.pt")
```
- gru_nlayer_test.pt: A GRU model with 3 layers.
```python
import torch
import torch.nn as nn
rnn = nn.GRU(10, 20, num_layers=3, batch_first=True)
input = torch.randn(5, 3, 10)
output, hn = rnn(input)
state_dict = rnn.state_dict()
state_dict['input'] = input
state_dict['output'] = output.contiguous()
state_dict['hn'] = hn
torch.save(state_dict, "gru_nlayer_test.pt")
```
- bi_gru_nlayer_test.pt: A bidirectional GRU model with 3 layers.
```python
import torch
import torch.nn as nn
rnn = nn.GRU(10, 20, num_layers=3, bidirectional=True, batch_first=True)
input = torch.randn(5, 3, 10)
output, hn = rnn(input)
state_dict = rnn.state_dict()
state_dict['input'] = input
state_dict['output'] = output.contiguous()
state_dict['hn'] = hn
torch.save(state_dict, "bi_gru_nlayer_test.pt")
```