PP: Shallow RNNs: a method for accurate time-series classification on tiny devices

Problem: time series classification

shallow RNNs: the first layer splits the input sequence and runs several independent RNNs.  The second layer consumes the output of the first layer to capture long dependencies. 

We improve inference time over standard RNNs without compromising accuracy. 

Time series -------- temporal dependencies. Sequential models such as RNN are particularly well-suited in this context. 

Directly leveraging RNNs for prediction in constrained scenarios is challenging, and requires large training and inference costs. 

 ?? how long the recurrence of RNN should be?

Each time series is divided into independent parts, and a shared RNN operates on each brick independently, thus ensuring a small model size and short recurrence. 

Only has a short recurrence. 

Supplementary knowledge:

1. theoretical justification

2. weak/ strong assumptions; Model flow:

  • assumption/ environment ~ weak or strong
  • model 
  • results
  • evaluation ~ baseline or indicator.

3. Sequential models: RNN; 

原文地址:https://www.cnblogs.com/dulun/p/12267468.html