Deep Learning For Sequential Data – Part V: Handling Long Term Temporal Dependencies

PERPETUAL ENIGMA

1 mainIn the previous blog post, we learnt why we cannot use regular backpropagation to train a Recurrent Neural Network (RNN). We discussed how we can use backpropagation through time to train an RNN. The next step is to understand how exactly the RNN can be trained. Does the unrolling strategy work in practice? If we can just unroll an RNN and make it into a feedforward neural network, then what’s so special about the RNN in the first place? Let’s see how we tackle these issues.  

View original post 867 more words

Measuring The Memory Of Time Series Data

PERPETUAL ENIGMA

1-mainTime series data has memory. It remembers what happened in the past and avenge any wrongdoings! Can you believe it? Okay the avenging part may not be true, but it definitely remembers the past. The “memory” refers to how strongly the past can influence the future in a given time series variable. If it has a strong memory, then we know that analyzing the past would be really useful to us because it can tell us what’s going to happen in the future. If you need a quick refresher, you can check out my blog post where I talked about memory in time series data. We have a high level understanding of how we can classify time series data into short memory and long memory, but how do we actually measure the memory?  

View original post 578 more words