Viewing a single comment thread. View all comments

trajo123 t1_j3qwftm wrote

13 periods as history to forecast another 13, this seems like a very atypical/extreme TS forecasting problem, do these services actually handle so little data?

First, it's unlikely that this little data is enough for anything but the simplest models. Probably the best you could do in terms of a domain independent model is linear regression. Even so calculating performance metrics - knowing how good the model is - is going to be challenging as that would require you to further reduce the amount of training data in order to have a validation/"out of sample" set.

Getting useful predictions with so little data is probably going to require you to make a model with strong assumptions - e.g. come up with a set of domain-specific parametrized equations that govern the time-series and then fit those parameters to the data.

In any case, Deep Learning is far from the first approach that comes to mind trying to solve this problem. Solving this problem is probably just a few lines of code using R or scipy.stats + sklearn, probably less than calling the cloud API functions. The trick is to use the right mathematical model.

2

trajo123 t1_j3qwrtb wrote

I understand that you want to use 13 rows of history for prediction, but do you have more than 13 rows to train the model? How many rows do you have in total?

1