If you want a model, you will need to think about input variables... This where it gets complicated.
As it minima detection, there are many ways this can be achieved, for example: look at a sign change in viewers per unit of time? Or if your data is evenly sampled, compute the difference between viewers at t[n+1] and t[n], and see whether the sign change is stable (no flip-flop) over say 5 or 10 samples?
To get a robust detection you will need to play with your data. A lot.
tareumlaneuchie t1_j3ba1km wrote
Reply to comment by C0R0NA_CHAN in [D] Which ML model should I use to analyse and detect dip in time series sequence? by C0R0NA_CHAN
If you want a model, you will need to think about input variables... This where it gets complicated.
As it minima detection, there are many ways this can be achieved, for example: look at a sign change in viewers per unit of time? Or if your data is evenly sampled, compute the difference between viewers at t[n+1] and t[n], and see whether the sign change is stable (no flip-flop) over say 5 or 10 samples?
To get a robust detection you will need to play with your data. A lot.