Submitted by GPUaccelerated t3_yf5jm3 in deeplearning
sckuzzle t1_iu2aa7o wrote
We use models to control things in real-time. We need to be able to predict what is going to happen in 5 or 15 minutes and proactively take actions NOW. If it takes 5 minutes to predict what is going to happen 5 minutes in the future, the model is useless.
So yes. We care about speed. The faster it runs the more we can include in the model (making it more accurate).
GPUaccelerated OP t1_iu4w6oh wrote
The perspective of your use case makes so much sense. I appreciate you sharing that info!
Mind sharing which use case that would be? I'm also trying to pin point which industries care about model speed.
sckuzzle t1_iu5mxmx wrote
This kind of thing is likely applicable to digital twins of many fields. The idea is to create a digital representation of whatever you are trying to model and run it alongside the real thing. It has applications in control engineering and predictive / prescriptive analytics. Depending on the application this could be done many ways (not necessarily using neural nets at all) and be fast or slow to run.
GPUaccelerated OP t1_iuilp7x wrote
Got it! Thank you.
Viewing a single comment thread. View all comments