Viewing a single comment thread. View all comments

robertknight2 t1_j381r8e wrote

Have a look at exporting to ONNX and using ONNX Runtime or another runtime which supports that format: https://pytorch.org/docs/stable/onnx.html

3

robertknight2 t1_j3821me wrote

Ah wait, you said it might not be easily scriptable, so presumably not easily exportable as a graph either?

2

Atom_101 OP t1_j384ogh wrote

I haven't used onnx before but have worked with torchscript. With torchscript I have had to change the models quite a bit with to make it scriptable. If onnx requires similar amount of effort I don't think it will be useful.

I don't want to go through the hassle of scripting because we might change the model architectures soon. I need a quick and possibly inefficient (space wise, not perf wise) way to package the models without exposing source code.

2