TorchScript Sample Inference Scripts

In the following pages we provide sample scripts which can be used to run TorchScript models in python. Please keep in mind that these models can also be run in C++ using the TorchScript API.

Please also note that if you require smaller models, faster models, or models made specifically for mobile devices, you may want to go back to model playground, and choose different architectures, use smaller images, lower model parameters etc to optimize runtime and/or memory usage as needed.

If you note a discrepancy between the metrics reported in model playground and from your deployed model, it is entirely possible you are not using the correct image transforms.

We recommend looking at the "config.yaml" file to see the transforms you used for validation/testing, and using the excellent albumentations library which provides almost all of them.

Please note that you have to replicate/implement them if you are deploying to an environment where albumentations is not available. You can read about using and building the transformations on the corresponding page.

Boost model performance quickly with AI-powered labeling and 100% QA.

Learn more
Last modified