Pack a TorchScript model

Carton can run TorchScript models without requiring Python at runtime. This allows you to run your model completely in native code.

1. Create a ScriptModule

Use torch.jit.script or torch.jit.trace to turn your model into a ScriptModule.

import torch
from typing import Dict, Any, List
class Model(torch.nn.Module):
def __init__(self):
def forward(self, inputs: Dict[str, Any]):
a = inputs["a"]
# Needed to help torch understand types
assert isinstance(a, torch.Tensor)
assert isinstance(inputs["b"], str)
assert isinstance(inputs["c"], List[str])
return {
"doubled": a * 2,
"string": "A string",
"stringlist": ["A", "list", "of", "strings"]
m = torch.jit.script(Model()), "/path/to/my/")

Your model must take a Dict[str, Any] or Dict[str, torch.Tensor] as input and return a dict. The value types will be torch.Tensor for tensors, str for string scalars, and List[str] for 1D string tensors. String tensors > 1D are not supported.

Note the above code saves the model to /path/to/my/ using

2. Pack the model

Select a programming language:

To pack this model, we'd do the following:

import asyncio
import cartonml as carton
async def main():
packed_model_path = await carton.pack(
# Path to the model from above
# This model runs with the torchscript runner
# `required_framework_version` is a semver version range.
# See and
# We want to run this model with Torch 2.0.x.
# The below value means any 2.0.x version is okay.

The model at packed_model_path can now be loaded from any programming language supported by Carton!


Required arguments


The name of the runner to process this model with. For TorchScript models, this is torchscript


This is a semver version range that specifies the versions of PyTorch that the model requires.

For now, it's recommended to specify a single major and minor version. For example, =2.0, which means any 2.0.x version is okay.

See and for more details on version ranges.

Other options


You can specify the following values:

  • num_interop_threads: An integer value to set the number of interop threads
  • num_threads: An integer value to set the number of intraop threads

See the torchscript docs for more detail.


There are several other options that can be used across all model types. They are not required, but may make it easier for others to use your model.

See here for more details.