TVM 可以加载编译的模型种类
发布日期:2021-05-04 21:02:14 浏览次数:24 分类:原创文章

本文共 1087 字,大约阅读时间需要 3 分钟。

 

relay 本质上在TVM中就是一种模型描述的IR, 将各种模型编译成relay后再进行一步优化

示例用法: tvm.relay.frontend.from_onnx("/path/to/onnx/model")  加载onnx模型

 

 

tvm.relay.frontend

Functions:

(init_net, predict_net, …)

Convert from caffe model into compatible relay Function.

(init_net, predict_net[, shape, …])

Load caffe2 graph which contains init_net and predict_net into Relay Function.

(model[, shape])

Convert from coreml model into Relay Function.

(net[, shape, dtype])

Convert from Darknet’s model into compatible relay Function.

(model[, shape, layout])

Convert keras model to relay Function.

(symbol[, shape, dtype, …])

Convert from MXNet”s model into compatible relay Function.

(model[, shape, dtype, opset, …])

Convert a ONNX model into an equivalent Relay Function.

(script_module, input_infos[, …])

Load PyTorch model in the form of a scripted PyTorch model and convert into relay.

(graph[, layout, shape, outputs])

Load tensorflow graph which is a python tensorflow graph object into relay.

(model[, shape_dict, dtype_dict])

Convert from tflite model into compatible relay Function.

(bias_var, …)

Quantized conv2d bias

上一篇:DGL 中的update_all函数 的详细理解
下一篇:TVM 0.7版起 API改变情况

发表评论

最新留言

关注你微信了!
[***.104.42.241]2025年04月13日 09时21分13秒