Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Error in compiling model after applying LazyGradientInit optimization #16869

Open
Jupiterghy opened this issue Apr 11, 2024 · 0 comments
Open
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug

Comments

@Jupiterghy
Copy link

When attempting to compile the model after applying LazyGradientInit optimization passes in TVM, an error is encountered:

tvm.error.InternalError: Check failed: (it != type_definitions.end()) is false: There is no definition of I.GlobalTypeVar("GradCell", "AdtHandle"). 

Additionally, the assertion error is triggered, indicating that the structure of the model after applying the optimization once is inconsistent with the structure of the model after applying the optimization twice.

Actual behavior

Traceback information:

tvm.error.InternalError: Traceback (most recent call last):
  22: _ZN3tvm7runtime13PackedFun
  21: tvm::runtime::TypedPackedFunc<tvm::IRModule (tvm::RelayExpr const&, tvm::runtime::Map<tvm::GlobalVar, tvm::BaseFunc, void, void> const&, tvm::runtime::Map<tvm::GlobalTypeVar, tvm::TypeData, void, void> const&)>::AssignTypedLambda<tvm::IRModule (*)(tvm::RelayExpr const&, tvm::runtime::Map<tvm::GlobalVar, tvm::BaseFunc, void, void> const&, tvm::runtime::Map<tvm::GlobalTypeVar, tvm::TypeData, void, void> const&)>(tvm::IRModule (*)(tvm::RelayExpr const&, tvm::runtime::Map<tvm::GlobalVar, tvm::BaseFunc, void, void> const&, tvm::runtime::Map<tvm::GlobalTypeVar, tvm::TypeData, void, void> const&), std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}::operator()(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) const
  20: tvm::IRModule::FromExpr(tvm::RelayExpr const&, tvm::runtime::Map<tvm::GlobalVar, tvm::BaseFunc, void, void> const&, tvm::runtime::Map<tvm::GlobalTypeVar, tvm::TypeData, void, void> const&)
  19: tvm::IRModule::FromExprInContext(tvm::RelayExpr const&, tvm::runtime::Map<tvm::GlobalVar, tvm::BaseFunc, void, void> const&, tvm::runtime::Map<tvm::GlobalTypeVar, tvm::TypeData, void, void> const&, std::unordered_set<tvm::runtime::String, std::hash<tvm::runtime::String>, std::equal_to<tvm::runtime::String>, std::allocator<tvm::runtime::String> >)
  18: tvm::IRModuleNode::Add(tvm::GlobalVar const&, tvm::BaseFunc const&, bool)
  17: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<void (tvm::IRModule const&, tvm::BaseFunc const&)>::AssignTypedLambda<tvm::relay::{lambda(tvm::IRModule const&, tvm::BaseFunc const&)#3}>(tvm::relay::{lambda(tvm::IRModule const&, tvm::BaseFunc const&)#3}, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
  16: tvm::relay::FreeTypeVars(tvm::RelayExpr const&, tvm::IRModule const&)
  15: tvm::relay::MixedModeVisitor::VisitExpr(tvm::RelayExpr const&)
  14: tvm::relay::MixedModeVisitor::VisitLeaf(tvm::RelayExpr const&)
  13: tvm::relay::TypeVarEVisitor::VisitExpr_(tvm::relay::FunctionNode const*)
  12: tvm::relay::ExprVisitor::VisitExpr_(tvm::relay::FunctionNode const*)
  11: tvm::relay::MixedModeVisitor::VisitExpr(tvm::RelayExpr const&)
  10: tvm::relay::MixedModeVisitor::VisitLeaf(tvm::RelayExpr const&)
  9: tvm::relay::TypeVarEVisitor::VisitExpr_(tvm::relay::LetNode const*)
  8: tvm::relay::ExpandANormalForm(tvm::relay::LetNode const*, std::function<void (tvm::relay::LetNode const*)>, std::function<void (tvm::relay::LetNode const*)>)
  7: tvm::relay::MixedModeVisitor::VisitExpr(tvm::RelayExpr const&)
  6: tvm::relay::MixedModeVisitor::VisitLeaf(tvm::RelayExpr const&)
  5: tvm::relay::TypeVarEVisitor::VisitExpr_(tvm::relay::FunctionNode const*)
  4: tvm::relay::ExprVisitor::VisitExpr_(tvm::relay::FunctionNode const*)
  3: tvm::relay::MixedModeVisitor::VisitExpr(tvm::RelayExpr const&)
  2: tvm::relay::MixedModeVisitor::VisitLeaf(tvm::RelayExpr const&)
  1: tvm::relay::TypeVarEVisitor::VisitExpr_(tvm::ConstructorNode const*)
  0: tvm::IRModuleNode::LookupTypeDef(tvm::GlobalTypeVar const&) const
  File "/home/shenqingchao/software/tvm/src/ir/module.cc", line 285
InternalError: Check failed: (it != type_definitions.end()) is false: There is no definition of I.GlobalTypeVar("GradCell", "AdtHandle")

Environment

  • Operating System: Ubuntu 18.04.5
  • TVM version: 0.15.dev0
  • ONNX: 1.15.0

Steps to reproduce

  1. Download the ONNX model
  2. Execute the script:
import onnx
import tvm
from tvm import relay
import numpy as np

def compile_onnx(mod, params, inputs):
    mod = relay.transform.InferType()(mod)
    exec_mod = 'graph'
    target = 'llvm'
    ctx = tvm.cpu(0)

    with tvm.transform.PassContext(opt_level=0):
        executor = relay.build_module.create_executor(
            exec_mod, mod, ctx, target, params
        ).evaluate()
    output = executor(**inputs)
    if isinstance(output, (tvm.runtime.container.ADT, list)):
        output = [r.numpy() for r in output]
    elif output is not None:
        output = [output.numpy()]
    return output


if __name__ == "__main__":
    onnx_file = "model.onnx"
    onnx_model = onnx.load(onnx_file)

    shape_dict = {'v2_0': []}
    inputs = {'v2_0': np.array([5.8400183], dtype=np.float32)}
    mod, params = relay.frontend.from_onnx(onnx_model, shape_dict, freeze_params=True)

    opt = tvm.relay.transform.LazyGradientInit()
    module_once = opt(mod)
    res_once = compile_onnx(module_once, params, inputs)
    module_multiple = opt(module_once)
    assert tvm.ir.structural_equal(module_once, module_multiple)

Triage

  • needs-triage
@Jupiterghy Jupiterghy added needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug labels Apr 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs-triage PRs or issues that need to be investigated by maintainers to find the right assignees to address it type: bug
Projects
None yet
Development

No branches or pull requests

1 participant