17 Commits (3f5238fb3862d5db5cc848778def5f621f460f12)

Author SHA1 Message Date
  Megvii Engine Team 3f5238fb38 feat(mgb/dnn): add accuracy shake checker 4 years ago
  Megvii Engine Team ba2ad46e54 feat(gopt): add deconv nchw4 int8 opt pass, add deconv nchw int8 4 years ago
  Megvii Engine Team 5d350fc843 feat(dnn/cuda): add deconv int8 and fix cutlass conv wrapper base on modify cutlass 2.4 4 years ago
  Megvii Engine Team b04ad06f84 refactor(megdnn): refactor matmul algo in conv backward filter 4 years ago
  Megvii Engine Team 25089e520e refactor(megdnn): refactor matmul algo in conv backward data 4 years ago
  Megvii Engine Team 0d720653ac refactor(megdnn): add default algo for convolution forward 4 years ago
  Megvii Engine Team 659217acd2 refactor(megdnn): refactor bfloat16 convbias to recursive inteface 4 years ago
  Megvii Engine Team b8febaf91f refactor(megdnn): refactor bfloat16 convolutionbackwardfilter to recursive inteface 4 years ago
  Megvii Engine Team f14e0c17e7 feat(mgb): add recursive for fastrun and megdnn test 4 years ago
  Megvii Engine Team 364afec033 chore(mge): update copyright years 4 years ago
  Megvii Engine Team a1877ee0fa refactor(dnn): refactor algo interface, use algoinfo instead of global algorithm 4 years ago
  Megvii Engine Team f354724220 fix(ci/megdnn_test/megbrain_test): split some 5 years ago
  Megvii Engine Team 0293d58ade feat(mge): add bfloat16 support 5 years ago
  Megvii Engine Team 1c4a64b2af test(megdnn): skip fp16 test if compute capability less than 60 5 years ago
  luzzyzhang 16f052e916 fix(megdnn): change ver 60 to use cuda capability 50 5 years ago
  Megvii Engine Team f5833a5294 fix(dnn/cuda): fix cublas matmul on sm60 5 years ago
  Megvii Engine Team f91881ffdc MegEngine: Initial commit of MegEngine. 5 years ago