11 Commits (336761253deccab67eafb680f40186b30e973cf0)

Author SHA1 Message Date
  Megvii Engine Team 336761253d feat(dnn/cuda): add tensorcore matmul for fp16 data type 3 years ago
  Megvii Engine Team ef9aa80074 fix(mgb/dnn): fix cuda naive matmul algo 4 years ago
  Megvii Engine Team 55974e8cf9 feat(log): opt log 4 years ago
  Megvii Engine Team 2de2222e46 feat(dnn/cuda): add cutlass batched gemv kernel for matmul operator 4 years ago
  Megvii Engine Team 973d2a0ac2 feat(dnn/cuda): add cutlass matmul using split k parallel 4 years ago
  Megvii Engine Team 03c921f7c4 feat(dnn/cuda): add cutlass matmul impls 4 years ago
  Megvii Engine Team 4a1d52c9c6 refactor(megdnn): refactor bfloat16 matmul to recursive inteface 4 years ago
  Megvii Engine Team 364afec033 chore(mge): update copyright years 4 years ago
  Megvii Engine Team a1877ee0fa refactor(dnn): refactor algo interface, use algoinfo instead of global algorithm 4 years ago
  Megvii Engine Team 0293d58ade feat(mge): add bfloat16 support 5 years ago
  Megvii Engine Team f91881ffdc MegEngine: Initial commit of MegEngine. 5 years ago