|
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293 |
- # MegEngine
-
- <p align="center">
- <img width="202" height="118" src="logo.svg">
- </p>
- <h3> <a href="https://www.megengine.org.cn/doc/stable/en/user-guide/index.html"> Documentation </a> | <a href="https://www.megengine.org.cn/doc/stable/zh/user-guide/index.html"> 中文文档 </a> </h3>
-
- [](README_CN.md) [](https://megengine.org.cn/) [](LICENSE) [](https://jq.qq.com/?_wv=1027&k=jJcBU1xi) [](https://www.zhihu.com/people/megengine-bot)
-
- MegEngine is a fast, scalable and easy-to-use deep learning framework with 3 key features.
- * **Unified core for both training and inference**
- * You can represent quantization/dynamic shape/image pre-processing and even derivation in one model.
- * After training, just put everything into your model and inference it on any platform at ease. Speed and precision problems won't bother you anymore due to the same core inside. Check the usage [here](https://www.megengine.org.cn/doc/stable/zh/user-guide/model-development/traced_module/index.html).
- * **Lowest hardware requirements helped by algorithms**
- * In training, GPU memory usage could go down to one-third at the cost of only one additional line, which enables the [DTR algorithm](https://www.megengine.org.cn/doc/stable/zh/user-guide/model-development/dtr/index.html).
- * Gain the lowest memory usage when inferencing a model by leveraging our unique pushdown memory planner
- * **Inference efficiently on all-platform**
- * Inference fast and high-precision on x86/Arm/CUDA/RoCM
- * Support Linux/Windows/iOS/Android/TEE...
- * Save more memory and optimize speed by leveraging [advanced usage](https://www.megengine.org.cn/doc/stable/zh/user-guide/deployment/lite/advance/index.html)
-
- ------
-
- ## Installation
-
- **NOTE:** MegEngine now supports Python installation on Linux-64bit/Windows-64bit/MacOS(CPU-Only)-10.14+/Android 7+(CPU-Only) platforms with Python from 3.5 to 3.8. On Windows 10 you can either install the Linux distribution through [Windows Subsystem for Linux (WSL)](https://docs.microsoft.com/en-us/windows/wsl) or install the Windows distribution directly. Many other platforms are supported for inference.
-
- ### Binaries
-
- To install the pre-built binaries via pip wheels:
-
- ```bash
- python3 -m pip install --upgrade pip
- python3 -m pip install megengine -f https://megengine.org.cn/whl/mge.html
- ```
-
- ## Building from Source
-
- * CMake build details. please refer to [BUILD_README.md](scripts/cmake-build/BUILD_README.md)
- * Python binding build details, Please refer to [BUILD_PYTHON_WHL_README.md](scripts/whl/BUILD_PYTHON_WHL_README.md)
-
- ## How to Contribute
-
- * MegEngine adopts [Contributor Covenant](https://contributor-covenant.org) as a guideline to run our community. Please read the [Code of Conduct](CODE_OF_CONDUCT.md).
- * Every contributor of MegEngine must sign a [Contributor License Agreement (CLA)](CONTRIBUTOR_LICENSE_AGREEMENT.md) to clarify the intellectual property license granted with the contributions.
- * You can help to improve MegEngine in many ways:
- * Write code.
- * Improve [documentation](https://github.com/MegEngine/Docs).
- * Answer questions on [MegEngine Forum](https://discuss.megengine.org.cn), or Stack Overflow.
- * Contribute new models in [MegEngine Model Hub](https://github.com/megengine/hub).
- * Try a new idea on [MegStudio](https://studio.brainpp.com).
- * Report or investigate [bugs and issues](https://github.com/MegEngine/MegEngine/issues).
- * Review [Pull Requests](https://github.com/MegEngine/MegEngine/pulls).
- * Star MegEngine repo.
- * Cite MegEngine in your papers and articles.
- * Recommend MegEngine to your friends.
- * Any other form of contribution is welcomed.
-
- We strive to build an open and friendly community. We aim to power humanity with AI.
-
- ## How to Contact Us
-
- * Issue: [github.com/MegEngine/MegEngine/issues](https://github.com/MegEngine/MegEngine/issues)
- * Email: [megengine-support@megvii.com](mailto:megengine-support@megvii.com)
- * Forum: [discuss.megengine.org.cn](https://discuss.megengine.org.cn)
- * QQ Group: 1029741705
-
- ## Resources
-
- - [MegEngine](https://megengine.org.cn)
- - [MegStudio](https://studio.brainpp.com)
- - mirror repo
- - OPENI: [openi.org.cn/MegEngine](https://www.openi.org.cn/html/2020/Framework_0325/18.html)
- - Gitee: [gitee.com/MegEngine/MegEngine](https://gitee.com/MegEngine/MegEngine)
-
-
- ## License
-
- MegEngine is licensed under the Apache License, Version 2.0
-
- ## Citation
- If you use MegEngine in your publication,please cite it by using the following BibTeX entry.
-
- ```
- @Misc{MegEngine,
- institution = {megvii},
- title = {MegEngine:A fast, scalable and easy-to-use deep learning framework},
- howpublished = {\url{https://github.com/MegEngine/MegEngine}},
- year = {2020}
- }
- ```
-
- Copyright (c) 2014-2021 Megvii Inc. All rights reserved.
|