Search...
ISA: LoongArch
03-12 2024

ONNX Runtime, the renowned AI inference framework software, officially supports LoongArch

Recently, ONNX Runtime, the open-source community behind the renowned AI inference framework, released the official version 1.17.0, which includes support for LoongArch. Users can now use this release directly from the ONNX Runtime open-source community to develop and deploy AI inference applications on the Loongson platform, thereby enhancing the LoongArch AI ecosystem.

ONNX Runtime (ORT) has gained significant popularity as an AI inference framework software in recent years. It serves as a foundational AI inference engine for numerous AI applications, supporting different model formats such as PyTorch, Tensorflow, and TFLite. In addition, it accommodates various computing backends, including CPU, GPU, IoT, NPU, and FPGA.

During the development of ORT version 1.17.0, Loongson's technology team closely collaborated with the ONNX Runtime community. They contributed 7697 lines of code to the community's code repository and conducted comprehensive vector optimization on core operators like matrix multiplication, convolution, and transposition. With the community's support, the LoongArch-optimized code successfully passed quality assurance processes, including inspection and test verification. As of ORT version 1.17.0, the ONNX Runtime community officially provides native support for LoongArch.