The research of deep learning is very popular today and has a wide range of applications. Among them, how to apply deep neural networks on edge computing devices, such as mobile phones, is an important key to applying deep neural networks to products.
This sharing will introduce how to accelerate the inference of deep learning models on the mobile platform.
- Introduce open source inference frameworks suitable for mobile phones, such as TensorFlow Lite, NCNN, TNN and MACE.
- Introduce accelerated computing with hardware devices, such as GPU and DSP.
- Introduce how to use TensorFlow Lite to deploy the deep neural network on mobile phone and accelerate with hardware devices.
About Ida Chen
A software engineer who is interested in open source and accelerated computing