Artificial Intelligence 2 min read

Challenges and Insights for Deploying Large Models on Edge with MNN

The talk presents an overview of the MNN inference engine, outlines the end‑to‑end workflow for deploying large language models on mobile devices, discusses technical challenges and practical solutions, and concludes with future directions for edge AI deployment.

DataFunSummit
DataFunSummit
DataFunSummit
Challenges and Insights for Deploying Large Models on Edge with MNN

Speaker: Wang Zhaode, Technical Expert at Taobao Group, holds a master’s degree from the Institute of Computing Technology, Chinese Academy of Sciences, and works on the MNN team responsible for framework architecture design and performance optimization.

Talk Title: Challenges and Considerations for Deploying Large Models on Edge with MNN

Outline:

1. Overview of the MNN inference engine

2. End‑to‑end workflow for large‑model deployment on the device side

3. Technical challenges and practical experiences in deploying large models on edge devices

5. Summary and outlook

Audience Benefits:

1. Understanding of the mobile inference framework MNN

2. Knowledge of the deployment process for LLMs on edge devices

3. Practical experience with LLM deployment in on‑device inference frameworks

Free access via QR code is provided for viewers.

AImobile AILarge ModelsMNNedge deploymentInference Engine
DataFunSummit
Written by

DataFunSummit

Official account of the DataFun community, dedicated to sharing big data and AI industry summit news and speaker talks, with regular downloadable resource packs.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.