Fudan University's MOSS: China's First Conversational Large Language Model
Fudan University's Natural Language Processing Lab introduced MOSS, the country's first conversational large language model capable of dialogue generation, programming, factual QA and ethical reasoning, with plans for open‑source release despite current limitations in Chinese language proficiency.
Since the rise of ChatGPT, many tech companies have launched competing products, and in China Fudan University's Natural Language Processing Laboratory has released the nation's first conversational large language model, MOSS.
MOSS originates from the team of Professor Qiu Xipeng at Fudan University's School of Computer Science and Technology, and its name references the AI in the film "The Wandering Earth." The public testing address was https://moss.fastnlp.top/ (service has been paused for upgrades).
MOSS can perform dialogue generation, programming, factual question answering and a range of other tasks, effectively covering the entire technical pathway that enables generative language models to understand human intent and exhibit conversational abilities; the model is slated for open‑source release in the future.
According to First Financial, Professor Qiu Xipeng stated at the 2023 World AI Developers Pioneer Conference that, if optimization proceeds smoothly, they plan to open‑source MOSS by the end of March.
The team also highlighted that MOSS’s current main shortcoming is insufficient Chinese proficiency, largely due to noisy web data such as advertisements, making data cleaning difficult. Demonstrations showed multi‑turn interaction, table generation, code generation and explanation, as well as ethical judgment and legal knowledge, with value‑aligned responses to harmful prompts.
Source: compiled from online reports; if any content infringes your rights, please contact the editor for removal.
DataFunSummit
Official account of the DataFun community, dedicated to sharing big data and AI industry summit news and speaker talks, with regular downloadable resource packs.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.