Artificial Intelligence 23 min read

Key Insights from Prof. Zhou Zhihua’s Talk on Deep Learning, Model Complexity, and the Deep Forest Method

In his JD AI Innovation Summit presentation, Prof. Zhou Zhihua examined why deep neural networks have succeeded, identified three essential conditions—layer‑wise processing, internal feature transformation, and sufficient model complexity—highlighted their limitations, introduced the gcforest/deep forest alternative, and emphasized the need for large data, powerful hardware, training tricks, and talent to advance AI research and education.

JD Tech
JD Tech
JD Tech
Key Insights from Prof. Zhou Zhihua’s Talk on Deep Learning, Model Complexity, and the Deep Forest Method

Prof. Zhou Zhihua, newly appointed academic advisor of JD AI’s Nanjing Institute, delivered a public talk titled “Some Thoughts on Deep Learning” at the JD Artificial Intelligence Innovation Summit on April 15.

He observed that deep neural networks dominate Kaggle competitions mainly on image, video, and audio tasks, while they often underperform on mixed, discrete, or symbolic modeling problems.

He attributed the success of deep learning to three key factors:

Layer‑wise processing

Internal feature transformation

Sufficient model complexity

He concluded that meeting these three conditions does not necessarily require deep neural networks; other models can also satisfy them.

To address the shortcomings of neural networks, Zhou introduced his team’s gcforest (deep forest) method, which offers cross‑task performance, adaptive model complexity, and does not rely on back‑propagation.

The deep forest combines ensemble learning ideas, works well on many tasks (especially non‑image tasks), and can automatically adjust its depth based on data size.

He discussed why deep models have become feasible now: the availability of massive datasets, powerful computing devices (e.g., GPUs), and a wealth of training tricks.

He also warned about deep learning’s challenges, such as over‑fitting, difficulty in hyper‑parameter sharing across tasks, and reproducibility issues.

Beyond technical aspects, Zhou emphasized the importance of AI talent, announcing a collaboration between Nanjing University’s AI Institute and JD to foster scientific research and talent cultivation.

Overall, the talk provided a critical perspective on deep learning’s strengths and weaknesses, presented an alternative tree‑based deep model, and highlighted the broader ecosystem needed for AI advancement.

deep learningNeural NetworksAI Educationdeep forestgcForestmodel complexity
JD Tech
Written by

JD Tech

Official JD technology sharing platform. All the cutting‑edge JD tech, innovative insights, and open‑source solutions you’re looking for, all in one place.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.