Artificial Intelligence 6 min read

JD Digital Science Unveils Fast Secure Federated Learning Framework and Two Industry‑First Techniques

JD Digital Science introduced its fast secure federated learning framework, highlighted two pioneering technologies—a kernel‑based nonlinear federated learning algorithm and a distributed fast homomorphic encryption method—both accepted at KDD 2020, and discussed their industrial applications, privacy benefits, and regulatory relevance.

JD Tech Talk
JD Tech Talk
JD Tech Talk
JD Digital Science Unveils Fast Secure Federated Learning Framework and Two Industry‑First Techniques

Recently, JD Digital Science Group AI Lab chief scientist Bo Liefeng disclosed JD's federated learning strategy and introduced its fast secure federated learning framework, sharing two industry‑first breakthroughs: a kernel‑based nonlinear federated learning algorithm and a distributed fast homomorphic encryption technique, both accepted at KDD 2020.

At an AI finance open class hosted by the Hong Kong AI & Robotics Society, Bo noted that data‑privacy regulations such as the EU GDPR and China’s draft Data Security Management Measures pose challenges for AI deployment, and federated learning emerges to address privacy and security from the source.

Federated learning enables AI systems to jointly use data while satisfying privacy, security, and regulatory requirements, allowing multiple enterprises to train models without sharing raw data.

JD Digital Science has applied federated learning to face‑recognition scenarios, achieving a 99.96% pass rate at a 1e‑5 false‑positive rate and 99.99% at a 1e‑4 rate, demonstrating superior model performance through multi‑party data collaboration.

Industry challenges include gradient‑based attacks, efficiency loss due to gradient encryption, and resource waste from synchronous updates; JD’s fast secure framework addresses these with three features: privacy‑preserving intermediate value exchange with perturbation, centralized data exchange independent of participants, and an asynchronous computation model that greatly speeds training.

The framework incorporates the kernel‑based nonlinear federated learning algorithm using doubly stochastic gradient descent, which avoids transmitting raw samples or gradients and improves speed, and a distributed fast homomorphic encryption technique for large‑scale secure computation; both resulted in papers accepted at KDD 2020.

AI is a key pillar of new infrastructure; JD Digital Science’s Industrial AI Center consolidates multiple AI labs to industrialize frontier AI technologies such as federated learning and to provide AI capabilities as infrastructure for digital transformation.

Federated LearningAI infrastructurehomomorphic encryptionKDD2020kernel methodssecure AI
JD Tech Talk
Written by

JD Tech Talk

Official JD Tech public account delivering best practices and technology innovation.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.