Ant Group Unveils Dynamic Query-Aware User Modeling Framework
- •Ant Group introduces Q-Anchor for dynamic, query-aware user representation
- •Framework outperforms static models with 9.8% AUC gain on industrial benchmarks
- •KV-cache optimization enables low-latency deployment across multiple business scenarios
Ant Group has introduced Q-Anchor, a novel framework that shifts how digital platforms understand their users. Traditional systems often rely on static embeddings—fixed mathematical representations of a user that remain the same whether recommending a product or assessing risk. By treating a natural language query as an "anchor," Q-Anchor allows Large Language Models (LLMs) to reinterpret the same user behavior logs through different lenses, creating dynamic profiles tailored to specific business needs.
To bridge the gap between raw behavioral data and high-level semantic understanding, the researchers developed UserU, a massive dataset featuring over 100 million multi-modal samples. This dataset combines transaction histories and app usage with AI-generated user reflections, teaching the model to perceive temporal trends and intent rather than just surface-level actions. The architecture uses a hierarchical approach, integrating diverse data types into the LLM's latent space—the internal mathematical environment where the model processes concepts.
Efficiency is vital for industrial deployment, so the team utilized KV-cache-accelerated inference. By encoding a user’s prefix behavior once and reusing it across multiple queries, the system avoids redundant computations and maintains lightning-fast performance. Extensively tested on Alipay's systems, Q-Anchor showed significant gains in user engagement, proving that adaptive modeling is the future of personalized digital experiences.