CORE:Toward Ubiquitous 6G Intelligence Through Collaborative Orchestration of Large Language Model Agents Over Hierarchical Edge
- •CORE framework orchestrates collaborative LLM agents across 6G mobile devices and hierarchical edge servers
- •New role affinity scheduling algorithm matches specific AI functional roles with distributed computing resources
- •System achieves higher efficiency and task completion rates in real-world edge-computing platform evaluations
The shift toward 6G connectivity promises a world of ubiquitous intelligence, but the heavy computational demands of Large Language Models often clash with the limited, fragmented resources found at the network's edge. To bridge this gap, researchers have introduced CORE, a framework designed to orchestrate a team of AI agents across a tiered network of mobile devices and servers. Instead of forcing a single device to carry the entire load, CORE breaks tasks down by assigning specific functional roles to different models, allowing them to work in harmony.
The system relies on three specialized modules to keep this collaboration seamless: real-time perception of the environment, dynamic orchestration of agent roles, and pipeline-parallel execution to speed up inference. A standout feature is the role affinity scheduling algorithm, which intelligently maps these specific AI roles to the most suitable hardware available. This ensures that a complex reasoning task isn't bottlenecked by a low-power device when a more robust edge server is accessible nearby.
In practical tests and 6G application scenarios, the CORE framework demonstrated a significant leap in system efficiency and successful task completion. Beyond theoretical simulations, the team successfully deployed the architecture on a real-world edge-computing platform, proving that distributed AI agents can maintain robust performance even in unpredictable operational environments. This move toward collaborative, decentralized intelligence marks a crucial step in making advanced AI truly mobile and responsive.