A Prompt-Based Knowledge Graph Foundation Model for Universal In-Context Reasoning
Yuanning Cui, Zequn Sun, Wei Hu·October 16, 2024
Summary
KG-ICL is a prompt-based foundation model for universal in-context reasoning across diverse knowledge graphs. It uses a unified tokenizer to map entities and relations to tokens, enabling the model to generalize to unseen entities and relations. Two message passing neural networks perform prompt encoding and KG reasoning, allowing it to outperform baselines on 43 different knowledge graphs. The model addresses scalability and handling special KG facts by extracting small-scale prompt graphs. Future enhancements aim to improve scalability through pruning and parallelization.
Advanced features