Recent text generation methods frequently learn node representations from graph-based data via global or local aggregation, such as knowledge graphs. Since all nodes are connected directly, node global representation encoding enables direct communication between two distant nodes while disregarding graph topology. Node local representation encoding, which captures the graph structure, considers the connections between nearby nodes but misses out onlong-range relations. A quantum-like approach to learning better-contextualised node embeddings is proposed using a fusion model that combines both encoding strategies. Our methods significantly improve on two graph-to-text datasets compared to state-of-the-art models in various experiments.
A quantum-like approach for text generation from knowledge graphs
De Meo P.
2023-01-01
Abstract
Recent text generation methods frequently learn node representations from graph-based data via global or local aggregation, such as knowledge graphs. Since all nodes are connected directly, node global representation encoding enables direct communication between two distant nodes while disregarding graph topology. Node local representation encoding, which captures the graph structure, considers the connections between nearby nodes but misses out onlong-range relations. A quantum-like approach to learning better-contextualised node embeddings is proposed using a fusion model that combines both encoding strategies. Our methods significantly improve on two graph-to-text datasets compared to state-of-the-art models in various experiments.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.