Lab: Training Graph Embeddings: Difference between revisions

From info216
Line 13: Line 13:


In addition, we will also use:
In addition, we will also use:
* '''KG''' - contains the knowledge graph (KG)
* '''KnowledgeGraph''' - contains the knowledge graph (KG)


More classes will be suggested below.
More classes will be suggested below.

Revision as of 13:04, 14 April 2022

Lab 14: Training Graph Embeddings

Topics

Training knowledge graph embeddings with TorchKGE.


Classes and methods

The following TorchKGE class from the previous lab remains central:

  • Model - contains the embeddings (entity and relation vectors) for the KG

In addition, we will also use:

  • KnowledgeGraph - contains the knowledge graph (KG)

More classes will be suggested below.


Tasks

Pre-trained models:

Note: The Wikidata dataset returns two graphs. They are not train/test, but the dataset with and without additional attributes. Start with the one without attributes. You need to split it into train/validation/test yourself using KG.split_kg().

Train your own:

  • Load the corresponding KG using a dataset loader.
  • Run the Shortest training example, but use a much lower value for epoch (for example 200).
  • Take note of the evaluation metrics and final loss, and re-run the example using different numbers of epochs. What happens when you increase the number?
  • Also run the Simplest training example. Use the documentation to make sure you have an idea of what the different parts of the algorithm do.

Train with early stopping:

  • Run the Training with Ignite example. Use the documentation to make sure you have an idea of what the different parts of the algorithm do. How do the results compare with your exploration of different epoch values?


If You Have More Time

  • Try this out on the other models supported by TorchKGE, both other TransX models and a deep model (ConvKB).
  • Try it out with different datasets, for example one you create youreself using SPARQL queries on an open KG.

Useful readings