Lab: Training Graph Embeddings
Revision as of 14:18, 14 April 2022 by Sinoa (Created page with " =Lab 14: Training Graph Embeddings= ==Topics== Training knowledge graph embeddings with TorchKGE. <!-- ==Tutorial== --> ==Classes and methods== The following TorchKGE cla...")
Lab 14: Training Graph Embeddings
Training knowledge graph embeddings with TorchKGE.
Classes and methods
The following TorchKGE classes from the previous lab remain central:
- KG - contains the knowledge graph (KG)
- Model - contains the embeddings (entity and relation vectors) for the KG
- Choose a KG and TransE model you want to work with. It should have a pre-trained model available.
- Load the pre-trained model (you do no need the KG yet). and evaluate it using the examples given here: https://torchkge.readthedocs.io/en/latest/tutorials/evaluation.html .
- Extra: You can also evaluate the model on relation prediction but, the way TransE is pre-trained, it is awful on this task.
Train your own:
- Load the corresponding KG using a dataset loader.
- Run the Shortest training example, but use a much lower value for epoch (for example 200).
- Take note of the evaluation metrics and final loss, and re-run the example using different numbers of epochs. What happens when you increase the number?
- Also run the Simplest training example. Use the documentation to make sure you have an idea of what the different parts of the algorithm do.
Train with early stopping:
- Run the Training with Ignite example. Use the documentation to make sure you have an idea of what the different parts of the algorithm do. How do the results compare with your exploration of different epoch values?
If You Have More Time
- Try this out on the other models supported by TorchKGE, both other TransX models and a deep model (ConvKB).
- Try it out with different datasets, for example one you create youreself using SPARQL queries on an open KG.