Lab: Training Graph Embeddings

From info216
Revision as of 12:18, 14 April 2022 by Sinoa (talk | contribs) (Created page with " =Lab 14: Training Graph Embeddings= ==Topics== Training knowledge graph embeddings with TorchKGE. <!-- ==Tutorial== --> ==Classes and methods== The following TorchKGE cla...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

Lab 14: Training Graph Embeddings

Topics

Training knowledge graph embeddings with TorchKGE.


Classes and methods

The following TorchKGE classes from the previous lab remain central:

  • KG - contains the knowledge graph (KG)
  • Model - contains the embeddings (entity and relation vectors) for the KG


Tasks

Pre-trained models:

Train your own:

  • Load the corresponding KG using a dataset loader.
  • Run the Shortest training example, but use a much lower value for epoch (for example 200).
  • Take note of the evaluation metrics and final loss, and re-run the example using different numbers of epochs. What happens when you increase the number?
  • Also run the Simplest training example. Use the documentation to make sure you have an idea of what the different parts of the algorithm do.

Train with early stopping:

  • Run the Training with Ignite example. Use the documentation to make sure you have an idea of what the different parts of the algorithm do. How do the results compare with your exploration of different epoch values?


If You Have More Time

  • Try this out on the other models supported by TorchKGE, both other TransX models and a deep model (ConvKB).
  • Try it out with different datasets, for example one you create youreself using SPARQL queries on an open KG.

Useful readings