Skip to content

Question about the loss #42

@aks1207

Description

@aks1207

What does the term lap_loss indicate intutively, like what are we trying to optimise or capture here ?

D = torch.diag(torch.sum(self.masked_adj[0], 0))
m_adj = self.masked_adj if self.graph_mode else self.masked_adj[self.graph_idx]
L = D - m_adj
pred_label_t = torch.tensor(pred_label, dtype=torch.float)
if self.args.gpu:
    pred_label_t = pred_label_t.cuda()
    L = L.cuda()
if self.graph_mode:
    lap_loss = 0
else:
    lap_loss = (self.coeffs["lap"]
        * (pred_label_t @ L @ pred_label_t)
        / self.adj.numel()
    )

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions