LibMTL.weighting.GradNorm

class GradNorm[source]

Bases: LibMTL.weighting.abstract_weighting.AbsWeighting

Gradient Normalization (GradNorm).

This method is proposed in GradNorm: Gradient Normalization for Adaptive Loss Balancing in Deep Multitask Networks (ICML 2018) and implemented by us.

Parameters

alpha (float, default=1.5) – The strength of the restoring force which pulls tasks back to a common training rate.

init_param(self)[source]

Define and initialize some trainable parameters required by specific weighting methods.

backward(self, losses, **kwargs)[source]
Parameters
  • losses (list) – A list of losses of each task.

  • kwargs (dict) – A dictionary of hyperparameters of weighting methods.