The loss functions for Neural Network in PyTorch

A loss function is the function which can get the difference(gap) between a model’s predictions and true values to evaluate how good a model is. *Loss function is also called Cost Function or Error Function.

There are popular loss function as shown be…


This content originally appeared on DEV Community and was authored by Super Kai (Kazuya Ito)

A loss function is the function which can get the difference(gap) between a model's predictions and true values to evaluate how good a model is. *Loss function is also called Cost Function or Error Function.

There are popular loss function as shown below:

(1) L1 Loss:

  • can compute the average of the sum of the absolute differences between a model's predictions and true values.
  • 's formula is as shown below: Image description
  • is also called Mean Absolute Error(MAE).
  • is L1Loss() in PyTorch.

(2) L2 Loss:

  • can compute the average of the sum of the squared differences between a model's predictions and true values.
  • 's formula is as shown below: Image description
  • is also called Mean Squared Error(MSE).
  • is MSELoss() in PyTorch

(3) Huber Loss:

  • can do the similar computation of either L1 Loss or L2 Loss depending on the absolute differences between a model's predictions and true values compared with delta. *Memos:
    • delta is 1.0 basically.
    • Be careful, the computation is not exactly same as L1 Loss or L2 Loss according to the formulas below.
  • 's formula is as shown below. *The 1st one is L2 Loss-like one and the 2nd one is L1 Loss-like one: Image description
  • is HuberLoss() in PyTorch.
  • 's delta of 1.0 is same as Smooth L1 Loss which is SmoothL1Loss() in PyTorch.

(4) BCE(Binary Cross Entropy) Loss:

  • can compute the differences between a model's binary predictions and true binary values.
  • s' formula is as shown below: Image description
  • is also called Binary Cross Entropy or Log(Logarithmic) Loss.
  • is BCELoss() in PyTorch. *There is also BCEWithLogitsLoss() which is the combination of BCE Loss and Sigmoid Activation Function in PyTorch.


This content originally appeared on DEV Community and was authored by Super Kai (Kazuya Ito)


Print Share Comment Cite Upload Translate Updates
APA

Super Kai (Kazuya Ito) | Sciencx (2024-06-17T00:55:05+00:00) The loss functions for Neural Network in PyTorch. Retrieved from https://www.scien.cx/2024/06/17/the-loss-functions-for-neural-network-in-pytorch/

MLA
" » The loss functions for Neural Network in PyTorch." Super Kai (Kazuya Ito) | Sciencx - Monday June 17, 2024, https://www.scien.cx/2024/06/17/the-loss-functions-for-neural-network-in-pytorch/
HARVARD
Super Kai (Kazuya Ito) | Sciencx Monday June 17, 2024 » The loss functions for Neural Network in PyTorch., viewed ,<https://www.scien.cx/2024/06/17/the-loss-functions-for-neural-network-in-pytorch/>
VANCOUVER
Super Kai (Kazuya Ito) | Sciencx - » The loss functions for Neural Network in PyTorch. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2024/06/17/the-loss-functions-for-neural-network-in-pytorch/
CHICAGO
" » The loss functions for Neural Network in PyTorch." Super Kai (Kazuya Ito) | Sciencx - Accessed . https://www.scien.cx/2024/06/17/the-loss-functions-for-neural-network-in-pytorch/
IEEE
" » The loss functions for Neural Network in PyTorch." Super Kai (Kazuya Ito) | Sciencx [Online]. Available: https://www.scien.cx/2024/06/17/the-loss-functions-for-neural-network-in-pytorch/. [Accessed: ]
rf:citation
» The loss functions for Neural Network in PyTorch | Super Kai (Kazuya Ito) | Sciencx | https://www.scien.cx/2024/06/17/the-loss-functions-for-neural-network-in-pytorch/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.