本文共 1733 字,大约阅读时间需要 5 分钟。
合页损失,铰链损失
Hinge Loss is a loss function used in Machine Learning for training classifiers. The hinge loss is a maximum margin classification loss function and a major part of the SVM algorithm.
铰链损失是机器学习中用于训练分类器的损失函数。 铰链损失是最大余量分类损失函数,是SVM算法的主要部分。
Hinge loss function is given by:
铰链损耗函数由下式给出:
LossH = max(0,(1-Y*y))
损耗H = max(0,(1-Y * y))
Where, Y is the Label and
其中, Y是标签,
y = 𝜭.x
y = 𝜭.x
This is the general Hinge Loss function and in this tutorial, we are going to define a function for calculating the Hinge Loss for a multiple point (having one feature) with given 𝜭.
这是常规的Hinge Loss函数,在本教程中,我们将定义一个函数,用于计算给定𝜭的多点(具有一个特征)的Hinge Loss。
# Linear Algebra Learning Sequence# Hinge loss for Multiple Point import numpy as npdef hinge_loss_single(feature_vector, label, theta, theta_0): ydash = label*(np.matmul(theta,feature_vector) + theta_0) hinge = np.max([0.0, 1 - ydash*label]) return hingedef hinge_loss_full(feature_matrix, labels, theta, theta_0): tothinge = 0 num = len(feature_matrix) for i in range(num): tothinge = tothinge + hinge_loss_single(feature_matrix[i], labels[i], theta, theta_0) hinge = tothinge return hingefeature_matrix = np.array([[2,2], [3,3], [7,0], [14,47]])theta = np.array([0.002,0.6])theta_0 = 0labels = np.array([[1], [-1], [1], [-1]])hingell = hinge_loss_full(feature_matrix, labels, theta, theta_0)print('Data point: ', feature_matrix)print('\n\nCorresponding Labels: ', labels)print('\n\n Hingle Loss for given data :', hingell)
Output:
输出:
Data point: [[ 2 2] [ 3 3] [ 7 0] [14 47]]Corresponding Labels: [[ 1] [-1] [ 1] [-1]] Hingle Loss for given data : [0.986]
翻译自:
合页损失,铰链损失
转载地址:http://ibazd.baihongyu.com/