create_dl_layer_loss_cross_entropy T_create_dl_layer_loss_cross_entropy CreateDlLayerLossCrossEntropy CreateDlLayerLossCrossEntropy create_dl_layer_loss_cross_entropy (Operator)
Name
create_dl_layer_loss_cross_entropy T_create_dl_layer_loss_cross_entropy CreateDlLayerLossCrossEntropy CreateDlLayerLossCrossEntropy create_dl_layer_loss_cross_entropy
— Create a cross entropy loss layer.
Signature
void CreateDlLayerLossCrossEntropy (const HTuple& DLLayerInput , const HTuple& DLLayerTarget , const HTuple& DLLayerWeights , const HTuple& LayerName , const HTuple& LossWeight , const HTuple& GenParamName , const HTuple& GenParamValue , HTuple* DLLayerLossCrossEntropy )
HDlLayer HDlLayer ::CreateDlLayerLossCrossEntropy (const HDlLayer& DLLayerTarget , const HDlLayer& DLLayerWeights , const HString& LayerName , double LossWeight , const HTuple& GenParamName , const HTuple& GenParamValue ) const
HDlLayer HDlLayer ::CreateDlLayerLossCrossEntropy (const HDlLayer& DLLayerTarget , const HDlLayer& DLLayerWeights , const HString& LayerName , double LossWeight , const HString& GenParamName , const HString& GenParamValue ) const
HDlLayer HDlLayer ::CreateDlLayerLossCrossEntropy (const HDlLayer& DLLayerTarget , const HDlLayer& DLLayerWeights , const char* LayerName , double LossWeight , const char* GenParamName , const char* GenParamValue ) const
HDlLayer HDlLayer ::CreateDlLayerLossCrossEntropy (const HDlLayer& DLLayerTarget , const HDlLayer& DLLayerWeights , const wchar_t* LayerName , double LossWeight , const wchar_t* GenParamName , const wchar_t* GenParamValue ) const
(Windows only)
static void HOperatorSet .CreateDlLayerLossCrossEntropy (HTuple DLLayerInput , HTuple DLLayerTarget , HTuple DLLayerWeights , HTuple layerName , HTuple lossWeight , HTuple genParamName , HTuple genParamValue , out HTuple DLLayerLossCrossEntropy )
HDlLayer HDlLayer .CreateDlLayerLossCrossEntropy (HDlLayer DLLayerTarget , HDlLayer DLLayerWeights , string layerName , double lossWeight , HTuple genParamName , HTuple genParamValue )
HDlLayer HDlLayer .CreateDlLayerLossCrossEntropy (HDlLayer DLLayerTarget , HDlLayer DLLayerWeights , string layerName , double lossWeight , string genParamName , string genParamValue )
Description
The operator create_dl_layer_loss_cross_entropy create_dl_layer_loss_cross_entropy CreateDlLayerLossCrossEntropy CreateDlLayerLossCrossEntropy CreateDlLayerLossCrossEntropy create_dl_layer_loss_cross_entropy
creates a cross
entropy loss layer whose handle is returned in
DLLayerLossCrossEntropy DLLayerLossCrossEntropy DLLayerLossCrossEntropy DLLayerLossCrossEntropy DLLayerLossCrossEntropy dllayer_loss_cross_entropy
.
This layer computes the two dimensional cross entropy loss on the input
(provided by DLLayerInput DLLayerInput DLLayerInput DLLayerInput DLLayerInput dllayer_input
) given the corresponding target
(provided by DLLayerTarget DLLayerTarget DLLayerTarget DLLayerTarget DLLayerTarget dllayer_target
) and weight
(provided by DLLayerWeights DLLayerWeights DLLayerWeights DLLayerWeights DLLayerWeights dllayer_weights
).
Cross entropy is commonly used to measure the similarity between two
vectors.
Example:
Illustrative example, where we have a
pixel-level classification problem with three classes.
The input vector for a single pixel is
(e.g., the output of a softmax layer) which means that the predicted
value (e.g., probability) is 0.7 for the class at index 0, 0.1 for the class
at index 1 and 0.2 for the class at index 2.
The target vector is
with a probability of 1.0 for the actual class and 0.0 else.
Entropy is calculated by the dot product of these two vectors. Since the
target vector has only one non-zero entry, it can be given by the
index of the actual class instead of a vector, in this case
.
The cross entropy is then simply the value of the input vector at the target
class index, hence
.
Using this simplification, the cross entropy loss function over an input
image can be defined by
where the input
consists of one prediction vector
for each pixel, the target
and weight
consist of one value
and
for each input pixel,
is the number of pixels
and
is the sum over all weights.
Hence, this layer expects multiple incoming layers:
The parameter LayerName LayerName LayerName LayerName layerName layer_name
sets an individual layer name.
Note that if creating a model using create_dl_model create_dl_model CreateDlModel CreateDlModel CreateDlModel create_dl_model
each layer of
the created network must have a unique name.
The parameter LossWeight LossWeight LossWeight LossWeight lossWeight loss_weight
determines the scalar weight factor with
which the loss, calculated in this layer, is multiplied.
This parameter can be used to specify the contribution of the cross entropy
loss to the overall network loss in case multiple loss layers are used.
The following generic parameters GenParamName GenParamName GenParamName GenParamName genParamName gen_param_name
and the corresponding
values GenParamValue GenParamValue GenParamValue GenParamValue genParamValue gen_param_value
are supported:
'is_inference_output' "is_inference_output" "is_inference_output" "is_inference_output" "is_inference_output" "is_inference_output" :
Determines whether apply_dl_model apply_dl_model ApplyDlModel ApplyDlModel ApplyDlModel apply_dl_model
will include the output of this
layer in the dictionary DLResultBatch DLResultBatch DLResultBatch DLResultBatch DLResultBatch dlresult_batch
even without specifying this
layer in Outputs Outputs Outputs Outputs outputs outputs
('true' "true" "true" "true" "true" "true" ) or not ('false' "false" "false" "false" "false" "false" ).
Default: 'false' "false" "false" "false" "false" "false"
'num_trainable_params' "num_trainable_params" "num_trainable_params" "num_trainable_params" "num_trainable_params" "num_trainable_params" :
Number of trainable parameters (weights and biases) of the layer.
Certain parameters of layers created using this operator
create_dl_layer_loss_cross_entropy create_dl_layer_loss_cross_entropy CreateDlLayerLossCrossEntropy CreateDlLayerLossCrossEntropy CreateDlLayerLossCrossEntropy create_dl_layer_loss_cross_entropy
can be set and retrieved using
further operators.
The following tables give an overview, which parameters can be set
using set_dl_model_layer_param set_dl_model_layer_param SetDlModelLayerParam SetDlModelLayerParam SetDlModelLayerParam set_dl_model_layer_param
and which ones can be retrieved
using get_dl_model_layer_param get_dl_model_layer_param GetDlModelLayerParam GetDlModelLayerParam GetDlModelLayerParam get_dl_model_layer_param
or get_dl_layer_param get_dl_layer_param GetDlLayerParam GetDlLayerParam GetDlLayerParam get_dl_layer_param
.
Note, the operators set_dl_model_layer_param set_dl_model_layer_param SetDlModelLayerParam SetDlModelLayerParam SetDlModelLayerParam set_dl_model_layer_param
and
get_dl_model_layer_param get_dl_model_layer_param GetDlModelLayerParam GetDlModelLayerParam GetDlModelLayerParam get_dl_model_layer_param
require a model created by
create_dl_model create_dl_model CreateDlModel CreateDlModel CreateDlModel create_dl_model
.
Generic Layer Parameters
set
get
'is_inference_output' "is_inference_output" "is_inference_output" "is_inference_output" "is_inference_output" "is_inference_output"
'num_trainable_params' "num_trainable_params" "num_trainable_params" "num_trainable_params" "num_trainable_params" "num_trainable_params"
Execution Information
Multithreading type: reentrant (runs in parallel with non-exclusive operators).
Multithreading scope: global (may be called from any thread).
Processed without parallelization.
Parameters
DLLayerInput DLLayerInput DLLayerInput DLLayerInput DLLayerInput dllayer_input
(input_control) dl_layer →
HDlLayer , HTuple HHandle HTuple Htuple (handle) (IntPtr ) (HHandle ) (handle )
Input layer.
DLLayerTarget DLLayerTarget DLLayerTarget DLLayerTarget DLLayerTarget dllayer_target
(input_control) dl_layer →
HDlLayer , HTuple HHandle HTuple Htuple (handle) (IntPtr ) (HHandle ) (handle )
Target layer.
DLLayerWeights DLLayerWeights DLLayerWeights DLLayerWeights DLLayerWeights dllayer_weights
(input_control) dl_layer →
HDlLayer , HTuple HHandle HTuple Htuple (handle) (IntPtr ) (HHandle ) (handle )
Weights layer.
LayerName LayerName LayerName LayerName layerName layer_name
(input_control) string →
HTuple str HTuple Htuple (string) (string ) (HString ) (char* )
Name of the output layer.
LossWeight LossWeight LossWeight LossWeight lossWeight loss_weight
(input_control) number →
HTuple float HTuple Htuple (real) (double ) (double ) (double )
Overall loss weight if there are multiple losses
in the network.
Default value: 1.0
GenParamName GenParamName GenParamName GenParamName genParamName gen_param_name
(input_control) attribute.name(-array) →
HTuple MaybeSequence[str] HTuple Htuple (string) (string ) (HString ) (char* )
Generic input parameter names.
Default value: []
List of values: 'is_inference_output' "is_inference_output" "is_inference_output" "is_inference_output" "is_inference_output" "is_inference_output" , 'num_trainable_params' "num_trainable_params" "num_trainable_params" "num_trainable_params" "num_trainable_params" "num_trainable_params"
GenParamValue GenParamValue GenParamValue GenParamValue genParamValue gen_param_value
(input_control) attribute.value(-array) →
HTuple MaybeSequence[Union[int, float, str]] HTuple Htuple (string / integer / real) (string / int / long / double) (HString / Hlong / double) (char* / Hlong / double)
Generic input parameter values.
Default value: []
Suggested values: 'true' "true" "true" "true" "true" "true" , 'false' "false" "false" "false" "false" "false"
DLLayerLossCrossEntropy DLLayerLossCrossEntropy DLLayerLossCrossEntropy DLLayerLossCrossEntropy DLLayerLossCrossEntropy dllayer_loss_cross_entropy
(output_control) dl_layer →
HDlLayer , HTuple HHandle HTuple Htuple (handle) (IntPtr ) (HHandle ) (handle )
Cross entropy loss layer.
Module
Deep Learning Training