The softmax layer applies the softmax function which is defined for each input
as follows:
where is the number of inputs. During training, the result of the
softmax function is transformed by a logarithm
function, such that the values are suitable as input to e.g., a cross
entropy loss layer.
This behavior can be changed by setting the generic parameter
'output_mode'"output_mode""output_mode""output_mode""output_mode""output_mode", see below.
This parameter determines if and in which case the output is transformed
by a logarithm function:
'default'"default""default""default""default""default": During inference, the result of the softmax
function is returned as output while during training, the softmax is
further transformed by a logarithm function.
'no_log_training'"no_log_training""no_log_training""no_log_training""no_log_training""no_log_training": During training the result of the
softmax function is not transformed by a logarithm function.
'log_inference'"log_inference""log_inference""log_inference""log_inference""log_inference": The logarithm of the softmax is
calculated during inference in the same way as during training.
Determines whether apply_dl_modelapply_dl_modelApplyDlModelApplyDlModelApplyDlModelapply_dl_model will include the output of this
layer in the dictionary DLResultBatchDLResultBatchDLResultBatchDLResultBatchDLResultBatchdlresult_batch even without specifying this
layer in OutputsOutputsOutputsOutputsoutputsoutputs ('true'"true""true""true""true""true") or not ('false'"false""false""false""false""false").
List of values: 'is_inference_output'"is_inference_output""is_inference_output""is_inference_output""is_inference_output""is_inference_output", 'num_trainable_params'"num_trainable_params""num_trainable_params""num_trainable_params""num_trainable_params""num_trainable_params", 'output_mode'"output_mode""output_mode""output_mode""output_mode""output_mode"