activation_relu {keras}R Documentation

Activation functions

Description

Activations functions can either be used through layer_activation(), or through the activation argument supported by all forward layers.

Usage

activation_relu(x, alpha = 0, max_value = NULL, threshold = 0)

activation_elu(x, alpha = 1)

activation_selu(x)

activation_hard_sigmoid(x)

activation_linear(x)

activation_sigmoid(x)

activation_softmax(x, axis = -1)

activation_softplus(x)

activation_softsign(x)

activation_tanh(x)

activation_exponential(x)

Arguments

x

Tensor

alpha

Alpha value

max_value

Max value

threshold

Threshold value for thresholded activation.

axis

Integer, axis along which the softmax normalization is applied

Details

Value

Tensor with the same shape and dtype as x.

References


[Package keras version 2.2.4 Index]