TFLearn estimator layers

TFLearn offers only one layer in the tflearn.layers.estimator module:

While creating the regression layer, you can specify the optimizer and the loss and metric functions.

TFLearn offers the following optimizer functions as classes in the tflearn.optimizers module:

  • SGD
  • RMSprop
  • Adam
  • Momentum
  • AdaGrad
  • Ftrl
  • AdaDelta
  • ProximalAdaGrad
  • Nesterov

You can create custom optimizers by extending the tflearn.optimizers.Optimizer base class.

TFLearn offers the following metric functions as classes or ops in the tflearn.metrics module:

  • Accuracy or  accuracy_op
  • Top_k or top_k_op
  • R2 or r2_op
  • WeightedR2  or weighted_r2_op
  • binary_accuracy_op

You can create custom metrics by extending the tflearn.metrics.Metric base class.

TFLearn provides the following loss functions, known as objectives, in the tflearn.objectives module:

  • softymax_categorical_crossentropy
  • categorical_crossentropy
  • binary_crossentropy
  • weighted_crossentropy
  • mean_square
  • hinge_loss
  • roc_auc_score
  • weak_cross_entropy_2d

While specifying the input, hidden, and output layers, you can specify the activation functions to be applied to the output. TFLearn provides the following activation functions in the tflearn.activations module:

  • linear
  • tanh
  • sigmoid
  • softmax
  • softplus
  • softsign
  • relu
  • relu6
  • leaky_relu
  • prelu
  • elu
  • crelu
  • selu