# softmax

Soft max transfer function

## Syntax

```A = softmax(N,FP) ```

## Description

`softmax` is a neural transfer function. Transfer functions calculate a layer’s output from its net input.

`A = softmax(N,FP)` takes `N` and optional function parameters,

 `N` `S`-by-`Q` matrix of net input (column) vectors `FP` Struct of function parameters (ignored)

and returns `A`, the `S`-by-`Q` matrix of the softmax competitive function applied to each column of `N`.

`info = softmax('code')` returns information about this function. The following codes are defined:

`softmax('name')` returns the name of this function.

`softmax('output',FP)` returns the `[min max]` output range.

`softmax('active',FP)` returns the `[min max]` active input range.

`softmax('fullderiv')` returns 1 or 0, depending on whether `dA_dN` is `S`-by-`S`-by-`Q` or `S`-by-`Q`.

`softmax('fpnames')` returns the names of the function parameters.

`softmax('fpdefaults')` returns the default function parameters.

## Examples

Here you define a net input vector `N`, calculate the output, and plot both with bar graphs.

```n = [0; 1; -0.5; 0.5]; a = softmax(n); subplot(2,1,1), bar(n), ylabel('n') subplot(2,1,2), bar(a), ylabel('a') ```

Assign this transfer function to layer `i` of a network.

```net.layers{i}.transferFcn = 'softmax'; ```

## Algorithms

```a = softmax(n) = exp(n)/sum(exp(n)) ```