I added support for the Flux style of defining models/layers as callable

objects.

```
struct Linear; w; b; end
(m::Linear)(x) = m.w * x .+ m.b
```

This way a model/layer acts as a (predict) function as well as a collection of parameters:

```
m = Linear(randn(10,784), zeros(10)) # define model
y = m(x) # gives the prediction
for p in (m.w, m.b) # iterate over parameters
```

For training, the parameters should be marked as AutoGrad.Param objects which makes it

possible to use the four interface functions `@diff`

, `grad`

, `params`

, `value`

:

```
m = Linear(Param(randn(10,784)), Param(zeros(10)))
for p in params(m) # iterates over parameters
y = m(x) # returns the same y value as above (test mode)
y = @diff m(x) # gives an object with both value and grad info
value(y) # gives the prediction value
grad(y, m.w) # gives the gradient of value(y) wrt m.w
```

This interface is not mandatory, everything should be backwardly compatible and old Knet

code should continue to work (as long as it is upgraded to Julia 1.0). However the new

interface should allow people to easily define their layer/model collections and thus

address Knet issues #144, #147, #341.

For more in depth examples check out the new tutorial.