Knet v0.9.0 supports windows, beats new benchmarks


#1

Knet v0.9.0 released with significant performance improvements, new benchmarks and windows support.

Compatibility

  • Windows GPU support implemented. (tested with VS-2015, Cuda 9.1)
  • MacOS GPU support improved: nvml only used when available.
  • CUDA up to v"9.1" and cuDNN up to v"7.0.5" are tested.
  • Pre-0.6 Julia versions no longer supported.

General

  • rnninit and rnnforw implement cudnn RNNs (with @cangumeli).
  • conv4 performance significantly improved using cudnnFind.
  • batchnorm implemented using CUDNN (@cangumeli).
  • logp performance significantly improved using cudnnSoftmaxForward.
  • DBGFLAGS and PROFILING constants defined in Knet.jl.
  • optimizers creates optimization structs for the whole model.
  • dropout now detects training mode automatically.
  • nll returns negative log likelihood given score matrix and answer index vector.
  • accuracy returns ratio of correct answers given score matrix and answer index vector.
  • minibatch(x,y,b) returns a batch iterator.
  • knetgc is now exported to cudaFree garbage collected pointers.
  • randn!, mean(a,dims), reshape with Colon is now supported by KnetArray (@CarloLucibello).
  • Using CUDAapi and CUDAdrv in build.jl if installed.
  • Got rid of the Combinatorics dependency in test.
  • curandInit called at initialization to prevent memory fill before first dropout.
  • deconv4 bug fixed (@ilkerkesen).

Documentation and Examples

  • New benchmarking notebooks under examples/DeepLearningFrameworks (with @kirnap, @ilkarman).
  • Knet/data now has download utilities: cifar.jl, fashion-mnist.jl, gutenberg.jl, housing.jl, imagenet.jl, imdb.jl, mikolovptb.jl, mnist.jl, treebank.jl, wikiner.jl
  • All examples updated to use the new RNNs and replaced/supported with IJulia notebooks.
  • New variational-autoencoder example (@CarloLucibello).
  • DyNet benchmark examples added (@ilkerkesen).
  • Deep Convolutional Generative Adversarial Networks example added (@ilkerkesen).