Knet v0.9.0 released with significant performance improvements, new benchmarks and windows support.
Compatibility
- Windows GPU support implemented. (tested with VS-2015, Cuda 9.1)
- MacOS GPU support improved: nvml only used when available.
- CUDA up to v"9.1" and cuDNN up to v"7.0.5" are tested.
- Pre-0.6 Julia versions no longer supported.
General
rnninitandrnnforwimplement cudnn RNNs (with @cangumeli).conv4performance significantly improved using cudnnFind.batchnormimplemented using CUDNN (@cangumeli).logpperformance significantly improved using cudnnSoftmaxForward.DBGFLAGSandPROFILINGconstants defined in Knet.jl.optimizerscreates optimization structs for the whole model.dropoutnow detects training mode automatically.nllreturns negative log likelihood given score matrix and answer index vector.accuracyreturns ratio of correct answers given score matrix and answer index vector.minibatch(x,y,b)returns a batch iterator.knetgcis now exported to cudaFree garbage collected pointers.randn!,mean(a,dims),reshapewithColonis now supported by KnetArray (@CarloLucibello).- Using CUDAapi and CUDAdrv in build.jl if installed.
- Got rid of the Combinatorics dependency in test.
curandInitcalled at initialization to prevent memory fill before first dropout.deconv4bug fixed (@ilkerkesen).
Documentation and Examples
- New benchmarking notebooks under examples/DeepLearningFrameworks (with @kirnap, @ilkarman).
- Knet/data now has download utilities: cifar.jl, fashion-mnist.jl, gutenberg.jl, housing.jl, imagenet.jl, imdb.jl, mikolovptb.jl, mnist.jl, treebank.jl, wikiner.jl
- All examples updated to use the new RNNs and replaced/supported with IJulia notebooks.
- New variational-autoencoder example (@CarloLucibello).
- DyNet benchmark examples added (@ilkerkesen).
- Deep Convolutional Generative Adversarial Networks example added (@ilkerkesen).