Skip to content

Latest commit

 

History

History
 
 

neural_nets

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Neural network models

This module implements building-blocks for larger neural network models in the Keras-style. This module does not implement a general autograd system in order emphasize conceptual understanding over flexibility.

  1. Activations. Common activation nonlinearities. Includes:

  2. Losses. Common loss functions. Includes:

  3. Wrappers. Layer wrappers. Includes:

  4. Layers. Common layers / layer-wise operations that can be composed to create larger neural networks. Includes:

  5. Optimizers. Common modifications to stochastic gradient descent. Includes:

  6. Learning Rate Schedulers. Common learning rate decay schedules.

  7. Initializers. Common weight initialization strategies.

  8. Modules. Common multi-layer blocks that appear across many deep networks. Includes:

  9. Models. Well-known network architectures. Includes:

  10. Utils. Common helper functions, primarily for dealing with CNNs. Includes:

    • im2col
    • col2im
    • conv1D
    • conv2D
    • dilate
    • deconv2D
    • minibatch
    • Various weight initialization utilities
    • Various padding and convolution arithmetic utilities