-
Notifications
You must be signed in to change notification settings - Fork 10
Configuration (Data as Code)
Welcome to the jutsu.ai configuration wiki! Here you can learn how to use the configuration options provided by jutsu.ai. The most important thing to remember is that jutsu.ai translates clojure keywords into java instance method calls:
:optimization-algo => (.optimizationAlgo network-instance ...)
jutsu.ai also provides some convenient global variables that can be accessed in the config.
:sgd => stochastic gradient descent
You can find a list of these options here
jutsu.ai configurations accept a vector of keywords paired to a piece of data. The configuration can be broken down into three pieces. The header, body, and footer. The minimal configuration allowable is the body. The way jutsu.ai splits a configuration vector is by the special :layers keyword. The header begins before the :layers keyword, the body is the data associated with the :layers keyword, and the footer is after the layers keyword. Therefore a :layers keyword is required for a jutsu.ai config. Or else it will throw an error.
A good rule of thumb is that most options go in the header. If you run into an error try moving it to the footer section (after :layers). Here is an example configuration:
(def network-config [:optimization-algo :sgd
:learning-rate 0.5
:momentum 0.9
:layers [[:dense [:n-in 4 :n-out 4 :activation :relu]]
[:dense [:n-in 4 :n-out 4 :activation :relu]]
[:output :negative-log-likelihood [:n-in 4 :n-out 10
:activation :softmax]]]
:pretrain false
:backprop true])
This configuration creates a simple feedforward neural net that has 4 inputs, 10 inputs, and one hidden layer. The layers data is also a vector of data like the rest of the config. Each layer is represented as its own vector. The first keyword in the layer vector specifies the layer type. The first two layers in the above example are :dense
layers which are the default layer types. The last layer is the :output
which is the default output type, it is also accepting an argument to specify the loss function to be used with that layer type. Some other layer types also accept 1 or more arguments before specifying the internal layer configuration with another vector. If you are confused as to why this is, or how to setup a configuration make sure to give the dl4j-examples repo a look.
Please note that the :pretrain, and :backprop options are located in the footer section. The :layers portion is the body, and everything above it is the header. Once again, if you are getting errors in your configuration try rearranging them according to this design.