org.apache.clojure-mxnet.module
Module API for Clojure package.
arg-params
(arg-params mod)
aux-params
(aux-params mod)
backward
(backward mod out-grads)
(backward mod)
Backward computation.
`out-grads`: collection of NDArrays
Gradient on the outputs to be propagated back. This parameter is only
needed when bind is called on outputs that are not a loss function.
bind
(bind mod {:keys [data-shapes label-shapes for-training inputs-need-grad force-rebind shared-module grad-req], :as opts, :or {for-training true, inputs-need-grad false, force-rebind false, grad-req "write"}})
Bind the symbols to construct executors. This is necessary before one
can perform computation with the module.
`mod`: module
`opts-map` {
`data-shapes`: map of `:name`, `:shape`, `:dtype`, and `:layout`
Typically is `(provide-data-desc data-iter)`.Data shape must be in the
form of `io/data-desc`
`label-shapes`: map of `:name` `:shape` `:dtype` and `:layout`
Typically is `(provide-label-desc data-iter)`.
`for-training`: boolean - Default is `true`
Whether the executors should be bind for training.
`inputs-need-grad`: boolean - Default is `false`.
Whether the gradients to the input data need to be computed.
Typically this is not needed. But this might be needed when
implementing composition of modules.
`force-rebind`: boolean - Default is `false`.
This function does nothing if the executors are already binded. But
with this `true`, the executors will be forced to rebind.
`shared-module`: Default is nil.
This is used in bucketing. When not `nil`, the shared module
essentially corresponds to a different bucket -- a module with
different symbol but with the same sets of parameters (e.g. unrolled
RNNs with different lengths).
}
Ex:
(bind {:data-shapes (mx-io/provide-data train-iter)
:label-shapes (mx-io/provide-label test-iter)}))
borrow-optimizer
(borrow-optimizer mod shared-module)
Borrow optimizer from a shared module. Used in bucketing, where exactly the
same optimizer (esp. kvstore) is used.
`mod`: Module
`shared-module`
data-names
(data-names mod)
data-shapes
(data-shapes mod)
exec-group
(exec-group mod)
fit
(fit mod {:keys [train-data eval-data num-epoch fit-params], :as opts, :or {num-epoch 1, fit-params (new FitParams)}})
Train the module parameters.
`mod`: Module
`opts-map` {
`train-data`: DataIter
`eval-data`: DataIter
If not nil, will be used as validation set and evaluate the
performance after each epoch.
`num-epoch`: int
Number of epochs to run training.
`fit-params`: FitParams
Extra parameters for training (see fit-params).
}
Ex:
(fit {:train-data train-iter :eval-data test-iter :num-epoch 100)
(fit {:train-data train-iter
:eval-data test-iter
:num-epoch 5
:fit-params
(fit-params {:batch-end-callback (callback/speedometer 128 100)
:initializer (initializer/xavier)
:optimizer (optimizer/sgd {:learning-rate 0.01})
:eval-metric (eval-metric/mse)}))
fit-params
(fit-params {:keys [eval-metric kvstore optimizer initializer arg-params aux-params allow-missing force-rebind force-init begin-epoch validation-metric monitor batch-end-callback], :as opts, :or {eval-metric (eval-metric/accuracy), kvstore "local", optimizer (optimizer/sgd), initializer (initializer/uniform 0.01), allow-missing false, force-rebind false, force-init false, begin-epoch 0}})
(fit-params)
Initialize FitParams with provided parameters.
`eval-metric`: EvalMetric - Default is `accuracy`
`kvstore`: String - Default is "local"
`optimizer`: Optimizer - Default is `sgd`
`initializer`: Initializer - Default is `uniform`
Called to initialize parameters if needed.
`arg-params`: map
If not nil, should be a map of existing `arg-params`. Initialization
will be copied from that.
`aux-params`: map -
If not nil, should be a map of existing `aux-params`. Initialization
will be copied from that.
`allow-missing`: boolean - Default is `false`
If `true`, params could contain missing values, and the initializer will
be called to fill those missing params.
`force-rebind`: boolean - Default is `false`
This function does nothing if the executors are already binded. But with
this `true`, the executors will be forced to rebind.
`force-init`: boolean - Default is `false`
If `true`, will force re-initialize even if already initialized.
`begin-epoch`: int - Default is 0
`validation-metric`: EvalMetric
`monitor`: Monitor
Ex:
(fit-params {:force-init true :force-rebind true :allow-missing true})
(fit-params
{:batch-end-callback (callback/speedometer batch-size 100)
:initializer (initializer/xavier)
:optimizer (optimizer/sgd {:learning-rate 0.01})
:eval-metric (eval-metric/mse)})
forward
(forward mod data-batch is-train)
(forward mod data-batch-map)
Forward computation.
`data-batch`: Either map or DataBatch
Input data of form `io/data-batch`.
`is-train`: Default is nil
Which means `is_train` takes the value of `for_training`.
forward-backward
(forward-backward mod data-batch)
A convenient function that calls both `forward` and `backward`.
get-params
(get-params mod)
grad-arrays
(grad-arrays mod)
init-optimizer
(init-optimizer mod {:keys [kvstore optimizer reset-optimizer force-init], :as opts, :or {kvstore "local", optimizer (optimizer/sgd), reset-optimizer true, force-init false}})
(init-optimizer mod)
Install and initialize optimizers.
`mod`: Module
`opts-map` {
`kvstore`: string - Default is "local"
`optimizer`: Optimizer - Default is `sgd`
`reset-optimizer`: boolean - Default is `true`
Indicating whether we should set `rescaleGrad` & `idx2name` for
optimizer according to executorGroup.
`force-init`: boolean - Default is `false`
Indicating whether we should force re-initializing the optimizer
in the case an optimizer is already installed.
Ex:
(init-optimizer {:optimizer (optimizer/sgd {:learning-rate 0.1})})
init-params
(init-params mod {:keys [initializer arg-params aux-params allow-missing force-init allow-extra], :as opts, :or {initializer (initializer/uniform 0.01), allow-missing false, force-init false, allow-extra false}})
(init-params mod)
Initialize the parameters and auxiliary states.
`opts-map` {
`initializer`: Initializer - Default is `uniform`
Called to initialize parameters if needed.
`arg-params`: map
If not nil, should be a map of existing arg-params. Initialization
will be copied from that.
`aux-params`: map
If not nil, should be a map of existing aux-params. Initialization
will be copied from that.
`allow-missing`: boolean - Default is `false`
If true, params could contain missing values, and the initializer will
be called to fill those missing params.
`force-init` boolean - Default is `false`
If true, will force re-initialize even if already initialized.
`allow-extra`: boolean - Default is `false`
Whether allow extra parameters that are not needed by symbol.
If this is `true`, no error will be thrown when `arg-params` or
`aux-params` contain extra parameters that is not needed by the
executor.
Ex:
(init-params {:initializer (initializer/xavier)})
(init-params {:force-init true :allow-extra true})
install-monitor
(install-monitor mod monitor)
Install monitor on all executors.
label-shapes
(label-shapes mod)
load-checkpoint
(load-checkpoint {:keys [prefix epoch load-optimizer-states data-names label-names contexts workload-list fixed-param-names], :as opts, :or {load-optimizer-states false, data-names ["data"], label-names ["softmax_label"], contexts [(context/cpu)], workload-list nil, fixed-param-names nil}})
(load-checkpoint prefix epoch)
Create a model from previously saved checkpoint.
`opts-map` {
`prefix`: string
Path prefix of saved model files. You should have prefix-symbol.json,
prefix-xxxx.params, and optionally prefix-xxxx.states, where xxxx is
the epoch number.
`epoch`: int
Epoch to load.
`load-optimizer-states`: boolean - Default is false
Whether to load optimizer states. Checkpoint needs to have been made
with `save-optimizer-states` = `true`.
`data-names`: vector of strings - Default is ["data"]
Input data names.
`label-names`: vector of strings - Default is ["softmax_label"]
Input label names.
`contexts`: Context - Default is `context/cpu`
`workload-list`: Default nil
Indicating uniform workload.
`fixed-param-names`: Default nil
Indicating no network parameters are fixed.
Ex:
(load-checkpoint {:prefix "my-model" :epoch 1 :load-optimizer-states true}
load-optimizer-states
(load-optimizer-states mod fname)
Load optimizer (updater) state from file.
`mod`: Module
`fname`: string - Path to input states file.
module
(module sym {:keys [data-names label-names contexts workload-list fixed-param-names], :as opts, :or {data-names ["data"], label-names ["softmax_label"], contexts [(context/default-context)]}})
(module sym data-names label-names contexts)
(module sym)
Module is a basic module that wrap a `symbol`.
`sym`: Symbol definition.
`opts-map` {
`data-names`: vector of strings - Default is ["data"]
Input data names
`label-names`: vector of strings - Default is ["softmax_label"]
Input label names
`contexts`: Context - Default is `context/cpu`.
`workload-list`: Default nil
Indicating uniform workload.
`fixed-param-names`: Default nil
Indicating no network parameters are fixed.
}
Ex:
(module sym)
(module sym {:data-names ["data"]
:label-names ["linear_regression_label"]}
output-names
(output-names mod)
output-shapes
(output-shapes mod)
outputs
(outputs mod)
Get outputs of the previous forward computation.
In the case when data-parallelism is used, the outputs will be collected from
multiple devices. The results will look like
`[[out1_dev1, out1_dev2], [out2_dev1, out2_dev2]]`.
Those `NDArray`s might live on different devices.
outputs-merged
(outputs-merged mod)
Get outputs of the previous forward computation.
In the case when data-parallelism is used, the outputs will be merged from
multiple devices, as they look like from a single executor.
The results will look like `[out1, out2]`.
predict
(predict mod {:keys [eval-data num-batch reset], :as opts, :or {num-batch -1, reset true}})
Run prediction and collect the outputs.
`mod`: Module
`opts-map` {
`eval-data`: DataIter
`num-batch` int - Default is `-1`
Indicating running all the batches in the data iterator.
`reset`: boolean - Default is `true`
Indicating whether we should reset the data iter before start doing
prediction.
}
returns: vector of NDArrays `[out1, out2, out3]` where each element is the
concatenation of the outputs for all the mini-batches.
Ex:
(predict mod {:eval-data test-iter})
(predict mod {:eval-data test-iter :num-batch 10 :reset false})
predict-batch
(predict-batch mod data-batch)
Run the predication on a data batch.
`mod`: Module
`data-batch`: data-batch
predict-every-batch
(predict-every-batch mod {:keys [eval-data num-batch reset], :as opts, :or {num-batch -1, reset true}})
Run prediction and collect the outputs.
`mod`: Module
`opts-map` {
`eval-data`: DataIter
`num-batch` int - Default is `-1`
Indicating running all the batches in the data iterator.
`reset` boolean - Default is `true`
Indicating whether we should reset the data iter before start doing
prediction.
}
returns: nested list like this
`[[out1_batch1, out2_batch1, ...], [out1_batch2, out2_batch2, ...]]`
Note: This mode is useful because in some cases (e.g. bucketing), the module
does not necessarily produce the same number of outputs.
Ex:
(predict-every-batch mod {:eval-data test-iter})
reshape
(reshape mod data-shapes label-shapes)
(reshape mod data-shapes)
Reshapes the module for new input shapes.
`mod`: Module
`data-shapes`: Typically is `(provide-data data-iter)`
`label-shapes`: Typically is `(provide-label data-tier)`
save-checkpoint
(save-checkpoint mod {:keys [prefix epoch save-opt-states], :as opts, :or {save-opt-states false}})
(save-checkpoint mod prefix epoch)
Save current progress to checkpoint.
Use mx.callback.module_checkpoint as epoch_end_callback to save during
training.
`mod`: Module
`opts-map` {
`prefix`: string
The file prefix to checkpoint to
`epoch`: int
The current epoch number
`save-opt-states`: boolean - Default is `false`
Whether to save optimizer states for continue training
}
Ex:
(save-checkpoint {:prefix "saved_model" :epoch 0 :save-opt-states true})
save-optimizer-states
(save-optimizer-states mod fname)
Save optimizer (updater) state to file.
`mod`: Module
`fname`: string - Path to output states file.
score
(score mod {:keys [eval-data eval-metric num-batch reset epoch], :as opts, :or {num-batch Integer/MAX_VALUE, reset true, epoch 0}})
Run prediction on `eval-data` and evaluate the performance according to
`eval-metric`.
`mod`: module
`opts-map` {
`eval-data`: DataIter
`eval-metric`: EvalMetric
`num-batch`: int - Default is `Integer.MAX_VALUE`
Number of batches to run. Indicating run until the `DataIter`
finishes.
`batch-end-callback`: not supported yet.
`reset`: boolean - Default is `true`,
Indicating whether we should reset `eval-data` before starting
evaluating.
`epoch`: int - Default is 0
For compatibility, this will be passed to callbacks (if any). During
training, this will correspond to the training epoch number.
}
Ex:
(score mod {:eval-data data-iter :eval-metric (eval-metric/accuracy)})
(score mod {:eval-data data-iter
:eval-metric (eval-metric/mse) :num-batch 10})
set-params
(set-params mod {:keys [arg-params aux-params allow-missing force-init allow-extra], :as opts, :or {allow-missing false, force-init true, allow-extra false}})
Assign parameters and aux state values.
`mod`: Module
`opts-map` {
`arg-params`: map - map of name to value (`NDArray`) mapping.
`aux-params`: map - map of name to value (`NDArray`) mapping.
`allow-missing`: boolean
If true, params could contain missing values, and the initializer will
be called to fill those missing params.
`force-init`: boolean - Default is `false`
If true, will force re-initialize even if already initialized.
`allow-extra`: boolean - Default is `false`
Whether allow extra parameters that are not needed by symbol. If this
is `true`, no error will be thrown when arg-params or aux-params
contain extra parameters that is not needed by the executor.
}
Ex:
(set-params mod
{:arg-params {"fc_0_weight" (ndarray/array [0.15 0.2 0.25 0.3] [2 2])
:allow-missing true})
update
(update mod)
Update parameters according to the installed optimizer and the gradients
computed in the previous forward-backward batch.
update-metric
(update-metric mod eval-metric labels)
Evaluate and accumulate evaluation metric on outputs of the last forward
computation.
`mod`: module
`eval-metric`: EvalMetric
`labels`: collection of NDArrays
Ex:
(update-metric mod (eval-metric/mse) labels)