org.apache.mxnet.javaapi

adam_updateParam

Related Doc: package javaapi

class adam_updateParam extends AnyRef

This Param Object is specifically used for adam_update

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. adam_updateParam
  2. AnyRef
  3. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new adam_updateParam(weight: NDArray, grad: NDArray, mean: NDArray, vari: NDArray, lr: Float)

    weight

    Weight

    grad

    Gradient

    mean

    Moving mean

    vari

    Moving variance

    lr

    Learning rate

Value Members

  1. final def !=(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  5. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. def getBeta1(): Float

  10. def getBeta2(): Float

  11. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  12. def getClip_gradient(): Float

  13. def getEpsilon(): Float

  14. def getGrad(): NDArray

  15. def getLazy_update(): Boolean

  16. def getLr(): Float

  17. def getMean(): NDArray

  18. def getOut(): mxnet.NDArray

  19. def getRescale_grad(): Float

  20. def getVari(): NDArray

  21. def getWd(): Float

  22. def getWeight(): NDArray

  23. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  24. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  25. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  26. final def notify(): Unit

    Definition Classes
    AnyRef
  27. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  28. def setBeta1(beta1: Float): adam_updateParam

    beta1

    The decay rate for the 1st moment estimates.

  29. def setBeta2(beta2: Float): adam_updateParam

    beta2

    The decay rate for the 2nd moment estimates.

  30. def setClip_gradient(clip_gradient: Float): adam_updateParam

    clip_gradient

    Clip gradient to the range of [-clip_gradient, clip_gradient] If clip_gradient <= 0, gradient clipping is turned off. grad = max(min(grad, clip_gradient), -clip_gradient).

  31. def setEpsilon(epsilon: Float): adam_updateParam

    epsilon

    A small constant for numerical stability.

  32. def setLazy_update(lazy_update: Boolean): adam_updateParam

    lazy_update

    If true, lazy updates are applied if gradient's stype is row_sparse and all of w, m and v have the same stype

  33. def setOut(out: NDArray): adam_updateParam

  34. def setRescale_grad(rescale_grad: Float): adam_updateParam

    rescale_grad

    Rescale gradient to grad = rescale_grad*grad.

  35. def setWd(wd: Float): adam_updateParam

    wd

    Weight decay augments the objective function with a regularization term that penalizes large weights. The penalty scales with the square of the magnitude of each weight.

  36. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  37. def toString(): String

    Definition Classes
    AnyRef → Any
  38. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  39. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  40. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from AnyRef

Inherited from Any

Ungrouped