org.apache.mxnet

FactorScheduler

Related Doc: package mxnet

class FactorScheduler extends LRScheduler

Class for reducing learning rate in factor

Assume the weight has been updated by n times, then the learning rate will be base_lr * factor^^(floor(n/step))

Linear Supertypes
LRScheduler, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. FactorScheduler
  2. LRScheduler
  3. AnyRef
  4. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new FactorScheduler(step: Int, factor: Float)

    step

    Int, schedule learning rate after n updates

    factor

    Float, the factor for reducing the learning rate

Value Members

  1. final def !=(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  4. def apply(numUpdate: Int): Float

    Base class of a learning rate scheduler

    Base class of a learning rate scheduler

    The training progress is presented by num_update, which can be roughly viewed as the number of minibatches executed so far. Its value is non-decreasing, and increases at most by one.

    The exact value is the upper bound of the number of updates applied to a weight/index.

    numUpdate

    Int, the maximal number of updates applied to a weight.

    Definition Classes
    FactorSchedulerLRScheduler
  5. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  6. var baseLR: Float

    Definition Classes
    LRScheduler
  7. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. var count: Int

    Attributes
    protected
  9. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  11. var factor: Float

    Float, the factor for reducing the learning rate

    Float, the factor for reducing the learning rate

    Attributes
    protected
  12. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  14. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  15. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  16. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  17. final def notify(): Unit

    Definition Classes
    AnyRef
  18. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  19. var step: Int

    Int, schedule learning rate after n updates

    Int, schedule learning rate after n updates

    Attributes
    protected
  20. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  21. def toString(): String

    Definition Classes
    AnyRef → Any
  22. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from LRScheduler

Inherited from AnyRef

Inherited from Any

Ungrouped