1. Test Modules
  2. Training Characteristics
    1. Input Learning
      1. Gradient Descent
      2. Conjugate Gradient Descent
      3. Limited-Memory BFGS
    2. Results
  3. Results

Subreport: Logs for com.simiacryptus.ref.lang.ReferenceCountingBase

Test Modules

Using Seed 2738985069112613888

Training Characteristics

Input Learning

In this apply, we use a network to learn this target input, given it's pre-evaluated output:

TrainingTester.java:332 executed in 0.01 seconds (0.000 gc):

    return RefArrays.stream(RefUtil.addRef(input_target)).flatMap(RefArrays::stream).map(x -> {
      try {
        return x.prettyPrint();
      } finally {
        x.freeRef();
      }
    }).reduce((a, b) -> a + "\n" + b).orElse("");

Returns

    [
    	[ [ 0.08 ], [ -0.384 ], [ -1.72 ], [ 0.7 ] ],
    	[ [ -0.128 ], [ -0.608 ], [ 0.496 ], [ -0.804 ] ],
    	[ [ 1.912 ], [ -1.688 ], [ 1.524 ], [ 0.048 ] ],
    	[ [ -0.852 ], [ 1.208 ], [ -1.028 ], [ 1.764 ] ]
    ]
    [
    	[ [ 0.496 ], [ 0.7 ], [ -1.72 ], [ -0.804 ] ],
    	[ [ 1.208 ], [ 0.048 ], [ -1.028 ], [ -0.608 ] ],
    	[ [ 1.912 ], [ -1.688 ], [ -0.128 ], [ 1.524 ] ],
    	[ [ 1.764 ], [ -0.384 ], [ -0.852 ], [ 0.08 ] ]
    ]
    [
    	[ [ -1.028 ], [ 0.496 ], [ 1.208 ], [ 1.912 ] ],
    	[ [ -0.804 ], [ -0.384 ], [ -0.608 ], [ -1.688 ] ],
    	[ [ -0.128 ], [ 0.08 ], [ 1.764 ], [ 0.048 ] ],
    	[ [ -1.72 ], [ 0.7 ], [ 1.524 ], [ -0.852 ] ]
    ]
    [
    	[ [ 0.7 ], [ -0.852 ], [ 1.764 ], [ 0.08 ] ],
    	[ [ -1.688 ], [ -0.384 ], [ -1.72 ], [ -0.128 ] ],
    	[ [ 1.208 ], [ 0.496 ], [ -1.028 ], [ -0.608 ] ],
    	[ [ 0.048 ], [ 1.912 ], [ 1.524 ], [ -0.804 ] ]
    ]
    [
    	[ [ -1.72 ], [ -1.688 ], [ 1.912 ], [ 0.048 ] ],
    	[ [ -0.384 ], [ 1.524 ], [ 0.496 ], [ -0.608 ] ],
    	[ [ 1.764 ], [ -0.852 ], [ -0.804 ], [ 1.208 ] ],
    	[ [ 0.08 ], [ 0.7 ], [ -1.028 ], [ -0.128 ] ]
    ]

Gradient Descent

First, we train using basic gradient descent method apply weak line search conditions.

TrainingTester.java:480 executed in 0.49 seconds (0.000 gc):

    IterativeTrainer iterativeTrainer = new IterativeTrainer(trainable.addRef());
    try {
      iterativeTrainer.setLineSearchFactory(label -> new ArmijoWolfeSearch());
      iterativeTrainer.setOrientation(new GradientDescent());
      iterativeTrainer.setMonitor(TrainingTester.getMonitor(history));
      iterativeTrainer.setTimeout(30, TimeUnit.SECONDS);
      iterativeTrainer.setMaxIterations(250);
      iterativeTrainer.setTerminateThreshold(0);
      return iterativeTrainer.run();
    } finally {
      iterativeTrainer.freeRef();
    }
Logging
Reset training subject: 2280841330291
Reset training subject: 2280892593824
Constructing line search parameters: GD
th(0)=2.5372736;dx=-0.50745472
New Minimum: 2.5372736 > 1.5617653557073885
END: th(2.154434690031884)=1.5617653557073885; dx=-0.3981269147611584 evalInputDelta=0.9755082442926113
Fitness changed from 2.5372736 to 1.5617653557073885
Iteration 1 complete. Error: 1.5617653557073885 Total: 0.1476; Orientation: 0.0057; Line Search: 0.0555
th(0)=1.5617653557073885;dx=-0.31235307114147776
New Minimum: 1.5617653557073885 > 0.4484229745550457
END: th(4.641588833612779)=0.4484229745550457; dx=-0.1673716184259836 evalInputDelta=1.1133423811523429
Fitness changed from 1.5617653557073885 to 0.4484229745550457
Iteration 2 complete. Error: 0.4484229745550457 Total: 0.0699; Orientation: 0.0017; Line Search: 0.0467
th(0)=0.4484229745550457;dx=-0.08968459491100916
New Minimum: 0.4484229745550457 > 1.9971893523295636E-32
WOLF (strong): th(10.000000000000002)=1.9971893523295636E-32; dx=1.6973057021074658E-17 evalInputDelta=0.4484229745550457
END: th(5.000000000000001)=0.11210574363876138; dx=-0.04484229745550457 evalInputDelta=0.3363172309162843
Fitness changed from 0.4484229745550457 to 1.9971893523295636E-32
Iteration 3 complete. Error: 1.9971893523295636E-32 Total: 0.0830; Orientation: 0.0015; Line Search: 0.0605
Zero gradient: 6.320109733746027E-17
th(0)=1.9971893523295636E-32;dx=-3.994378704659128E-33
New Minimum: 1.9971893523295636E-32 > 2.407412430484045E-36
WOLF (strong): th(10.772173450159421)=2.407412430484045E-36; dx=8.666684749742562E-36 evalInputDelta=1.9969486110865152E-32
END: th(5.386086725079711)=2.24370838521113E-33; dx=-1.0303725202471715E-33 evalInputDelta=1.7728185138084505E-32
Fitness changed from 1.9971893523295636E-32 to 2.407412430484045E-36
Iteration 4 complete. Error: 2.407412430484045E-36 Total: 0.0810; Orientation: 0.0021; Line Search: 0.0589
Zero gradient: 6.938893903907229E-19
th(0)=2.407412430484045E-36;dx=-4.8148248609680906E-37
New Minimum: 2.407412430484045E-36 > 0.0
END: th(11.60397208403195)=0.0; dx=0.0 evalInputDelta=2.407412430484045E-36
Fitness changed from 2.407412430484045E-36 to 0.0
Iteration 5 complete. Error: 0.0 Total: 0.0498; Orientation: 0.0021; Line Search: 0.0345
Zero gradient: 0.0
th(0)=0.0;dx=0.0 (ERROR: Starting derivative negative)
Fitness changed from 0.0 to 0.0
Static Iteration Total: 0.0437; Orientation: 0.0017; Line Search: 0.0287
Iteration 6 failed. Error: 0.0
Previous Error: 0.0 -> 0.0
Optimization terminated 6
Final threshold in iteration 6: 0.0 (> 0.0) after 0.476s (< 30.000s)

Returns

    0.0

Training Converged

Conjugate Gradient Descent

First, we use a conjugate gradient descent method, which converges the fastest for purely linear functions.

TrainingTester.java:452 executed in 0.39 seconds (0.000 gc):

    IterativeTrainer iterativeTrainer = new IterativeTrainer(trainable.addRef());
    try {
      iterativeTrainer.setLineSearchFactory(label -> new QuadraticSearch());
      iterativeTrainer.setOrientation(new GradientDescent());
      iterativeTrainer.setMonitor(TrainingTester.getMonitor(history));
      iterativeTrainer.setTimeout(30, TimeUnit.SECONDS);
      iterativeTrainer.setMaxIterations(250);
      iterativeTrainer.setTerminateThreshold(0);
      return iterativeTrainer.run();
    } finally {
      iterativeTrainer.freeRef();
    }
Logging
Reset training subject: 2281323024023
Reset training subject: 2281337768217
Constructing line search parameters: GD
F(0.0) = LineSearchPoint{point=PointSample{avg=2.5372736}, derivative=-0.50745472}
New Minimum: 2.5372736 > 2.5372735999492546
F(1.0E-10) = LineSearchPoint{point=PointSample{avg=2.5372735999492546}, derivative=-0.5074547199949255}, evalInputDelta = -5.0745185831146955E-11
New Minimum: 2.5372735999492546 > 2.5372735996447817
F(7.000000000000001E-10) = LineSearchPoint{point=PointSample{avg=2.5372735996447817}, derivative=-0.5074547199644781}, evalInputDelta = -3.552180771748681E-10
New Minimum: 2.5372735996447817 > 2.5372735975134715
F(4.900000000000001E-9) = LineSearchPoint{point=PointSample{avg=2.5372735975134715}, derivative=-0.5074547197513473}, evalInputDelta = -2.486528316580916E-9
New Minimum: 2.5372735975134715 > 2.537273582594303
F(3.430000000000001E-8) = LineSearchPoint{point=PointSample{avg=2.537273582594303}, derivative=-0.5074547182594303}, evalInputDelta = -1.7405696883798782E-8
New Minimum: 2.537273582594303 > 2.537273478160123
F(2.4010000000000004E-7) = LineSearchPoint{point=PointSample{avg=2.537273478160123}, derivative=-0.5074547078160122}, evalInputDelta = -1.2183987685432385E-7
New Minimum: 2.537273478160123 > 2.537272747120924
F(1.6807000000000003E-6) = LineSearchPoint{point=PointSample{avg=2.537272747120924}, derivative=-0.5074546347120852}, evalInputDelta = -8.528790758077776E-7
New Minimum: 2.537272747120924 > 2.5372676298494765
F(1.1764900000000001E-5) = LineSearchPoint{point=PointSample{avg=2.5372676298494765}, derivative=-0.5074541229845966}, evalInputDelta = -5.970150523282314E-6
New Minimum: 2.5372676298494765 > 2.5372318090938366
F(8.235430000000001E-5) = LineSearchPoint{point=PointSample{avg=2.5372318090938366}, derivative=-0.5074505408921753}, evalInputDelta = -4.179090616318604E-5
New Minimum: 2.5372318090938366 > 2.536981070884372
F(5.764801000000001E-4) = LineSearchPoint{point=PointSample{avg=2.536981070884372}, derivative=-0.5074254662452269}, evalInputDelta = -2.9252911562771544E-4
New Minimum: 2.536981070884372 > 2.535226250338966
F(0.004035360700000001) = LineSearchPoint{point=PointSample{avg=2.535226250338966}, derivative=-0.5072499437165883}, evalInputDelta = -0.002047349661033593
New Minimum: 2.535226250338966 > 2.5229595056422536
F(0.028247524900000005) = LineSearchPoint{point=PointSample{avg=2.5229595056422536}, derivative=-0.5060212860161178}, evalInputDelta = -0.014314094357746221
New Minimum: 2.5229595056422536 > 2.437925249700972
F(0.19773267430000002) = LineSearchPoint{point=PointSample{avg=2.437925249700972}, derivative=-0.4974206821128243}, evalInputDelta = -0.099348350299028
New Minimum: 2.437925249700972 > 1.883500347961436
F(1.3841287201) = LineSearchPoint{point=PointSample{avg=1.883500347961436}, derivative=-0.4372164547897696}, evalInputDelta = -0.6537732520385637
New Minimum: 1.883500347961436 > 0.0024556384071461796
F(9.688901040700001) = LineSearchPoint{point=PointSample{avg=0.0024556384071461796}, derivative=-0.015786863528387207}, evalInputDelta = -2.5348179615928537
F(67.8223072849) = LineSearchPoint{point=PointSample{avg=84.83169320002756}, derivative=2.93422027530129}, evalInputDelta = 82.29441960002755
F(5.217100560376924) = LineSearchPoint{point=PointSample{avg=0.5804299323306081}, derivative=-0.24271048959220853}, evalInputDelta = -1.9568436676693917
F(36.51970392263847) = LineSearchPoint{point=PointSample{avg=17.84451065547224}, derivative=1.3457548928545409}, evalInputDelta = 15.307237055472243
F(2.809207994049113) = LineSearchPoint{point=PointSample{avg=1.311960484691878}, derivative=-0.3649001343958046}, evalInputDelta = -1.2253131153081218
F(19.66445595834379) = LineSearchPoint{point=PointSample{avg=2.369856903664098}, derivative=0.4904273792293681}, evalInputDelta = -0.16741669633590162
2.369856903664098 <= 2.5372736
New Minimum: 0.0024556384071461796 > 3.4685998298414117E-32
F(10.0) = LineSearchPoint{point=PointSample{avg=3.4685998298414117E-32}, derivative=3.5706437806481976E-17}, evalInputDelta = -2.5372736
Right bracket at 10.0
Converged to right
Fitness changed from 2.5372736 to 3.4685998298414117E-32
Iteration 1 complete. Error: 3.4685998298414117E-32 Total: 0.3382; Orientation: 0.0014; Line Search: 0.3017
Zero gradient: 8.328985328167427E-17
F(0.0) = LineSearchPoint{point=PointSample{avg=3.4685998298414117E-32}, derivative=-6.937199659682826E-33}
New Minimum: 3.4685998298414117E-32 > 0.0
F(10.0) = LineSearchPoint{point=PointSample{avg=0.0}, derivative=0.0}, evalInputDelta = -3.4685998298414117E-32
0.0 <= 3.4685998298414117E-32
Converged to right
Fitness changed from 3.4685998298414117E-32 to 0.0
Iteration 2 complete. Error: 0.0 Total: 0.0307; Orientation: 0.0008; Line Search: 0.0214
Zero gradient: 0.0
F(0.0) = LineSearchPoint{point=PointSample{avg=0.0}, derivative=0.0}
Fitness changed from 0.0 to 0.0
Static Iteration Total: 0.0219; Orientation: 0.0009; Line Search: 0.0122
Iteration 3 failed. Error: 0.0
Previous Error: 0.0 -> 0.0
Optimization terminated 3
Final threshold in iteration 3: 0.0 (> 0.0) after 0.391s (< 30.000s)

Returns

    0.0

Training Converged

Limited-Memory BFGS

Next, we apply the same optimization using L-BFGS, which is nearly ideal for purely second-order or quadratic functions.

TrainingTester.java:509 executed in 0.50 seconds (0.000 gc):

    IterativeTrainer iterativeTrainer = new IterativeTrainer(trainable.addRef());
    try {
      iterativeTrainer.setLineSearchFactory(label -> new ArmijoWolfeSearch());
      iterativeTrainer.setOrientation(new LBFGS());
      iterativeTrainer.setMonitor(TrainingTester.getMonitor(history));
      iterativeTrainer.setTimeout(30, TimeUnit.SECONDS);
      iterativeTrainer.setIterationsPerSample(100);
      iterativeTrainer.setMaxIterations(250);
      iterativeTrainer.setTerminateThreshold(0);
      return iterativeTrainer.run();
    } finally {
      iterativeTrainer.freeRef();
    }
Logging
Reset training subject: 2281721209337
Reset training subject: 2281730243949
Adding measurement 7af7fc65 to history. Total: 0
LBFGS Accumulation History: 1 points
Constructing line search parameters: GD
Non-optimal measurement 2.5372736 < 2.5372736. Total: 1
th(0)=2.5372736;dx=-0.50745472
Adding measurement 602321b to history. Total: 1
New Minimum: 2.5372736 > 1.5617653557073885
END: th(2.154434690031884)=1.5617653557073885; dx=-0.3981269147611584 evalInputDelta=0.9755082442926113
Fitness changed from 2.5372736 to 1.5617653557073885
Iteration 1 complete. Error: 1.5617653557073885 Total: 0.0544; Orientation: 0.0042; Line Search: 0.0255
Non-optimal measurement 1.5617653557073885 < 1.5617653557073885. Total: 2
LBFGS Accumulation History: 2 points
Non-optimal measurement 1.5617653557073885 < 1.5617653557073885. Total: 2
th(0)=1.5617653557073885;dx=-0.31235307114147776
Adding measurement 1f95d80d to history. Total: 2
New Minimum: 1.5617653557073885 > 0.4484229745550457
END: th(4.641588833612779)=0.4484229745550457; dx=-0.1673716184259836 evalInputDelta=1.1133423811523429
Fitness changed from 1.5617653557073885 to 0.4484229745550457
Iteration 2 complete. Error: 0.4484229745550457 Total: 0.0356; Orientation: 0.0023; Line Search: 0.0245
Non-optimal measurement 0.4484229745550457 < 0.4484229745550457. Total: 3
LBFGS Accumulation History: 3 points
Non-optimal measurement 0.4484229745550457 < 0.4484229745550457. Total: 3
th(0)=0.4484229745550457;dx=-0.08968459491100916
Adding measurement 6d56409e to history. Total: 3
New Minimum: 0.4484229745550457 > 1.9971893523295636E-32
WOLF (strong): th(10.000000000000002)=1.9971893523295636E-32; dx=1.6973057021074658E-17 evalInputDelta=0.4484229745550457
Non-optimal measurement 0.11210574363876138 < 1.9971893523295636E-32. Total: 4
END: th(5.000000000000001)=0.11210574363876138; dx=-0.04484229745550457 evalInputDelta=0.3363172309162843
Fitness changed from 0.4484229745550457 to 1.9971893523295636E-32
Iteration 3 complete. Error: 1.9971893523295636E-32 Total: 0.0390; Orientation: 0.0018; Line Search: 0.0297
Non-optimal measurement 1.9971893523295636E-32 < 1.9971893523295636E-32. Total: 4
Rejected: LBFGS Orientation magnitude: 6.320e-16, gradient 6.320e-17, dot -1.000; [341d5d46-166d-4cc4-94dc-63aae33d20dc = 1.000/1.000e+00, 9a4dfdf6-a559-4379-b48b-7b51ba767122 = 1.000/1.000e+00, 91e52dfe-b115-46a3-840b-e78c50fe6eb9 = 1.000/1.000e+00, 0761f47f-1e21-4529-80d3-6430b53b4c34 = 1.000/1.000e+00, a9e136a3-84cc-45d3-a62c-e4acfba95887 = 1.000/1.000e+00]
Orientation rejected. Popping history element from 1.9971893523295636E-32, 0.4484229745550457, 1.5617653557073885, 2.5372736
LBFGS Accumulation History: 3 points
Removed measurement 6d56409e to history. Total: 3
Adding measurement 76d4cf50 to history. Total: 3
th(0)=1.9971893523295636E-32;dx=-3.994378704659128E-33
Adding measurement 5244d055 to history. Total: 4
New Minimum: 1.9971893523295636E-32 > 2.407412430484045E-36
WOLF (strong): th(10.772173450159421)=2.407412430484045E-36; dx=8.666684749742562E-36 evalInputDelta=1.9969486110865152E-32
Non-optimal measurement 2.24370838521113E-33 < 2.407412430484045E-36. Total: 5
END: th(5.386086725079711)=2.24370838521113E-33; dx=-1.0303725202471715E-33 evalInputDelta=1.7728185138084505E-32
Fitness changed from 1.9971893523295636E-32 to 2.407412430484045E-36
Iteration 4 complete. Error: 2.407412430484045E-36 Total: 0.1030; Orientation: 0.0633; Line Search: 0.0331
Non-optimal measurement 2.407412430484045E-36 < 2.407412430484045E-36. Total: 5
Rejected: LBFGS Orientation magnitude: 6.939e-18, gradient 6.939e-19, dot -1.000; [91e52dfe-b115-46a3-840b-e78c50fe6eb9 = 0.000e+00, a9e136a3-84cc-45d3-a62c-e4acfba95887 = 1.000/1.000e+00, 0761f47f-1e21-4529-80d3-6430b53b4c34 = 0.000e+00, 9a4dfdf6-a559-4379-b48b-7b51ba767122 = 0.000e+00, 341d5d46-166d-4cc4-94dc-63aae33d20dc = 0.000e+00]
Orientation rejected. Popping history element from 2.407412430484045E-36, 1.9971893523295636E-32, 0.4484229745550457, 1.5617653557073885, 2.5372736
Rejected: LBFGS Orientation magnitude: 6.939e-18, gradient 6.939e-19, dot -1.000; [a9e136a3-84cc-45d3-a62c-e4acfba95887 = 1.000/1.000e+00, 341d5d46-166d-4cc4-94dc-63aae33d20dc = 0.000e+00, 9a4dfdf6-a559-4379-b48b-7b51ba767122 = 0.000e+00, 91e52dfe-b115-46a3-840b-e78c50fe6eb9 = 0.000e+00, 0761f47f-1e21-4529-80d3-6430b53b4c34 = 0.000e+00]
Orientation rejected. Popping history element from 2.407412430484045E-36, 1.9971893523295636E-32, 0.4484229745550457, 1.5617653557073885
LBFGS Accumulation History: 3 points
Removed measurement 5244d055 to history. Total: 4
Removed measurement 76d4cf50 to history. Total: 3
Adding measurement 601852b2 to history. Total: 3
th(0)=2.407412430484045E-36;dx=-4.8148248609680906E-37
Adding measurement 728b2faa to history. Total: 4
New Minimum: 2.407412430484045E-36 > 0.0
END: th(11.60397208403195)=0.0; dx=0.0 evalInputDelta=2.407412430484045E-36
Fitness changed from 2.407412430484045E-36 to 0.0
Iteration 5 complete. Error: 0.0 Total: 0.1445; Orientation: 0.1141; Line Search: 0.0231
Non-optimal measurement 0.0 < 0.0. Total: 5
Rejected: LBFGS Orientation magnitude: 0.000e+00, gradient 0.000e+00, dot NaN; [91e52dfe-b115-46a3-840b-e78c50fe6eb9 = 0.000e+00, 341d5d46-166d-4cc4-94dc-63aae33d20dc = 0.000e+00, a9e136a3-84cc-45d3-a62c-e4acfba95887 = 0.000e+00, 9a4dfdf6-a559-4379-b48b-7b51ba767122 = 0.000e+00, 0761f47f-1e21-4529-80d3-6430b53b4c34 = 0.000e+00]
Orientation rejected. Popping history element from 0.0, 2.407412430484045E-36, 0.4484229745550457, 1.5617653557073885, 2.5372736
Rejected: LBFGS Orientation magnitude: 0.000e+00, gradient 0.000e+00, dot NaN; [a9e136a3-84cc-45d3-a62c-e4acfba95887 = 0.000e+00, 91e52dfe-b115-46a3-840b-e78c50fe6eb9 = 0.000e+00, 0761f47f-1e21-4529-80d3-6430b53b4c34 = 0.000e+00, 341d5d46-166d-4cc4-94dc-63aae33d20dc = 0.000e+00, 9a4dfdf6-a559-4379-b48b-7b51ba767122 = 0.000e+00]
Orientation rejected. Popping history element from 0.0, 2.407412430484045E-36, 0.4484229745550457, 1.5617653557073885
LBFGS Accumulation History: 3 points
Removed measurement 728b2faa to history. Total: 4
Removed measurement 601852b2 to history. Total: 3
Adding measurement 1b69baef to history. Total: 3
th(0)=0.0;dx=0.0 (ERROR: Starting derivative negative)
Non-optimal measurement 0.0 < 0.0. Total: 4
Fitness changed from 0.0 to 0.0
Static Iteration Total: 0.1196; Orientation: 0.0960; Line Search: 0.0159
Iteration 6 failed. Error: 0.0
Previous Error: 0.0 -> 0.0
Optimization terminated 6
Final threshold in iteration 6: 0.0 (> 0.0) after 0.496s (< 30.000s)

Returns

    0.0

Training Converged

TrainingTester.java:432 executed in 0.15 seconds (0.000 gc):

    return TestUtil.compare(title + " vs Iteration", runs);
Logging
Plotting range=[1.0, -35.61844950135784], [5.0, 0.19361578474208638]; valueStats=DoubleSummaryStatistics{count=9, sum=4.020377, min=0.000000, average=0.446709, max=1.561765}
Plotting 5 points for GD
Plotting 2 points for CjGD
Plotting 5 points for LBFGS

Returns

Result

TrainingTester.java:435 executed in 0.02 seconds (0.000 gc):

    return TestUtil.compareTime(title + " vs Time", runs);
Logging
Plotting range=[0.0, -35.61844950135784], [0.322, 0.19361578474208638]; valueStats=DoubleSummaryStatistics{count=9, sum=4.020377, min=0.000000, average=0.446709, max=1.561765}
Plotting 5 points for GD
Plotting 2 points for CjGD
Plotting 5 points for LBFGS

Returns

Result

Results

TrainingTester.java:255 executed in 0.00 seconds (0.000 gc):

    return grid(inputLearning, modelLearning, completeLearning);

Returns

Result

TrainingTester.java:258 executed in 0.00 seconds (0.000 gc):

    return new ComponentResult(null == inputLearning ? null : inputLearning.value,
        null == modelLearning ? null : modelLearning.value, null == completeLearning ? null : completeLearning.value);

Returns

    {"input":{ "LBFGS": { "type": "Converged", "value": 0.0 }, "CjGD": { "type": "Converged", "value": 0.0 }, "GD": { "type": "Converged", "value": 0.0 } }, "model":null, "complete":null}

LayerTests.java:425 executed in 0.00 seconds (0.000 gc):

    throwException(exceptions.addRef());

Results

detailsresult
{"input":{ "LBFGS": { "type": "Converged", "value": 0.0 }, "CjGD": { "type": "Converged", "value": 0.0 }, "GD": { "type": "Converged", "value": 0.0 } }, "model":null, "complete":null}OK
  {
    "result": "OK",
    "performance": {
      "execution_time": "2.408",
      "gc_time": "0.284"
    },
    "created_on": 1586736912461,
    "file_name": "trainingTest",
    "report": {
      "simpleName": "Bottom",
      "canonicalName": "com.simiacryptus.mindseye.layers.cudnn.ImgCropLayerTest.Bottom",
      "link": "https://github.com/SimiaCryptus/mindseye-cudnn/tree/59d5b3318556370acb2d83ee6ec123ce0fc6974f/src/test/java/com/simiacryptus/mindseye/layers/cudnn/ImgCropLayerTest.java",
      "javaDoc": ""
    },
    "training_analysis": {
      "input": {
        "LBFGS": {
          "type": "Converged",
          "value": 0.0
        },
        "CjGD": {
          "type": "Converged",
          "value": 0.0
        },
        "GD": {
          "type": "Converged",
          "value": 0.0
        }
      }
    },
    "archive": "s3://code.simiacrypt.us/tests/com/simiacryptus/mindseye/layers/cudnn/ImgCropLayer/Bottom/trainingTest/202004131512",
    "id": "369e162e-b265-4a38-83cd-ac8ba9d07298",
    "report_type": "Components",
    "display_name": "Comparative Training",
    "target": {
      "simpleName": "ImgCropLayer",
      "canonicalName": "com.simiacryptus.mindseye.layers.cudnn.ImgCropLayer",
      "link": "https://github.com/SimiaCryptus/mindseye-cudnn/tree/59d5b3318556370acb2d83ee6ec123ce0fc6974f/src/main/java/com/simiacryptus/mindseye/layers/cudnn/ImgCropLayer.java",
      "javaDoc": ""
    }
  }