1. Test Modules
  2. Training Characteristics
    1. Input Learning
      1. Gradient Descent
      2. Conjugate Gradient Descent
      3. Limited-Memory BFGS
    2. Results
  3. Results

Subreport: Logs for com.simiacryptus.ref.lang.ReferenceCountingBase

Test Modules

Using Seed 2278304637576738816

Training Characteristics

Input Learning

In this apply, we use a network to learn this target input, given it's pre-evaluated output:

TrainingTester.java:332 executed in 0.01 seconds (0.000 gc):

    return RefArrays.stream(RefUtil.addRef(input_target)).flatMap(RefArrays::stream).map(x -> {
      try {
        return x.prettyPrint();
      } finally {
        x.freeRef();
      }
    }).reduce((a, b) -> a + "\n" + b).orElse("");

Returns

    [
    	[ [ 0.7 ], [ -1.72 ], [ -0.804 ], [ -1.028 ] ],
    	[ [ 0.048 ], [ 0.496 ], [ 1.912 ], [ -0.852 ] ],
    	[ [ 1.524 ], [ -0.384 ], [ -1.688 ], [ 0.08 ] ],
    	[ [ 1.764 ], [ -0.608 ], [ 1.208 ], [ -0.128 ] ]
    ]
    [
    	[ [ -1.028 ], [ 1.912 ], [ 1.524 ], [ -0.384 ] ],
    	[ [ -0.608 ], [ -1.72 ], [ 0.7 ], [ -0.804 ] ],
    	[ [ 0.08 ], [ -0.128 ], [ 1.764 ], [ -0.852 ] ],
    	[ [ 1.208 ], [ 0.048 ], [ -1.688 ], [ 0.496 ] ]
    ]
    [
    	[ [ 1.764 ], [ -0.384 ], [ -0.608 ], [ 1.524 ] ],
    	[ [ -1.688 ], [ 0.496 ], [ -0.852 ], [ 1.208 ] ],
    	[ [ 0.08 ], [ 0.048 ], [ 0.7 ], [ -1.72 ] ],
    	[ [ 1.912 ], [ -1.028 ], [ -0.128 ], [ -0.804 ] ]
    ]
    [
    	[ [ -1.72 ], [ -0.608 ], [ -0.852 ], [ 1.208 ] ],
    	[ [ -0.128 ], [ -1.028 ], [ -0.384 ], [ 0.048 ] ],
    	[ [ 1.912 ], [ 0.08 ], [ -0.804 ], [ 0.496 ] ],
    	[ [ 0.7 ], [ -1.688 ], [ 1.524 ], [ 1.764 ] ]
    ]
    [
    	[ [ 0.048 ], [ 1.524 ], [ -0.128 ], [ -1.688 ] ],
    	[ [ 1.208 ], [ -1.028 ], [ 0.7 ], [ -1.72 ] ],
    	[ [ 0.08 ], [ -0.608 ], [ -0.384 ], [ 1.912 ] ],
    	[ [ 1.764 ], [ -0.852 ], [ 0.496 ], [ -0.804 ] ]
    ]

Gradient Descent

First, we train using basic gradient descent method apply weak line search conditions.

TrainingTester.java:480 executed in 0.55 seconds (0.000 gc):

    IterativeTrainer iterativeTrainer = new IterativeTrainer(trainable.addRef());
    try {
      iterativeTrainer.setLineSearchFactory(label -> new ArmijoWolfeSearch());
      iterativeTrainer.setOrientation(new GradientDescent());
      iterativeTrainer.setMonitor(TrainingTester.getMonitor(history));
      iterativeTrainer.setTimeout(30, TimeUnit.SECONDS);
      iterativeTrainer.setMaxIterations(250);
      iterativeTrainer.setTerminateThreshold(0);
      return iterativeTrainer.run();
    } finally {
      iterativeTrainer.freeRef();
    }
Logging
Reset training subject: 2351888579479
Reset training subject: 2351940123745
Constructing line search parameters: GD
th(0)=2.3768856;dx=-0.47537712
New Minimum: 2.3768856 > 1.4630418984219005
END: th(2.154434690031884)=1.4630418984219005; dx=-0.37296022418245506 evalInputDelta=0.9138437015780996
Fitness changed from 2.3768856 to 1.4630418984219005
Iteration 1 complete. Error: 1.4630418984219005 Total: 0.1521; Orientation: 0.0055; Line Search: 0.0628
th(0)=1.4630418984219005;dx=-0.29260837968438014
New Minimum: 1.4630418984219005 > 0.420076932550378
END: th(4.641588833612779)=0.420076932550378; dx=-0.15679160090792538 evalInputDelta=1.0429649658715225
Fitness changed from 1.4630418984219005 to 0.420076932550378
Iteration 2 complete. Error: 0.420076932550378 Total: 0.0594; Orientation: 0.0017; Line Search: 0.0385
th(0)=0.420076932550378;dx=-0.08401538651007559
New Minimum: 0.420076932550378 > 2.2803010541544874E-32
WOLF (strong): th(10.000000000000002)=2.2803010541544874E-32; dx=1.796556931309086E-17 evalInputDelta=0.420076932550378
END: th(5.000000000000001)=0.10501923313759445; dx=-0.042007693255037795 evalInputDelta=0.3150576994127835
Fitness changed from 0.420076932550378 to 2.2803010541544874E-32
Iteration 3 complete. Error: 2.2803010541544874E-32 Total: 0.1310; Orientation: 0.0017; Line Search: 0.1135
Zero gradient: 6.75322301446426E-17
th(0)=2.2803010541544874E-32;dx=-4.560602108308976E-33
New Minimum: 2.2803010541544874E-32 > 3.851859888774472E-35
WOLF (strong): th(10.772173450159421)=3.851859888774472E-35; dx=6.162975822039156E-35 evalInputDelta=2.2764491942657128E-32
END: th(5.386086725079711)=4.005934284325451E-33; dx=-1.6023737137301806E-33 evalInputDelta=1.8797076257219424E-32
Fitness changed from 2.2803010541544874E-32 to 3.851859888774472E-35
Iteration 4 complete. Error: 3.851859888774472E-35 Total: 0.0981; Orientation: 0.0113; Line Search: 0.0677
Zero gradient: 2.7755575615628915E-18
th(0)=3.851859888774472E-35;dx=-7.703719777548945E-36
New Minimum: 3.851859888774472E-35 > 0.0
END: th(11.60397208403195)=0.0; dx=0.0 evalInputDelta=3.851859888774472E-35
Fitness changed from 3.851859888774472E-35 to 0.0
Iteration 5 complete. Error: 0.0 Total: 0.0567; Orientation: 0.0013; Line Search: 0.0346
Zero gradient: 0.0
th(0)=0.0;dx=0.0 (ERROR: Starting derivative negative)
Fitness changed from 0.0 to 0.0
Static Iteration Total: 0.0427; Orientation: 0.0014; Line Search: 0.0269
Iteration 6 failed. Error: 0.0
Previous Error: 0.0 -> 0.0
Optimization terminated 6
Final threshold in iteration 6: 0.0 (> 0.0) after 0.541s (< 30.000s)

Returns

    0.0

Training Converged

Conjugate Gradient Descent

First, we use a conjugate gradient descent method, which converges the fastest for purely linear functions.

TrainingTester.java:452 executed in 0.40 seconds (0.000 gc):

    IterativeTrainer iterativeTrainer = new IterativeTrainer(trainable.addRef());
    try {
      iterativeTrainer.setLineSearchFactory(label -> new QuadraticSearch());
      iterativeTrainer.setOrientation(new GradientDescent());
      iterativeTrainer.setMonitor(TrainingTester.getMonitor(history));
      iterativeTrainer.setTimeout(30, TimeUnit.SECONDS);
      iterativeTrainer.setMaxIterations(250);
      iterativeTrainer.setTerminateThreshold(0);
      return iterativeTrainer.run();
    } finally {
      iterativeTrainer.freeRef();
    }
Logging
Reset training subject: 2352434625490
Reset training subject: 2352445913072
Constructing line search parameters: GD
F(0.0) = LineSearchPoint{point=PointSample{avg=2.3768856}, derivative=-0.47537712}
New Minimum: 2.3768856 > 2.376885599952462
F(1.0E-10) = LineSearchPoint{point=PointSample{avg=2.376885599952462}, derivative=-0.47537711999524623}, evalInputDelta = -4.75379735576098E-11
New Minimum: 2.376885599952462 > 2.376885599667236
F(7.000000000000001E-10) = LineSearchPoint{point=PointSample{avg=2.376885599667236}, derivative=-0.47537711996672366}, evalInputDelta = -3.327640385464292E-10
New Minimum: 2.376885599667236 > 2.376885597670652
F(4.900000000000001E-9) = LineSearchPoint{point=PointSample{avg=2.376885597670652}, derivative=-0.47537711976706526}, evalInputDelta = -2.3293478257357947E-9
New Minimum: 2.376885597670652 > 2.3768855836945644
F(3.430000000000001E-8) = LineSearchPoint{point=PointSample{avg=2.3768855836945644}, derivative=-0.4753771183694565}, evalInputDelta = -1.6305435668328983E-8
New Minimum: 2.3768855836945644 > 2.376885485861955
F(2.4010000000000004E-7) = LineSearchPoint{point=PointSample{avg=2.376885485861955}, derivative=-0.4753771085861953}, evalInputDelta = -1.1413804523741078E-7
New Minimum: 2.376885485861955 > 2.3768848010337416
F(1.6807000000000003E-6) = LineSearchPoint{point=PointSample{avg=2.3768848010337416}, derivative=-0.47537704010336745}, evalInputDelta = -7.98966258486189E-7
New Minimum: 2.3768848010337416 > 2.3768800072390106
F(1.1764900000000001E-5) = LineSearchPoint{point=PointSample{avg=2.3768800072390106}, derivative=-0.47537656072357215}, evalInputDelta = -5.59276098943684E-6
New Minimum: 2.3768800072390106 > 2.376846450811252
F(8.235430000000001E-5) = LineSearchPoint{point=PointSample{avg=2.376846450811252}, derivative=-0.47537320506500463}, evalInputDelta = -3.9149188748144326E-5
New Minimum: 2.376846450811252 > 2.376611562449412
F(5.764801000000001E-4) = LineSearchPoint{point=PointSample{avg=2.376611562449412}, derivative=-0.4753497154550325}, evalInputDelta = -2.740375505880799E-4
New Minimum: 2.376611562449412 > 2.374967668907556
F(0.004035360700000001) = LineSearchPoint{point=PointSample{avg=2.374967668907556}, derivative=-0.4751852881852273}, evalInputDelta = -0.0019179310924442028
New Minimum: 2.374967668907556 > 2.363476338674785
F(0.028247524900000005) = LineSearchPoint{point=PointSample{avg=2.363476338674785}, derivative=-0.47403429729659097}, evalInputDelta = -0.013409261325215027
New Minimum: 2.363476338674785 > 2.2838173304962632
F(0.19773267430000002) = LineSearchPoint{point=PointSample{avg=2.2838173304962632}, derivative=-0.46597736107613685}, evalInputDelta = -0.09306826950373681
New Minimum: 2.2838173304962632 > 1.7644391423394492
F(1.3841287201) = LineSearchPoint{point=PointSample{avg=1.7644391423394492}, derivative=-0.40957880753295756}, evalInputDelta = -0.6124464576605508
New Minimum: 1.7644391423394492 > 0.002300410790839708
F(9.688901040700001) = LineSearchPoint{point=PointSample{avg=0.002300410790839708}, derivative=-0.01478893273070307}, evalInputDelta = -2.3745851892091605
F(67.8223072849) = LineSearchPoint{point=PointSample{avg=79.46924998185587}, derivative=2.748740190885079}, evalInputDelta = 77.09236438185587
F(5.217100560376924) = LineSearchPoint{point=PointSample{avg=0.5437393696783812}, derivative=-0.2273680960857632}, evalInputDelta = -1.8331462303216188
F(36.51970392263847) = LineSearchPoint{point=PointSample{avg=16.716510358220148}, derivative=1.2606860473996582}, evalInputDelta = 14.339624758220149
F(2.809207994049113) = LineSearchPoint{point=PointSample{avg=1.2290278761553917}, derivative=-0.34183379943079556}, evalInputDelta = -1.1478577238446084
F(19.66445595834379) = LineSearchPoint{point=PointSample{avg=2.2200517706800644}, derivative=0.45942612398443117}, evalInputDelta = -0.1568338293199356
2.2200517706800644 <= 2.3768856
New Minimum: 0.002300410790839708 > 6.779273404243071E-33
F(10.0) = LineSearchPoint{point=PointSample{avg=6.779273404243071E-33}, derivative=3.814726312612038E-18}, evalInputDelta = -2.3768856
Right bracket at 10.0
Converged to right
Fitness changed from 2.3768856 to 6.779273404243071E-33
Iteration 1 complete. Error: 6.779273404243071E-33 Total: 0.3509; Orientation: 0.0015; Line Search: 0.3149
Zero gradient: 3.682193206295148E-17
F(0.0) = LineSearchPoint{point=PointSample{avg=6.779273404243071E-33}, derivative=-1.3558546808486142E-33}
New Minimum: 6.779273404243071E-33 > 0.0
F(10.0) = LineSearchPoint{point=PointSample{avg=0.0}, derivative=0.0}, evalInputDelta = -6.779273404243071E-33
0.0 <= 6.779273404243071E-33
Converged to right
Fitness changed from 6.779273404243071E-33 to 0.0
Iteration 2 complete. Error: 0.0 Total: 0.0319; Orientation: 0.0009; Line Search: 0.0217
Zero gradient: 0.0
F(0.0) = LineSearchPoint{point=PointSample{avg=0.0}, derivative=0.0}
Fitness changed from 0.0 to 0.0
Static Iteration Total: 0.0179; Orientation: 0.0007; Line Search: 0.0092
Iteration 3 failed. Error: 0.0
Previous Error: 0.0 -> 0.0
Optimization terminated 3
Final threshold in iteration 3: 0.0 (> 0.0) after 0.401s (< 30.000s)

Returns

    0.0

Training Converged

Limited-Memory BFGS

Next, we apply the same optimization using L-BFGS, which is nearly ideal for purely second-order or quadratic functions.

TrainingTester.java:509 executed in 0.53 seconds (0.000 gc):

    IterativeTrainer iterativeTrainer = new IterativeTrainer(trainable.addRef());
    try {
      iterativeTrainer.setLineSearchFactory(label -> new ArmijoWolfeSearch());
      iterativeTrainer.setOrientation(new LBFGS());
      iterativeTrainer.setMonitor(TrainingTester.getMonitor(history));
      iterativeTrainer.setTimeout(30, TimeUnit.SECONDS);
      iterativeTrainer.setIterationsPerSample(100);
      iterativeTrainer.setMaxIterations(250);
      iterativeTrainer.setTerminateThreshold(0);
      return iterativeTrainer.run();
    } finally {
      iterativeTrainer.freeRef();
    }
Logging
Reset training subject: 2352840539161
Reset training subject: 2352849030805
Adding measurement 3b01ab0f to history. Total: 0
LBFGS Accumulation History: 1 points
Constructing line search parameters: GD
Non-optimal measurement 2.3768856 < 2.3768856. Total: 1
th(0)=2.3768856;dx=-0.47537712
Adding measurement 53360749 to history. Total: 1
New Minimum: 2.3768856 > 1.4630418984219005
END: th(2.154434690031884)=1.4630418984219005; dx=-0.37296022418245506 evalInputDelta=0.9138437015780996
Fitness changed from 2.3768856 to 1.4630418984219005
Iteration 1 complete. Error: 1.4630418984219005 Total: 0.0640; Orientation: 0.0056; Line Search: 0.0283
Non-optimal measurement 1.4630418984219005 < 1.4630418984219005. Total: 2
LBFGS Accumulation History: 2 points
Non-optimal measurement 1.4630418984219005 < 1.4630418984219005. Total: 2
th(0)=1.4630418984219005;dx=-0.29260837968438014
Adding measurement 4d303931 to history. Total: 2
New Minimum: 1.4630418984219005 > 0.420076932550378
END: th(4.641588833612779)=0.420076932550378; dx=-0.15679160090792538 evalInputDelta=1.0429649658715225
Fitness changed from 1.4630418984219005 to 0.420076932550378
Iteration 2 complete. Error: 0.420076932550378 Total: 0.0369; Orientation: 0.0024; Line Search: 0.0262
Non-optimal measurement 0.420076932550378 < 0.420076932550378. Total: 3
LBFGS Accumulation History: 3 points
Non-optimal measurement 0.420076932550378 < 0.420076932550378. Total: 3
th(0)=0.420076932550378;dx=-0.0840153865100756
Adding measurement 13672bf6 to history. Total: 3
New Minimum: 0.420076932550378 > 2.2803010541544874E-32
WOLF (strong): th(10.000000000000002)=2.2803010541544874E-32; dx=1.796556931309086E-17 evalInputDelta=0.420076932550378
Non-optimal measurement 0.10501923313759445 < 2.2803010541544874E-32. Total: 4
END: th(5.000000000000001)=0.10501923313759445; dx=-0.04200769325503779 evalInputDelta=0.3150576994127835
Fitness changed from 0.420076932550378 to 2.2803010541544874E-32
Iteration 3 complete. Error: 2.2803010541544874E-32 Total: 0.0450; Orientation: 0.0020; Line Search: 0.0356
Non-optimal measurement 2.2803010541544874E-32 < 2.2803010541544874E-32. Total: 4
Rejected: LBFGS Orientation magnitude: 6.753e-16, gradient 6.753e-17, dot -1.000; [1754890b-87b4-449f-8120-447e65d5e377 = 1.000/1.000e+00, 63c81fed-2149-45de-82c9-ddaed59e7449 = 1.000/1.000e+00, 9012789a-8b52-482f-bc35-510f43e56246 = 1.000/1.000e+00, 61e39831-36a5-4b22-afd3-49f179d6bedd = 1.000/1.000e+00, ef0ffb0b-e79b-46af-9cf9-019d1610d565 = 1.000/1.000e+00]
Orientation rejected. Popping history element from 2.2803010541544874E-32, 0.420076932550378, 1.4630418984219005, 2.3768856
LBFGS Accumulation History: 3 points
Removed measurement 13672bf6 to history. Total: 3
Adding measurement e93f5d3 to history. Total: 3
th(0)=2.2803010541544874E-32;dx=-4.560602108308976E-33
Adding measurement 6fb8e4c7 to history. Total: 4
New Minimum: 2.2803010541544874E-32 > 3.851859888774472E-35
WOLF (strong): th(10.772173450159421)=3.851859888774472E-35; dx=6.162975822039156E-35 evalInputDelta=2.2764491942657128E-32
Non-optimal measurement 4.005934284325451E-33 < 3.851859888774472E-35. Total: 5
END: th(5.386086725079711)=4.005934284325451E-33; dx=-1.6023737137301806E-33 evalInputDelta=1.8797076257219424E-32
Fitness changed from 2.2803010541544874E-32 to 3.851859888774472E-35
Iteration 4 complete. Error: 3.851859888774472E-35 Total: 0.1035; Orientation: 0.0625; Line Search: 0.0328
Non-optimal measurement 3.851859888774472E-35 < 3.851859888774472E-35. Total: 5
Rejected: LBFGS Orientation magnitude: 2.776e-17, gradient 2.776e-18, dot -1.000; [9012789a-8b52-482f-bc35-510f43e56246 = 0.000e+00, 61e39831-36a5-4b22-afd3-49f179d6bedd = 0.000e+00, 63c81fed-2149-45de-82c9-ddaed59e7449 = 0.000e+00, 1754890b-87b4-449f-8120-447e65d5e377 = 1.000/1.000e+00, ef0ffb0b-e79b-46af-9cf9-019d1610d565 = 0.000e+00]
Orientation rejected. Popping history element from 3.851859888774472E-35, 2.2803010541544874E-32, 0.420076932550378, 1.4630418984219005, 2.3768856
Rejected: LBFGS Orientation magnitude: 2.776e-17, gradient 2.776e-18, dot -1.000; [9012789a-8b52-482f-bc35-510f43e56246 = 0.000e+00, 63c81fed-2149-45de-82c9-ddaed59e7449 = 0.000e+00, ef0ffb0b-e79b-46af-9cf9-019d1610d565 = 0.000e+00, 1754890b-87b4-449f-8120-447e65d5e377 = 1.000/1.000e+00, 61e39831-36a5-4b22-afd3-49f179d6bedd = 0.000e+00]
Orientation rejected. Popping history element from 3.851859888774472E-35, 2.2803010541544874E-32, 0.420076932550378, 1.4630418984219005
LBFGS Accumulation History: 3 points
Removed measurement 6fb8e4c7 to history. Total: 4
Removed measurement e93f5d3 to history. Total: 3
Adding measurement 1e9b9fd4 to history. Total: 3
th(0)=3.851859888774472E-35;dx=-7.703719777548945E-36
Adding measurement 2a0511fe to history. Total: 4
New Minimum: 3.851859888774472E-35 > 0.0
END: th(11.60397208403195)=0.0; dx=0.0 evalInputDelta=3.851859888774472E-35
Fitness changed from 3.851859888774472E-35 to 0.0
Iteration 5 complete. Error: 0.0 Total: 0.1588; Orientation: 0.1274; Line Search: 0.0240
Non-optimal measurement 0.0 < 0.0. Total: 5
Rejected: LBFGS Orientation magnitude: 0.000e+00, gradient 0.000e+00, dot NaN; [63c81fed-2149-45de-82c9-ddaed59e7449 = 0.000e+00, ef0ffb0b-e79b-46af-9cf9-019d1610d565 = 0.000e+00, 1754890b-87b4-449f-8120-447e65d5e377 = 0.000e+00, 9012789a-8b52-482f-bc35-510f43e56246 = 0.000e+00, 61e39831-36a5-4b22-afd3-49f179d6bedd = 0.000e+00]
Orientation rejected. Popping history element from 0.0, 3.851859888774472E-35, 0.420076932550378, 1.4630418984219005, 2.3768856
Rejected: LBFGS Orientation magnitude: 0.000e+00, gradient 0.000e+00, dot NaN; [9012789a-8b52-482f-bc35-510f43e56246 = 0.000e+00, ef0ffb0b-e79b-46af-9cf9-019d1610d565 = 0.000e+00, 61e39831-36a5-4b22-afd3-49f179d6bedd = 0.000e+00, 63c81fed-2149-45de-82c9-ddaed59e7449 = 0.000e+00, 1754890b-87b4-449f-8120-447e65d5e377 = 0.000e+00]
Orientation rejected. Popping history element from 0.0, 3.851859888774472E-35, 0.420076932550378, 1.4630418984219005
LBFGS Accumulation History: 3 points
Removed measurement 2a0511fe to history. Total: 4
Removed measurement 1e9b9fd4 to history. Total: 3
Adding measurement 458ea4cc to history. Total: 3
th(0)=0.0;dx=0.0 (ERROR: Starting derivative negative)
Non-optimal measurement 0.0 < 0.0. Total: 4
Fitness changed from 0.0 to 0.0
Static Iteration Total: 0.1214; Orientation: 0.0979; Line Search: 0.0156
Iteration 6 failed. Error: 0.0
Previous Error: 0.0 -> 0.0
Optimization terminated 6
Final threshold in iteration 6: 0.0 (> 0.0) after 0.530s (< 30.000s)

Returns

    0.0

Training Converged

TrainingTester.java:432 executed in 0.15 seconds (0.000 gc):

    return TestUtil.compare(title + " vs Iteration", runs);
Logging
Plotting range=[1.0, -34.41432951870191], [5.0, 0.16525676357772248]; valueStats=DoubleSummaryStatistics{count=9, sum=3.766238, min=0.000000, average=0.418471, max=1.463042}
Plotting 5 points for GD
Plotting 2 points for CjGD
Plotting 5 points for LBFGS

Returns

Result

TrainingTester.java:435 executed in 0.02 seconds (0.000 gc):

    return TestUtil.compareTime(title + " vs Time", runs);
Logging
Plotting range=[0.0, -34.41432951870191], [0.344, 0.16525676357772248]; valueStats=DoubleSummaryStatistics{count=9, sum=3.766238, min=0.000000, average=0.418471, max=1.463042}
Plotting 5 points for GD
Plotting 2 points for CjGD
Plotting 5 points for LBFGS

Returns

Result

Results

TrainingTester.java:255 executed in 0.00 seconds (0.000 gc):

    return grid(inputLearning, modelLearning, completeLearning);

Returns

Result

TrainingTester.java:258 executed in 0.00 seconds (0.000 gc):

    return new ComponentResult(null == inputLearning ? null : inputLearning.value,
        null == modelLearning ? null : modelLearning.value, null == completeLearning ? null : completeLearning.value);

Returns

    {"input":{ "LBFGS": { "type": "Converged", "value": 0.0 }, "CjGD": { "type": "Converged", "value": 0.0 }, "GD": { "type": "Converged", "value": 0.0 } }, "model":null, "complete":null}

LayerTests.java:425 executed in 0.00 seconds (0.000 gc):

    throwException(exceptions.addRef());

Results

detailsresult
{"input":{ "LBFGS": { "type": "Converged", "value": 0.0 }, "CjGD": { "type": "Converged", "value": 0.0 }, "GD": { "type": "Converged", "value": 0.0 } }, "model":null, "complete":null}OK
  {
    "result": "OK",
    "performance": {
      "execution_time": "2.460",
      "gc_time": "0.270"
    },
    "created_on": 1586736983504,
    "file_name": "trainingTest",
    "report": {
      "simpleName": "Left",
      "canonicalName": "com.simiacryptus.mindseye.layers.cudnn.ImgCropLayerTest.Left",
      "link": "https://github.com/SimiaCryptus/mindseye-cudnn/tree/59d5b3318556370acb2d83ee6ec123ce0fc6974f/src/test/java/com/simiacryptus/mindseye/layers/cudnn/ImgCropLayerTest.java",
      "javaDoc": ""
    },
    "training_analysis": {
      "input": {
        "LBFGS": {
          "type": "Converged",
          "value": 0.0
        },
        "CjGD": {
          "type": "Converged",
          "value": 0.0
        },
        "GD": {
          "type": "Converged",
          "value": 0.0
        }
      }
    },
    "archive": "s3://code.simiacrypt.us/tests/com/simiacryptus/mindseye/layers/cudnn/ImgCropLayer/Left/trainingTest/202004131623",
    "id": "c18029cd-3b3d-46e5-8c0e-178e4d708ae8",
    "report_type": "Components",
    "display_name": "Comparative Training",
    "target": {
      "simpleName": "ImgCropLayer",
      "canonicalName": "com.simiacryptus.mindseye.layers.cudnn.ImgCropLayer",
      "link": "https://github.com/SimiaCryptus/mindseye-cudnn/tree/59d5b3318556370acb2d83ee6ec123ce0fc6974f/src/main/java/com/simiacryptus/mindseye/layers/cudnn/ImgCropLayer.java",
      "javaDoc": ""
    }
  }