分享

徐国庆 --VGG--train and evaluate--result

 木俊 2018-09-27
/usr/bin/python3.5 "/home/mj/04 VGG Tensorflow/training_and_val.py"
2018-09-26 20:14:58.687916: I tensorflow/core/platform/cpu_feature_guard.cc:140] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
Step: 0, loss: 2.2878, accuracy: 18.7500%
**  Step 0, val loss = 2.27, val accuracy = 9.38%  **
Step: 50, loss: 2.0949, accuracy: 28.1250%
Step: 100, loss: 1.9311, accuracy: 37.5000%
Step: 150, loss: 2.1226, accuracy: 31.2500%
Step: 200, loss: 1.7765, accuracy: 37.5000%
**  Step 200, val loss = 2.07, val accuracy = 31.25%  **
Step: 250, loss: 1.8251, accuracy: 43.7500%
Step: 300, loss: 1.5262, accuracy: 46.8750%
Step: 350, loss: 1.4536, accuracy: 56.2500%
Step: 400, loss: 1.9600, accuracy: 31.2500%
**  Step 400, val loss = 2.16, val accuracy = 15.62%  **
Step: 450, loss: 1.3603, accuracy: 53.1250%
Step: 500, loss: 1.2914, accuracy: 59.3750%
Step: 550, loss: 1.1540, accuracy: 59.3750%
Step: 600, loss: 1.2201, accuracy: 62.5000%
**  Step 600, val loss = 1.36, val accuracy = 53.12%  **
Step: 650, loss: 1.1941, accuracy: 53.1250%
Step: 700, loss: 1.5037, accuracy: 50.0000%
Step: 750, loss: 1.1634, accuracy: 62.5000%
Step: 800, loss: 1.3095, accuracy: 53.1250%
**  Step 800, val loss = 1.43, val accuracy = 50.00%  **
Step: 850, loss: 1.1210, accuracy: 68.7500%
Step: 900, loss: 1.2529, accuracy: 59.3750%
Step: 950, loss: 1.0447, accuracy: 71.8750%
Step: 1000, loss: 0.7034, accuracy: 75.0000%
**  Step 1000, val loss = 0.97, val accuracy = 68.75%  **
Step: 1050, loss: 1.4092, accuracy: 56.2500%
Step: 1100, loss: 1.3699, accuracy: 46.8750%
Step: 1150, loss: 0.8433, accuracy: 78.1250%
Step: 1200, loss: 1.0779, accuracy: 65.6250%
**  Step 1200, val loss = 1.69, val accuracy = 53.12%  **
Step: 1250, loss: 1.0841, accuracy: 71.8750%
Step: 1300, loss: 1.0031, accuracy: 65.6250%
Step: 1350, loss: 1.0336, accuracy: 62.5000%
Step: 1400, loss: 0.8833, accuracy: 68.7500%
**  Step 1400, val loss = 1.04, val accuracy = 68.75%  **
Step: 1450, loss: 0.9168, accuracy: 71.8750%
Step: 1500, loss: 0.6315, accuracy: 78.1250%
Step: 1550, loss: 0.9870, accuracy: 75.0000%
Step: 1600, loss: 1.1243, accuracy: 53.1250%
**  Step 1600, val loss = 0.84, val accuracy = 75.00%  **
Step: 1650, loss: 0.8808, accuracy: 71.8750%
Step: 1700, loss: 0.7827, accuracy: 68.7500%
Step: 1750, loss: 0.9195, accuracy: 68.7500%
Step: 1800, loss: 0.6955, accuracy: 75.0000%
**  Step 1800, val loss = 0.84, val accuracy = 71.88%  **
Step: 1850, loss: 1.1907, accuracy: 50.0000%
Step: 1900, loss: 1.1771, accuracy: 56.2500%
Step: 1950, loss: 0.8817, accuracy: 68.7500%
Step: 2000, loss: 1.1340, accuracy: 62.5000%
**  Step 2000, val loss = 0.99, val accuracy = 59.38%  **
Step: 2050, loss: 0.5817, accuracy: 78.1250%
Step: 2100, loss: 1.0990, accuracy: 53.1250%
Step: 2150, loss: 0.6999, accuracy: 78.1250%
Step: 2200, loss: 0.8845, accuracy: 71.8750%
**  Step 2200, val loss = 1.46, val accuracy = 50.00%  **
Step: 2250, loss: 1.1791, accuracy: 56.2500%
Step: 2300, loss: 0.8422, accuracy: 65.6250%
Step: 2350, loss: 1.1622, accuracy: 68.7500%
Step: 2400, loss: 0.7933, accuracy: 68.7500%
**  Step 2400, val loss = 0.74, val accuracy = 71.88%  **
Step: 2450, loss: 0.6325, accuracy: 75.0000%
Step: 2500, loss: 0.8189, accuracy: 75.0000%
Step: 2550, loss: 0.8709, accuracy: 71.8750%
Step: 2600, loss: 0.8839, accuracy: 71.8750%
**  Step 2600, val loss = 0.77, val accuracy = 71.88%  **
Step: 2650, loss: 0.8068, accuracy: 75.0000%
Step: 2700, loss: 1.0660, accuracy: 59.3750%
Step: 2750, loss: 0.8019, accuracy: 65.6250%
Step: 2800, loss: 0.6053, accuracy: 78.1250%
**  Step 2800, val loss = 0.89, val accuracy = 75.00%  **
Step: 2850, loss: 0.9991, accuracy: 65.6250%
Step: 2900, loss: 0.8593, accuracy: 71.8750%
Step: 2950, loss: 1.0939, accuracy: 53.1250%
Step: 3000, loss: 0.6514, accuracy: 78.1250%
**  Step 3000, val loss = 1.13, val accuracy = 59.38%  **
Step: 3050, loss: 0.7261, accuracy: 78.1250%
Step: 3100, loss: 1.0023, accuracy: 62.5000%
Step: 3150, loss: 0.6202, accuracy: 81.2500%
Step: 3200, loss: 0.8993, accuracy: 71.8750%
**  Step 3200, val loss = 0.41, val accuracy = 87.50%  **
Step: 3250, loss: 1.0066, accuracy: 68.7500%
Step: 3300, loss: 0.9523, accuracy: 59.3750%
Step: 3350, loss: 0.6585, accuracy: 71.8750%
Step: 3400, loss: 0.7210, accuracy: 71.8750%
**  Step 3400, val loss = 0.82, val accuracy = 71.88%  **
Step: 3450, loss: 0.7825, accuracy: 71.8750%
Step: 3500, loss: 0.6582, accuracy: 81.2500%
Step: 3550, loss: 0.5372, accuracy: 78.1250%
Step: 3600, loss: 0.7473, accuracy: 71.8750%
**  Step 3600, val loss = 0.52, val accuracy = 81.25%  **
Step: 3650, loss: 1.0434, accuracy: 65.6250%
Step: 3700, loss: 0.7560, accuracy: 75.0000%
Step: 3750, loss: 0.6632, accuracy: 75.0000%
Step: 3800, loss: 0.7088, accuracy: 71.8750%
**  Step 3800, val loss = 0.85, val accuracy = 71.88%  **
Step: 3850, loss: 0.5815, accuracy: 78.1250%
Step: 3900, loss: 0.4702, accuracy: 84.3750%
Step: 3950, loss: 0.8895, accuracy: 68.7500%
Step: 4000, loss: 0.6602, accuracy: 78.1250%
**  Step 4000, val loss = 0.67, val accuracy = 81.25%  **
Step: 4050, loss: 0.6596, accuracy: 81.2500%
Step: 4100, loss: 0.9502, accuracy: 62.5000%
Step: 4150, loss: 1.0146, accuracy: 59.3750%
Step: 4200, loss: 1.2704, accuracy: 59.3750%
**  Step 4200, val loss = 0.98, val accuracy = 62.50%  **
Step: 4250, loss: 0.8964, accuracy: 62.5000%
Step: 4300, loss: 0.9040, accuracy: 68.7500%
Step: 4350, loss: 0.7033, accuracy: 75.0000%
Step: 4400, loss: 0.7327, accuracy: 75.0000%
**  Step 4400, val loss = 1.24, val accuracy = 53.12%  **
Step: 4450, loss: 0.7180, accuracy: 71.8750%
Step: 4500, loss: 0.5717, accuracy: 87.5000%
Step: 4550, loss: 0.9558, accuracy: 62.5000%
Step: 4600, loss: 0.5935, accuracy: 75.0000%
**  Step 4600, val loss = 0.81, val accuracy = 68.75%  **
Step: 4650, loss: 0.3839, accuracy: 84.3750%
Step: 4700, loss: 1.1722, accuracy: 56.2500%
Step: 4750, loss: 0.5053, accuracy: 84.3750%
Step: 4800, loss: 0.7333, accuracy: 71.8750%
**  Step 4800, val loss = 1.07, val accuracy = 65.62%  **
Step: 4850, loss: 0.9031, accuracy: 65.6250%
Step: 4900, loss: 0.8386, accuracy: 68.7500%
Step: 4950, loss: 0.6524, accuracy: 71.8750%
Step: 5000, loss: 0.5686, accuracy: 78.1250%
**  Step 5000, val loss = 1.13, val accuracy = 62.50%  **
Step: 5050, loss: 0.7360, accuracy: 68.7500%
Step: 5100, loss: 1.0724, accuracy: 56.2500%
Step: 5150, loss: 0.5258, accuracy: 78.1250%
Step: 5200, loss: 0.7757, accuracy: 65.6250%
**  Step 5200, val loss = 1.01, val accuracy = 65.62%  **
Step: 5250, loss: 0.4880, accuracy: 84.3750%
Step: 5300, loss: 0.5747, accuracy: 78.1250%
Step: 5350, loss: 0.7243, accuracy: 68.7500%
Step: 5400, loss: 0.5165, accuracy: 78.1250%
**  Step 5400, val loss = 0.76, val accuracy = 71.88%  **
Step: 5450, loss: 1.1489, accuracy: 59.3750%
Step: 5500, loss: 0.8461, accuracy: 71.8750%
Step: 5550, loss: 0.8353, accuracy: 68.7500%
Step: 5600, loss: 0.5715, accuracy: 84.3750%
**  Step 5600, val loss = 0.87, val accuracy = 71.88%  **
Step: 5650, loss: 0.4826, accuracy: 81.2500%
Step: 5700, loss: 0.3974, accuracy: 84.3750%
Step: 5750, loss: 0.6625, accuracy: 75.0000%
Step: 5800, loss: 0.6289, accuracy: 75.0000%
**  Step 5800, val loss = 0.93, val accuracy = 68.75%  **
Step: 5850, loss: 0.6702, accuracy: 71.8750%
Step: 5900, loss: 0.5701, accuracy: 81.2500%
Step: 5950, loss: 0.4144, accuracy: 87.5000%
Step: 6000, loss: 0.5227, accuracy: 87.5000%
**  Step 6000, val loss = 1.08, val accuracy = 68.75%  **
Step: 6050, loss: 0.7968, accuracy: 68.7500%
Step: 6100, loss: 0.6206, accuracy: 78.1250%
Step: 6150, loss: 0.6986, accuracy: 68.7500%
Step: 6200, loss: 0.7680, accuracy: 71.8750%
**  Step 6200, val loss = 0.49, val accuracy = 81.25%  **
Step: 6250, loss: 0.8155, accuracy: 68.7500%
Step: 6300, loss: 0.6705, accuracy: 71.8750%
Step: 6350, loss: 0.4824, accuracy: 81.2500%
Step: 6400, loss: 0.7511, accuracy: 68.7500%
**  Step 6400, val loss = 1.26, val accuracy = 56.25%  **
Step: 6450, loss: 0.6250, accuracy: 75.0000%
Step: 6500, loss: 0.9068, accuracy: 59.3750%
Step: 6550, loss: 0.7184, accuracy: 78.1250%
Step: 6600, loss: 0.6448, accuracy: 75.0000%
**  Step 6600, val loss = 1.08, val accuracy = 65.62%  **
Step: 6650, loss: 0.7620, accuracy: 65.6250%
Step: 6700, loss: 0.7411, accuracy: 78.1250%
Step: 6750, loss: 0.5249, accuracy: 78.1250%
Step: 6800, loss: 0.5505, accuracy: 81.2500%
**  Step 6800, val loss = 0.90, val accuracy = 62.50%  **
Step: 6850, loss: 0.6473, accuracy: 78.1250%
Step: 6900, loss: 1.0988, accuracy: 56.2500%
Step: 6950, loss: 0.3969, accuracy: 84.3750%
Step: 7000, loss: 0.7694, accuracy: 75.0000%
**  Step 7000, val loss = 1.10, val accuracy = 65.62%  **
Step: 7050, loss: 0.2977, accuracy: 87.5000%
Step: 7100, loss: 0.7764, accuracy: 78.1250%
Step: 7150, loss: 0.3636, accuracy: 87.5000%
Step: 7200, loss: 0.8170, accuracy: 65.6250%
**  Step 7200, val loss = 0.93, val accuracy = 65.62%  **
Step: 7250, loss: 1.0851, accuracy: 56.2500%
Step: 7300, loss: 0.8692, accuracy: 65.6250%
Step: 7350, loss: 0.9418, accuracy: 65.6250%
Step: 7400, loss: 0.5270, accuracy: 78.1250%
**  Step 7400, val loss = 0.80, val accuracy = 65.62%  **
Step: 7450, loss: 0.8623, accuracy: 65.6250%
Step: 7500, loss: 0.9371, accuracy: 62.5000%
Step: 7550, loss: 0.5857, accuracy: 78.1250%
Step: 7600, loss: 0.6169, accuracy: 75.0000%
**  Step 7600, val loss = 0.81, val accuracy = 68.75%  **
Step: 7650, loss: 0.5693, accuracy: 78.1250%
Step: 7700, loss: 0.5812, accuracy: 78.1250%
Step: 7750, loss: 0.5141, accuracy: 81.2500%
Step: 7800, loss: 0.5303, accuracy: 84.3750%
**  Step 7800, val loss = 1.15, val accuracy = 59.38%  **
Step: 7850, loss: 0.7054, accuracy: 71.8750%
Step: 7900, loss: 0.5825, accuracy: 71.8750%
Step: 7950, loss: 0.4360, accuracy: 81.2500%
Step: 8000, loss: 0.8744, accuracy: 65.6250%
**  Step 8000, val loss = 0.70, val accuracy = 78.12%  **
Step: 8050, loss: 0.4669, accuracy: 84.3750%
Step: 8100, loss: 0.4940, accuracy: 78.1250%
Step: 8150, loss: 0.5472, accuracy: 75.0000%
Step: 8200, loss: 0.6027, accuracy: 71.8750%
**  Step 8200, val loss = 0.69, val accuracy = 75.00%  **
Step: 8250, loss: 0.5125, accuracy: 78.1250%
Step: 8300, loss: 0.8783, accuracy: 65.6250%
Step: 8350, loss: 0.7112, accuracy: 71.8750%
Step: 8400, loss: 0.6570, accuracy: 75.0000%
**  Step 8400, val loss = 0.82, val accuracy = 68.75%  **
Step: 8450, loss: 1.0580, accuracy: 56.2500%
Step: 8500, loss: 0.6043, accuracy: 75.0000%
Step: 8550, loss: 0.9825, accuracy: 59.3750%
Step: 8600, loss: 0.8117, accuracy: 68.7500%
**  Step 8600, val loss = 0.72, val accuracy = 78.12%  **
Step: 8650, loss: 0.8820, accuracy: 65.6250%
Step: 8700, loss: 0.4686, accuracy: 81.2500%
Step: 8750, loss: 0.7427, accuracy: 68.7500%
Step: 8800, loss: 0.6334, accuracy: 71.8750%
**  Step 8800, val loss = 0.69, val accuracy = 75.00%  **
Step: 8850, loss: 0.6973, accuracy: 71.8750%
Step: 8900, loss: 0.7907, accuracy: 68.7500%
Step: 8950, loss: 0.5579, accuracy: 75.0000%
Step: 9000, loss: 0.6692, accuracy: 71.8750%
**  Step 9000, val loss = 1.08, val accuracy = 71.88%  **
Step: 9050, loss: 0.7515, accuracy: 68.7500%
Step: 9100, loss: 0.7300, accuracy: 65.6250%
Step: 9150, loss: 0.7632, accuracy: 65.6250%
Step: 9200, loss: 0.5393, accuracy: 75.0000%
**  Step 9200, val loss = 1.59, val accuracy = 53.12%  **
Step: 9250, loss: 0.4207, accuracy: 84.3750%
Step: 9300, loss: 0.1736, accuracy: 93.7500%
Step: 9350, loss: 0.5300, accuracy: 81.2500%
Step: 9400, loss: 0.6697, accuracy: 68.7500%
**  Step 9400, val loss = 0.83, val accuracy = 68.75%  **
Step: 9450, loss: 0.5604, accuracy: 78.1250%
Step: 9500, loss: 0.7283, accuracy: 75.0000%
Step: 9550, loss: 0.6662, accuracy: 75.0000%
Step: 9600, loss: 0.4128, accuracy: 81.2500%
**  Step 9600, val loss = 0.91, val accuracy = 68.75%  **
Step: 9650, loss: 0.7216, accuracy: 71.8750%
Step: 9700, loss: 0.4810, accuracy: 81.2500%
Step: 9750, loss: 0.7910, accuracy: 71.8750%
Step: 9800, loss: 0.4659, accuracy: 81.2500%
**  Step 9800, val loss = 0.82, val accuracy = 75.00%  **
Step: 9850, loss: 0.6369, accuracy: 71.8750%
Step: 9900, loss: 0.9221, accuracy: 65.6250%
Step: 9950, loss: 0.7184, accuracy: 71.8750%
Step: 10000, loss: 0.3525, accuracy: 84.3750%
**  Step 10000, val loss = 0.38, val accuracy = 81.25%  **
Step: 10050, loss: 0.8405, accuracy: 65.6250%
Step: 10100, loss: 0.5309, accuracy: 78.1250%
Step: 10150, loss: 0.7731, accuracy: 65.6250%
Step: 10200, loss: 0.9846, accuracy: 59.3750%
**  Step 10200, val loss = 0.45, val accuracy = 81.25%  **
Step: 10250, loss: 0.5364, accuracy: 78.1250%
Step: 10300, loss: 1.1560, accuracy: 68.7500%
Step: 10350, loss: 0.5301, accuracy: 78.1250%
Step: 10400, loss: 0.9455, accuracy: 62.5000%
**  Step 10400, val loss = 0.97, val accuracy = 62.50%  **
Step: 10450, loss: 0.7467, accuracy: 68.7500%
Step: 10500, loss: 0.3621, accuracy: 87.5000%
Step: 10550, loss: 0.6153, accuracy: 75.0000%
Step: 10600, loss: 0.4037, accuracy: 84.3750%
**  Step 10600, val loss = 0.59, val accuracy = 81.25%  **
Step: 10650, loss: 0.6775, accuracy: 75.0000%
Step: 10700, loss: 0.5818, accuracy: 78.1250%
Step: 10750, loss: 0.6682, accuracy: 71.8750%
Step: 10800, loss: 0.5280, accuracy: 78.1250%
**  Step 10800, val loss = 1.60, val accuracy = 53.12%  **
Step: 10850, loss: 0.6873, accuracy: 71.8750%
Step: 10900, loss: 0.2613, accuracy: 87.5000%
Step: 10950, loss: 0.1860, accuracy: 93.7500%
Step: 11000, loss: 0.4113, accuracy: 84.3750%
**  Step 11000, val loss = 0.55, val accuracy = 81.25%  **
Step: 11050, loss: 0.6611, accuracy: 75.0000%
Step: 11100, loss: 0.6092, accuracy: 75.0000%
Step: 11150, loss: 0.7894, accuracy: 65.6250%
Step: 11200, loss: 0.7373, accuracy: 68.7500%
**  Step 11200, val loss = 1.39, val accuracy = 59.38%  **
Step: 11250, loss: 0.4162, accuracy: 87.5000%
Step: 11300, loss: 0.6013, accuracy: 75.0000%
Step: 11350, loss: 0.2277, accuracy: 90.6250%
Step: 11400, loss: 0.4418, accuracy: 81.2500%
**  Step 11400, val loss = 0.94, val accuracy = 65.62%  **
Step: 11450, loss: 0.5806, accuracy: 78.1250%
Step: 11500, loss: 0.7453, accuracy: 68.7500%
Step: 11550, loss: 0.8404, accuracy: 65.6250%
Step: 11600, loss: 0.6232, accuracy: 75.0000%
**  Step 11600, val loss = 1.10, val accuracy = 65.62%  **
Step: 11650, loss: 0.6895, accuracy: 71.8750%
Step: 11700, loss: 0.4599, accuracy: 81.2500%
Step: 11750, loss: 0.5318, accuracy: 78.1250%
Step: 11800, loss: 0.7051, accuracy: 68.7500%
**  Step 11800, val loss = 1.13, val accuracy = 62.50%  **
Step: 11850, loss: 0.6968, accuracy: 71.8750%
Step: 11900, loss: 0.6814, accuracy: 71.8750%
Step: 11950, loss: 0.5256, accuracy: 78.1250%
Step: 12000, loss: 0.6113, accuracy: 84.3750%
**  Step 12000, val loss = 0.91, val accuracy = 68.75%  **
Step: 12050, loss: 0.5788, accuracy: 75.0000%
Step: 12100, loss: 0.5805, accuracy: 75.0000%
Step: 12150, loss: 0.8733, accuracy: 62.5000%
Step: 12200, loss: 0.5943, accuracy: 75.0000%
**  Step 12200, val loss = 1.21, val accuracy = 65.62%  **
Step: 12250, loss: 0.9078, accuracy: 71.8750%
Step: 12300, loss: 0.6909, accuracy: 71.8750%
Step: 12350, loss: 0.7239, accuracy: 68.7500%
Step: 12400, loss: 0.5929, accuracy: 75.0000%
**  Step 12400, val loss = 0.47, val accuracy = 78.12%  **
Step: 12450, loss: 0.5465, accuracy: 75.0000%
Step: 12500, loss: 0.3165, accuracy: 87.5000%
Step: 12550, loss: 0.5684, accuracy: 78.1250%
Step: 12600, loss: 0.5479, accuracy: 78.1250%
**  Step 12600, val loss = 1.96, val accuracy = 53.12%  **
Step: 12650, loss: 0.5613, accuracy: 75.0000%
Step: 12700, loss: 0.4612, accuracy: 78.1250%
Step: 12750, loss: 0.6830, accuracy: 71.8750%
Step: 12800, loss: 0.4370, accuracy: 81.2500%
**  Step 12800, val loss = 1.42, val accuracy = 65.62%  **
Step: 12850, loss: 0.5652, accuracy: 78.1250%
Step: 12900, loss: 0.8104, accuracy: 62.5000%
Step: 12950, loss: 0.2526, accuracy: 90.6250%
Step: 13000, loss: 0.9200, accuracy: 62.5000%
**  Step 13000, val loss = 0.77, val accuracy = 71.88%  **
Step: 13050, loss: 0.6037, accuracy: 75.0000%
Step: 13100, loss: 0.4571, accuracy: 81.2500%
Step: 13150, loss: 0.5222, accuracy: 78.1250%
Step: 13200, loss: 0.5065, accuracy: 78.1250%
**  Step 13200, val loss = 1.53, val accuracy = 65.62%  **
Step: 13250, loss: 0.4879, accuracy: 84.3750%
Step: 13300, loss: 0.3689, accuracy: 84.3750%
Step: 13350, loss: 0.4545, accuracy: 81.2500%
Step: 13400, loss: 0.3731, accuracy: 84.3750%
**  Step 13400, val loss = 1.20, val accuracy = 59.38%  **
Step: 13450, loss: 0.6981, accuracy: 71.8750%
Step: 13500, loss: 0.3649, accuracy: 84.3750%
Step: 13550, loss: 0.5855, accuracy: 75.0000%
Step: 13600, loss: 0.6844, accuracy: 71.8750%
**  Step 13600, val loss = 1.61, val accuracy = 56.25%  **
Step: 13650, loss: 0.4586, accuracy: 84.3750%
Step: 13700, loss: 0.4526, accuracy: 81.2500%
Step: 13750, loss: 0.7377, accuracy: 68.7500%
Step: 13800, loss: 0.2531, accuracy: 90.6250%
**  Step 13800, val loss = 1.62, val accuracy = 62.50%  **
Step: 13850, loss: 0.4134, accuracy: 84.3750%
Step: 13900, loss: 0.6587, accuracy: 71.8750%
Step: 13950, loss: 0.2366, accuracy: 93.7500%
Step: 14000, loss: 0.3619, accuracy: 84.3750%
**  Step 14000, val loss = 1.06, val accuracy = 71.88%  **
Step: 14050, loss: 0.6089, accuracy: 75.0000%
Step: 14100, loss: 0.6247, accuracy: 71.8750%
Step: 14150, loss: 0.4694, accuracy: 78.1250%
Step: 14200, loss: 0.3693, accuracy: 84.3750%
**  Step 14200, val loss = 0.92, val accuracy = 71.88%  **
Step: 14250, loss: 0.5170, accuracy: 78.1250%
Step: 14300, loss: 0.5106, accuracy: 78.1250%
Step: 14350, loss: 0.8761, accuracy: 62.5000%
Step: 14400, loss: 0.2418, accuracy: 90.6250%
**  Step 14400, val loss = 0.96, val accuracy = 65.62%  **
Step: 14450, loss: 0.5069, accuracy: 78.1250%
Step: 14500, loss: 0.4083, accuracy: 81.2500%
Step: 14550, loss: 0.7300, accuracy: 68.7500%
Step: 14600, loss: 0.0377, accuracy: 100.0000%
**  Step 14600, val loss = 0.80, val accuracy = 68.75%  **
Step: 14650, loss: 0.7653, accuracy: 75.0000%
Step: 14700, loss: 0.6505, accuracy: 71.8750%
Step: 14750, loss: 0.6095, accuracy: 75.0000%
Step: 14800, loss: 0.5087, accuracy: 78.1250%
**  Step 14800, val loss = 0.72, val accuracy = 75.00%  **
Step: 14850, loss: 0.4392, accuracy: 81.2500%
Step: 14900, loss: 0.6882, accuracy: 71.8750%
Step: 14950, loss: 0.3780, accuracy: 84.3750%
Step: 14999, loss: 0.5979, accuracy: 75.0000%
**  Step 14999, val loss = 1.69, val accuracy = 59.38%  **

Process finished with exit code 0
////////////////////////////////////////////////
/usr/bin/python3.5 "/home/mj/04 VGG Tensorflow/training_and_val.py"
WARNING:tensorflow:From /home/mj/04 VGG Tensorflow/tools.py:131: arg_max (from tensorflow.python.ops.gen_math_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Use `argmax` instead
Reading checkpoints...
2018-09-27 15:42:38.141566: I tensorflow/core/platform/cpu_feature_guard.cc:140] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA

Evaluating......
Total testing samples: 9984
Total correct predictions: 6903
Average accuracy: 69.14%

Process finished with exit code 0

    本站是提供个人知识管理的网络存储空间,所有内容均由用户发布,不代表本站观点。请注意甄别内容中的联系方式、诱导购买等信息,谨防诈骗。如发现有害或侵权内容,请点击一键举报。
    转藏 分享 献花(0

    0条评论

    发表

    请遵守用户 评论公约

    类似文章 更多