-
Notifications
You must be signed in to change notification settings - Fork 512
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
怎么预测出来的概率全是0 啊 #61
Comments
相同问题 |
===============Eval a batch======================= |
参考#17 方法,重新训练可解决 |
一样的 |
===============Eval a batch=======================
the step 6801.0 test accuracy: 0.0
===============Eval a batch=======================
the step 6802.0 takes 5.253191947937012 loss 0.7621740102767944
the step 6803.0 takes 5.179174900054932 loss 0.836201548576355
the step 6804.0 takes 5.21918511390686 loss 0.7611073851585388
the step 6805.0 takes 5.158170461654663 loss 0.7541408538818359
the step 6806.0 takes 5.20117974281311 loss 0.7468845844268799
the step 6807.0 takes 5.162172317504883 loss 0.827800989151001
the step 6808.0 takes 5.18017578125 loss 0.7844552397727966
the step 6809.0 takes 5.195178985595703 loss 0.7899702191352844
the step 6810.0 takes 5.188177824020386 loss 0.8137649297714233
the step 6811.0 takes 5.229186773300171 loss 0.7540019750595093
the step 6812.0 takes 5.173173904418945 loss 0.7108616828918457
the step 6813.0 takes 5.21818470954895 loss 0.785955548286438
the step 6814.0 takes 5.249190807342529 loss 0.7812217473983765
the step 6815.0 takes 5.238188028335571 loss 0.8018498420715332
the step 6816.0 takes 5.2111828327178955 loss 0.8020637631416321
the step 6817.0 takes 5.189177513122559 loss 0.8187956809997559
the step 6818.0 takes 5.226185083389282 loss 0.6946130990982056
the step 6819.0 takes 5.19417929649353 loss 0.8060418367385864
the step 6820.0 takes 5.136166334152222 loss 0.7476954460144043
the step 6821.0 takes 5.186176776885986 loss 0.7232500910758972
the step 6822.0 takes 5.16417121887207 loss 0.7656339406967163
the step 6823.0 takes 5.187177419662476 loss 0.7978050112724304
the step 6824.0 takes 5.1751744747161865 loss 0.7152007818222046
the step 6825.0 takes 5.178175210952759 loss 0.822441041469574
the step 6826.0 takes 5.1611714363098145 loss 0.851509690284729
the step 6827.0 takes 5.2411887645721436 loss 0.7786080241203308
the step 6828.0 takes 5.161171913146973 loss 0.8658627271652222
the step 6829.0 takes 5.20918345451355 loss 0.8724192380905151
the step 6830.0 takes 5.199178695678711 loss 0.7804743051528931
the step 6831.0 takes 5.165170669555664 loss 0.7231662273406982
the step 6832.0 takes 5.1601715087890625 loss 0.7696665525436401
the step 6833.0 takes 5.186176776885986 loss 0.7821593284606934
the step 6834.0 takes 5.163172721862793 loss 0.8162339329719543
the step 6835.0 takes 5.160170316696167 loss 0.8535160422325134
the step 6836.0 takes 5.14216685295105 loss 0.8528579473495483
the step 6837.0 takes 5.1971800327301025 loss 0.717275857925415
the step 6838.0 takes 5.241189479827881 loss 0.6899185180664062
the step 6839.0 takes 5.163172245025635 loss 0.7475823163986206
the step 6840.0 takes 5.212183237075806 loss 0.7100281715393066
the step 6841.0 takes 5.185176372528076 loss 0.7437551021575928
the step 6842.0 takes 5.214183568954468 loss 0.8610426187515259
the step 6843.0 takes 5.196179628372192 loss 0.8723220825195312
the step 6844.0 takes 5.184176206588745 loss 0.7162365913391113
the step 6845.0 takes 5.203181028366089 loss 0.7437829971313477
the step 6846.0 takes 5.197179079055786 loss 0.7429450154304504
the step 6847.0 takes 5.218184232711792 loss 0.7336486577987671
the step 6848.0 takes 5.2151830196380615 loss 0.7843906283378601
the step 6849.0 takes 5.240188837051392 loss 0.7857959270477295
the step 6850.0 takes 5.143167972564697 loss 0.7797924280166626
the step 6851.0 takes 5.222185134887695 loss 0.7834088802337646
The text was updated successfully, but these errors were encountered: