- 重构了代码结构,优化了导入顺序和格式 - 改进了模型训练流程,添加了早停机制和学习率调度器- 增加了模型测试和可视化部分的代码 -优化了量子卷积层和模型的实现 - 调整了训练参数和数据预处理方法
2.3 KiB
2.3 KiB
1 | epoch | train_acc | valid_acc | train_loss | valid_loss | |
---|---|---|---|---|---|---|
2 | 0 | 1 | 0.561125 | 0.6476814516129032 | 1.271985122203827 | 0.9911825291572078 |
3 | 1 | 2 | 0.676375 | 0.6668346774193549 | 0.9134431419372558 | 0.8937955517922679 |
4 | 2 | 3 | 0.699125 | 0.6824596774193549 | 0.8243759956359863 | 0.8520721235582905 |
5 | 3 | 4 | 0.716625 | 0.6995967741935484 | 0.7818560593128204 | 0.8005917687569896 |
6 | 4 | 5 | 0.725125 | 0.7101814516129032 | 0.745261854171753 | 0.8037946531849522 |
7 | 5 | 6 | 0.733375 | 0.7061491935483871 | 0.7344184167385102 | 0.7732808570707997 |
8 | 6 | 7 | 0.737875 | 0.7253024193548387 | 0.7150477197170257 | 0.74149090051651 |
9 | 7 | 8 | 0.74325 | 0.7253024193548387 | 0.7022454526424408 | 0.7516527022084882 |
10 | 8 | 9 | 0.742625 | 0.7298387096774194 | 0.7041031284332275 | 0.7247120468847214 |
11 | 9 | 10 | 0.747875 | 0.7116935483870968 | 0.6772661633491516 | 0.7338270487323884 |
12 | 10 | 11 | 0.757875 | 0.7212701612903226 | 0.6562292041778565 | 0.7387931039256435 |
13 | 11 | 12 | 0.761625 | 0.7328629032258065 | 0.6542983632087708 | 0.725050816612859 |
14 | 12 | 13 | 0.75925 | 0.7318548387096774 | 0.6493379819393158 | 0.7001086867624714 |
15 | 13 | 14 | 0.767375 | 0.7379032258064516 | 0.6460576868057251 | 0.6988139988914612 |
16 | 14 | 15 | 0.765875 | 0.7288306451612904 | 0.6339491362571716 | 0.7128441218406923 |
17 | 15 | 16 | 0.763125 | 0.7368951612903226 | 0.6262373595237732 | 0.714297366719092 |
18 | 16 | 17 | 0.76925 | 0.7227822580645161 | 0.6279029569625855 | 0.7181522269402781 |
19 | 17 | 18 | 0.77025 | 0.7449596774193549 | 0.6159816448688507 | 0.6757595616002237 |
20 | 18 | 19 | 0.7745 | 0.7283266129032258 | 0.6136245548725128 | 0.6998546681096477 |
21 | 19 | 20 | 0.775375 | 0.7469758064516129 | 0.6000997524261474 | 0.6749713065162781 |
22 | 20 | 21 | 0.7805 | 0.748991935483871 | 0.5928600332736969 | 0.6656959902855658 |
23 | 21 | 22 | 0.77525 | 0.7444556451612904 | 0.599046837568283 | 0.6857193170055267 |
24 | 22 | 23 | 0.785 | 0.7384072580645161 | 0.5875316572189331 | 0.6785462142959717 |
25 | 23 | 24 | 0.778375 | 0.765625 | 0.588378502368927 | 0.627939272311426 |
26 | 24 | 25 | 0.78575 | 0.7459677419354839 | 0.5700427904129028 | 0.6502643679418871 |
27 | 25 | 26 | 0.7805 | 0.75 | 0.5785817315578461 | 0.6712307555060233 |
28 | 26 | 27 | 0.78675 | 0.7474798387096774 | 0.5676946561336518 | 0.6555941143343526 |
29 | 27 | 28 | 0.787875 | 0.7505040322580645 | 0.575938116312027 | 0.6507430134281036 |
30 | 28 | 29 | 0.789 | 0.7605846774193549 | 0.5655779435634612 | 0.6520830373610219 |
31 | 29 | 30 | 0.790625 | 0.7434475806451613 | 0.5578647639751434 | 0.6789213486256138 |
32 | 30 | 31 | 0.787375 | 0.7565524193548387 | 0.5687701859474182 | 0.649224087115257 |
33 | 31 | 32 | 0.79575 | 0.7560483870967742 | 0.5432633152008056 | 0.6388471222692921 |
34 | 32 | 33 | 0.7895 | 0.7595766129032258 | 0.5557173223495483 | 0.6391933618053314 |