mamba心脏疾病诊断

1. medmamba复现

想首先尝试一下medmamba的实验复现,用做后续的基础架构

网络结构图

实验结果

是二维的,不好改,还是改改之前的分割网络看看

2. echo-mi实验

2.1 idea

使用双切面(二腔室+四腔室)超声心动图数据训练Mamba模型诊断早期心梗的需求,设计了一个创新性的双路径时空融合Mamba架构(Dual-Path Spatiotemporal Fusion Mamba, DPSF-Mamba)。该架构针对性解决多切面心脏超声的时空特征融合问题,核心思路如下:

一、核心改造思路

双路径异构特征提取

独立编码路径:为二腔室(2C)和四腔室(4C)切面分别设计专用Mamba块,适应不同视角的局部结构特征。
2C路径:聚焦左心室前壁、心尖部运动异常(早期心梗敏感区域)。

image-20251011130229693

4C路径:捕获室间隔、侧壁运动及整体心室协调性(诊断关键指标)。

image-20251011130312755

动态门控融合模块(DGFM):引入可学习的门控权重,自适应融合双路径特征(公式示例):

Ffused=σ(Wg[F2C,F4C])F2C+(1σ(Wg[F2C,F4C]))F4CF_{fused} = \sigma(W_g \cdot [F_{2C}, F_{4C}]) \odot F_{2C} + (1-\sigma(W_g \cdot [F_{2C}, F_{4C}])) \odot F_{4C}

其中 WgW_g 为可训练权重,σ\sigma 为Sigmoid函数,实现特征重要性动态分配。

多任务联合优化分类头

//todo

二、创新技术优势

模块 传统Mamba局限 DPSF-Mamba改进 临床价值
多切面处理 单一路径忽视视角差异 双路径异构编码 + 动态融合 减少视角偏差,提升小病灶敏感性
时空特征利用 长序列建模但空间关联弱 时空切片重组 + 坐标注意力 同步捕捉运动异常与结构变形
数据效率 需大量标注数据 多任务学习共享特征 缓解超声标注稀缺问题
可解释性 黑盒决策 门控权重可视化切面贡献度 辅助医生理解AI诊断依据

三、预期效果验证方案

可解释性分析

绘制门控权重热力图(如下示例),验证模型对病变切面的关注度:

前壁梗死患者:4C切面权重峰值达0.83(主导诊断)
心尖梗死患者:2C切面权重升至0.79

四、潜在挑战与解决方案

挑战1:双切面数据不全(如部分患者缺失一个切面)
方案:引入跨切面知识蒸馏,用完整数据训练教师网络指导单切面学生网络。
挑战2:超声伪影干扰
方案:在Mamba前端加入对抗去噪模块(如Conditional GAN)。

2.2 faec_advance实验

name acc auc f1 precision recall specificity
faec 0.71875 0.7013 0.3077 1.0 0.1818 1.0

camus 多切面

name acc auc f1 precision recall specificity
Experiment_6 fold_1 0.8125 0.858333 0.769230 0.714286 0.833333 0.8
Experiment_6 fold_2 0.75 0.741666 0.555555 0.833333 0.416666 0.949999
Experiment_6 fold_3 0.8125 0.899999 0.699999 0.875 0.583333 0.949999
Experiment_6 fold_4 0.78125 0.852814 0.588235 0.833333 0.454545 0.952380
Experiment_6 fold_5 0.71875 0.701298 0.307692 1.0 0.1818 1.0
Experiment_6 avg 0.775 0.8108 0.5841 0.8512 0.4939 0.9305

3. MIMamba实验

改造之前的分割网络的mamba用于分类

3.1 camus数据集

camus A2C的实验

name acc auc f1 precision recall
Experiment_4 fold_1 0.375 0.533333 0.545454 0.375 1.0
Experiment_4 fold_2 0.34375 0.558441 0.511628 0.34375 1.0
Experiment_5 fold_1 0.75 0.699999 0.636364 0.699999 0.583333
Experiment_5 fold_2 0.71875 0.670995 0.571428 0.6 0.545454
Experiment_5 fold_3 0.71875 0.636363 0.470588 0.571428 0.4
Experiment_5 fold_4 0.71875 0.590909 0.470588 0.571428 0.4
Experiment_5 fold_5 0.625 0.490909 0.4 0.4 0.4
Experiment_5 avg 0.70625 0.617835 0.509794 0.568571 0.425757

camus 多切面

name acc auc f1 precision recall specificity
Experiment_7 fold_1 0.84375 0.837499 0.761904 0.888888 0.666666 0.949999
Experiment_7 fold_2 0.75 0.708333 0.555555 0.833333 0.416666 0.949999
Experiment_7 fold_3 0.8125 0.8375 0.727272 0.8 0.666666 0.899999
Experiment_7 fold_4 0.8125 0.818181 0.75 0.692307 0.818181 0.809523
Experiment_7 fold_5 0.71875 0.714285 0.307692 1.0 0.181818 1.0
Experiment_7 (监控acc) 0.7875 0.7832 0.6205 0.8429 0.5500 0.9219
Experiment_8 fold_1 0.65625 0.875 0.266666 0.666666 0.166666 0.949999
Experiment_8 fold_2 0.75 0.816666 0.666666 0.666666 0.666666 0.8
Experiment_8 fold_3 0.71875 0.858333 0.571428 0.666666 0.5 0.85
Experiment_8 fold_4 0.65625 0.8658 0.0 0 0 1
Experiment_8 fold_5 0.6875 0.796536 0.583333 0.538461 0.636363 0.714285
Experiment_8 (监控auc)
Experiment_9 fold_1 0.75 0.879166 0.5 1.0 0.333333 1.0
Experiment_9 fold_2 0.65625 0.695833 0.153846 1.0 0.083333 1.0
Experiment_9 fold_3 0.65625 0.770833 0.153846 1.0 0.083333 1.0
Experiment_9 fold_4 0.6875 0.649350 0.166666 1.0 0.090909 1.0
Experiment_9 fold_5 0.75 0.761904 0.5 0.8 0.363636 0.952380
Experiment_9 (precision)
Experiment_10 fold_1 0.84375 0.875 0.814814 0.733333 0.916666 0.8
Experiment_10 fold_2 0.78125 0.75 0.695652 0.727272 0.666666 0.85
Experiment_10 fold_3 0.84375 0.85 0.761904 0.888888 0.666666 0.949999
Experiment_10 fold_4 0.75 0.779220 0.692307 0.6 0.818181 0.714285
Experiment_10 fold_5 0.625 0.675324 0.625 0.476190 0.909090 0.476190
Experiment_10 (监控f1) 0.7688 0.7859 0.7179 0.6851 0.7955 0.7581
Experiment_11 fold_1 0.90625 0.85 0.869565 0.909090 0.833333 0.949999
Experiment_11 fold_2 0.75 0.754166 0.714285 0.625 0.833333 0.699999
Experiment_11 fold_3 0.78125 0.820833 0.740740 0.666666 0.833333 0.75
Experiment_11 fold_4 0.75 0.744588 0.714285 0.588235 0.909090 0.666666
Experiment_11 fold_5 0.65625 0.679653 0.645161 0.5 0.909090 0.523809
Experiment_best 0.8063 0.8129 0.7321 0.7512 0.7242 0.8547
Experiment_15 fold_1 0.875 0.870833 0.846153 0.785714 0.916666 0.85
Experiment_15 fold_2 0.8125 0.762499 0.769230 0.714285 0.833333 0.8
Experiment_15 fold_3 0.84375 0.854166 0.761904 0.888888 0.666666 0.949999
Experiment_15 fold_4 0.75 0.709956 0.6 0.666666 0.545454 0.857142
Experiment_15 fold_5 0.75 0.718614 0.692307 0.6 0.818181 0.714285
Experiment_15
Experiment_18 fold_1 0.78125 0.725 0.72 0.6923 0.75 0.8
Experiment_23 fold_1 0.78125 0.7625 0.6666 0.7777 0.5833 0.8999
Experiment_24 fold_1 0.8125 0.8166 0.6999 0.875 0.5833 0.949999
Experiment_25 fold_1 0.84375 0.862499 0.761904 0.888888 0.666666 0.949999

使用adaw

name acc auc f1 precision recall specificity
Experiment_28 fold_1 0.8125 0.791666 0.75 0.75 0.75 0.85
Experiment_29 fold_1 0.8125 0.7958 0.75 0.75 0.75 0.85

camus mi_mamba A2C和A4C路径

1
d_state: 16   d_state: 32
name acc auc f1 precision recall specificity
Experiment_26 fold_1 0.78125 0.766666 0.72 0.692307 0.75 0.8

换用 adaw

name acc auc f1 precision recall specificity
Experiment_27 fold_1 0.8125 0.75 0.75 0.75 0.75 0.85

camus 多切面 MultiStageFusion

name acc auc f1 precision recall specificity
Experiment_12 fold_1 0.8125 0.866666 0.699999 0.875 0.583333 0.949999
Experiment_12 fold_2 0.75 0.720833 0.666666 0.666666 0.666666 0.8
Experiment_12 fold_3 0.78125 0.75 0.631579 0.857142 0.5 0.949
Experiment_13 fold_1 0.8125 0.754166 0.699999 0.875 0.583333 0.949999
Experiment_14 fold_1 0.8125 0.891666 0.785714 0.6875 0.916666 0.75

camus 多尺度融合模型

name acc auc f1 precision recall specificity
Experiment_17 fold_1 0.71875 0.695833 0.666666 0.6 0.75 0.699999

camus 分层融合模型mi_mamba_hierarchical_model

name acc auc f1 precision recall specificity
Experiment_16 fold_1 0.78125 0.8208 0.7407 0.666666 0.833333 0.75
Experiment_16 fold_2 0.6875 0.5208 0.375 0.75 0.25 0.9499
Experiment_16 fold_3 0.75 0.8 0.6 0.75 0.5 0.8999
Experiment_16 fold_4 0.75 0.6883 0.6 0.666666 0.545454 0.857142
Experiment_16 fold_5 0.6875 0.580086 0.375 0.6 0.272727 0.904762
Experiment_19 fold_1 0.75 0.758333 0.5555
Experiment_20 fold_1 0.75 0.7333 0.6666 0.666666 0.666666 0.8
Experiment_21 fold_1 0.78125 0.75 0.695652 0.727272 0.666666 0.85
Experiment_22 fold_1 0.75 0.6875 0.5555 0.8333 0.416666 0.9499
0.75 0.6958 0.6 0.75 0.5 0.899999
换用adaw
Experiment_30 fold_1 0.78125 0.708333 0.666666 0.777777 0.583333 0.899999

cross-ssm

name acc auc f1 precision recall specificity
Experiment_32 fold_1 0.625 0.475 0 0 0 1
Experiment_33 fold_1 0.78125 0.783333 0.666666 0.777777 0.583333 0.899999
Experiment_34 fold_1 gate 0.75 0.754166 0.636363 0.699999 0.583333 0.85
Experiment_35 fold_1 gate w 0.8125 0.804166 0.727272 0.8 0.666666 0.899999
Experiment_36 fold_1 attn w 0.75 0.720833 0.6 0.75 0.5 0.899999
Experiment_37 fold_1 gate attn w 0.8125 0.825 0.727272 0.8 0.666666 0.899999
Experiment_ fold_1 attn gate w 0.75 0.774999 0.636363 0.699999 0.583333 0.85

从上面的来看cross-ssm效果还可以,继续使用cross-ssm 验证camus填充方式

cross-ssm gate attn w

name acc auc f1 precision recall specificity
Experiment_39 fold_1 repeat 0.78125 0.729166 0.666666 0.777777 0.583333 0.899999
Experiment_40 fold_1 interpolate 0.84375 0.845833 0.8 0.769230 0.833333 0.85
Experiment_40 fold_1 random_repeat 0.75 0.754166 0.666666 0.666666 0.666666 0.8
Experiment_41 fold_1 cyclic 0.8125 0.791666 0.75 0.75 0.75 0.85
Experiment_42 fold_1 reflect 0.71875 0.745833 0.666666 0.6 0.75 0.699999
Experiment_43 fold_1 noise 0.75 0.745833 0.692307 0.642857 0.75 0.75
Experiment_44 fold_1 random 0.8125 0.783333 0.727272 0.8 0.666666 0.899999

hidden size降低到256再试试

name acc auc f1 precision recall specificity
Experiment_60 fold_1 repeat 0.75 0.766666 0.636363 0.699999 0.583333 0.85
Experiment_61 fold_1 interpolate 0.78125 0.829166 0.666666 0.777777 0.583333 0.899999
Experiment_62 fold_1 random_repeat 0.8125 0.8083 0.7692 0.714285 0.833333 0.8
Experiment_63 fold_1 cyclic 0.78125 0.716666 0.72 0.6923 0.75 0.8
Experiment_64 fold_1 reflect 0.78125 0.795833 0.666666 0.777777 0.583333 0.899999
Experiment_65 fold_1 noise 0.8125 0.795833 0.75 0.75 0.75 0.85
Experiment_66 fold_1 random 0.78125 0.7875 0.695652 0.727272 0.666666 0.85

cross-ssm

name acc auc f1 precision recall specificity
Experiment_45 fold_1 0.84375 0.858333 0.761904 0.888888 0.666666 0.949999
Experiment_45 fold_1 0.78125 0.8 0.72 0.692307 0.75 0.8
Experiment_46 fold_1 0.71875 0.704166 0.666666 0.6 0.75 0.699999
Experiment_47 fold_1 0.78125 0.758333 0.695652 0.727272 0.666666 0.85
Experiment_48 fold_1 b8 0.78125 0.779166 0.72 0.692307 0.75 0.8
Experiment_49 fold_1 b16 0.78125 0.758333 0.740740 0.666666 0.833333 0.75
增加模型feat[32, 64, 128, 256]
Experiment_50 fold_1 b8 0.78125 0.741666 0.740740 0.666666 0.833333 0.75
Experiment_51 fold_1 b8 0.78125 0.7875 0.758620 0.647058 0.916666 0.699999
Experiment_52 fold_1 b8 0.75 0.804166 0.714285 0.625 0.833333 0.699999
改回了填充到32帧
Experiment_53 fold_1 0.78125 0.779166 0.72 0.6923 0.75 0.8
Experiment_54 fold_1 0.78125 0.7625 0.666666 0.7777 0.583333 0.899999
增大hidden size 512
Experiment_56 fold_1 0.71875 0.666666 0.666666 0.6 0.75 0.699999
Experiment_57 fold_1 0.78125 0.729166 0.666666 0.777777 0.583333 0.899999
hidden size 256
Experiment_58 fold_1 0.78125 0.754166 0.740740 0.666666 0.833333 0.75
hidden size 768
Experiment_59 fold_1 0.78125 0.7875 0.6315 0.8571 0.5 0.9499

前面的损失都是bce

损失

name acc auc f1 precision recall specificity
Experiment_67 fold_1 focal 0.6875 0.770833 0.545454 0.6 0.5 0.8
Experiment_68 fold_1 dice 0.75 0.779166 0.6 0.75 0.5 0.899999
Experiment_69 fold_1 bce_dice 0.75 0.729166 0.636363 0.699999 0.583333 0.85
Experiment_70 fold_1 tversky 0.875 0.829166 0.846153 0.785714 0.916666 0.85
Experiment_71 fold_1 asymmetric 0.78125 0.766666 0.72 0.6923 0.75 0.8

tversky 完整实验

name acc auc f1 precision recall specificity
Experiment_70 fold_1 0.875 0.829166 0.846153 0.785714 0.916666 0.85
Experiment_70 fold_2 0.75 0.675 0.666666 0.666666 0.666666 0.8
Experiment_70 fold_3 0.71875 0.691666 0.526315 0.714285 0.416666 0.899999
Experiment_70 fold_4 0.71875 0.718614 0.608695 0.583333 0.636363 0.761904
Experiment_70 fold_5 0.5625 0.597402 0.533333 0.421052 0.727272 0.476190

还是改为16帧吧

name acc auc f1 precision recall specificity
Experiment_74 fold_1 0.84375 0.770833 0.782608 0.818181 0.75 0.899999
Experiment_74 fold_2 0.6875 0.645833 0.375 0.75 0.25 0.949999
Experiment_74 fold_3 0.6875 0.6875 0.5454 0.6 0.5 0.8
Experiment_74 fold_4 0.78125 0.744588 0.666666 0.699999 0.636363 0.857142
Experiment_74 fold_5 0.6875 0.670995 0.583333 0.538461 0.636363 0.714285

bce损失

name acc auc f1 precision recall specificity
Experiment_75 fold_1 0.8125 0.7666 0.75 0.75 0.75 0.85
Experiment_75 fold_2 0.6875 0.637499 0.444444 0.666666 0.333333 0.899999
Experiment_75 fold_3 0.6875 0.612499 0.583333 0.583333 0.583333 0.75
Experiment_75 fold_4 0.75 0.766233 0.555555 0.714285 0.454545 0.904761
Experiment_75 fold_5 0.59375 0.683982 0.580645 0.449999 0.818181 0.476190

去掉模块

bce

name acc auc f1 precision recall specificity
Experiment_76 fold_1 0.8125 0.8292 0.7273 0.8 0.6667 0.8999
Experiment_76 fold_2 0.75 0.7083 0.6923 0.6429 0.75 0.75
Experiment_76 fold_3 0.75 0.7916 0.6364 0.6999 0.5833 0.85
Experiment_76 fold_4 0.78125 0.7359 0.6666 0.6999 0.6363 0.8571
Experiment_76 fold_5 0.4375 0.5757 0.5263 0.3703 0.9090 0.1904

只加上gated

name acc auc f1 precision recall specificity
Experiment_77 fold_1 0.6875 0.7375 0.6666 0.5555 0.8333 0.6
Experiment_77 fold_2 0.7187 0.6666 0.64 0.6153 0.6666 0.75
Experiment_77 fold_3 0.5937 0.5916 0.6285 0.4782 0.9166 0.4
Experiment_77 fold_4 0.5625 0.6147 0.5882 0.4347 0.9090 0.3809
Experiment_77 fold_5

只加上attn

name acc auc f1 precision recall specificity
Experiment_78 fold_1 0.75 0.7541 0.6666 0.6666 0.6666 0.8
Experiment_78 fold_2 0.71875 0.6916 0.64 0.6153 0.6666 0.75
Experiment_78 fold_3 0.625 0.6208 0.6 0.5 0.75 0.55
Experiment_78 fold_4 0.7187 0.5930 0.5263 0.625 0.4545 0.8571
Experiment_78 fold_5

3.1.1 mamba+echoprime video

使用两个分支对两个切面的数据进行提取,echo_prime和mamba进行提取,最后一起送入分类器

mamba: A2C切面的数据

echo_prime: A4C切面的数据

融合: 通道拼接

name acc auc f1 precision recall specificity
Experiment_79 fold_1 0.7813 0.7333 0.6667 0.7777 0.5833 0.90
Experiment_79 fold_2 0.78125 0.774999 0.5882 1.0 0.4166 1.0
Experiment_79 fold_3 0.875 0.9208 0.818181 0.899999 0.75 0.949999
Experiment_79 fold_4 0.84375 0.839826 0.761904 0.8 0.727272 0.904761
Experiment_79 fold_5 0.78125 0.757575 0.6666 0.699999 0.636363 0.857142
Experiment_79 0.8125

保留2*2*2的空间结构

name acc auc f1 precision recall specificity
Experiment_80 fold_1 0.78125 0.695833 0.666666 0.777777 0.5833 0.8999
Experiment_80 fold_2 0.78125 0.787499 0.695652 0.727272 0.666666 0.85
Experiment_80 fold_3 0.875 0.8958 0.846153 0.785714 0.916666 0.85
Experiment_80 fold_4 0.84375 0.839826 0.782608 0.75 0.818181 0.857142
Experiment_80 fold_5 0.8125 0.787878 0.6666 0.8571 0.5454 0.9523
Experiment_80 0.8188 0.8014 0.7315 0.7796 0.7060 0.8818

保留4*4*4的空间结构

name acc auc f1 precision recall specificity
Experiment_81 fold_1 0.8125 0.875 0.6999 0.875 0.5833 0.9499
Experiment_81 fold_2 0.8125 0.775 0.7272 0.8 0.666666 0.899999
Experiment_81 fold_3 0.875 0.879166 0.8333 0.8333 0.8333 0.8999
Experiment_81 fold_4 0.84375 0.848484 0.7826 0.75 0.818181 0.857142
Experiment_81 fold_5 0.84375 0.7922 0.7619 0.8 0.727272 0.904761
Experiment_80 0.8375 0.8139 0.7610 0.8117 0.7257 0.9023

交换特征提取器

mamba: A4C切面的数据

echo_prime: A2C切面的数据

融合: 通道拼接

保留4*4*4的空间结构

name acc auc f1 precision recall specificity
Experiment_82 fold_1 0.90625 0.879166 0.879999 0.846153 0.916666 0.899999
Experiment_82 fold_2 0.78125 0.741666 0.695652 0.727272 0.666666 0.85
Experiment_82 fold_3 0.8125 0.8666 0.7692 0.714285 0.8333 0.8
Experiment_82 fold_4 0.90625 0.9220 0.842105 1.0 0.727272 1.0
Experiment_82 fold_5 0.75 0.6666 0.5 0.8 0.363636 0.9523
Experiment_82 0.8312 0.8152 0.7373 0.8175 0.6976 0.8805

学习率

name acc auc f1 precision recall specificity
Experiment_83 fold_5 1e-4 0.71875 0.6320 0.3077 1.0 0.1818 1.0
Experiment_84 fold_5 1e-6 0.78125 0.7445 0.5882 0.8333 0.454545 0.952380
name acc auc f1 precision recall specificity
Experiment_84 fold_1 0.9375 0.904166 0.916666 0.916666 0.916666 0.949999
Experiment_84 fold_2 0.78125 0.699999 0.72 0.692307 0.75 0.8
Experiment_84 fold_3 0.84375 0.816666 0.8 0.769230 0.833333 0.85
Experiment_84 fold_4 0.90625 0.8961 0.8571 0.8999 0.818181 0.952380
Experiment_84 fold_5 0.78125 0.7445 0.5882 0.8333 0.454545 0.952380
Experiment_84 95% 0.8500 0.8249 0.7857 0.8148 0.7586 0.9020

3.1.2 mamba+echoprime text video

在mamba的decoder部分加入三层知识向量

mamba: A4C切面的数据

echo_prime: A2C切面的数据

融合: 通道拼接

保留4*4*4的空间结构 损失bce

name acc auc f1 precision recall specificity
Experiment_102 fold_1 0.9375 0.895833 0.916666 0.916666 0.916666 0.949999
Experiment_102 fold_2 0.78125 0.7374 0.6666 0.7777 0.5833 0.8999
Experiment_102 fold_3 0.78125
Experiment_102 fold_4 0.875
Experiment_102 fold_5 0.75
Experiment_102 0.816

在mamba的decoder部分加入知识向量和echo prime视频提取部分都加入知识向量

学习率1e-5

name acc auc f1 precision recall specificity
Experiment_103 fold_1 0.90625 0.8791 0.8799 0.8461 0.9166 0.8999
Experiment_103 fold_2 0.78125
Experiment_103 fold_3 0.8125
Experiment_103 fold_4 0.875 0.8744 0.8 0.8888 0.7272 0.9523
Experiment_103 fold_5 0.75
Experiment_103 0.825

损失asymmetric 学习率1e-5

name acc auc f1 precision recall specificity
Experiment_104 fold_1 0.90625
Experiment_104 fold_2 0.78125
Experiment_104 fold_5 0.75

学习率

name acc auc f1 precision recall specificity
Experiment_105 fold_5 1e-6 0.71875
Experiment_106 fold_5 1e-5 0.75
Experiment_107 fold_5 1e-4 0.71875
Experiment_108 fold_5 3e-4 0.75
Experiment_109 fold_5 3e-5 0.6875
Experiment_110 fold_5 3e-6 0.71875

效果好差,还是只加一层知识向量吧

name acc auc f1 precision recall specificity
Experiment_111 fold_5 1e-6 0.75

在A4C的mamba的瓶颈层加入知识向量,同时A2C的video encoder也加入知识向量

name acc auc f1 precision recall specificity
Experiment_112 fold_1 0.9375 0.925 0.916666 0.916666 0.916666 0.9499
Experiment_112 fold_2 0.78125 0.7374 0.6666 0.7777 0.5833 0.8999
Experiment_112 fold_3 0.78125 0.7958 0.72 0.6923 0.75 0.8
Experiment_112 fold_4 0.875 0.8701 0.8181 0.818181 0.818181 0.9047
Experiment_112 fold_5
Experiment_112

b16

name acc auc f1 precision recall specificity
Experiment_113 fold_1 0.90625 0.9208 0.8799 0.8461 0.9166 0.8999
Experiment_113 fold_2 0.78125 0.725 0.72 0.6923 0.75 0.8
Experiment_113 fold_3 0.75 0.8125 0.7142 0.625 0.8333 0.6999
Experiment_113 fold_4 0.90625 0.8658 0.8571 0.8999 0.8181 0.9523
Experiment_113 fold_5 0.78125 0.6666 0.6666 0.6999 0.6363 0.8571
Experiment_113

交换A2C和A4C 十分拉跨

name acc auc f1 precision recall specificity
Experiment_114 fold_1 0.75 0.8458 0.7333 0.6111 0.9166 0.6499
Experiment_114 fold_2 0.8125 0.875 0.7692 0.7142 0.8333 0.8
Experiment_114 fold_3 0.78125 0.825 0.7586 0.6470 0.9166 0.6999
Experiment_114 fold_4 0.6875 0.7748 0.6428 0.5294 0.8181 0.6190
Experiment_114 fold_5 0.75 0.6926 0.6666 0.6153 0.7272 0.7619

在mamba的decoder部分加入知识向量和echo prime video encoder 加入知识向量

mamba: A4C切面的数据

echo_prime: A2C切面的数据

融合: 通道拼接

保留4*4*4的空间结构 损失bce

学习率1e-6 b8

name acc auc f1 precision recall specificity
Experiment_115 fold_1 0.9375 0.9125 0.9166 0.9166 0.9166 0.9499
Experiment_115 fold_2 0.8125 0.7333 0.75 0.75 0.75 0.85
Experiment_115 fold_3 0.8125 0.8041 0.7272 0.8 0.6666 0.8999
Experiment_115 fold_4 0.90625 0.9004 0.8571 0.8999 0.8181 0.9523
Experiment_115 fold_5 0.78125 0.7835 0.5882 0.8333 0.4545 0.9523
Experiment_115 95% 0.8500 0.8315 0.7778 0.8400 0.7241 0.9216

3.2 hmc数据集

简单的先尝试,效果非常的糟糕

name acc auc f1 precision recall specificity
Experiment_73 fold_1 0.84375 0.884057 0.888888 0.909090 0.869565 0.777777
Experiment_73 fold_2 0.75 0.744588 0.826086 0.759999 0.904761 0.454545
Experiment_73 fold_3 0.6875 0.647058 0.75 0.652174 0.882352 0.466666
Experiment_73 fold_4 0.6875 0.536796 0.807692 0.677419 1.0 0.090909
Experiment_73 fold_5

3.2.1 mamba+echoprime video

mamba: A4C切面的数据

echo_prime: A2C切面的数据

融合: 通道拼接

保留4*4*4的空间结构

name acc auc f1 precision recall specificity
Experiment_85 fold_1 0.90625 0.913043 0.936170 0.916666 0.956521 0.777777
Experiment_85 fold_2 0.78125 0.787878 0.829268 0.85 0.8095 0.7272
Experiment_85 fold_3 0.8125 0.8784 0.8235 0.8235 0.8235 0.8
Experiment_85 fold_4 0.8125 0.8095 0.85 0.8947 0.8095 0.8182
Experiment_85 fold_5 0.84375 0.878787 0.87179 0.9444 0.8095 0.9090
Experiment_85 0.83125

损失改为tversky

name acc auc f1 precision recall specificity
Experiment_86 fold_2 0.78125 0.813852 0.820512 0.888888 0.761904 0.818181

损失改为focal

name acc auc f1 precision recall specificity
Experiment_87 fold_2 0.78125 0.8 0.8108 0.9375 0.714285 0.9090

损失改为dice

name acc auc f1 precision recall specificity
Experiment_88 fold_2 0.78125 0.783549 0.820512 0.888888 0.761904 0.818181

损失改为bce_dice

name acc auc f1 precision recall specificity
Experiment_89 fold_2 0.78125 0.805194 0.8205 0.8888 0.7619 0.818181

损失改为asymmetric

name acc auc f1 precision recall specificity
Experiment_90 fold_1 0.90625 0.932367 0.936170 0.916666 0.956521 0.777777
Experiment_90 fold_2 0.8125 0.8484 0.8571 0.8571 0.8571 0.7272
Experiment_90 fold_3 0.84375 0.843137 0.864864 0.8 0.941176 0.733333
Experiment_90 fold_4 0.8125 0.835497 0.85 0.894736 0.809523 0.818181
Experiment_90 fold_5 0.84375 0.861471 0.883720 0.863636 0.904761 0.727272
Experiment_90 0.84375 0.864175 0.888351 0.866628 0.891956 0.753332

改为交叉注意力融合

name acc auc f1 precision recall specificity
Experiment_95 fold_1 0.84375 0.903381 0.883720 0.949999 0.826086 0.888888
Experiment_95 fold_2 0.78125 0.779220 0.851063 0.769230 0.952380 0.454545
Experiment_95 fold_3 0.8125
Experiment_95 fold_4 0.78125
Experiment_95 fold_5 0.8125 0.818181 0.857142 0.857142 0.857142 0.727272
Experiment_95 拉跨

加权融合

name acc auc f1 precision recall specificity
Experiment_96 fold_1 0.90625 0.937198 0.936170 0.916666 0.9565 0.7777
Experiment_96 fold_2 0.8125 0.826839 0.857142 0.857142 0.857142 0.727272
Experiment_96 fold_3 0.8125 0.882352 0.849999 0.739130 1.0 0.6
Experiment_96 fold_4 0.8125 0.826839 0.85 0.894736 0.809523 0.818181
Experiment_96 fold_5 0.84375 0.857142 0.883720 0.863636 0.904761 0.727272
Experiment_96

Adaw优化器

name acc auc f1 precision recall specificity
Experiment_97 fold_1 0.90625 0.927536 0.930232 1.0 0.869565 1.0
Experiment_97 fold_2 0.78125 0.831168 0.810810 0.9375 0.714285 0.909090
Experiment_97 fold_3 0.78125 0.882352 0.774193 0.857142 0.705882 0.866666
Experiment_97 fold_4 0.8125 0.844155 0.842105 0.941176 0.761904 0.909090
Experiment_97 fold_5 0.8125 0.883116 0.85 0.894736 0.809523 0.818181
Experiment_97

SGD

name acc auc f1 precision recall specificity
Experiment_98 fold_1 0.8125 0.816425 0.869565 0.869565 0.869565 0.666666
Experiment_98 fold_2 0.65625

RMSprop

name acc auc f1 precision recall specificity
Experiment_99 fold_1 0.90625 0.913043 0.930232 1.0 0.869565 1.0
Experiment_99 fold_2 0.78125 0.822510 0.810810 0.9375 0.714285 0.909090

Adagrad

name acc auc f1 precision recall specificity
Experiment_100 fold_1 0.78125
Experiment_100 fold_2 0.65625

还是改回MADGRAD优化器

做了点数据增强

name acc auc f1 precision recall specificity
Experiment_101 fold_1 0.875 0.9565 0.916666 0.879999 0.956521 0.666666
Experiment_101 fold_2 0.8125 0.7922 0.85 0.8947 0.8095 0.818181
Experiment_101 fold_3 0.84375 0.886274 0.871794 0.772727 1.0 0.666666
Experiment_101 fold_4 0.75
Experiment_101 fold_5 0.8125 0.8311 0.875 0.7777 1.0 0.4545

mamba: A2C切面的数据

echo_prime: A4C切面的数据

融合: 通道拼接

保留4*4*4的空间结构

损失还是asymmetric

name acc auc f1 precision recall specificity
Experiment_92 fold_1 0.90625 0.971014 0.936170 0.916666 0.956521 0.7777
Experiment_92 fold_2 0.84375 0.805194 0.888888 0.833333 0.952380 0.636363
Experiment_92 fold_3 0.75 0.7960 0.7647 0.7647 0.7647 0.7333
Experiment_92 fold_4 0.78125 0.826839 0.810810 0.9375 0.714285 0.909090
Experiment_92 fold_5 0.84375 0.878787 0.878048 0.899999 0.857142 0.818181
Experiment_92 0.825 效果拉跨

mamba: A4C切面的数据

echo_prime: A2C切面的数据

融合: 通道拼接

保留4*4*4的空间结构

损失还是asymmetric

但是使用的是dec0作为mamba的输出

name acc auc f1 precision recall specificity
Experiment_93 fold_1 0.90625 0.942028 0.938775 0.884615 1.0 0.666666
Experiment_93 fold_2 0.8125 0.792207 0.85 0.894736 0.809523 0.818181
Experiment_93 fold_3 0.8125 0.870588 0.823529 0.823529 0.823529 0.8
Experiment_93 fold_4 0.8125 0.805194 0.857142 0.857142 0.857142 0.727272
Experiment_93 fold_5 0.84375 0.874458 0.878048 0.89999 0.857142 0.818181
Experiment_93 0.8375

保留6*6*6的空间结构

name acc auc f1 precision recall specificity
Experiment_93 fold_1 0.90625
Experiment_93 fold_3 0.75

mamba: A4C切面的数据

echo_prime: A2C切面的数据

融合: 通道拼接

保留4*4*4的空间结构

损失还是asymmetric

但是使用的是dec2作为mamba的输出

name acc auc f1 precision recall specificity
Experiment_94 fold_1 0.90625 0.946859 0.936170 0.916666 0.956521 0.777777
Experiment_94 fold_2 0.78125 0.796536 0.829268 0.85 0.809523 0.727272
Experiment_94 fold_3 0.84375 0.925490 0.848484 0.875 0.823529 0.866666
Experiment_94 fold_4 0.75 0.779220 0.809523 0.809523 0.809523 0.636363
Experiment_94 fold_5 0.8125 0.861471 0.857142 0.857142 0.857142 0.727272
name acc auc f1 precision recall specificity
Experiment_95 fold_1 1e-8 0.90625 0.9565 0.938775 0.884615 1.0 0.666666
Experiment_95 fold_3 1e-8 0.8125 0.921568 0.849999 0.739130 1.0 0.6

3.2.2 mamba+echoprime text video

加入知识向量

mamba: A4C切面的数据

echo_prime: A2C切面的数据

都加入了知识向量

融合: 通道拼接

保留4*4*4的空间结构

学习率1e-6 b8 损失:bce

name acc auc f1 precision recall specificity
Experiment_116 fold_1 0.90625 0.8067 0.9387 0.8846 1.0 0.6666
Experiment_116 fold_2 0.78125 (0.81) 0.818181 0.8108 0.9375 0.7142 0.9090
Experiment_116 fold_3 0.8125 (0.84) 0.8509 0.8 0.9230 0.7058 0.9333
Experiment_116 fold_4 0.8125 0.8614 0.85 0.8947 0.8095 0.8181
Experiment_116 fold_5 0.84375 0.8484 0.8837 0.8636 0.9047 0.7272
Experiment_116 0.83125
name acc auc f1 precision recall specificity
Experiment_127 fold_1 1
Experiment_127 fold_2
Experiment_127 fold_3
Experiment_127 fold_4
Experiment_127 fold_5
Experiment_127

使用dec2作为输出

name acc auc f1 precision recall specificity
Experiment_126 fold_1 0.90625 0.8937 0.9387 0.8846 1.0 0.6666
Experiment_126 fold_2 0.8125 0.8571 0.8571 0.8571 0.8571 0.7272
Experiment_126 fold_3 0.75 0.8117 0.75 0.8 0.7058 0.8
Experiment_126 fold_4 0.6875 0.7662 0.75 0.7894 0.7142 0.6363
Experiment_126 fold_5 0.84375 0.8354 0.8718 0.9444 0.8095 0.9090
Experiment_126

学习率1e-5 b8 损失:bce

name acc auc f1 precision recall specificity
Experiment_125 fold_1 0.875 0.8743 0.9166 0.8799 0.9565 0.6666
Experiment_125 fold_2 0.75 0.8051 0.7777 0.9333 0.6666 0.9090
Experiment_125 fold_3 0.78125 0.8705 0.7407 1.0 0.5882 1.0
Experiment_125 fold_4 0.71875 0.7445 0.7567 0.875 0.6666 0.8181
Experiment_125 fold_5 0.84375 0.8311 0.8837 0.8636 0.9047 0.7272
Experiment_125

损失:asymmetric

name acc auc f1 precision recall specificity
Experiment_117 fold_1 0.90625 0.8405 0.9387 0.8846 1.0 0.6666
Experiment_117 fold_2 0.75 0.7229 0.7777 0.9333 0.6666 0.9090
Experiment_117 fold_3 0.8125 (0.84) 0.8823 0.8421 0.7619 0.9411 0.6666
Experiment_117 fold_4 0.8125 0.8528 0.8421 0.9411 0.7619 0.9090
Experiment_117 fold_5 0.84375 0.8354 0.8837 0.8636 0.9047 0.7272
Experiment_117 0.825

损失weighted_bce 2.0

name acc auc f1 precision recall specificity
Experiment_118 fold_1 0.875 0.8937 0.9166 0.8799 0.9565 0.6666
Experiment_118 fold_2 0.78125 0.818181 0.8205 0.8888 0.7619 0.8181
Experiment_118 fold_3 0.8125 0.8431 0.8 0.9230 0.7058 0.9333
Experiment_118 fold_4 0.75 0.8354 0.7894 0.8823 0.7142 0.8181
Experiment_118 fold_5 0.84375 0.8441 0.8837 0.8636 0.9047 0.7272
Experiment_118 0.8125 0.8469 0.8420 0.8875 0.7886 0.7931

损失weighted_bce 0.75

name acc auc f1 precision recall specificity
Experiment_119 fold_1 0.90625 0.8550 0.9387 0.8846 1.0 0.6666
Experiment_119 fold_2 0.78125 0.8138 0.8108 0.9375 0.7142 0.9090
Experiment_119 fold_3 0.84375 0.9098 0.8571 0.8333 0.8823 0.8
Experiment_119 fold_4 0.8125 0.8528 0.8571 0.8571 0.8571 0.7272
Experiment_119 fold_5 0.8125 0.8528 0.8571 0.8571 0.8571 0.7272
Experiment_119 0,83125

损失weighted_bce 0.5

name acc auc f1 precision recall specificity
Experiment_120 fold_1 0.84375 0.8550 0.8936 0.875 0.9130 0.6666
Experiment_120 fold_2 0.75 0.8268 0.7777 0.9333 0.6666 0.9090
Experiment_120 fold_3 0.875 0.9058 0.8888 0.8421 0.9411 0.8
Experiment_120 fold_4 0.75 0.8398 0.7894 0.8823 0.7142 0.8181
Experiment_120 fold_5 0.78125 0.8398 0.8292 0.85 0.8095 0.7272
Experiment_120 0.8

使用的dec0作为mamba的输出

学习率1e-6 b8 损失:bce

name acc auc f1 precision recall specificity
Experiment_121 fold_1 0.875 0.9227 0.9166 0.8799 0.9565 0.6666
Experiment_121 fold_2 0.8125 0.8051 0.875 0.7777 1.0 0.4545
Experiment_121 fold_3 0.875 0.8862 0.8823 0.8823 0.8823 0.8666
Experiment_121 fold_4 0.6875 0.6839 0.7222 0.8666 0.6190 0.8181
Experiment_121 fold_5 0.875 0.8917 0.9047 0.9047 0.9047 0.8181
Experiment_121 0.825

使用的dec0作为mamba的输出

学习率1e-6 b8 损失:asymmetric

name acc auc f1 precision recall specificity
Experiment_122 fold_4 0.6875

学习率:1e-4

name acc auc f1 precision recall specificity
Experiment_123 fold_1 0.875 0.8550 0.9166 0.8799 0.9565 0.6666
Experiment_123 fold_2 0.75
Experiment_123 fold_3 0.84375
Experiment_123 fold_4 0.71875 0.7532 0.7428 0.9285 0.6190 0.9090
Experiment_123 fold_5 0.8125
Experiment_123

学习率1e-4 b8 损失:bce 少加几层知识向量

name acc auc f1 precision recall specificity
Experiment_124 fold_1 0.90625 0.8599 0.9387 0.8846 1.0 0.6666
Experiment_124 fold_2 0.78125 0.8 0.8205 0.8888 0.7619 0.8181
Experiment_124 fold_3 0.84375 0.9568 0.8387 0.9285 0.7647 0.9333
Experiment_124 fold_4 0.71875 0.7532 0.7428 0.9285 0.6190 0.9090
Experiment_124 fold_5 0.8125 0.8311 0.8571 0.8571 0.8571 0.7272
Experiment_124 0.8125

4. sota实验

4.1 camus

name acc auc f1 precision recall specificity
BI-Mamba 78.75 73.45 64.87 85.01 54.85 /
CARL 83.75 60.33 73.42 85.56 67.42 /
MV-Swin-T 62.50 57.14 60.42 61.65 62.17 /
CTT-Net 62.50 70.04 70.70 62.96 82.62 /
MIMamba 85.00 83.15 77.78 84.00 72.41 92.16

4.2 hmc

5. 示意图

Exp_115可视化ROC

image-20251029165752231


mamba心脏疾病诊断
http://example.com/2025/09/25/mamba-classify/
作者
Mercury
发布于
2025年9月25日
许可协议