Sign Up to like & get
recommendations!
1
Published in 2022 at "IEEE Signal Processing Letters"
DOI: 10.1109/lsp.2022.3178673
Abstract: The Non-local self-attention mechanism can significantly improve the capability of feature representation with long-range dependencies at the cost of high computational complexity. To address the issue, the self-attention-based autoregressive axial transformer has been proposed to…
read more here.
Keywords:
network;
efficient axial;
feature maps;
axial attention ... See more keywords
Sign Up to like & get
recommendations!
0
Published in 2022 at "Brain Sciences"
DOI: 10.3390/brainsci13010012
Abstract: Accurately identifying tumors from MRI scans is of the utmost importance for clinical diagnostics and when making plans regarding brain tumor treatment. However, manual segmentation is a challenging and time-consuming process in practice and exhibits…
read more here.
Keywords:
brain tumor;
brain;
axial attention;
segmentation ... See more keywords