Skip to content

Feature/residual attenton unet#7

Open
perctrix wants to merge 11 commits intoProjectNeura:mainfrom
perctrix:feature/residual_attenton_unet
Open

Feature/residual attenton unet#7
perctrix wants to merge 11 commits intoProjectNeura:mainfrom
perctrix:feature/residual_attenton_unet

Conversation

@perctrix
Copy link
Collaborator

@perctrix perctrix commented Sep 18, 2025

This pull request extends the UNet implementation with new modular components and factory functions, enabling more flexibility and advanced architectures. The most significant changes include the introduction of attention gates, residual convolutional blocks, and attention-based upsampling, along with new factory functions to easily create UNet variants using these components.

New UNet Components

  • Added AttentionGate, UNetResidualConv, and UNetAttentionUpsample modules in components.py to provide attention mechanisms, residual connections, and attention-based upsampling for UNet architectures.
  • Updated the __init__.py to export these new components for external use.

UNet Factory Functions

  • Added make_unet2d_resconv and make_unet2d_attnupsample functions to quickly instantiate UNet models with residual convolution blocks or attention-based upsampling, respectively.

UNet Class Flexibility

  • Modified the UNet class to accept a conv_block parameter, allowing users to specify the type of convolutional block (e.g., standard double conv or residual conv) used in the initial layer.

Testing and Example Usage

  • Expanded the example usage in unet.py to demonstrate instantiation and sanity checks for the new UNet variants, including residual and attention-upsampled models.

fixes #4

@ATATC ATATC self-requested a review September 25, 2025 14:27
@ATATC ATATC added enhancement New feature or request todo New task or assignment labels Sep 25, 2025
@perctrix perctrix force-pushed the feature/residual_attenton_unet branch from 1dbd8aa to 2bab412 Compare September 25, 2025 15:19
perctrix and others added 2 commits September 25, 2025 11:19
Signed-off-by: Steven Chen <117523987+perctrix@users.noreply.github.com>
Copy link
Contributor

@ATATC ATATC left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No blank lines.

@perctrix perctrix requested a review from ATATC October 15, 2025 22:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request todo New task or assignment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add support for Residual/Attention blocks in U-Net

2 participants