Skip to content

Bug fix for ELU PyTorch implementation by multiplying with self.config["alpha"]#161

Open
MaungThantAI wants to merge 1 commit intophlippe:masterfrom
MaungThantAI:ELU_bug_fix
Open

Bug fix for ELU PyTorch implementation by multiplying with self.config["alpha"]#161
MaungThantAI wants to merge 1 commit intophlippe:masterfrom
MaungThantAI:ELU_bug_fix

Conversation

@MaungThantAI
Copy link
Copy Markdown

Bug description

In your original code of Tutorial: 3: Activation Functions (PyTorch), ELU is implemented as

class ELU(ActivationFunction):

    def forward(self, x):
        return torch.where(x > 0, x, torch.exp(x)-1)

I may be mistaken, but I believe there may be a small issue in Tutorial 3: Activation Functions (PyTorch), specifically in the implementation of the ELU activation function. In the current implementation, it appears that the output is not multiplied by self.config["alpha"]. I wondered whether this was intentional, perhaps assuming self.config["alpha"] = 1, or whether it might be an oversight. I hope that bringing this to your attention would be useful for you.

I fixed the implementation of ELU to

class ELU(ActivationFunction):
    
    def forward(self, x):
        return torch.where(x > 0, x, self.config["alpha"] * (torch.exp(x)-1))

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant