Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

change activation function but return same accuracy value!!! #11

Open
bakigkgz1 opened this issue Dec 17, 2023 · 4 comments
Open

change activation function but return same accuracy value!!! #11

bakigkgz1 opened this issue Dec 17, 2023 · 4 comments

Comments

@bakigkgz1
Copy link

Hello, when I change the activation function inside the formula.cpp file (for example, when I write the formula for the tanh function), it consistently returns an accuracy value of 9.80%. Do you know why this might be happening?

Screenshot from 2023-12-17 17-46-44
image

Thank you for help

@leo-c-ling99
Copy link

Hi,

Disclaimer it's been a few year since I've looked at this code. It seems that the gradient for the activation is hardcoded within train.cpp. You might need to update their backpropagation code if you want to change the activation.

// Backpropagation

@bakigkgz1
Copy link
Author

bakigkgz1 commented Dec 19, 2023 via email

@bakigkgz1
Copy link
Author

Hello, first of all, thank you for your reply. I changed the part you said as follows, but after a certain iteration, it gets stuck at 10.35%. (I want to use tanh as activation function)

image

// Backpropagation
/* Second layer (hidden layer to the output layer) */

		for (int j = 0; j < param->nOutput; j++){
                             s2[j] = -2*a2[j] * (1 - a2[j]*a2[j])*(Output[i][j] - a2[j]);
		}

		/* First layer (input layer to the hidden layer) */
		std::fill_n(s1, param->nHide, 0);
		#pragma omp parallel for
					
		for (int j = 0; j < param->nHide; j++) {
			for (int k = 0; k < param->nOutput; k++) {
				s1[j] += a1[j] * (1 - a1[j]*a1[j]) * weight2[k][j] * s2[k];
			}
		}

@bakigkgz1
Copy link
Author

bakigkgz1 commented Jan 21, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants