Skip to content

Moved sigmoid, tanh and relu to neural_network/activation_functions from math/#9050

Closed
AdarshAcharya5 wants to merge 1 commit intoTheAlgorithms:masterfrom
AdarshAcharya5:move-tanh-sigmoid-relu
Closed

Moved sigmoid, tanh and relu to neural_network/activation_functions from math/#9050
AdarshAcharya5 wants to merge 1 commit intoTheAlgorithms:masterfrom
AdarshAcharya5:move-tanh-sigmoid-relu

Conversation

@AdarshAcharya5
Copy link
Contributor

Describe your change:

Fixes #9048

  • Documentation change?

Checklist:

  • I have read CONTRIBUTING.md.
  • This pull request is all my own work -- I have not plagiarized.
  • I know that pull requests will not be merged if they fail the automated tests.
  • This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
  • All new Python files are placed inside an existing directory.
  • All filenames are in all lowercase characters with no spaces or dashes.
  • All functions and variable names follow Python naming conventions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

sigmoid, tanh and relu should be moved to neural_network/activation_functions from maths/

2 participants