Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make it easier to set activations #283

Open
EssamWisam opened this issue Nov 9, 2024 · 2 comments
Open

Make it easier to set activations #283

EssamWisam opened this issue Nov 9, 2024 · 2 comments

Comments

@EssamWisam
Copy link
Collaborator

Motivation and description

image

It would be easier and I don't have to explictly import Flux and just specify :relu because I coming to MLJFlux because I am not interested in using or importing Flux.

Possible Implementation

Something like:

function get_activation(func_symbol::Symbol)
           if hasproperty(Flux, func_symbol)
               return getproperty(Flux, func_symbol)
           else
               error("Function $func_symbol not found in Flux.")
           end
       end
@ablaom
Copy link
Collaborator

ablaom commented Nov 13, 2024

I think to keep things simple we either re-export the entire Flux namespace, or we re-export none of it. I think the latter and current choice gives the user more control, and it's not too burdensome to run using Flux to get dircect access to Flux's exported methods.

In case it was not clear to you, after using Flux you can do relu, Dense, etc without the Flux. qualification.

In docs we always try to put in the Flux. qualifier, because we cannot assume the user has done using Flux.

Even if we did re-export Flux's namespace, it won't help if the user never runs using MLJFlux, which they can indeed avoid with @load NeuralNetworkClassifier or similar, for that macro only does import MLJFlux.

My vote would be for the status quo. Anyone else prefer we re-export the Flux namespace?

@EssamWisam
Copy link
Collaborator Author

To my understanding, one of MLJFlux's objectives is

Provide a user-friendly and high-level interface to fundamental Flux deep learning models while still being extensible by supporting custom models written with Flux

This gives me the impression that for standard deep learning models, I can achieve my task more simply by using MLJFlux in lieu of (and not in conjunction with) Flux.

In the status quo, I can use any standard network (i.e., any one among those supported by MLJFlux by default) and I will need to import Flux just to choose the activation (a minor and single decision among all the others in the network) as well as import Optimisers just for the optimization method. It's a basic and not advanced functionality to choose these hyperparameters which is why I like more if it's possible solely via MLJFlux (or supporting both syntaxes).

For reference, in Keras, both syntaxes are supported.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants