-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature/prophet #11
Feature/prophet #11
Conversation
if not os.path.exists("bench"): | ||
os.mkdir("bench") | ||
|
||
na.set_log_scale(50) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jcabrero this is indeed stupidly high - reason is that some prophet parameters are extremely small (ie 1e-13) which means that they get rounded to zero - which is not yet supported.
instead of going into export_state
and patching this (which will be unnecessary once zero secrets are allowed), I opted to put this here temporarily
@@ -19,8 +19,7 @@ def __init__(self, shape: _ShapeLike) -> None: | |||
Args: | |||
shape (_ShapeLike, optional): Parameter array shape. | |||
""" | |||
zeros = na.zeros(shape) | |||
super().__init__(inner=zeros.inner) | |||
super().__init__(inner=np.empty(shape)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is a better way imo
--> really low overhead & ensures no one can perform operations with empty parameters
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very cool 👍 I also prefer it. Raising errors is a way to realise you did something wrong, and with this, we can realise.
|
||
def __init__( | ||
self, | ||
n_changepoints: int, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jcabrero idk what you think of this but - even though it is a hyperparameter that Prophet learns - we do need to receive this information explicitly & in plain-text as it's used to size parameters.
this is likely a bit bad UX but i couldn't find a decent way around it
"complex_model", | ||
"linear_regression", | ||
"neural_net", | ||
# "time_series", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
commented out b/c the stupidly high scale is too high for the test to work properly - should be fixed when zero secrets are supported
@@ -189,7 +189,13 @@ def forward(self, x: torch.Tensor) -> torch.Tensor: | |||
) | |||
|
|||
# Sort & rescale the obtained results by the quantization scale (here: 16) | |||
outputs = [result[1] / 2**16 for result in sorted(result.items())] | |||
outputs = outputs = [ | |||
result[1] / 2**16 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably that is not the way we want to do it, but (if you consider it relevant) we can put here na_client.float_from_rational(result[1])
.
@@ -167,7 +167,13 @@ def forward(self, x: na.NadaArray) -> na.NadaArray: | |||
) | |||
|
|||
# Sort & rescale the obtained results by the quantization scale (here: 16) | |||
outputs = [result[1] / 2**16 for result in sorted(result.items())] | |||
outputs = [ | |||
result[1] / 2**16 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment as above: na_client.float_from_rational(result[1])
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wow. 👏 Amazing job here. I have been reading this with the paper side-by-side, otherwise I could not have understood how you did it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Amazing job. I mean such a complex model and you got it working. 🥇
Adds Prophet model functionality
(note: builds on a pre-release version of nada-algebra)