Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add change log for 6.3 #3386

Open
wants to merge 3 commits into
base: develop
Choose a base branch
from
Open

add change log for 6.3 #3386

wants to merge 3 commits into from

Conversation

bghimireamd
Copy link
Contributor

No description provided.

@bghimireamd bghimireamd requested a review from a team as a code owner November 15, 2024 11:21
@bghimireamd bghimireamd removed the request for review from a team November 15, 2024 11:22
CHANGELOG.md Outdated
Comment on lines 6 to 20
## MIOpen 3.3.0 for ROCm 6.3.0
### Added
- [RNN] LSTM fwd
- [Mha] Mask is added for Forward pass
- [GLU] Gated Linear Unit (experimental)
- [PReLU] Implement PReLU backward (experimental)

### Fixed
- Fixed stream not being set when calling hipMemsetAsync
- Fixed memory leak issue caused by incorrect transpose in find 2.0 (#3285)
- Fixed memcopy data race by replacing hipMemcpy with hipMemcpyWithStream
## Perfomance
- MI300 TunaNet Update: CK FWD and WRW Solvers Updated

## MIOpen 3.2.0 for ROCm 6.2.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
## MIOpen 3.3.0 for ROCm 6.3.0
### Added
- [RNN] LSTM fwd
- [Mha] Mask is added for Forward pass
- [GLU] Gated Linear Unit (experimental)
- [PReLU] Implement PReLU backward (experimental)
### Fixed
- Fixed stream not being set when calling hipMemsetAsync
- Fixed memory leak issue caused by incorrect transpose in find 2.0 (#3285)
- Fixed memcopy data race by replacing hipMemcpy with hipMemcpyWithStream
## Perfomance
- MI300 TunaNet Update: CK FWD and WRW Solvers Updated
## MIOpen 3.2.0 for ROCm 6.2.0
## MIOpen 3.3.0 for ROCm 6.3.0
### Added
* [RNN] LSTM fwd
* [Mha] Mask is added for Forward pass
* [GLU] Gated Linear Unit (this is an experimental feature)
* [PReLU] Implemented PReLU backward (this is an experimental feature)
### Optimized
- MI300 TunaNet Update: CK FWD and WRW Solvers Updated
### Resolved issues
- Fixed unset stream when calling `hipMemsetAsync`
- Fixed a memory leak issue caused by an incorrect transpose in find 2.0 (see issue #3285 on GitHub)
- Fixed a `memcopy` data race by replacing `hipMemcpy` with `hipMemcpyWithStream`
## MIOpen 3.2.0 for ROCm 6.2.0

CHANGELOG.md Outdated
## MIOpen-3.2.0 for ROCm 6.2.0
## MIOpen 3.3.0 for ROCm 6.3.0
### Added
- [RNN] LSTM fwd
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be forward or forwarding? Or is fwd okay?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think fwd is ok

CHANGELOG.md Outdated
- [RNN] LSTM fwd
- [Mha] Mask is added for Forward pass
- [GLU] Gated Linear Unit (experimental)
- [PReLU] Implement PReLU backward (experimental)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe PReLU Backward is an official name, so Backward s/b capitalized. Can you confirm?

Copy link
Contributor

@amd-jnovotny amd-jnovotny left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants