Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

attention层 #4

Open
YuQuankun opened this issue May 10, 2021 · 1 comment
Open

attention层 #4

YuQuankun opened this issue May 10, 2021 · 1 comment

Comments

@YuQuankun
Copy link

YuQuankun commented May 10, 2021

你好,我最近在做毕业设计的时候使用attention机制结合BiLSTM来对时序数据进行预测,在复制您关于attention层构建部分的代码到我的程序的时候发生了数组维度的报错,并且因为维度的报错引发了大量的其他问题。并且因为毕业时间临近,所以现在不得不求助于您。我的测试数据中:输入数据是一个 60* 1 的 矩阵,然后输出的是一个1维的数。而训练数据总共有1952组,所以输入数据是1952 * 60 * 1维的,而输出数据是1952*1维的,想请教您一下这种维度的数据该怎样构造attention层呢?
如果您看到这条信息能够回复我,我将不胜感激。
在这里贴出我的模型构造的代码:

Build the Model

inputs = Input(shape=(60,1), dtype='int32')
print(inputs)
x=Bidirectional(LSTM(units=50, dropout=0.2,return_sequences=True))(inputs)
x=Bidirectional(LSTM(units=500, dropout=0.5,return_sequences=True))(x)
x=Bidirectional(LSTM(units=500,dropout=0.1))(x)
x=AttentionLayer(attention_size=50)(x)

outputs = Dense(1, activation='softmax')(x)
model=Model(inputs=inputs,outputs=outputs)
其中的AttentionLayer()函数则复制了您在https://blog.csdn.net/huanghaocs/article/details/95752379博客中的代码。

@YuQuankun
Copy link
Author

另外由于github的网速时好时坏,有时候信息无法及时的接受。所以希望您能加我的微信yu13032869232或者QQ1921292683来给予我一些指导。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant