-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[QST] Custom dataset with interacion only #741
Comments
@dcy0577 you dont need extra features. you can groupby your data by another option is to create some temporal features since you have timestamp data already. we showcase some ways of temporal features but these are just some examples, you can be creative and create your own temporal features. |
@rnyak If I want to extract both the long-term and short-term interests of a user from their interactions, I can still follow a similar approach ( while considering two time windows for defining "long-term" and "short-term.")
question is where to mention this threshold value for long term and short term when I am trying to extract users' long term and short term interests separately using XLNet |
@rnyak thanks for the answer. Could you please elaborate more about
Is the slicing a must? If I understand correctly, the max length should be the max value appears in Also in model configuration part, there are some
Do the |
Yes, we expect them to be consistent for the data loader and for the input block. |
❓ Questions & Help
Hi, my dataset contains only the user-item interaction and timestamp. I noticed that the data used in session-based example all contain additional information as features, such like category. Can I use the same logic as in the example code to preprocess my data, without adding any additional feature columns? Can the model accept such data format?
user_id:token item_id:token timestamp:float
0 0 1681314649
0 0 1681314664
0 0 1681314674
0 0 1681314688
0 1 1681322022
0 1 1681322023
0 1 1681322024
0 1 1681322026
0 1 1681322027
0 1 1681322029
0 1 1681322030
0 1 1681322032
0 1 1681322033
0 1 1681322034
...
The text was updated successfully, but these errors were encountered: