-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Questions about data formatting #3
Comments
Hi Song, Sorry for the delayed reply. You are looking into the file for Story Model. During Training: I hope this answers your question. Feel free to ask me any questions. |
Thank you for the feedback! However, it is still hard to understand the given example.. Considering both files below https://github.com/Heidelberg-NLP/COINS/blob/main/model/src/data/conceptnet.py https://github.com/Heidelberg-NLP/COINS/blob/main/model/src/train/batch.py in the for loop of batch_conceptnet_generate function (line 92)
when i==0
when i==1
So does it mean i1, o1, i3, o3 corresponds to Incomplete Story, Ouput_Effect_S2/Ouput_Cause_S5, Incomplete Story, Ouput_Effect_S3/Ouput_Cause_S5 and i2,o2, i4, o4 to Incomplete Story, Output_S3, Incomplete Story, Output_S4? Also when splitting the example given at line 94 of conceptnet.py (make_tensors) the list would be
It doesn’t seem to match the 11 sequences the code expects. |
Hi, I am currently trying to reproduce your work (specifically COINS GR) and have a few questions about the training data.
From your paper it seems the training data for Knowledge Model would be
and
Story Model would be
But looking at the part where you load the data
(https://github.com/Heidelberg-NLP/COINS/blob/main/model/src/data/conceptnet.py)
it is confusing which corresponds to which.
Also, the data downloaded with the given script doens't match the format used in the rest of the code
It would be nice if you could provide a data sample for each Knowledge and Story Models or the model weight if possible.
Thank you
The text was updated successfully, but these errors were encountered: