-
Notifications
You must be signed in to change notification settings - Fork 132
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FTheoryTools] Add model 1511.03209 #3978
Conversation
Adding a file of this size to the repository is not acceptable. We really need a separate means for data storage. At the same time I wonder how efficient the encoding is. How many terms do you polynomials have? Are the homogeneous? What's the coefficient ring? |
Fully agreed. I have alerted @antonydellavecchia on slack, and we already had a brief conversation on the matter. I am sure we can find a good solution. For instance, for another set of F-theory models, the required data is downloaded when needed. But here, the file(s) are big/massive...
A few numbers regarding the largest polynomial (a6) i.e. the one whose file I could not even temporarily attach to this PR, due to github's file size restrictions:
Maybe that helps to gauge the situation better? |
cb41c81
to
662e893
Compare
662e893
to
859ef64
Compare
After thinking this over a bit more, I have "minimized" this PR. It now just includes the build instructions, so that the creation of the model will take a long, long time. But as such, I believe this PR could be merged (provided the tests do not fail). I have the .mrdi files (and so do @apturner and @emikelsons ), so that we can continue our work by loading from those files. Whenever @antonydellavecchia is ready, we can enhance those .mrdi files by using the attribute saving framework as well as the serialization of the Lie algebras. With this, I believe we can then create a useful artifact. Does that seem to make sense? (Note that this PR includes the changes in #4002, as mutual changes in the metadata index files happened/to avoid a merge conflict.) |
859ef64
to
1fb82bf
Compare
3e75bfd
to
2ff3b37
Compare
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #3978 +/- ##
=======================================
Coverage 84.55% 84.55%
=======================================
Files 596 596
Lines 81994 81994
=======================================
Hits 69330 69330
Misses 12664 12664
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good and most of the tests passed
2ff3b37
to
d2d159c
Compare
@antonydellavecchia I will open a new PR shortly to discuss/implement the future serialization improvements for the |
d2d159c
to
dd526a4
Compare
dd526a4
to
2fa3002
Compare
@antonydellavecchia This model requires a couple of HUGE polynomials. It is a lot faster to save them as .mrdi (thanks to your hard work!) and load them, rather than to compute them anew. Drawback: The .mrdi files are large. The largest file that we need is about 130MB in size and I could not check it into this PR by github file size restriction to at most 100MB.
I will try to add the file here in a moment.It seems github does not allow me to upload a .mrdi file. So I will share it via slack with you.Any suggestions on how to proceed?
(It goes without saying that this PR will fail without having that very file in
experimental/FTheoryTools/src/LiteratureModels/Models/1511.03209
.)