You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for sharing. This is a very impressive project, but I have a few questions I'd like to ask you.
1.Regarding the generation granularity of the digest, would it be more accurate in the URL generation model due to the fact that currently the code recognizes both /A1/ and /1A/ as /0-9a-z/?
2.In the Maker generation model, if there are a huge number of URLs, such as tens of millions, it seems that there may be insufficient memory. Therefore, I would like to know if there is a way to use a sharding approach to generate corresponding models for these URLs.
3.Have you considered using GPUs in the future to speed up the generation and matching of the Maker and Matcher?
I would greatly appreciate it if you could answer my questions. Best wishes for your success!
The text was updated successfully, but these errors were encountered:
Thank you for sharing. This is a very impressive project, but I have a few questions I'd like to ask you.
1.Regarding the generation granularity of the digest, would it be more accurate in the URL generation model due to the fact that currently the code recognizes both
/A1/
and/1A/
as/0-9a-z/
?2.In the Maker generation model, if there are a huge number of URLs, such as tens of millions, it seems that there may be insufficient memory. Therefore, I would like to know if there is a way to use a sharding approach to generate corresponding models for these URLs.
3.Have you considered using GPUs in the future to speed up the generation and matching of the Maker and Matcher?
I would greatly appreciate it if you could answer my questions. Best wishes for your success!
The text was updated successfully, but these errors were encountered: