-
-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BulkInsert #102
Comments
Sounds good. This would be nice to add as
|
I think there are some limitations in sqlite bulk inserts, (eg: SQLITE ERROR, Toomany Variables) when you insert many rows at a time. Is there any basis for deciding the default chunk value to 100?
yeah, but I don't think that is required in bulk inserts. |
None at all :) Probably 500 or even 1,000 is doable, I just don't know what the typical limit is and will have to look into it.
Agreed, I just wanted to point out the return value would be different between |
I just tried with a table with 4 columns , and fails at chunk of 240 with this, When the chunks are smaller the net time taken for the query is more when compared with larger chunks.. Is it possible optimize the chunks count dynamically ? 🤔 |
context
Option to insert bulk data in batches
proposed solution
In Knex , insert function (ref) taking either a hash of properties to be inserted into the row, or an array of inserts . Can that be implemented in trilogy as well
alternatives
Unable to insert data at once using trilogy, but as a workaround, knex can be used
eg:
The text was updated successfully, but these errors were encountered: