-
Notifications
You must be signed in to change notification settings - Fork 119
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error when attempting to create partitions on large Teradata tables (2B+ rows) #1280
Comments
Looks like the numeric overflow is occurring when DVT attempts to execute this query in TD, so it's something realted to the ROW_NUMBER calculation WITH t0 AS (
SELECT
t2.EDW_ACCT_NB,
t2.EFF_DT,
(row_number() OVER (ORDER BY t2.EDW_ACCT_NB ASC, t2.EFF_DT ASC) - 1) + 1 AS dvt_pos_num
FROM `<DATASET_NAME>.<TABLE_NAME>` t2
) |
Adding to this - based on the TD docs, row_number() is similar to count(*) and returns an INT value. In a table with over 2 billion rows, once you get to row number 2,147,483,648 you have exceeded the upper limit of the INT data type and will see the numeric overflow error. Would be good to review if there are any other instances where DVT outputs an INT that is dependent on the size of the table. Any other instances of count(*) or similar functions will present with the same error. |
I wonder if doing something similar to the following would work:
We would have to see if count(*) can replace row_number() in all databases. |
Hi, Thanks. Sundar Mudupalli |
When creating partitions for a large table in a Teradata to BigQuery migration, the following error was encountered:
teradatasql.OperationalError: [Version 20.0.0.15] [Session 64925450] [Teradata Database] [Error 2616] Numeric overflow occurred during computation
DVT is failing when generating the partitions as it is trying to do count(*) before partition creation.
According to the documentation, a TD integer can only be between -2,147,483,648 and 2,147,483,647. The table in question has 2,234,011,695 rows, which is outside the Integer range(https://docs.teradata.com/r/Enterprise_IntelliFlex_VMware/SQL-Data-Types-and-Literals/Numeric-Data-Types/INTEGER-Data-Type).
Suggestion is to change the count check to cast(count(1) as BIGINT) which should resolve the error.
The text was updated successfully, but these errors were encountered: