You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've got a problem with an overflowing timestamps. I'm consuming messages stored in my Kafka with some fake content. The date goes very far into the future (transactionDate). transactionDate's are for example: "2010-05-14" without time.
At some point the execution of the pipeline fails and in the error log is "Deserialization error ParseError "2262-04-13T00:00:00+00:00 would overflow 64-bit signed nanoseconds"" shown.
What can I do to solve this 🤔
CREATETABLEkafka
(
amount int,
transactionDate TIMESTAMP,
event_time TIMESTAMP GENERATED ALWAYS AS (CAST("transactionDate"asTIMESTAMP)) STORED,
watermark TIMESTAMP GENERATED ALWAYS AS (CAST("transactionDate"asTIMESTAMP)) STORED
) WITH (
connector ='kafka',
format ='json',
bootstrap_servers ='broker:29092',
topic ='in',
type ='source',
'source.offset'='earliest',
'source.read_mode'='read_committed',
event_time_field ='event_time',
watermark_field ='watermark'
);
The text was updated successfully, but these errors were encountered:
I've got a problem with an overflowing timestamps. I'm consuming messages stored in my Kafka with some fake content. The date goes very far into the future (transactionDate). transactionDate's are for example: "2010-05-14" without time.
At some point the execution of the pipeline fails and in the error log is "Deserialization error ParseError "2262-04-13T00:00:00+00:00 would overflow 64-bit signed nanoseconds"" shown.
What can I do to solve this 🤔
The text was updated successfully, but these errors were encountered: