You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This seems to be because the Pandas index is duplicated (each file has the same index values that Pandas assigned to the data). When joining data from different files the code then fails. The solution was to import data to parquet files with index=False. I have to go back and give more information about how this was precisely done.
However, the code should not fail when the data has duplicate indexes that have been created by pandas.
Before submitting
Please check the following:
I have described the situation in which the bug arose, including what code was executed, information about my environment, and any applicable data others will need to reproduce the problem.
I have included available evidence of the unexpected behavior (including error messages, screenshots, and/or plots) as well as a descriprion of what I expected instead.
If I have a solution in mind, I have provided an explanation and/or pseudocode and/or task list.
The text was updated successfully, but these errors were encountered:
Bug report
When importing data into Parquet from csv via pandas, and then importing, the code fails at this line
https://github.com/astronomy-commons/hipscat-import/blob/62a30a0768e5035c02df19d26037c369d007618f/src/hipscat_import/catalog/map_reduce.py#L152C1-L153C1
This seems to be because the Pandas index is duplicated (each file has the same index values that Pandas assigned to the data). When joining data from different files the code then fails. The solution was to import data to parquet files with
index=False
. I have to go back and give more information about how this was precisely done.However, the code should not fail when the data has duplicate indexes that have been created by pandas.
Before submitting
Please check the following:
The text was updated successfully, but these errors were encountered: