Replies: 4 comments 6 replies
-
It'll be interesting to optimize the performance. For now, all the data are printed into the browser and processed in memory, which is limited by the browser process |
Beta Was this translation helpful? Give feedback.
-
I tried using it on a subset of my data that was 21k rows and 420 columns... this is what I got. All in my 32 gb ram laptop... |
Beta Was this translation helpful? Give feedback.
-
@Asm-Def I tried your patch. But I still get the same issues. Running Jupyter Lab (Edge browser) on Mac with 16GB RAM. Dataframe is too large for ipynb files. Only 20129 sample items are printed to the file. |
Beta Was this translation helpful? Give feedback.
-
Hi, in next version, we will support computing with duckdb. In the POC test, we can use datas(10G+, 10 columns, 200W+ row), the result can be obtained within 1s. |
Beta Was this translation helpful? Give feedback.
-
What is the maximum number of rows/columns that pygwalker can handle? I have a pretty big machine so that shouldn't be a bottleneck.
Beta Was this translation helpful? Give feedback.
All reactions