Wondering Tabpy's ability to handle concurrency. #524
-
We have one single server that deloyed tabpy server and register several python functions on it. There might be handreds people using tableau calling these function on the server at a short time. Hence, is there a strategy for tabpy to handle this situations? Such as a request queue or something else. It's hard for us to test this situation, since we don't have so many pc with tableau to call the function at the same time and I wonder if there is a way to test this. (Our company have a tableau course for our analyst every few months, during the class, we will introduce tabpy and let them try it, this might be a risk if the server breaks when a lot of people calling at the same time). |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Hi @UTimeStrange in our internal testing, it's usually Tableau that will be the bottleneck before TabPy slows down. You could script calling TabPy's API as a way to test this. If TabPy is running on a decent size server, it generally can handle the load. You can also look at options using our TabPy container (or building your own) and using something like AWS ECS or another elastic container service. We have also seen customers put a load balancer in front of several TabPy servers to handle larger loads. |
Beta Was this translation helpful? Give feedback.
Hi @UTimeStrange in our internal testing, it's usually Tableau that will be the bottleneck before TabPy slows down. You could script calling TabPy's API as a way to test this. If TabPy is running on a decent size server, it generally can handle the load.
You can also look at options using our TabPy container (or building your own) and using something like AWS ECS or another elastic container service. We have also seen customers put a load balancer in front of several TabPy servers to handle larger loads.