You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've opened this issue to discuss about the limit that we are currently implementing and what should we do in the future. As this is a decision that affects all projects using data-api I'd like your input on it.
The current implementation has a default limit (quite small) and is build such as build-report can get a sample data by default and alleviate the server of any unnecessary big call. This has the extra advantage of avoiding calls that accidentally have a big response. Having default response limits is an industry good practice but does not deal with all the issues.
Nevertheless there some things that need to be taken into account:
Databases often implement limits in row count, timeout and result size to avoid being overwhelmed by client's queries. Changing these is not often a solution to all the problems and the needed DB configuration vary depending on the application.
Being able to download a response as a file, in the current implementation a limit must be given that is bigger or equal than the query result. The question here is how should we deal with it and how should the behaviour be from the client's side point of view?
In the case when the read API is public we'll have to deal with DoS and DDoS attacks. Having no search limit makes these attacks easier, but even being able to set a limit does not eliminate the threat.
So the questions that I would like to discuss are:
What kind of behaviours can be observed from the client side? Specially for the File Download use case. What measures can we have to limit DoS attack vectors?
I'd like your opinions and discuss enough to take an informed decision on what we'll do here.
I've opened this issue to discuss about the limit that we are currently implementing and what should we do in the future. As this is a decision that affects all projects using data-api I'd like your input on it.
The relevant code is located here
The current implementation has a default limit (quite small) and is build such as build-report can get a sample data by default and alleviate the server of any unnecessary big call. This has the extra advantage of avoiding calls that accidentally have a big response. Having default response limits is an industry good practice but does not deal with all the issues.
Nevertheless there some things that need to be taken into account:
Databases often implement limits in row count, timeout and result size to avoid being overwhelmed by client's queries. Changing these is not often a solution to all the problems and the needed DB configuration vary depending on the application.
Being able to download a response as a file, in the current implementation a limit must be given that is bigger or equal than the query result. The question here is how should we deal with it and how should the behaviour be from the client's side point of view?
In the case when the read API is public we'll have to deal with DoS and DDoS attacks. Having no search limit makes these attacks easier, but even being able to set a limit does not eliminate the threat.
So the questions that I would like to discuss are:
What kind of behaviours can be observed from the client side? Specially for the File Download use case. What measures can we have to limit DoS attack vectors?
I'd like your opinions and discuss enough to take an informed decision on what we'll do here.
@rufuspollock @anuveyatsu @EvgeniiaVak @shubham-mahajan @sagargg
The text was updated successfully, but these errors were encountered: