You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have a lot of data in our database and currently no mechanism to limit the maximal number of requested points from the database, which makes it pretty easy to overload our service.
This does not have to be on purpose. It could simply happen by accident.
Limiting the number of geo.jon points is pretty straight forward, but limiting the number of data points involved in calculations in average.json and statistics.json is more complicated.
My approach would be the following:
We can calculate the average density d of the points we have per day. Then for every request we get a polygon p and a timespan t from the user.
So we can estimate the number of involved points with d*p.area*t.
With this estimation we can directly stop the query with a proper error message.
The text was updated successfully, but these errors were encountered:
We have a lot of data in our database and currently no mechanism to limit the maximal number of requested points from the database, which makes it pretty easy to overload our service.
This does not have to be on purpose. It could simply happen by accident.
Limiting the number of geo.jon points is pretty straight forward, but limiting the number of data points involved in calculations in average.json and statistics.json is more complicated.
My approach would be the following:
We can calculate the average density
d
of the points we have per day. Then for every request we get a polygonp
and a timespant
from the user.So we can estimate the number of involved points with
d*p.area*t
.With this estimation we can directly stop the query with a proper error message.
The text was updated successfully, but these errors were encountered: