[device list page] get 200 entries?

I see the below request is made when navigated to device list page. Please help me to understand the reason for fetching 200 device entries from inventory when only 20 devices are on the list.

Request URL:

https://demo.hosted.mender.io/api/management/v2/devauth/devices?status=accepted&per_page=200&page=1

Regards,
Pei

@mzedel I think this one is for you :slight_smile:

Hi @pei.cei,
The call for 200 device entries is made to get pagination and grouping information for the device list.
However in the coming weeks a changed interaction from the GUI with the API will be released (and be found in the demo environment as well), which should reduce these calls noticeably.

thank you @mzedel for clarifying it.
Also, in device page first fetch 20 devices from deviceauth and fetch inventory for each device upfront. Which is inefficient I feel taking scalability into account, if we have other microservices coming with more details again 20 more request to fetch those details. This is because heartbeat column is being displayed from inventory and and other from deviceauth. I think this has to be refactored to fetch from single DB or serivice. Let me know what do you think.
Sorry for continuing in this thread, i felt related so.

Regards,
Pei

1 Like

Hi Pei,
yes this is an area we are currently working on and one of the areas where our current microservice architecture poses a different set of challenges. Since the data sits in separate stores, each with different load patterns, a single DB to read from is not directly possible if we don’t want to sacrifice the freshness of the data at some point.
As the UI currently loads the data from the different sources that hold what the view want to show, the burden lies on the users browser to handle these calls and process the result. However since all the endpoints are paginated, the extra load will always be “just” that of the current page.
Bundling the calls to the inventory would be one approach or serving the relevant data from a single service (kind of what a GraphQL endpoint does) would be another. We hope to get there in the near future, as usual you are more than welcome to give us a hand and submit a PR.

1 Like

Hi @mzedel,
Yes, I would try to find something around the ways that you highlighted.

I found one more bottlenect in the dashboard while loading devices section. I have provisioned platform with 5000 devices from the test client. It was taking too much time to load dashboard. Digging into code, I saw that deviceauth will be queried for all the devices (5000 in my case, 25 requests) and inventory will be queries for each device id (5000 more), total 50025 requests.
Reason is, Inventory data model can still have the data even if device is rejected or dismissed, looking over 5000 device ids which are accepted and fetchine one by one from inventory. If inventory database will keep only accepted devices one API to fetch all the devices and represent will be much efficient.
Let me know your thoughts on this.

Thanks
Pei.

Hi @pei.cei,
again valid points and unfortunately the answer for now is the same, since the inventory API does not include the bulk retrieval of certain ids, this is done in the frontend. To not just plainly crash the browser if there are too many devices to handle, this is capped at 5000 devices and the implementation for the device retrieval was recently changed (will be part of the 2.3 release) to bulk retrieve the devices from the inventory and match the ids locally - that cuts down from the 5025 requests to 50.
But even with that approach it is obviously not futureproof and as mentioned, getting the data for the relevant ids in bulk or the combined endpoint would be the way to go. I can’t give a definitive timeframe, but can only recommend to “stay tuned”, as we should get there.
These should however also be the 2 worst acrobatics the GUI has to do to display the data and provide the functionality that the backend has not yet caught up to.

1 Like

Hi @mzedel,
Thank you for the explaination. I will be following it up closely.