I am using fastapi as backend and react for frontend.
Currently, I'm rendering data (as chunks i.e., 500 records per chunk) which is a stream coming from websockets. And then I'm using visualization in frontend to render certain no. of records to display that data in a table format in the viewport height.
If I scroll down, previous data will be rendered, or up, next data will be rendered, the data will be rendered in viewport height in the table.
The problem is: after some point of time (rendering 1 GB of data), I am getting high memory usage in the webpage. I have to render 8GB of data. So can you please help me? How should I achieve less memory usage, while maintaing my functionality?
Except indexed db and gzip methods - are there any other recommended ways?
actual use case: i am taking the live data(positions i.e., lattitute and longitudes) from the sensors which are on the jets. Now i am getting data continously from the sensors. so now what i have to do is , i've to display the incoming data in the frontend in the form of a table. for this , pagination is best approach , but the requirement is table with infinity scroll(i used virtualization to display certain no of records in frontend). so now what i am doing is , i am storing the data in mongo db and sending the data through webscokets and displaying in the table. instead of hitting backend api multiple times , i am storing data in ( the browser) frontend. but the data is very huge(i.e., approximately 8 GB) so , my page is getting unresponsive to the events(mouse clicks,key strokes) . so can you please how should i optimize the memory usage!?