SQL table with thousands of columns
02:04 06 Sep 2024

I need to store data(float) from 20,000 sensors once a second. I originally wanted to create the following table:

time sensor 1 sensor 2 ... sensor 20000
2024-09-06 13:00:00 1.2 5.3 .... 2.0

But then I found a table cannot have more than 1600 columns in PostgreSQL. What's the best practice to store this kind of data? Separate them into multiple tables or switch to another type of DB?

All 20000 sensor values are read and inserted together.

I need to query up to 100 of them per second to plot trend charts.

sql postgresql jsonb entity-attribute-value denormalization