We agree wasting compute resources is bad, and probably better spent mining BTC. This is on our roadmap for both real-time(websockets) as well as API calls. It will likely be protobufs.
I use IQFeed (comma separated values). Parsing is 99% of CPU usage. The analysis only 1%. JSON is not the best format to use for speed of parsing or readability, worse than CSV, fixed-size format, etc. That said, it should still be more than fast enough. Last time I checked I was only running about 3% CPU utilization watching around 1,000 of the most liquid stocks, about 5 minutes after the open. This was several years ago on an i7 2600. The volume drops off precipitously after that. Today I have an i7 6700 and a Ryzen 7. Binary may be a poor choice in this use case. Keep in mind that the amount of data sent over the wire is tiny compared to what your computer can handle. If not using CSV, a faster choice would be to use a fixed format. The number of characters for each field is fixed. This way you can just move a pointer to the start of the fields in a line you wish to read and read the fixed number of characters. It is also directly human-readable. I store my data as it comes over the wire. The human-readable format is very important to me. I use it quite often to see why trades did or did not happen. Very useful for development also. All just my thoughts/opinion.
I think VPS would be an optimal solution for you here and I don’t think you’d need a dedicated computer for that.