Recently, I came across an interesting article published in 2004 comparing growth, reasons and handling imbalance between bandwidth and latency. Excerpts below are from
Latency Lags Bandwidth, Recognizing the chronic imbalance between bandwidth and latency, and how to cope with it. By David A. Patterson, Communications of the ACM, October 2004/Vol. 47, No. 10.In the time that bandwidth doubles, latency improves by no more than a factor of 1.2 to 1.4.
Reasons for Bountiful Bandwidth“There is an old network saying: Bandwidth problems can be cured with money. Latency problems are harder because the speed of light is fixed – you can’t bribe God” – Anonymous.
Moore’s Law helps bandwidth more than latency.
Distance limits latency.
Bandwidth is generally easier to sell.
Latency helps bandwidth.
Bandwidth hurts latency.
Operating system overhead hurts latency.
Coping with Lagging LatencyCaching: Leveraging capacity to help latency.
Replication: Leveraging capacity to again help latency.
Prediction: Leveraging bandwidth to again help latency.
Marketing Latency InnovationsThe difficulty of marketing latency innovations is one of the reasons latency has received less attention thus far.
Perhaps, we can draw inspiration from the more mature automotive industry, which advertises time to accelerate from 0-to-60 miles per hour in addition to peak horsepower and top speed.
Another way to improve latency is using a compression engine. The overall response time of the storage system is faster because it handles lower througput as the data is compressed.
ReplyDeleteNot sure, how compression can help in reducing latency? Intuitively, I would assume the additional time required for compression/decompression process will only add to latency instead of decreasing it.
ReplyDeleteIn the pure context of reducing disk latency, pre-compressed data would reduce disk latency - less data written to disk.
ReplyDeleteThis would just put strain elsewhere in the network path.
Assuming compression is happening inline on the fly else data will need to be written on disk uncompressed, then read and again compressed before being written in compressed form.
ReplyDeleteAnyway disk latency has more to do with access time than data transfer rate.