NEW+--
Min 75% OFF | Pay Day Sale Extended till 3rd June
Move Left Complete list

Low Latency

Latency is the amount of time it takes for data to travel from its source to its destination. In simple terms, latency is defined as any delay or lapse of time between a request and a response. Low latency would mean that this response time is being kept at a minimum.

 

Latency can be represented in an equation:

L=P+N+S+I+A

 

Here P is propagation time - sending the bits along the wire, N is network packet processing- routing, switching and protection, S is serialization time - pulling the bits on/off the wire, I interrupt handling time - receiving the packet on the server and AP is application Processing Time.

Latency plays an important role in algorithmic trading where speed is the key factor in executing a trade. Low latency leads to competitive prices for trade execution. To reduce latency, new technologies are being employed. Microwave and Laser technologies are being used to transmit data which can offer a speed advantage over fiber optics. Wireless transmission can also allow data to move in a straighter, more direct path than cabling routes which reduce the transmission distance and thus reduce latency.