L

Latency

The time delay between sending a request and receiving a response, measured in milliseconds, critically affecting real-time applications.

What is Latency?

Latency is the time delay in data transmission - the time it takes for a packet to travel from source to destination. Measured in milliseconds (ms), low latency is crucial for real-time applications like video calls, gaming, and financial trading.

Types of Latency

  • Network Latency: Time for data to traverse the network
  • Processing Latency: Time for devices to process packets
  • Propagation Latency: Time for signals to travel the medium
  • Queuing Latency: Time spent waiting in buffers

Measuring Latency

  • Ping: Round-trip time (RTT) measurement
  • Traceroute: Latency at each network hop
  • Application monitoring: End-to-end latency

Acceptable Latency Ranges

  • Gaming: < 50ms preferred, < 100ms playable
  • Video Calls: < 150ms for smooth conversation
  • VoIP: < 150ms one-way
  • Web Browsing: < 100ms feels responsive

Reducing Latency

  • Use nearby servers (CDN, edge computing)
  • Optimize network paths
  • Upgrade to faster connections
  • Reduce hops and processing
  • Use caching effectively
  • Choose appropriate protocols