« Back to Glossary Index

What does latency mean?

Latency (delay) on the internet describes the period of time a data packet requires to travel from the source to the destination within a network. It is a decisive factor for the speed and efficiency of data transmission.

How is latency measured?

Latency is usually measured in milliseconds (ms). This time indicates how long it takes for a data packet to reach its destination and for the confirmation to return to the sender. This measurement is performed using network diagnostic tools such as ping or traceroute. The process is referred to as Round-Trip Time (RTT) and shows the time required for sending and receiving a message.

Latency in different network environments

Latency varies greatly depending on the network type. Here are some examples:

Local Area Network (LAN):
Here, the delay is often at its lowest, typically under 1 ms. This is ideal for fast data transfers within a building or campus.

Wide Area Network (WAN):
WANs, which connect distant locations, can exhibit latencies from 20 ms to 100 ms or more. The values depend on the distance and the quality of the connection.

Mobile networks:
Latency varies depending on the technology. With 4G it is around 50 ms, while 5G can achieve latencies of up to 1 ms. 5G significantly reduces latency, which is important for real-time applications.

Satellite internet:
The delay is particularly high due to the long distance the signals must travel, typically between 600 ms and 1,200 ms.
Influence of data traffic and network congestion

Data traffic in a network directly influences latency. High data traffic can lead to congestion, which increases latency. This is comparable to traffic on a highway during peak hours: the more cars on the road, the slower the traffic becomes.

Measures to reduce internet latency

Various measures can be taken to reduce delay time:

Quality of Service (QoS):
This configuration on routers prioritizes data traffic and grants critical applications preferred access.

Traffic Shaping:
This technique controls the speed of certain data streams to optimize network performance.
Importance and impact of latency on digital services
Latency influences the response speed of digital services. For streaming services, online games, and VoIP communication, high latency can lead to delays and a poor user experience. Services like Netflix and YouTube optimize their networks to minimize latency and avoid interruptions.

In VoIP and video conferencing systems, low delay is essential for clear and effective communication. Providers continuously work on improving latency through technologies such as WAN optimization and improved routing.

Optimizing and reducing internet latency

The following methods decrease latency:

Route optimization:
Network paths with the lowest delay should be selected.

Use of Content Delivery Networks (CDN):
These networks store content at various locations to reduce the distance to end users.

Minimizing protocol overhead:
Use more efficient protocols and reduce additional data traffic.

Compression:
Compression reduces the amount of data transmitted.

« Back to Glossary Index WordPress Cookie Notice by Real Cookie Banner