1 Bandwidth and Latency
Network performance is measured in two fundamental ways: bandwidth (also called throughput) and latency(also called delay). The bandwidth of a network is given by the number of bits that can be transmitted over the network in a certain period of time.
Relative importance of bandwidth and latency depends on application
For large file transfer, bandwidth is critical
How many bits the sender must transmit before the first bit arrives at the receiver if the sender keeps the pipe full takes another one-way latency to receive a response from the receiver If the sender does not fill the pipe send a whole delay × bandwidth product’s worth of data before it stops to wait for a signal the sender will not fully utilize the network
2 Delay × Bandwidth Product
The product of these two metrics, often called the delay × bandwidth product. A channel between a pair of processes as a hollow pipe , where the latency corresponds to the length of the pipe and the bandwidth gives the diameter of the pipe, then the delay × bandwidth product gives the volume of the pipe—the maximum number of bits that could be in transit through the pipe at any given instant.
For example, a transcontinental channel with a one-way latency of 50 ms and a bandwidth of 45 Mbps is able to hold
50×10−3 sec×45×106 bits/sec = 2.25 ×106 bits
or approximately 280 KB of data. In other words, this example channel (pipe) holds as many bytes as the memory of a personal computer from the early 1980s could hold.
The delay × bandwidth product is important to know when constructing high-performance networks because it corresponds to how many bits the sender must transmit before the first bit arrives at the receiver.
3 High-Speed Networks
The bandwidths available on today’s networks are increasing at a dramatic rate,and there is eternal optimism that network bandwidth will continue to improve. This causes network designers to start thinking about what happens in the limit, or stated another way, what is the impact on network design of having infinite bandwidth available Although high-speed networks bring a dramatic change in the bandwidth available to applications, in many respects their impact on how we think about networking comes in what does notchange as bandwidth increases: the speed of light.
4 Application Performance Needs
A network-centric view of performance; that is, we have talked in terms of what a given link or channel will support. The unstated assumption has been that application programs have simple needs—they want as much bandwidth as the network can provide. This is certainly true of the aforementioned digital library program that is retrieving a 25-MB image; the more bandwidth that is available, the faster the program will be able to return the image to the user.
If the application needs to support a frame rate of 30 frames per second, then it might request a throughput rate of 75 Mbps. The ability of the network to provide more bandwidth is of no interest to such an application because it has only so much data to transmit in a given period of time.
21 videos|113 docs|66 tests
|
1. What are some important network performance parameters in computer science engineering? |
2. How does latency affect network performance? |
3. What is the importance of bandwidth in network performance? |
4. How does packet loss affect network performance? |
5. What is the significance of throughput in network performance? |
|
Explore Courses for Computer Science Engineering (CSE) exam
|