![]() ![]() The bandwidth rate is one of two parameters that defines a finite limit of a pipe. Why measure bandwidth at all?īandwidth essentially defines capacity. As with the PC, arriving sooner is faster than arriving later, or put in ‘experience terms’: arriving later can lead to a missed a flight, which is not a good experience. Does the ‘car per hour rate’ of the road directly relate to the experience? No, it does not! The ‘time to destination’ on the other hand defines a material parameter for the driver’s choice. The airport is a 30min journey via freeway versus a 20min journey via the single lane road. For example, if the left sign shows ‘to airport, 35 miles’ and the right shows ‘to airport, 10 miles’, the time assessment is shorter for the ‘right fork’. However, if distance is added into the equation this choice may change. If ‘speed’ is the criteria for selection, then the ‘left fork’ to freeway is clearly over two times faster. If the left fork is a 70mph 8-lane freeway and the right fork is a 30mph single lane shared road, should the car go left or right? A car comes to a fork in a road and both left and right directions are signed ‘To Airport’. To clarify the time/rate disconnect further, consider the following. In short, time is directly material to the user experience whereas bits per second is open to question. That being accepted, would you now expect a different result if the 10ms bandwidth was 50Mbps and the 20ms bandwidth was 100Mbps? Most probably not and yet our passion for bandwidth says otherwise. If a packet is sent at the same time to both ‘a’ and ‘b’, which customer should get the packet first? Obviously the 10ms connection should arrive first because 10ms is shorter in time than 20ms. The connection latency to customer ‘a’ is 10ms and the connection latency to customer ‘b’ is 20ms. Consider a ‘cloud’ application service provider and two customers. Latency time defines one of the most important aspects of packet delivery. "Latency time defines one of the most important aspects of packet delivery." Latency is a measure of time taken for a packet to travel to the destination and back (round trip time, RTT) and is expressed in milliseconds (ms). Interestingly, network time - which is known as latency - is well defined, but unlike bandwidth it is seldom considered or referenced. Therefore, should it be expected that bps and pph share similar traits? For example, is sending a package to a customer not unlike sending a network packet to an online user? In particular, as world geography relates to packages, would you expect a greater distance to the online user to consume more time than a shorter distance? If the answer is yes, then it should also be expected that a network connection can incur variations in delivery time based on network related factors such as wireless versus wired or the performance of a cheap $60 router versus one costing $10,000. Like packages per second, bandwidth is also defined as a rate, namely bits per second (bps). What about distance, is distance to the customer material to the experience? For example, will a package being sent to Denver incur a different delivery experience than a package being sent to Sydney from the same shipping point? Most probably ‘Yes’. ![]() To measure the purchaser experience, does it help to know that Amazon’s maximum shipping rate is 20,000pph (packages per hour)? The answer is ‘No', as a package rate limit has no tangible correlation to delivery time. It is hard to debate that arriving sooner is slower than arriving later. For the purchaser, faster is therefore defined by the delivery time for the PC to arrive. Is this true?Ĭonsider a new PC that is ordered from Amazon - most will agree that ‘next day’ delivery is faster than ‘second day’ delivery. To be more specific, it is a common belief that 100Mbps is faster than 50Mbps and will clearly deliver a better user experience. It is clear that the world has bought into the marketing message that more bandwidth is faster and that faster is clearly better than slower. More importantly, our desire for so called ‘speed’ drives an unnatural passion to persistently measure our bandwidth as if our life depended on it! Literally several million bandwidth tests, commonly referred to as speed tests, are done every day just in the USA. ![]() Today it is therefore not surprising that our global fixation with bandwidth fuels an almost immediate willingness to invest in more bandwidth when encouraged to do so. However, despite bandwidth increasing by several orders of magnitude, a poor user experience still remains the no. Over the last 20 years the number one recommendation to resolve network experience problems has been ‘get more bandwidth’. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |