02-08-2018 07:28 AM
Good Day, I just upgraded to AT&T Fiber at 1000MBPS. I'm only getting ~200MBPS +/- 100MBPS speed ~5 feet from WiFI Node.
1. Anything I can do to improve that?
2. Why such a huge drop from modem to Velop WiFi system?
3. Is it possible to do anything to optimize this fraction of the speed?
02-08-2018 07:39 AM - edited 02-08-2018 07:39 AM
This information may explain why you will not see a 1gigabit wireless connection with current wireless technology.
Below is a quote from this Intel article:
The underlying assumption is that both your Wi-Fi AP (i.e. wireless router) and your client (i.e. device, like a laptop) are using the same Wi-Fi standard and configuration. For example, the maximum data rate for 802.11ac 2×2 is 867 Mbps.
The “max data rate ÷ by 2” part of the equation broadly estimates actual throughput by taking into account network overhead and environmental factors explained earlier; that number is then divided by the number of clients sharing the bandwidth to arrive at the maximum throughput. It’s important to keep in mind that the throughput formula doesn’t take into account that data rate and throughput decrease as clients move away from the AP.
The following shows how the throughput for an 802.11ac 2×2 Wi-Fi device decreases as you add clients:
(867 Mbps ÷ 2) ÷ 1 client = ~433 Mbps per client
(867 Mbps ÷ 2) ÷ 2 clients = ~216 Mbps per client
(867 Mbps ÷ 2) ÷ 3 clients = ~144 Mbps per client
Now compare 802.11ac throughput with 802.11bgn 1×1 Wi-Fi commonly found in devices today:
(150 Mbps ÷ 2) ÷ 1 client = ~75 Mbps per client
(150 Mbps ÷ 2) ÷ 2 clients = ~38 Mbps per client
(150 Mbps ÷ 2) ÷ 3 clients = ~25 Mbps per client
What seemed like more speed than you might need quickly shrinks to speeds that you do need, especially in a modern home setting where multiple members of a family are streaming video, playing online games, or downloading files across multiple devices. In essence: the higher the maximum data rate, the more throughput.