There are few things more annoying than wireless dropouts. The IT admins amongst you will almost certainly hear ‘”The Wi-Fi is down” echo through the corridors outside your office. That’s fine when the Wi-Fi is down, but often – it’s not. Frustrating stuff right?
The wireless infrastructure is often the fall guy for anyone not able to connect to the content they want. But there are loads of other common reasons why a device loses connectivity.
So, are there any preventative measures that can be taken by IT techs to mean less moaning? We set out our top 4 issues and some proposed routes to avoid them.
Mismatch between wireless infrastructure and device configuration
The 802.11ac Wi-Fi standard has become pretty popular over the past 2 years. So it only follows that when education and public sector institutions are looking to put a new wireless network in – they have a thirst for the latest and greatest wireless connectivity. This can sometimes result in what might be called (by someone like me who likes making new phrases up) connectivity dissonance – that is a mismatch between the connectivity standards of a device and the connectivity configuration of a wireless network.
If a network has been designed and optimised for 802.11ac and there are legacy devices with b/g/n only wireless cards, while the AC standard is designed to be backwards compatible, it doesn’t always work so well in practice. The result: a large investment in a wireless network, and a set of devices unable to communicate any quicker than they did previously.
What you can do about it
Getting the right fit between the wireless network and the devices that will operate on it is key. Ideally, organisations should seek to invest in devices with 802.11ac compliant wireless cards to improve the quality and speed of connections from access points (APs) to the device. That said, if the budget can’t stretch, why not just get new wireless cards and retrofit them into the older devices? Alternatively, if push comes to shove, there are differing ways (depending on the wireless vendor) of segmenting traffic and creating a better experience for legacy devices on the network. This is something your wireless networking partner will be able to help with.
Also, the radio frequency (RF) properties of most wireless tech can be tweaked to provide better support for legacy devices. Most quality devices purchased within the last 2 years will have 802.11ac cards in them – so speak to your end user computing partner for advice.
Buggy drivers on devices and firmware on controllers
Buggy or unreliable wireless card drivers are a common cause of wireless dropouts. Each wireless card manufacturer will provide drivers for their products to work with Windows. For Windows 8 through 10, device drivers by default are configured to download and install automatically. Sounds good in practice, however there can often be problems. Just because a driver is newer doesn’t mean that it turns out to be the best driver for that device.
The same notion applies to the ‘brain’ of the wireless network itself. As standards change and technology progresses, updates are issued to wireless solutions that must be applied to ensure continuing high performance. Some newer, cloud based solutions do this automatically, but legacy controller based platforms require manual intervention to keep up with the times.
What you can you do about it:
For your devices you should prevent drivers being downloaded automatically. Instead, IT admins should use network deployment tools like Windows Deployment Services, WSUS, or preferably System Centre Configuration Manager, to manage Microsoft updates centrally. This gives an important chance for drivers to be tested on devices before any rollout. We also recommend utilising network card branded drivers rather than Microsoft default drivers where possible.
A really common reason for disconnects is down to simply a lack of Wi-Fi coverage. It’s not uncommon for the wireless infrastructure to be refreshed once every 4 or 5 years. Collaboration hubs and location independent working spaces means there’s a constant creep of additional locations where coverage is required – often in volumes not previously considered. This can mean that users experience wireless dropouts because they’ve roamed out of range or have moved to a location where coverage was never really designed to go.
What you can do about it:
It should go without saying to make sure that when your wireless network is installed that a full site survey is done, whether you do it yourself or through a third party. In any case, heat maps are a great visual tool in identifying areas where wireless dropouts are likely to occur.
I think it’s also best practice to periodically rescan too – perhaps once a year or so to accommodate for the fact both working and learning spaces are designed to be far more flexible in nature now. This will reduce the amount of disconnects users report and is just part of great wireless infrastructure housekeeping.
The more common cause these days is access point (AP) density.
Most sites have near site wide coverage. But if the wireless infrastructure was designed to support 500 devices and you get an increase to 1000 devices the APs can’t cope with the number of concurrent connections and the users get the impression they’ve dropped off the network – even in areas with full signal strength. This is because the AP can only communicate with one client at a time. The solution is to add more APs to balance the load, but this requires careful planning to ensure APs don’t get interference between each other. As 802.11ac Wave 2 becomes more common on devices, upgrading APs to Wave 2 will have a dramatic effect on high density support as the AP can communicate with multiple clients concurrently.
The average number of wireless enabled devices per person, and per student has risen in the last 5 years from 1 to over 2.5. As the world moves towards the realisation of the ‘internet of things’ (buzzword alert), the sheer volume of devices that operate in the same radio spectrum as your wireless network increases exponentially.
As the standard moves from legacy 2.4GHz radio frequency to 5GHz, more non-overlapping channels are available for wireless devices to use. However, volume again compounds any issues. You‘ll often find that as mobile phones in pockets constantly sweep for wireless networks, in a confined area they exhaust the network capacity available very quickly.
This makes communal gatherings like the canteens and libraries particularly pesky when it comes to performance.
What you can do about it
By segmenting your traffic and taking advantage of the dual band technology that most wireless infrastructures have at their disposal now will make sure that increasing numbers of devices won’t affect the ability for your staff to do their jobs.
By putting all priority device traffic on the 5GHz frequency and then other systems like PA systems and heating controls on the 2.4GHz channel, you can protect the productivity of your people.
You mustn’t underestimate the importance of a planned approach too. Historically we’d have recommended putting a 5 year plan in place. However, with the rate of technological innovation getting quicker by the year, it’s getting more and more difficult to look that far in the future and form an opinion. So go with either a 3, 4 or 5 year plan – whatever fits with your wider tech strategy and comfort in making predictions about the future.