Access points comparison—performance test

  • 1
  • Question
  • Updated 1 year ago
  • (Edited)
Are there any available, in which you can compare access points from various vendors with regards to throughput, etc.?

I found this test, apparently conducted by an academic and research network—CARNet—but, yeah...

http://a030f85c1e25003d7609-b98377aee968aad08453374eb1df3398.r40.cf2.rackcdn.com/other/carnet-wifi-t...
Photo of oc

oc

  • 8 Posts
  • 1 Reply Like

Posted 2 years ago

  • 1
Photo of tobiaslinder

tobiaslinder

  • 2 Posts
  • 0 Reply Likes
It would be quite a shock if the Aerohive APs perform in such a mediocre way. Anyone who can chime in to Aerohives defense?
(Edited)
Photo of Crowdie

Crowdie, Champ

  • 972 Posts
  • 272 Reply Likes
I have always believed that access point "bake offs" are used by wireless vendors who don't have the features to compete.  I can't remember the last time in a discovery session with an enterprise customer that they listed "single access point speed" as a requirement. 

If you want a real test deploy 500 access points in a four level public hospital with several thousand staff laptops, tablets and smartphones, 2.4 GHz Vocera badges used by the medical staff and one thousand or so guests with everything from high end laptops to cheap, nasty android smartphones - the ones that telcos give away free with prepay accounts.  Configure the wireless network to support twenty or so different use cases and wait a week.  Once all the "feedback" starts rolling in how easy is it to implement the requested changes?  That's a real test.
Photo of Ben Littleton

Ben Littleton

  • 1 Post
  • 0 Reply Likes
Crowdie, Are you using Vocera in an Aerohive network?  If so how is the performance?
Photo of Crowdie

Crowdie, Champ

  • 972 Posts
  • 272 Reply Likes
The biggest issue we experienced was contention on the 2.4 GHz wireless network.  The Vocera badge, like any VoWiFi device, must have instant access to the medium or call quality is affected.  As the 2.4 GHz spectrum only has three non-overlapping channels avoiding 2.4 GHz co-channel contention in a dense wireless network is extremely difficult.   The other issue we had was at shift change as all the nurses would power their Vocera badges on by the nurse's station putting excessive load on the three radios covering this area.
Photo of tobiaslinder

tobiaslinder

  • 2 Posts
  • 0 Reply Likes
Crowdie, I agree with you that throughput is not everything but I have schools where at the end of the lesson pupils want to upload their files to the server and when 25 users do this at the same time the throughput is really not satisfisfying.
Photo of BJ

BJ, Champ

  • 374 Posts
  • 45 Reply Likes
Personally, Aerohive requires no defense and I'm not sure how much stock should be put into the opinion of the Croatian Academic and Research Network. Obviously this particular report is Ruckus propaganda, given their logo slapped on every page and their top rankings in almost every test.

"Each of the following vendors chose to send an engineer to the tests with equipment in hand: Aerohive, Cisco, HP, and Ruckus." However Aruba, Ubiquiti, Meraki, and Xirrus, devices were also tested. Who configured these devices? Given the 2015 date, the Cisco engineer may have been responsible for the Meraki device, but I believe it is safe to say the HP engineer was probably not yet fully familiar with Aruba, if he/she was even responsible for that test.

Obviously consistent configurations were not executed; optimization methods like channel bonding and beamforming were used in some tests and not others. No one should consider this a responsible report. Furthermore, the disparate systems represented tells me there was no intention in testing parity and equity.

Crowdie nailed it. Throughput is only a portion of the endgame.   
Photo of Prasanna Chamala

Prasanna Chamala

  • 5 Posts
  • 1 Reply Like
Hi 

We, at Alethea conducted a comparison and benchmarking test for 4 802.11ac Wave 2 Access Points. The four access points considered for the test were Ubiquiti, Meraki, Aruba and Ruckus. All the 4 access points have similar features and support the same. But in practice when a stress test is done with different client load ranging up to 100, which access point performs better? Though Aerohive is not in the compared list, you can have a look at the below link for more information. 

https://goo.gl/8Q35ib

Ubiquiti performed better in most of the test cases. Followed by Aruba in the video streaming and Meraki in the Throughput test.
Photo of Tobias Linder

Tobias Linder

  • 1 Post
  • 0 Reply Likes
Hi
That Ubiquiti performed best is only surprising till you read the "tests were sponsored by Ubiquiti" remark at the end of the document :-)
Photo of Prasanna Chamala

Prasanna Chamala

  • 5 Posts
  • 1 Reply Like
Hi Tobias,

Thanks for going through the results!

Though Ubiquiti sponsored, tests were done independently by Alethea neither they involved in test setup nor the execution of tests/analysis of results.
(Edited)
Photo of Crowdie

Crowdie, Champ

  • 972 Posts
  • 272 Reply Likes
So a "cost effective" Ubiquiti access point out performed a Ruckus access point with dynamic beamforming?  Something doesn't add up here.

Why were these settings used during the test?

  • Bandwidth 40 MHz for 2.4 GHz band (page 12)
  • Power set to Maximum (page 12)

I am not aware of any enterprise wireless vendor who would recommend these settings but I know several who would strongly advise against them.
(Edited)
Photo of Crowdie

Crowdie, Champ

  • 972 Posts
  • 272 Reply Likes
If you configure 40 MHz wide 2.4 GHz channels then you only have one non-overlapping channel across the site.  This is just not realistic.  Why would you configure a performance test with a setting nobody would use?

You want the 2.4 GHz transmit power to be around 15 dBm to match the average transmit power of smartphones and tablets.  If you have an access point 2.4 GHz transmit power of 20 dBm it is almost four times that.  The two largest wireless vendors Aruba (ClientMatch) and Cisco (Dynamic Transmit Power Control) created systems to counter this power imbalance.  
(Edited)
Photo of Prasanna Chamala

Prasanna Chamala

  • 5 Posts
  • 1 Reply Like
Hi Crowdie,

As mentioned earlier, the test was conducted in a controlled environment ( in a class room of a large college campus with no other APs nearby)..and hence we put  40MHz in test specification for 2.4 Ghz. 

The same may not hold good for other places where there are too many APs (a large office complex, public places).

The goal of the test is to measure the performance in a relatively controlled environment. For evaluating performance with interference, we would rather make Auto Channel Select and Auto Channel BW on.

- Prasanna
Photo of Crowdie

Crowdie, Champ

  • 972 Posts
  • 272 Reply Likes
I think the point I am making is that you took products, like Meraki, designed for enterprise deployments and tested it like a SME product.  That is a bit like racing a tractor against and a car and then saying the tractor's performance was poor.  Repeat the same test in a wet field and see how far the car goes.
Photo of Desmond

Desmond

  • 4 Posts
  • 1 Reply Like
Crowdie, I have absolutely got to jack this analogy from you. It describes the situation 100%

Photo of Prasanna Chamala

Prasanna Chamala

  • 5 Posts
  • 1 Reply Like
Crowdie, Thanks for your prompt revert every time!

I don't agree with your analogy. All the APs under test support similar features and only contrast is the price. Please note that the settings we configured did not give an edge to one AP over the other. We are clearly stating the test conditions and our claims are only within those conditions.

From past 7 years, we have worked with leading Wireless OEMs, Chip Manufacturers, and Service Providers. We do understand the generic test requirements of each segment in the wireless industry quite well. The products and solutions that we offer address the challenges in the test & measurement space of the industry.

However, we do understand that testing requirements may be different/personal depending on numerous factors. If you have different test requirements for your product benchmarking or want to test your product in the lab/field, please contact our sales team at info@alethea.in