Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Wi-Fi Alliance Brings Wi-Fi 6 into 6 GHz (wi-fi.org)
225 points by dmmalam on Jan 6, 2020 | hide | past | favorite | 202 comments


A lot of marketing speak and not much real content. I found this article by Cisco, which seems a lot better at explaining the importance of 6 GHz

https://blogs.cisco.com/enterprise/wi-fis-new-6ghz-spectrum-...


So to summarize, 6GHz offers a wider frequency range, allowing for more wider channels, and is not used by anything else. As long as WiFi 6 is the only thing using the specific frequency band the AP can assign devices timeslots for both sending and receiving, improving both performance (fewer collisions) and battery life (no need to keep the radio on outside assigned timeslots).


Let's note the "not used by anything else" is probably the most important thing, because of https://ieeexplore.ieee.org/document/1208921

Basically, because WiFi gives an equal opportunity (sort of) of transmission to all devices, old devices or devices far away will slow down the network for fast devices as they will take a big amount of transmission time because of their lower transmission rates.

Greenfield was already a try at "ignoring" old devices but never really took off.


It's also a lot more line of sight, and easier absorbed in concrete and more dense construction materials (think rooms away with layers of wood and drywall).

2.4GHz = 5 in

6GHz = 2 in

The inches are the wavelength height from base to top... Meaning that's pretty much what it can get around without impeded.


Wavelength is not a height, but the distance between between points of the same phase in the variation of the propagating electromagnetic field.

Propagating EM fields are adorned and reflected with multiple effects. Light is electromagnetic fields with a wavelength of ~ 100s nm, yet it can pass through glass, but not a thin sheet of aluminum foil.


> The inches are the wavelength height from base to top... Meaning that's pretty much what it can get around without impeded.

Yeah, that's not what it means.

  λ = c / f
"c" being the speed of light in a vacuum (299,792,458 m/s); wavelength in meters, frequency in hertz -- if memory serves!

(Not sure how I remember that one 25 years after my first ham radio exam but some things just stick in my mind, I guess).


Units don't matter for formulas of quantitites.

For electrical formulas the unit system matters, but not the units.

lambda = c / f

Is true for any combination of units.


Very well, lambda in feet and frequency in nanohertz!


Deeply uninspiring.

5Ghz gets blocked by the thinnest brick walls. We all live in castles in Europe so 5Ghz practically only works line-of-sight. Makes it pretty expensive and cable-intensive to deploy [well] in a large building.

It's also disheartening when a new standard comes out and device manufacturers flock to replace all their old single-radio, single-antenna devices with shiny new... single-radio, single-antenna devices. Wave2/MU-MIMO should be a base requirement.

Is there any progress on improving longer band wifi? Ie sub-2.4GHz, or 2.4GHz coordination between masts so devices can actually roam well?


Single antenna devices make sense for a lot of small/low power devices. Not to say manufacturers don't just cheap out sometimes though :).

802.11ah (2017) would be your most likely bet for low frequency in the short term. I have a 5 year bet going with another guy in the office that he won't be able to get a client and an AP with it by 2022 (i.e. he thought manufacturers would jump on it, I thought it was going to fizzle and 6/24/60 GHz would have the hardware action). So far it's looking like I'm going to win that lunch. There is also some talk about refreshing the 3.7 GHz space but I'm not sure that'll happen and if it did it'd probably be a ways out before real world impact (if ever).

As far as improving 2.4 all of the roaming standards improvements work in it and Wi-Fi 6 works in it. It's really no different than 5 GHz/6 GHz in that regard it just has a bunch of shitty legacy clients to compete with which is amplified by the fact it penetrates farther.


> 802.11ah (2017) would be your most likely bet for low frequency

900MHz is pretty nice for the 1-10mi range (this is what I do for my day job), but it never really made sense to me as part of the WiFi umbrella for that reason. Its also got a bit of a chicken/egg problem - without the correct antennas and chipsets in devices, its useless in an AP and vice-versa. 2/5Ghz have been around since 802.11a/b days, so continuing to iterate on those bands makes more sense.


Curious what you work on in the 900 MHz space :). I come from enterprise Healthcare so it's mostly about connecting very new consumer type devices and very old pieces of shit.

I think the 802.11 group wanted to offer low to mid speed Ethernet/IP + traditional wireless security managed the same as the rest of the devices which makes some sense I just didn't (and still don't) think the types of devices using 900 MHz care about fitting in that mold enough to adopt.

Pretty much everything 900 MHz has the chicken and egg scenario on hardware/protocol right now so I didn't/don't count that in my reasoning. There's not really a single protocol that has a large number of products deployed in a large number of verticals where you can safely say "we'll just attach to the existing 900 MHz radio at the customer site". Some protocols at some verticals yes but not enough that choice is locked in due to chicken/egg blocking changes.

Overall though while 802.11 is going to continue iterating 2.4/5 they are heavily interested in offering Wi-Fi in basically any unrestricted frequency. Moving extremely high bandwidth things to 60 GHz, moving low bandwidth things to 900 MHz, moving new clients to 6 GHz. They've come to realize they can come up with the coolest protocol in the world but there is only so much you can do with the 60 MHz of space in 2.4 when devices with wireless N chips are still being sold. There are even talks with hardware vendors looking at interest in the 50 MHz of space in the 3.7 range for things that should be 2.4 but 2.4 is too crowded.

.

Edit: I just realized out of all of that babbling I never said why I didn't think 900 MHz Wi-FI would take off. They made some protocol changes to make it reasonable to use at low bandwidth long range but not enough. That combined with the desire for long range Wi-Fi itself already being niche almost guaranteed hardware manufacturers weren't going to be very interested. I think there was a company in France that actually started sampling some AH hardware but it's only partial spec and very low bandwidth.


I worked briefly in the radio group at Trimble Nav Ltd. around the DotCom times.

900 MHz and VHF CB bands didn’t have great channel bandwidth but sure worked well at a distance. 2.4 GHz was marginal but useful just like as it’s used for WiFi. 5+ GHz is/was basically useless... and 802.11/A proved that out. GPS and other satnavs use the L-band 1.2 - 1.6 GHz which is a very sweet spot for many applications. Cell phones use/used 450, 800 MHz through L-bands, which makes sense.

5G is being deployed in bands (24 - 71 GHz) that are implicitly trying to sell as much equipment making the deployment as expensive as possible and/or going to waste an enormous amount of energy trying to overcompensate for their horrible spectrum utilization. I predict 5G will die, not because of unfounded conspiracy theories, but because carriers will realize it’s too expensive and users will balk at $400 USD mobile bills to fund this pointless adventure. I suspect they’ll be forced to redo or abandon the emperor’s new clothes when they’re discovered to not be a substitute for a winter coat.


I'm a radio technician, not a hardware or software engineer. Most of our customers are things like water/wastewater districts, utility companies and assorted state and federal government projects (people who can afford $1000 radios and need them to be rock solid for 10+ years). Its not all super exciting work, but we do get the chance to work on some pretty interesting stuff on occasion, and designing radio networks can be pretty rewarding.

Agree with you on the lack of protocol compatibility - I think its seen as an competitive moat of sorts, but it can make it difficult to get into existing systems due to the need to essentially rip and replace existing systems. Unfortunately, vendors are also terrible at being compatible with their own legacy products due to the use of fixed function chipsets, although that is changing with the move to SDR.

My email is in my profile if you want to reach out.


Yes that's how it works in professional installations you want a lot of AP's that don't overlap and using 5Ghz makes this simpler than using 2.4Ghz

5Ghz Also has lot more non overlapping channels.

Just use Powerline for your DS


What's the "that" in "that's how it works"? 5GHz APs in every room?

I'm not saying that 5GHz doesn't have benefits, just that many of them are hard to exploit when you have real walls.


Using higher GHz is all about selling more hardware.

Let’s use 500 GHz so you have to have one in ever corner of every room to work. And don’t breathe or take a shower because that will block the signal.


They do indeed have 5GHz APs in almost every room in most commercial installations in the US.


It's actually useful in high density environments to have wireless signals absorbed. Say you're in an office: there may be 3-10 wireless devices per desk. Have those signals reach an access point and then die is a good thing. Another scenario is a stadium. It's best that communication doesn't propogate or reflect or nobody will be getting a communication line.


I understand that not being backwards compatible means new wifi 6 devices can take better advantage of improvements, but it will also probably be a bit of a mess for years while millions of legacy devices muddle on, and years even before manufacturers start making wifi 6 capable hardware the default.

Not only that, there's a reason many (most?) smart home devices still only support 2.4ghz, (range, particularly through solid objects I think) and I don't think that is going to change.

Or are we going to be stuck with wifi routers that have to implement ever wifi protocol?


For a pretty long time I disabled 802.11a/b/g on my AP. All devices on my home network must support 802.11n or better. It isn't really an issue because 802.11n is actually really old. IIRC the presence of older 802.11a/b/g devices will cause throughout to decrease for newer devices as well.

Late last year I also stopped providing 2.4GHz for my home network. The only inconvenience so far is just a wireless printer not supporting 5GHz, so I have to print using a USB cable.

The level of inconvenience would surely be different for everyone, but in my case it's worth it.


In some countries there are no 5ghz bands that don’t legally require DFS (which can kill your network for 30 minutes or so when interference is detected) and out where I live 5ghz usually doesn’t make it across half the house (land is cheap as crap and everyone builds crazy large houses on it.)

I have to wonder what you’re even doing that needs 5ghz, I guess you’re in the city and trying to escape interference?


Yes, I live in a city (though no way as dense as NYC) and try to escape interference.


I currently have several devices that are 2.4 only, so I'm stuck. The wireless printer I can probably work around if I move it out of the closet, but the wife won't let me and I love my wife more than I love wifi. There's also two game consoles, an iPhone 4, an egg minder, an e-book reader, and an iLamp on 2.4. Probably some other stuff I don't remember off the top of my head.

For me, it would be nice if all AP's had the same power level. Currently the neighbor across the street and two houses up is putting out so much power that it drowns out my AP two rooms away.


There are mandated power levels which devices use. Some firmware allows you to ignore them.


Manually switch to a different channel, you can use even 13 if you are outside US...


> Currently the neighbor across the street and two houses up is putting out so much power that it drowns out my AP two rooms away.

Give your neighbour a dose of their own medicine: Get a directional high db antenna, set a spare WiFi AP to the same channel, and point the antenna at your neighbours house. Then set up a looping script that copies data to constantly saturate that AP.

Not that I've ever done this, or advocate anyone actually doing it... ;D


Unless actual malice is involved, the neighbor will probably just wonder why the wlan is shit and buy a "better" (stronger) one. Sounds much like a good old coordination problem to me.


Yeah, it's a basic tragedy of the commons, where the commons is noisy to begin with radio spectrum.


Unfortunately, 2.4ghz is still pretty much the only option for something like a smart lightbulb or smart plug. For the printer though, it seems like either the printer software or my router is smart enough to talk to it even though my PC is on the 5ghz SSID. But the same can't be said of the bulbs and plugs. I had to return a bunch and swap for Amazon echo smart plugs, which are also only 2.4ghz, but the echo itself is smart enough to work with them even though it too is on the 5ghz said.


Wouldn't something like Zigbee or Z-Wave better for a lightbulb or plug?


Possibly from a “BETAMAX” features point-of-view, but it’s not as pervasive as WiFi (“VHS”).

My front door lock is Zigbee only because a WiFi camera is its WiFi-to-Zigbee bridge. Everything else is WiFi.

Maybe in the future smart home stuff will convert over, but then it means one company or another will have to sell a bridge and be the gatekeeper. Perhaps the home/ISP-provided access points should include Z-Wave / Zigbee that is open to provision and configure from FAANG’s and smart home products?


Interesting.

802.11ac's ("WiFi 5") core benefit was (with enough antennas) running channels on both 2.4GHz and 5GHz and supporting devices duplexing across both channels for maximum speed/latency shaping. It made running 2.4GHz cool again, and from what I'm aware of finally solved some of the compatibility issues between 802.11a/b/g and 802.11n so that there are fewer issues with older devices.

My impression from school ages ago is that though 802.11a botched the original plan and took years to fix while 802.11b captured the market, and 802.11n mostly fixed it right, and even though 2.4 GHz is one of the noisiest bands in the modern home (microwaves, old school walkie talkies/baby monitors, parts of Bluetooth, all sorts of other random home appliances), 2.4 GHz is still the technically superior band in a lot of cases, and WiFi frequency hopping and noise avoidance have gotten really good since 802.11a.

It's interesting me to decide to ban that entire band from one's home WiFi options.


There are new Android phones sold in 2020 without 5GHz Wi-Fi support, especially some Samsung midrange ones.


Huh?

We turned off 2.4 GHz in the office three years ago without notice. This was due to a huge hotel going up next door that seems to have an AP going full blast in every room making 2.4 GHz unusable.

Nobody noticed or complained. And I know we have lots of really old Androids in use by staff.


Galaxy A10 is one.


I do something similar. I don’t run 2.4GHz.

Pain points:

* Some IOT devices, such as yeelights.

* some SBCs such as OrangePI Zeros.

* Some televisions (Sony Bravia)


If you don't mind me asking, how big is your house? What are the interior walls made from?

I live in an apartment and can't get 5Ghz signal in my bathroom which is 20 feet from the router.


Quite a few things are based around devices like the ESP32 and ESP8266 which are very much only on 2.4GHz.

I think that we'll begin moving away from that to things having their own LTE-M1 modems in them and a token data service though. We're already seeing that to some extent, where there's hot tubs (literally!) that have their own internet connections to send "telemetries" back to the operator for advertising purposes. From the consumer perspective it's "zero configuration", for the manufacturer it means that "just don't connect it to the internet" simply isn't an option anymore unless you go and tear out the LTE chip.


Routers in most homes and offices have the power budget and physical space to support multiple frequencies and protocols, so there's not much reason to give up backward compatibility unless you want a router that is either very small or very cheap. Of course the 2.4GHz band is crowded, but it won't be so bad anymore when most bandwidth-heavy devices have moved on to higher frequencies.


1. Force devices to use 5 GHz, and 2. Prevent use of inefficient older protocols which reduce overall bandwidth


> Force devices to use 5 GHz

You can force devices to use 5 GHz by having different ESSIDs for the 2.4 GHz network and the 5 GHz network. A device configured to use the home-5g network won't try to connect to the home-2g4 network, and vice versa.


I meant people, force people to use the 5 GHz band in an easy hassle-free way.


Tell that to my fridge, which only supports 2.4GHz and which I have no plan to replace in the next 10 years.

Fortunately, the fact that some devices continue to use 2.4GHz does not reduce the bandwidth I have on my other network that is 5GHz. So I have no incentive to disable 2.4GHz anytime in the next 10 years, either.


What are you using the internet access on your fridge for?


According to the manufacturer, it sends diagnostic information and receives firmware updates. I'm not entirely sure that it's not snitching on my eating habits at the same time, though.


> According to the manufacturer, [...] receives firmware updates.

Why would a fridge, which should do nothing more than turning the compressor on and off according to its inside temperature, need firmware updates?


I wonder how nobody takes this "feature" as negative. My first thought was: "My fridge doesn't need firmware updates and it does what it's supposed to do. What's wrong with this one?"


It's going to require new hardware to support the new frequency so it only makes sense that if you support the frequency it should be required you do it with the minimum protocol version being the one that was around at the time the frequency was introduced.

And yes, tri band routers will be popular as well as switchable dual band routers (i.e. you can flip between 2.4/5 or 5/6 but not both at the same time)


Yeah I've got a handful of random devices that are still just 2.4. my thermostat being the most front and center.


I appreciate that they lasted a year before muddying the already foggy "wifi 6" branding by introducing "wifi 6e".

    802.11
    802.11a  
    802.11b 
    802.11g
    802.11-2007
    802.11n
    802.11-2012
    802.11ac
    802.11af
This is too confusing for consumers, lets drop the letter scheme.

    Wifi 6. 
    Wifi 6E
Hold up.


This is why we should let Apple name everything.

  Wifi Mesquite
  Wifi Mistletoe
  Wifi Kangaroo Rat
  Wifi Roadrunner
  Wifi Rattlesnake
  Wifi Great Horned Owl
That way it's easy to keep track because we just follow the food chain. And with Apple in charge, it will be instinctive to know that Mac OS X Leopard is incompatible with Wifi Minnesota Big Game Hunter Dentist.


But mac os versions all have numbers that go along with them in addition to the name. It's the same system used by the Linux Kernel (i.e. 5.2 / Bobtail Squid), Ubuntu (18.04 / Bionic Beaver), and many other pieces of software. Most of the time the numbers are more useful, but there are occasions where having a non-numeric designation is helpful.

Apple's real naming sin is that they don't enumerate all of their hardware as they do with the iPhone.

But as long as we are talking about silly naming, this is always my favorite criticism: http://www.secretgeek.net/ex_ms


Eh, for their Macs Apple follows just a more verbose version of Ubuntu's scheme by versioning with dates. MacBook Pro (16-inch, 2019)


That’s not true, if you ever open about this Mac - you’ll see the real name and model code like “MacBook 10,9 A1358”


That’s not true

It is absolutely true. And it is both.

On the machine I'm using right now, opening About This Mac shows "iMac (21.5-inch, Late 2013)."

The identifier you're talking about can only then be reached by clicking System Report... and looking under Model Identifier, where it is listed as "iMac14,1."


That link is hilarious!


I think they should turn it over to the people at the Debian project who name every release for a toy story character. Old list I've pasted here, buster is now a stable release.

1.1 Buzz 1996-06-17 Buzz Lightyear

1.2 Rex 1996-12-12 Rex (the T-Rex)

1.3 Bo 1997-06-05 Bo Peep

2.0 Hamm 1998-07-24 Hamm (the pig)

2.1 Slink 1999-03-09 Slinky Dog

2.2 Potato 2000-08-15 Mr Potato Head

3.0 Woody 2002-07-19 Woody the cowboy

3.1 Sarge 2005-06-06 Sarge from the Bucket O’ Soldiers

4.0 Etch 2007-04-08 Etch, the Etch-A-Sketch

5.0 Lenny 2009-02-14 Lenny, the binoculars

6.0 Squeeze 2011-02-06 Squeeze toy aliens

7 Wheezy 2013-05-04 Wheezy the penguin

8 Jessie 2015-04-26 Jessie the cowgirl

9 Stretch 2017-06-17 Rubber octopus from Toy Story 3

10 Buster not yet released Andy’s pet dog

11 Bullseye Not yet released Woody’s horse

Sid "unstable" The next door neighbour


> Sid "unstable" The next door neighbour

Still In Development


It is possible for something to intentionally have a double meaning.

Given the overall naming convention I'd actually hazzard a guess that Sid the ToyStory character was the initial reason for the name and Still In Development is a backronym.


I think it was a backronym, but it was referred to when I first used Debian in 2000 so it's hardly new


Sid, aka the kid who broke toys


... and, someday, "bookworm"!




Buster was released in July 2019


They haven’t all been great though. How one says ‘X’ in the Apple ecosystem has always been trouble, and they seem to repeatedly fall into that trap. There is also the weird pleuralisation issues they have (iPhones 7? iPhone 7s? How do you pleuralise the iPhone 7S?). Some of the releases shared a name too, like Mountain Lion and High Sierra (which were great so are forgiven).


Huh? Always ten, 7Ss (seven esses... ese), and just like S those mean minor releases based on a previous major, just with software not hardware.


If you ask people what various Apple things are called you won’t get the X pronounced as 10. Xcode comes to mind too.


as a developer for only the last 4ish years I never even put together that the X in Xcode had anything to do with the X in OSX


Or!

WiFi

WiFi 2

WiFi 3

New WiFi

WiFi

WiFi

WiFi


  Wifi
  Wifi 3G
  Wifi 3GS
  Wifi 4
  Wifi 4S
  Wifi 5
  Wifi 5C
  Wifi 5S
  Wifi 5SE
Smells like Apple is a member of the Wifi Alliance.


The Wi-Fi

2 Wi 2 Fi

The Wi-Fi: Tokyo Drift

WiFi

WiFive

Wifi 6

Fi 7

The Fate of the Fi

WiFi Presents: Facebook & Google


But don’t forget that the WiFi Tokyo Drift version actually fits in between two later versions and it’s up to you to figure out where by carefully examining which characters are present in the documentation of each version.


That's what 802.11a was about.


Wi-Fi

Wi-Fi 2

Wi-Fi 3

Wi-Fi 4: Modern Internet

Wi-Fi: World Connected

Wi-Fi: Modern Internet 2

Wi-Fi: Dev Ops

Wi-Fi: Modern Internet 3

Wi-Fi: Dev Ops II

Wi-Fi: Packet-loss

Wi-Fi: Advanced Internet

Wi-Fi: Dev Ops III

Wi-Fi: Infinite Internet

Wi-Fi: WAN

Wi-Fi: Dev Ops 4

Wi-Fi: Modern Internet


You forgot some, though.

WiFi 6: Second Edition

WiFi Classic Edition

2 Wi 2 Fi: 2 (who knows, maybe they even make this a full trilogy)


Some Microsoft-style edgelord naming:

WiFi

WiFi 360

WiFi One

WiFi One X

WiFi Series X

WiFi X Is The Most Futuristic Letter Known To Man

WiFi XXX

WiFi Look at all those X’s

Or maybe:

WiFi

WiFi XP

WiFi XP Ultimate

WiFi Extreme X


WiFi 2: Electric Boogaloo


I laughed way too hard at this.


I laughed way too furiously at this.


Joke was a bit too fast for my taste :D


Please leave the low-effort puns at reddit, they already trashed the comments section over there, don't bring it here.


Naming can be hard when a new product is not a direct replacement/upgrade of the previous one but more of a fork which might include a premium/cheap version.


Firefox should join in and we should go straight to

Wifi 48


WiFi Pro


Sticking to incrementing integer major revision may not be as much fun, it makes ordering much easier. Case in point, a roadrunner hunting a rattlesnake: https://www.youtube.com/watch?v=YwLspcdm0Bs

I own some Apple hardware but I couldn't tell you the order of OSX/MacOS releases off the top of my head. It's something only brand enthusiasts memorize, while every school child learns the ordering of integers.


To be fair, the IEEE standards are very easy to order, the letters sort the same way excel columns do: a-z, followed by aa-az, then ba-bz, etc. Sure, they skip most of the letters and I'm not sure if the ordering is entirely intentional, but so far it works.


> Sure, they skip most of the letters

Actually, they skip just a couple of letters, the ones which can easily be confused with other letters. See https://en.m.wikipedia.org/wiki/802.11#Standards_and_amendme... for a full list. And yes, the letters are alocated in order.


It's just that they also use letters for clarifications, incremental updates, and in a couple cases tangential committee work, so 802.11 skips so many letters not because they weren't documents but because they were documents that didn't spawn a "new standard" outright. Some of the "in between" letters are used for interesting things, and/or are useful stepping stones in seeing the evolution of the standards that made it to "consumer adoption".

(Hence why it was maybe smart of the Wi-Fi Alliance to disambiguate IEEE procedure from marketing names for consumers, but also why they don't seem to be having much more luck with it than say the marketing names of cellular networks, because standards don't evolve "linearly".)


The problem is then you get to this:

  WiFi Terminator Model 101
And all hell breaks loose because of the circular dependencies. Yes, they could skip it and just go with

  Wifi John Connor
But that won’t satisfy anyone.

——

Great post, btw !


WiFi 6

Wifi 6 Pro

Wifi 6 Pro Max


I miss Wifi Snow Roadrunner


It gets better, though:

""" Please note:

In this story, we discuss 5 GHz and 6 GHz, which are frequency bands, as well as the wireless standards 5G and Wi-Fi 6.

The frequency ranges may sound like they are related to the wireless standards, but the terminology similarity is a coincidence. """


The post also talks about "WiFi 4" and "WiFi 5", which were new terms to me. It turns out "WiFi 4" is back naming for 802.11n, "WiFi 5" is 802.11ac.

So you can have Wifi 4 2.4GHz, WiFi 4 5GHz, and WiFi 5 5Ghz, WiFi 6 5GHz and Wifi 6 6GHz? Something like that.


I'd rather they just use the years the standards were ratified... at least that would be both monotonic and obvious while not risking confusion with the frequency.


Monotonic version numbering is the way to go, and years are the most consistent and best understood mechanism for something with a release cadence such as WiFi or a language spec.

We should adopt this as an industry.


It also gives an indication of whether two things were produced around the same time or not. (Eg. You'd know that Windows 2015 probably wouldn't support WiFi 2019 without support being added)


Now that's just brilliant. It hasn't seen wide enough adoption for this to happen yet, but imagine all the USB standards and processors and whatnot did the same.

That's just so simple it's genius.

"That 2020 motherboard won't run USB 2025"

We really need this.

Please Intel, Broadcom, Nvidia, etc. folks reading this. Please.


It's also really nice for convincing people that things are out of date. It is a lot easier to defend targeting "ANSI C" than it is to defend targeting "C89", even though it shouldn't be.


Those firms don't have much control of the rate technology is adopted.

Look at how slow USB-C has been to come on. Look at how 802.11ax has gone nowhere. Or how Intel has been dragging its feet with PCIe4.


Absolutely this. “Hey, this 2025 motherboard runs USB 2020 right?” “Nope!”


> Look at how 802.11ax has gone nowhere

802.11ax is WiFi 6. We can assume it will get somewhere once it gets a fully fledged launch.


Except that your 2025 motherboard will run USB 2025, even though it’s the same as USB 2020. The marketing people just didn’t want to write USB 2020 on there.

But since USB won’t actually Change until 2030 it doesn’t matter, right?


ECMAScript does this (ECAMScript 2015, 2017, 2018, etc) and I also really like it.


Fortran has been doing it longer.. Fortran 66, 77, 90, 95, 2003, 2008, 2018. It really is a very helpful approach.


> Monotonic version numbering is the way to go, and years are the most consistent and best understood mechanism [...]. We should adopt this as an industry.

Absolutely agreed.

In the interest of efficiency, we should probably also truncate it to fewer digits.

Perhaps 2.

Since we're in the first quarter of a new century, this won't cause an issue for a long time.


someone put a website to advocate something like this to replace semver, forgot the url though


> someone put a website to advocate something like this to replace semver, forgot the url though

It's probably CalVer: https://calver.org/

As other commenters have said though, manufacturers probably don't want consumers to have a good idea of just how obsolete their equipment is. The current a|c|b|g|n alphabet soup is working out just fine for selling lots of equipment.

Also, there are likely to be actual manufacturing and engineering reasons why Wifi 2021 will only become widely available during 2024, and those are hard to get across to consumers too.


Probably unpalatable to industry. Do you imagine ASUS would want their newest 2020 router to only be Wifi 2018?


It can be done in the same way that specifications for consumer electronics currently work. This router supports WiFi Hyper-Speed 5 Super Tough Technology with Ultra Frequency Crystals (R) for enhanced frequency modulation GAMING EDITION ULTIMATE FATALITY RGB.

The rise of this sort of branding is very intentionally to make comparison possible, as you don't really know anything about the Ultra Frequency Crystals and why their competitor doesn't have them, just buy whichever ones have flames and a dragon on the box and it'll work itself out.


The auto industry deals with this by assigning cars a "model year" which is the year after the calendar year in which they start being sold, thus you can buy 2021 cars in 2020.


Consumers that need to know or care can be taught.

The other half or more of the market already doesn't understand the existing versioning scheme. If you print the version in small print on the back of the box, they'll be no worse off. Or simply elide the first two digits and they won't know it's a year-based scheme.

And who's to say they can't do a yearly release cadence? WiFi 20, WiFi 21, ...


I propose using a 3 digit date, no one will even realize if they aren't a tech person.

Our 2020 router supports wifi 018


I have fond memories from the mid nineties of a compiled xBase language that did this, their dominant version was Clipper Summer'87 — their last year numbered version, their next version was 5.0 and was so bad they lost most of their market advantage — in the mid nineties talking about a product called Summer'87 sounded really bad to me, humorously this was when Microsoft moved from a numeric scheme to years, which itself only lasted from 95-2000 (longer for server releases of course)


> Microsoft moved from a numeric scheme to years, which itself only lasted from 95-2000 (longer for server releases of course)

Their commitment to it even for servers is only half-hearted.

    Windows Server 2003 (April 2003)
    Windows Server 2003 R2 (December 2005)
    Windows Server 2008 (February 2008)
    Windows Server 2008 R2 (October 2009)
    Windows Server 2012 (September 2012)
    Windows Server 2012 R2 (October 2013)
    Windows Server 2016 (September 2016)
    Windows Server 2019 (October 2018)


That works for the version but tells you very little about what frequencies are supported. And it's the complication of frequencies that's hard here, not figuring out that wifi 6 comes between 5 and 7.


I'd rather they just use the years the standards were ratified

Worked for Microsoft... for a while.

And then later, lazy programmers forced it to skip Windows 9.

People ruin everything.


> People ruin everything.

'Marketeers' ruin everything.


It's my understanding that the reason "Windows 9" was skipped was technical, not marketing.

I read (I think on HN) that lazy programmers would test whether the environment was Windows 95 through 98 by only checking for "Win9" or something similar, rather than a list of strings. Thus, if Microsoft released an actual "Win9," many programs would think they were running on Windows 95.


To name and shame at least one example, such code was included in some versions of the Java Development Kit itself and could be found in easy Google code searches at least at the time Microsoft skipped Windows 9.

Microsoft and Apple and others also prefer to skip 9 for the international marketing reason that 9 is a very evil number by superstition in most of Asia, like how some of the west believes 13 to be unlucky and things like elevators tend to skip 13 in the US/Europe.


WiFi 6 works on 2.4 as well.


Which is one of the major benefits of it! It brings all the goodies back to 2.4ghz.


Hah...you can't even get rid of the possibility of confusion by switching to wavelength instead of frequency to talk about the bands.

5 GHz is 6 cm, and 6 GHz is 5 cm, and so using wavelength would probably actually be more confusing.


> 802.11af

You have to admire the poetry that 802 doesn't just go to 11 but it's as 11 AF!


Genius.


Telecommunications engineers produce the worst naming schemes and acronyms of anyone, anywhere.

Any cellular document is a mishmash of alphabet soup: UTRAN, UMTS, PS-CN, SGSN, RNC, RNS, eNodeB, EPC, MME, S-GW, X2-AP, S1, GTP-U, HSDPA, HSUPA, RRC, PDCP, RLC, OFDM, MU-MIMO, and on and on ad infinitum.

You'd think the marketing people would at least get the public-facing stuff right, but they've likely been tainted by association with the engineers.


They have though. They added 2G/3G/4G/5G (though some American ISPs like to call stuff the next G even if it doesn't qualify).

There's also WiFi and now WiFi 1-6 (the F is just dumb) and Bluetooth 1-5.

OFDM and MU-MIMO aren't user facing technologies, they can be applied in any wireless standard really. Just because router manufacturers put stickers on their boxes doesn't make the technology customer facing.

Same with the protocols underlying the cellular industry. For all the engineers care, the people just call it by their Gs and be done with it. Cell phone manufacturers tried to advertise with specific technology names because "3G but slightly faster" doesn't sell well. Then the 4G debacle happened where LTE didn't even qualify to be called 4G at first except due to a technicality and now that LTE Advanced is available, which finally is fast enough to be called 4G according to the specification, people are already starting to call it 5G.

I blame the marketeers for ruining any consumer facing schemes by their drive to have the biggest number the first no matter what. As long as people like that end up naming things, we'll never have concise naming schemes for consumers as all simplification efforts are ruined by trying to grab a quick buck.


What debacle? As I recall, LTE was the actual 4G tech because, well, it was a new generation after all the 3G ones. The only debacle I remember was that some carriers decided to call HSPA+ 4G for marketing reasons.


Well, according to the original design specification for the 4G name, the standard had to support a certain throughput in rest (I believe it was 1Gbps) and a certain throughput while moving at a certain speed (something like 100mbps at 100km/h). LTE failed to reach the 1gbps threshold but managed to get the high speed while moving so it was basically declared "4G enough" and people just accepted LTE as 4G. This is why the ITU now has definitions for 4G (LTE at the moment it started being marketed as 4G) and "True 4G" (4G technologies that actually pass the requirements for 4G such as LTE advanced, which AT&T now markets as "5G E" because screw consumers I suppose)


I recently worked in Telco, a long time before that I was in Research and Dev for the Military.

Telco has nothing on the military.

We had a database of TLA's[1] that was over 11,000 strong. Also everyone knew your TLA wasn't worth a damn unless it changed every three months or so.

[1] That's a 'Three Letter Acronym' - yes, there is a TLA for TLA.


> We had a database of TLA's[1] that was over 11,000 strong 26^3 = 17,576

So even back then you were already than halfway though the available space. Assuming they were all unique, which they wouldn't be. :)


Telecoms is a bit more rigorous when it comes to standards :-)

Back when I worked in that area our standards collection for x.400 /x.500 took up about 1/3 of a full hight cupboard


As hilarious as this sounds, it's actually not _that_ bad. It doesn't seem like 6E is a completely new standard so much as it is an existing standard being used on a new frequency band. In that respect, it wouldn't really make sense to call it "Wi-Fi 7", and the "add an arbitrary letter onto the end of the version number as an indicator of a different variant" naming scheme is already pretty well understood thanks to smartphones doing the same thing for years. (And Ethernet cables, come to think of it.)

I suppose they could have incremented the version number anyway just to avoid the need for that sort of awkward naming scheme, but I'm not sure if that'd be any less confusing. It'd be simpler sure, but also somewhat misleading. In any case, it's still way better than the previous scheme of using non-sequential letters of the alphabet as version numbers.


Have they considered names like Low speed, Full speed, High speed, and Super speed? They are already familiar to consumers who will understand them as well for wifi as they do for USB.


USB has to keep inventing new, slightly more impressive words for "fast".


ITU really got it bad here, having designated "Very High Frequency" at what we now consider pretty low frequency (<300MHz). Subsequently stacking on Ultra- Super- and Extremely-High Frequency bands before giving up at Terahertz.


That still leaves ridiculous, ludicrous, and plaid.


True, but considering how long ago the designation "high frequency" was made, it's understandable. AM was extremely low frequency (~1 MHz), but that was what they were comparing against when they started using higher frequency bands.


AM = amplitude modulation (as opposed to frequency modulation). It has nothing to do with a specific frequency range.

In the United States for example, AM is commonly used from ~150khz all the way up into the low hundreds of megahertz. As an example, all aircraft transmissions in the US are AM and are in the ~120-130Mhz range.


Thanks for explaining that to me, a licensed amateur radio operator for more than 20 years. :p

I meant the AM radio broadcast band as an example of a popular use case of radio in the early days when the "high frequency" designation was made.

They are extremely low frequency compared to where most radio communication occurs today, orders of magnitude higher. Not literally ELF, which isn't used apart from extreme niches.


> a licensed amateur radio operator for more than 20 years

Every HAM I know would ridicule me if I used such wording as you did, which I feel I didn’t do, maybe your experience is different.

My reply was less in explaining it to you though, but more as being informative to others who might’ve not had the experience and thus could learn misleading information from your post.

(I’m also a licensed operator, but not for as long as you)


I’m pretty sure they meant AM radio which is around 1.5 MHz and below.


AM radio stations in the US are from 530Khz to 1700Khz, so possible. If so, then calling them “extremely low frequency” is a bit of a misnomer, as that band is called “medium frequency” or MF for short. The actual ELF band is 3-30hz, 4+ orders of magnitude lower frequency, and is mostly only used for submarine communication as most higher frequency signals are filtered by water.


I thought that was obvious, yes.

My point is that the basis for the designations were made in a period where AM radio was the primary use case for radio and so frequencies higher than the AM radio band are called "high frequency," whereas today those frequencies are actually much, much lower frequency than nearly all forms of radio communication.


The name MF/HF/VHF etc were defined by ITU v.431 originally in 1953. Amateur radio operators in the 1950s commonly were using “shortwave” frequencies (now referenced as HF commonly). As it’s been told to me from multiple operators in that era, the naming scheme used by ITU for each group is based on the amount of bandwidth available in each group, not based on any presumption of AM radio broadcasts being used as a baseline.


Same issue happened with the naming of silicon development. Thankfully, it seems naming was largely over after "Very Large Scale Integration" (VLSI), having given up on the next generation name: "Ultra large scale integration", but after surpassing "large scale integration".

https://en.wikipedia.org/wiki/Very_Large_Scale_Integration


I don't even know why they do it... I was looking at dishwasher liquid the other day, wondering why it needs all that advertising on it. "New formula!" "Plus Ultra Action!" "Now 20% more effective!" "5% Free"... I'm just buying it for my plates and stuff, gee :D


Reminds me of TVs which I was just shopping for recently. You have your standard old high definition, ultra high definition, high dynamic range, quad high definition, quad resolution, quad full high resolution, etc...


> Super speed

That doesn't take into account that todays super speed will be tomorrows 28.8kbaud modem.


See the naming conventions of monitors/televisions

HD Ready (1280x720)

Full HD (1920x1080)

Quad HD (2560x1440)

4K Ultra HD (3840x2160)

Thank god we hit diminishing returns


Reminds me of how "fast ethernet" (100Mbps) is the slowest type of ethernet still commonly used.


Please tell me your username is a combination between Real Life Quine and also the author of Goosebumps R.L. Stine?

I been wondering about this for the last 11 days...


I think it is less confusing than IEEE designations were to a consumer but it could have been a lot cleaner than what they replaced it with.


Why is this? Is it standards body politics where new standards don’t want to step on the toes of whoever made the previous standard? Like WiFi 7 might imply that WiFi 6 isn’t good enough any more?


The nice thing is where we are well past the point where figuring out if 2 devices are compatible is something most people will bother doing.


Surely the logical successor to 802.11af is 802.11wtaf?


This is an improve meant over 802.11xx however.


Serious question - is there any reason why naming things with consecutive simple numbers isn't the best approach?

WiFi 1, WiFi 2

...

MacBook Air 1, MacBook Air 2

...

MacOS 10, MacOS 11

I know some of these are already named this way, they're just examples.

Using the year would also be a good (or even better) approach, as it gives you two pieces of information.

I understand that it may not be the best marketing approach, but it would make life so much easier for consumers - if I need a new anything I want to know exactly which is the latest, if I look at second hand items I want to know how far behind it is quickly, without requiring extensive research.

When buying an iPad for instance it took me ages to figure out how far behind the models I was looking at in eBay were...


That is what the WiFi Alliance is trying with WiFi 6. They just have the baggage of the old names (802.11a/b/g/n/ac) to deal with. (It is a lot of baggage. Everyone can agree that n/ac are WiFi 4 and WiFi 5 respectively from a consumer standpoint, but between a/b/g there a lot of interesting debates on how you can possibly number any of them 1-3 in any order for both technical reasons and consumption reasons.)

This story is interesting because even in trying to keep to consecutive single numbers, the WiFi Alliance hit a detour in the consumer expectations. "6E" doesn't make sense as a "6.1" or a "7", but still needs to be differentiated from "6 only" to sell it to consumers.


> if I need a new anything I want to know exactly which is the latest

If so, then month/year would be better than a number. (There are flaws to both approaches, though.) 2021-01 Wi-Fi.


Is it a universal rule that higher frequency = lower range?

The range of 5Ghz is already pretty low and has trouble going through solid materials like concrete. Would 6 be significantly worse?


In general, higher frequency means greater attenuation, which means greater power loss when passing through objects. It's a lot more complicated than that depending on the frequency and medium, so don't take that as an absolute rule.


No it is not universal at all. For example, some materials are transparent, kind of like glass for light, another well known EM wave.

You have to look up materials frequency response: on earth, for example, athmosphere does not have a monotonous response (look at http://www.gb.nrao.edu/GBTopsdocs/primer/spect002.gif for example). Same with water: https://en.wikipedia.org/wiki/Electromagnetic_absorption_by_...

Really, WiFi 5GHz is just well absorbed by walls and buildings, that's all.


Will 6GHz perform in buildings than 5GHz?


Indeed : radio transmissions follow Friis equation [1] -- power decreases by the square of the frequency (same goes for distance). So, going from 2.4 to 5 GHz cuts the received power at a certain distance by a factor of 2 (more or less). Assuming free space, of course : obstacles will make matters worse, depending on frequency, thickness and materials.

Meaning that going from 5 GHz to 6 GHz will decrease the range by a factor of around 1.2

[1] https://en.wikipedia.org/wiki/Friis_transmission_equation


Take note that Friis transmission equation is extremely misleading, since it makes certain assumptions about the choice of antenna, which are not necessarily true for all systems.

Too ask the pointed question: Why would propagation in free space be affected by the frequency of the field?

A good explanation is here: https://www.dsprelated.com/showarticle/62.php


As you note, it's only true when you assume free space. In practice, you have air, you have obstacles, you have earth curvature, you have […], which makes it wrong, because athmosphere has an opacity that varies with the frequency (or wavelength): https://en.wikipedia.org/wiki/File:Atmospheric_electromagnet...

So, no, it's not an universal rule.

Please also note that shorter range is not necessarily bad: it means less interference from neighboring devices that use the same frequency band ( https://www.embedded.com/why-60ghz-mmwave-is-moving-into-the... ).


Sat down, took a deep breath, sigh, wrote some rant, deleted that and rethinking now, it is actually pretty good marketing.

All Date are products ( expected ) on the market,

WiFI 6. - 2019 WiFi 6E - 2021 WiFi 7 - 2023

Basically every two year you have a new marketing name to sell. Which is a lot easier for consumers and sales people. Since you need to go through FCC certification for using the new spectrum I assume the likelihood of having your existing WiFi 6 client and router upgradeable to be extremely unlikely.

Then there is 802.11ax / WiFi6 Wave 2, which is similar to 802.11ac / WiFi 5 Wave 2 if anyone remember, with additional features that was originally missing from the wave 1. On top of my head they were mainly features related to UL MU-MIMO. I assume those might come with WiFi 6E too.

And none of the current WiFi 6 on Smartphone supports 160Mhz Channels, so WiFi 6E is another possible candidate to include that on a Smartphone. Another selling point to push for upgrade. ( I am not saying consumer will upgrade because of that, but at least there is one more point to think about what they have now is old, and requires upgrade. It is important to note consumer here implies normal people who are not tech-savvy )

My beloved NetBSD running WiFi 5 / 802.11ac AirPort Extreme came out nearly 7 years ago and it is still working near to perfection. Had always been planning to upgrade it once a WiFi 6 version comes, but unfortunately Apple discontinued it. But the stubborn Apple changed their mind on Keyboard, may be they will do that too for Airport.


They should have called it "Route 66", since it's Wifi 6 plus 6 GHz functionality. ;)


On the topic, what are your favorite budget wifi 6 routers? And is now a good time to buy?


great time to buy devices supporting Wi-Fi 5, because of Wi-Fi 6 devices arriving in the US market.

But I say that with some sarcasm, because it's actually a bad time to buy Wi-Fi 6 devices, as Wi-Fi 6E was just announced.

Wi-Fi 6E will require new hardware to support the new frequency - as stated by a previous comment.

BTW, The Wi-Fi Alliance's naming schemes are confusing and poorly timed (Cellular carriers are pushing 5G..and they launch 6E??? Will they bring out 6F and 6G next year?)


Any recommendations on the best budget router of wifi 5?


If your internet's speed is less than 500 Mbps, netgear r7800 should be able to handle it well with openwrt.


Let's increase the operating frequency so these conference room demonstrations will be even more >>>awesome<<<


So for someone who isn’t following all this. If I replace my seven year old router am I going to get a huge speed boost?


You might get a modest performance improvement. Wifi 6 makes use of multiple antennas to better spatially separate ACKs and other upstream traffic from multiple clients even when the packets temporally overlap. The extra bands may also come in handy but some of them may be attenuated by walls and other building materials.


Assuming you have client side WiFi 6 equipment, which is still fairly rare on the market.


Just like anything with radio, the answer is "it depends".


Yes. Assuming your upstream connectivity is faster than 100MBits or so.


802.11ac is already capable of 400Mb/s or so. So you'd need a link faster than that, plus devices which have Wifi 6 AND can push/pull data that fast and do something with it. For most folks that's a no, so far.


True, but I was speaking in practical terms. To get that 400Mbit you need to be within a few feet of the AP typically. Plus, if there are several concurrently streaming devices on the network you won't achieve the theoretical aggregate maximum throughput. A good rule of thumb in WiFi marketing is to divide whatever throughput the retail box claims by 4 to get the practically achievable throughput. Meanwhile wired networking (the OP's upstream connection) actually delivers the marketed throughput.


802.11ac is actually capable of over 3Gbps, though 4 stream devices are pretty uncommon. 2 streams @ 80MHz is pretty common though, and that provides 867Mbps (max link rate, actual throughput is of course lower).


It can finally replace Gigabit Ethernet... if your devices also support the higher speed.

And I mean replace GbE for normal usage (NAS in one room, TV in another, etc), not 3 feet from an overpriced arachnid looking thing on your desk :)


Fun fact: Wi-Fi 6 official standard is not even out yet :)


Where can this be used? As far as licensed spectrum goes.


Apparently nowhere yet, better coverage at: https://www.msn.com/en-us/news/technology/wi-fi-6e-prepares-...


Tri band routers are coming. 2.4/5/6ghz


so... I need a new laptop and a new router. ossum


If you are limited by your current wifi.

Which most people really aren't.


Is there an NDA?


[flagged]


Dfd?


I like to imagine comments like these are from zombie botnet nodes communicating with their C&C backend server.

The more sophisticated systems just post stegonographically coded comments about blockchains, how working for a FAANG is awesome/sucks, etc.


Considering how hit or miss 5ghz wifi penetration is in my small house, I have no interest in making that problem worse.


If you own a house, you definitely have money for a wifi setup with multiple APs - like Ubiquiti with three APs and PoE switch and dedicated gateway - I'd love to play with that, but I'm not going to buy a house to do that.

And for people living in apartments, where every apartment has their own wifi network, less penetration is really a good thing, because there's less interference from other apartments.


This is assuming that intra-apt walls magically allow wifi penetration whereas inter-apt walls magically block it.

Neither of these things are the case, tragically.


If you have nice, solid walls, the signal will mostly bounce off them and pass through doors.

It will usually pass a single wall, so you will hear some first-degree neighbors, in some places, but it's significantly better than interference from 2nd degree neighbors.


Agreed


Am I the only one concerned about the impact on health? A higher frequency is generally worse for living organisms, and personally I don't need a higher speed than 4Mb/s


In that case, you should also avoid any light source. Not to mention all those infrared emitting organisms around you.


> Am I the only one concerned about the impact on health?

Am I the only one concerned about people spreading unfounded FUD online without spending five minutes to do their research?


>A higher frequency is generally worse for living organisms

Do you have any sources for this claim?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: