I have a Netgear R6400 with split 2.4ghz and 5ghz WiFi networks, and for some reason devices in the 5ghz cannot see devices on the 2.4ghz… is that normal? I feel like they're all on the same router/subnet (I think) and that shouldn't be an issue. 1/2


I think my only option is a single network but my experience thus far is that most devices end up picking up the much slower 2.4ghz signal. Any ideas? 2/2

@dshafik Check that they are bridged. Sometimes they are not. I prefer devices that can run DD-WRT.

@dshafik I've seen this with some wifi routers, basically the wifi interface doesn't echo ARP packets to other wireless clients. There isn't a good solution to this issue because it's done on purpose, to limit transmit "noise". I've "fixed" this in the past by giving static DHCP assignments to my known clients and replicating the ARP tables manually on the devices where I can.

Sign in to participate in the conversation
Mastodon for Tech Folks

This Mastodon instance is for people interested in technology. Discussions aren't limited to technology, because tech folks shouldn't be limited to technology either! We adhere to an adapted version of the TootCat Code of Conduct and follow the Toot CafΓ© list of blocked instances. Ash is the admin and is supported by Fuzzface, Brian!, and Daniel Glus as moderators. Hosting costs are largely covered by our generous supporters on Patreon – thanks for all the help!