That's not how token ring worked. The token controls which node is allowed to transmit over a shared medium. Every node saw every packet and made it's own determination of relevance.
Showerthoughts
A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The best ones are thoughts that many people can relate to and they find something funny or interesting in regular stuff.
Rules
- All posts must be showerthoughts
- The entire showerthought must be in the title
- Posts must be original/unique
- Be good to others - no bigotry - including racism, sexism, ableism, homophobia, transphobia, or xenophobia
- Adhere to Lemmy's Code of Conduct
That's what I thought too unless the pic (left) literally is how cables are arranged??
My understanding was a shared medium (say, all computers in parallel on a single UTP), where they pass a virtual token "packet" that assigns the right to transmit while anyone receives if addressed, like a ball between kindergarteners sitting in a circle.
The pictured ring topology (left) makes it seem like everyone can only talk to a computer one over, which seems awful for efficiency and resilience, while the pictured star topology (right) introduces an authority figure (MAU is like a kindergarten teacher that decides who walks around and gives the ball to whichever child they think should speak next). Both seem inherently worse than Ethernet - left can be completely broken by disabling one or two nodes while the right one is just a switched network with less throughput.
That is what 'automation' often is. You take a working process, then let machines do as many steps in that process as you can. Harvesting crops, sending memos, robots spraypainting car parts, self driving cars (We still have a lot to do there)
Building on that it gets even more interesting as we try to find better, or even completely new processes.
I think token ring is a data link layer technology that controls transmission access over the physical connection. Like early non-switched Ethernet, computers are connected in parallel to the same wires but instead of collision detection and random delays, which caused congestion and serious overhead on busy networks, a "token" is passed around and determines the right to "speak". Everyone listens at the same time and starts receiving packets when addressed. If the computers were literally wired in series like a looping daisy chain, the failure of one would destroy message propagation. Instead, if the token-bearing computer or disconnects from a token ring network, the token is presumed expired after a short while and a new token-bearer is chosen. It's like a kindergarten activity where you sit around in a circle and need to hold the ball to speak, passing it around. It doesn't matter who you're addressing, you can even broadcast, but that's handled by a higher-level protocol.
As for memos, I have never used them and they seem extremely inefficient.
Edit: looks like Token Ring is actually more physical than I thought, with special cables connecting computers in series, so you may be right. That sounds really stupid as a thing to build a network on, it's easy to cut it in half by disabling just two computers, antithetical to the internet's resiliency principle.
Edit edit: my original understanding was right, the literal cable ring is obsolete for good reason. I still don't get the role of a MAU in the star topology unless it's just needed for old NICs to understand virtual tokens.
My memory of token ring is vague, but I think it was originally a ring in series as you said - however token ring switches (that isn't what they were called) also existed, which was the "modern" way of writing up a token ring network.
😜 I love Cunningham's law. Yeah token ring sucked hard we used it with bnc coax cables and vampire taps at school.