this post was submitted on 27 Mar 2024
1015 points (99.0% liked)

Memes

45731 readers
998 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 7 months ago (3 children)

It's a well known scientific fact that higher frequency waves carry more energy. For larger mammals, such as humans, these differences are trivial for the most part. Unless you're standing in a location which is exposed to high amplitude and high frequency EM waves, the danger is generally nil. By high frequency, I'm talking about pretty much anything over ~10Mhz, and for high amplitude, I'm talking about power levels at or above 100W. Putting 100+W of power through an antenna is extremely rare, and due to things like attenuation, free space path loss, reflections, refractions, etc, unless you're basically standing directly next to an antenna, in its transmission path, you're fine. Bluntly, this is why cellular towers are set up the way they are. Usually an antenna mast will have a relatively small support pillar of some sort, usually a cylindrical "pipe" shape, or a set of support beams in an overlapping "x" shape, which narrows as it goes up. At the top it usually flares out for where the antennas are mounted, so if you climb up the mast, you end up behind the "business end" of the antennas; aka, they're pointed away from you. This means that the vast majority of energy being produced is directed away from where you are. For everyone else, being on the ground or even in a nearby building, you're too far away to be exposed to significant signal amplitude. We can it EIRP in the industry, or "estimated isotopically radiated power". The EIRP drops off quickly in the first few meters after the antenna, as the signal expands outwards towards the service area; so even being within 15m is generally safe.

EM waves can be dangerous, specifically in the extremely high bands; IMO, this is what scares people. Extreme high band EM is dangerous at most power levels. These extreme high bands are capable of causing damage at the cellular level, possibly causing your DNA to break down. These are referred to as "ionising". The bands that people most commonly know that are ionising, includes UV and X-ray. High band UV 2 and UV 3 are in this range, and x-rays are too. They're all EM waves and they are extremely dangerous. These are all emitted by our sun, and mostly blocked by ozone. Some small levels of UV 3 might get through (hello skin cancer). What I want to point out is that these are all at, or above hundreds of terahertz in frequency. UV bands start around 800Thz. 80,000 times higher than 10Ghz. It goes up from there.

Light, which is also an EM wave is between 400-800Thz, and it's widely considered harmless. Yet, common folks tend to start to freak out about EM above ~6 GHz because of a lack of understanding. 8Ghz is more than 100,000 times lower than the low band of UV (which is non-ionising). Any EM wave with sufficiently high transmission power is strong enough to cause damage, for most frequencies below 400Thz (aka, below visible light) would need to be significantly higher than what we normally use. For context, transmission power at the antenna for broadcast radio (eg FM radio stations), is usually around 100kW maximum per antenna system. These transmitters can be legally and safely placed in urban areas provided adequate separation between the antenna and the public, usually 30-40 meters. To contrast this, the broadcast power of WiFi at 2.4Ghz is usually set at or around 100mW (0.1W), with a maximum output of around 1W (legally at least). To further this example, microwave ovens use 2.45Ghz frequency EM to heat your food. This is usually combined with a very well insulated cage to prevent that energy from escaping, which both protects you and your home from being cooked, and also directs the energy towards the item being heated, improving efficiency. Most modern microwaves can emit around 1000W (or 1kW) of power. 2.45Ghz is, however, special, in the way that it directly interacts with water. This specific frequency can excite water on a molecular level to create heat. I won't go much further into it than that. So it's unique in the interaction it has. Something something resonant frequencies something something.... Look into it if you're curious. The point is that your 2.4 GHz WiFi is 1000-10000 times less powerful than your microwave. Other frequencies do not have the same effect on water or other molecules. If your microwave ended up emitting 3Ghz instead of 2.45ghz, you would have a microwave that consumes a lot of power, which doesn't do anything useful.

I mention this to point out that the amount of power needed to affect something in favorable conditions is generally at or above ~800W of transmission power, in the equivalent of an EM "mirror" box. Consumer goods generally will never transmit above 1W. Even at 0.1W you can usually saturate your house, your yard, and your neighbors yard.... At least enough to "see" the signal.

"5G" and "6G" mobile/cellular technologies operate in the gigantic band between 900MHz and 400THz (often on the lower side of that very broad range), well below the level of ionizing EM, and at power levels well below what would be dangerous. The largest 5G arrays run with power levels around 120W. Which is less than 1/8th the power of your microwave, and at a maximum of 40Ghz, well below ionising.

Scientifically speaking, 5G mobile carrier antennas are less dangerous than walking under a 1000W flood light, which people do without hesitation, or even a thought given to any possible danger from the exposure to the ~600Thz EM being emitted by the floodlight.

Bluntly speaking, it's a stupid argument to be afraid of 5G for the transmissions themselves. You will not be harmed by them.

[–] [email protected] 2 points 7 months ago

Thank you for taking the time write this, I wouldn't have the patience to do it.

[–] [email protected] 1 points 7 months ago* (last edited 7 months ago) (2 children)

Nice write up. There is a guy that tried out how dangerous the microwaves are (tl:dr not very): https://m.youtube.com/watch?v=3hBRxwQXmCQ&pp=ygUgc3RpY2tpbmcgbXkgaGFuZCBpbiBhIG1pY3Jvd2F2ZSA%3D

Short term effects seem to be none so far. I wouldn't stick any part of myself into one, but his video is still quite hilarious

[–] [email protected] 2 points 7 months ago

That.... Is definitely not recommended.

I only have a fairly basic grasp of physics and biology, and from what I know, the microwaves will heat up the water inside your cells.... Specifically the 2.45Ghz emitted by the magnetron in a microwave "oven". It will easily penetrate your flesh and heat you up from the inside out.... Like, on a cellular level. The water inside your cells can very easily and quickly boil, causing the cell to explode.... Especially when exposed to ~1000W of 2.45GHz EM energy. It's non-ionising so the effects will be limited by the amount of exposure, and limited exposure won't cause much damage. Your body will very easily heal, since you lose cells all the time. Prolonged exposure will kill you.

Also, activating a nearly 1kW magnetron in an open environment will have devistating effects on anything operating in the same frequency band, and likely anything on resonant frequencies, which will likely get you in trouble with the FCC (or local regulatory body), which can include aircraft, military operations, emergency services.... It's a long list. If they use radios as part of their normal operation, a powerful and unregulated transmission like this can basically jam their system making any legitimate broadcasts unintelligible. This can obviously put lives at risk.

With that said: do not do this.

I'm licensed to operate radio equipment in amateur radio bands up to 190W EIRP (if I recall correctly), and I don't think I've ever used anything more powerful than 50W, I don't own anything more powerful than 25W, and anything with the antenna attached to the radio (like a handheld radio) that I own doesn't exceed 10W, most are 5W. For me, if I was using anything over 50W, I'd want a band pass filter on my antenna feed line to eliminate spurious emissions on resonant frequencies just to be extra careful.

That all being said, since the damage from this guy has already been done, I wouldn't expect any further issues for him. I'm sure his biological systems have fully recovered from the exposure, and I don't expect it to resurface again. If it was higher frequency (ionising radiation) then he would probably already have died, and even if he didn't, he would be in for a lifetime of hurt, but yeah, 2.45Ghz is relatively safe by comparison.

Just to note, above x-rays are all the radioactive emissions that come from stuff like nuclear materials. Which is why they're called "radioactive" .... They're actively emitting radio (EM) waves. Usually well into the petahertz and exahertz, while ionising radiation starts in the very high terahertz range.

1Ehz = 1,000Phz = 1,000,000 Thz = 1,000,000,000 GHz (For clarity)

[–] [email protected] 1 points 7 months ago

Here is an alternative Piped link(s):

https://m.piped.video/watch?v=3hBRxwQXmCQ&pp=ygUgc3RpY2tpbmcgbXkgaGFuZCBpbiBhIG1pY3Jvd2F2ZSA%3D

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] [email protected] 1 points 7 months ago* (last edited 7 months ago)

Truly, thank you so much for responding. I love learning from experts.

This is usually combined with a very well insulated cage to prevent that energy from escaping

Faraday cage. Please. I'm a fool, not an imbecile. And to be clear, I'm well-aware of ionizing radiation bands.

However, my concerns lie in extended exposure. I'll relate this to analog. Regardless of frequency, sound as quiet as 70 decibels can cause hearing loss after extended exposure. In the territory of 24 hours and longer, mind you. This is as quiet as, say, a hearty conversation, or a washing machine.

And this is a dual inverted sliding scale. Hearing loss zones:

  • | XXdB | Duration before hearing loss. |
  • | 70dB | 24h | As quiet as a clothes washer can cause hearing loss. Really.
  • | 75dB | 8h |
  • | 80dB | 2h |
  • | 90dB | 1h |
  • | 95dB | 50m |
  • | 100dB | 15m |
  • | 105dB | <5m |
  • | 110dB | <2m |
  • | 120+dB | Instantaneous |

I'd like to know where the scales for EM radiation amplitudes are. I've read a few studies but most of them focus on bursts or separated exposures. Very few of them observe sustained continuous exposure.

Also, I'm aware sound and radiation are not apples to apples, but my point of relating energy input and exposure duration is the same. If you ask anyone if 70 dB is safe, everyone will tell you, "Yes. Of course." Which is not correct. Even 60 dB can do you further harm, if your ears have not healed from damage sustained immediately prior.

Some small levels of UV 3 might get through (hello skin cancer).

Now you're getting into much more familiar territory. UV-A, the lowest band of UV, at UV 1 is entirely capable of causing sun-burns. It just depends on exposure time and pigmentation. Any sunburn has the potential to cause cancer. The more intense the burn and larger the affected area, the higher the chance more cells mutate, the higher your chance one of those cells becomes an unstable cancer cell, the higher the chance one of those cells becomes stable, and the higher chance for metastasis... From non-ionising low flux UV-A. Possible, but unlikely. Though increasingly likely as duration increases.

These transmitters can be legally and safely placed in urban areas provided adequate separation between the antenna and the public, usually 30-40 meters.

The EIRP drops off quickly in the first few meters after the antenna, as the signal expands outwards towards the service area; so even being within 15m is generally safe.

Inverse square, I'm familiar. And now we're cooking with fire. Let's elide frequency for a moment, pretending it's irrelevant in the same way mechanical waves' frequencies are.

Let's assume 30-40 meters for 100KW antenna and 15 meters for ~20W macro cell is instantaneous minor damage. With each meter you distance yourself, the concentration of wattage decreases. What do you suppose the limit of energy density is for immediate damage when in direct contact? What do you suppose is the limit of wattage for sustained direct exposure on scale of 24 hours. That is, the equivalent of 70dB for intensity. What about sustained exposure for several years?

I don't fear the effects it will have. After all, death will come to us all at some point. However, that doesn't mean I'll be reckless with the time I have left.

Also, I just started studying to get my Ham and haven't quite wrapped my brain around a lot of the implications, so your input is very much appreciated. Thanks.