Sunday, September 12, 2010

Wireless Controversy Rebuttal

A former student tried to comment on a previous post, but the comment was blocked apparently because it was too long. I'll put it here instead:

Hi Marie, Devin here with the comment I promised. This will get pretty sciencey, and I'm glossing over some parts in a hand-wavey manner simply because there's too much material to cover. If you have any questions please don't hesitate to ask, this is something which would be easier to explain in person and it's no trouble for me.

To understand why I don't think cellphones and WiFi are harmful you need to understand electromagnetic waves. But first you need to understand a bit about waves in general.

All waves have two basic properties (that we're concerned with), wavelength and frequency. There's a nice image that explains what wavelength is here. It is the distance it takes for a given wave to complete one full cycle. This has a standard unit of meters.

Frequency is the number of cycles the wave can complete per second. This is measured in Hertz (which is the same unit as 1/seconds).

The velocity of a wave is given by the equation: v=λf. Where v is the velocity, λ (Greek letter lowercase lamda) is the wavelength, and f is the frequency.

Looking at the equation this should be fairly intuitive. Think of the classic v=d/t (velocity is distance per unit time). Our wave equation above is identical, we have wavelength (distance) times frequency (1/time).

Now the neat thing about electromagnetic waves is that they always travel at the speed of light, c. So the equation above becomes c=λf.

From this equation you can see that if you increase the wavelength, the frequency must decrease (since otherwise their product would no longer equal c). The opposite is also true.

Now we can talk about the electromagnetic spectrum found here. As you can see, on the left there is decreasing frequency as you progress down the chart, and an increasing wavelength on the right.

Note the positions of radio waves and microwaves relative to visible light and ultraviolet (sunlight). Radio waves have a very low frequency compared to visible light (seven orders of magnitude lower).

The energy carried by any electromagnetic wave is given by the equation E=hf, where E is the energy, h is Plank's constant and f is the frequency. Plank's constant is just a number (a very small number), for the sake of this example we can pretend it's one so that it is clear that the energy is directly proportional to the frequency.

Clearly then, if radio waves carry 10^8 Joules of energy, visible light carries 10^15! Radios obviously operate in the radio wave range, cellphones and WiFi operates in the microwave range. This is still 4 orders of magnitude lower than visible light.

If you take nothing else away from this understand this part: The light coming out of your lightbulb is bombarding you with significantly more powerful radiation than anything your cellphone or WiFi router could ever push out.

Of course once you breach a certain energy level this radiation becomes what is called "ionizing radiation", and that's why you wear sunscreen for ultraviolet light or lead vests for X-rays.

Again, this is a lot to understand and I'm breezing through several hundred years worth of scientific discoveries in a few paragraphs, so if you have any questions please let me know and I'll do my best to clarify.

Devin

Okay - here are some question:

1. How do we explain incidents such as the increase in headaches, blurred vision, etc. in a group of people once a system was turned on, and their symptoms disappeared as soon as it was turned off - and they had no knowledge of the new transmitter? Coincidence? Or the fact that tumours of the auditory nerve are three times more frequent in people who have used cell phones for more than a decade, and always on the side they favour as was reported by Dr. Devra Davis in The Secret History of the War on Cancer. I know they're only showing a correlation and not causation necessarily, but it still gives one pause.

2. If we held lightbulbs up to our head for many hours a day, would we likely increase our chances of getting tumours? That is, does the fact that we have very close contact with these devices affect the chance of the microwaves affecting us even if the radiation is extremely low?

3. And, this might be a stupid question, but if microwaves are not at all harmful to us, how is it that they can be used to cook meat? I mean, I've made little cakes with a lightbulb in an Easy Bake Oven, but I don't think the lightbulb could cook a piece of steak as efficiently as a microwave oven. I suspect the microwaves are used differently in that application than in phones and such; is that the case? As far as I understand it, microwaves cook food by making molecules vibrate. Why isn't it possible for phones to have the same effect?

11 comments:

Devin said...

I'll split up my answers into multiple comments so hopefully they'll fit this time.

I'm glad you asked about microwaves, and it isn't a stupid question because most people have no idea how they work. You are correct in that microwave ovens work completely differently than the way I was describing above. The energy equation I gave (E=hf) is the energy carried per photon (the force carrying elementary particles for electromagnetism). When the photon hits something it transfers energy to it similarly to the way a billiard ball transfers energy when hit hits another ball.

Microwave ovens work on a very different principle. Water molecules (and many others) are polar, basically meaning they have a positive and negative end. You can see an image of a water molecule here. The two white dots are the hydrogen atoms with a positive charge, and the red dot is the oxygen atom with a negative charge.

Electromagnetic waves are (as you would expect) varying electric and magnetic fields. When the electric field is near the water molecules, they'll rotate to align themselves with it. E.g. when the field is positive, the oxygen atom will turn towards it (opposites attract). Temperature is simply the average kinetic energy of a bunch of molecules. Rotation is kinetic energy so spinning them increases the energy they have which increases their temperature.

Fortunately, this requires a very precise frequency to work. The electric field needs to switch between positive and negative at just the right rate or the water molecules won't gain energy. If it switches too slowly, they don't spin enough and there's no gain. You can think of this like rubbing your hands together slowly. If you do it fast, your hands warm up, if you do it slowly, the heat dissipates as fast as you are producing it so there's no net gain.

Conversely, if it switches too fast the molecules don't have enough time to fully rotate. This is sort of analogous to someone shouting at you to face east, then face west alternating. If they're talking as fast as they can, you can't possibly turn fast enough to face the opposite direction before they're already telling you to face where you are. So you'll just stay put.

It's true that WiFi uses electromagnetic waves in the same range as microwave ovens, but they don't use exactly the same frequency. If they did you'd have your skin burn every time you walked by one.

Devin said...

With respect to your second question, yes and no. The intensity of an electromagnetic wave does decrease (by an inverse square law, like lots of stuff in physics) as you move farther away from it. However the energy that each photon carries remains the same.

When I say intensity, what I mean is that the power carried by the wave decreases, you're receiving less total energy over time. You can think of it like if each photon is a pebble, when you're far from the source you may only be hit by two or three of them per second. When you're close, you're getting hit by nine or ten. This is why the Earth is relatively warm compared to Pluto which is much farther away from the sun (atmospheric effects ignored).

However, there is an extreme difference between being hit by many low energy photons, and being hit by few high energy photons. The total energy might be the same, but the effects are quite different. Ionizing radiation (typically the radiation including and above the ultraviolet range) is the kind which gives you cancer, and the kind emitted by nuclear bombs. It is radiation that removes electrons from an atom (creating an ion) by force.

When a photon hits an atom, it either carries enough energy to rip off an electron or it doesn't. This is more or less binary. It doesn't matter if you're hit by a billion low energy photons, if one of them doesn't have enough energy to ionize an atom, then all of them together can't either.

Think of stabbing a piece of tough leather with a toothpick. You can stab it with one toothpick or a hundred, and you're still not going to pierce it. On the other hand if you stab it with a sharp enough knife, you'll get through.

As a side note, the light bulb is just a common example. Almost anything that creates heat or light also gives off electromagnetic radiation. Your computer monitor, your stove, even you! Humans give off infrared radiation all the time which carries more energy than microwaves or radio waves. Getting close to light bulbs isn't something that happens often, but snuggling in bed with someone will irradiate you with very close proximity.

Devin said...

Finally, I have no explanation for what those people felt. You're right when you say that there isn't solid evidence (negative or positive) between long term exposure and cancer. My defense is based around the fact that nothing we know gives any indication that it should cause cancer. There is still the one in several billion chance that the exact specific frequencies and arrangement we use somehow injures us through some unknown mechanism.

Personally, I don't worry about it. If they did work like microwave ovens, you'd feel the effect immediately (and notice the burns on your skin). I live and work in an environment where I'm constantly bombarded with radio and microwaves twenty four hours a day, and I don't notice any immediate adverse effects (such as the headaches and blurred vision reported). What I know about how they work tells me there's no way they should cause cancer. Maybe they still do, but I have enough known threats to take care of than to spend time worrying about what may be.

That's just my own anecdotal experiences and opinions, but the physics is all fact. Truth with a capital T. And don't take my word for it either, go read up about this stuff on Wikipedia or if you're a student take a physics course.

Devin said...

With respect to your second question, yes and no. The intensity of an electromagnetic wave does decrease (by an inverse square law, like lots of stuff in physics) as you move farther away from it. However the energy that each photon carries remains the same.

When I say intensity, what I mean is that the power carried by the wave decreases, you're receiving less total energy over time. You can think of it like if each photon is a pebble, when you're far from the source you may only be hit by two or three of them per second. When you're close, you're getting hit by nine or ten. This is why the Earth is relatively warm compared to Pluto which is much farther away from the sun (atmospheric effects ignored).

However, there is an extreme difference between being hit by many low energy photons, and being hit by few high energy photons. The total energy might be the same, but the effects are quite different. Ionizing radiation (typically the radiation including and above the ultraviolet range) is the kind which gives you cancer, and the kind emitted by nuclear bombs. It is radiation that removes electrons from an atom (creating an ion) by force.

Devin said...

When a photon hits an atom, it either carries enough energy to rip off an electron or it doesn't. This is more or less binary. It doesn't matter if you're hit by a billion low energy photons, if one of them doesn't have enough energy to ionize an atom, then all of them together can't either.

Think of stabbing a piece of tough leather with a toothpick. You can stab it with one toothpick or a hundred, and you're still not going to pierce it. On the other hand if you stab it with a sharp enough knife, you'll get through.

As a side note, the light bulb is just a common example. Almost anything that creates heat or light also gives off electromagnetic radiation. Your computer monitor, your stove, even you! Humans give off infrared radiation all the time which carries more energy than microwaves or radio waves. Getting close to light bulbs isn't something that happens often, but snuggling in bed with someone will irradiate you with very close proximity.

Devin said...

I'll split up my answers into multiple comments so hopefully they'll fit this time. Edit: My comments seem to be mysteriously disappearing still, so these might appear entirely out of order :(.

I'm glad you asked about microwaves, and it isn't a stupid question because most people have no idea how they work. You are correct in that microwave ovens work completely differently than the way I was describing above. The energy equation I gave (E=hf) is the energy carried per photon (the force carrying elementary particles for electromagnetism). When the photon hits something it transfers energy to it similarly to the way a billiard ball transfers energy when hit hits another ball.

Microwave ovens work on a very different principle. Water molecules (and many others) are polar, basically meaning they have a positive and negative end. You can see an image of a water molecule here. The two white dots are the hydrogen atoms with a positive charge, and the red dot is the oxygen atom with a negative charge.

Electromagnetic waves are (as you would expect) varying electric and magnetic fields. When the electric field is near the water molecules, they'll rotate to align themselves with it. E.g. when the field is positive, the oxygen atom will turn towards it (opposites attract). Temperature is simply the average kinetic energy of a bunch of molecules. Rotation is kinetic energy so spinning them increases the energy they have which increases their temperature.

Devin said...

I'll split up my answers into multiple comments so hopefully they'll fit this time. Edit: My comments seem to be mysteriously disappearing, so these might appear entirely out of order :(.

I'm glad you asked about microwaves, and it isn't a stupid question because most people have no idea how they work. You are correct in that microwave ovens work completely differently than the way I was describing above. The energy equation I gave (E=hf) is the energy carried per photon (the force carrying elementary particles for electromagnetism). When the photon hits something it transfers energy to it similarly to the way a billiard ball transfers energy when hit hits another ball.

Microwave ovens work on a very different principle. Water molecules (and many others) are polar, basically meaning they have a positive and negative end. You can see an image of a water molecule here. The two white dots are the hydrogen atoms with a positive charge, and the red dot is the oxygen atom with a negative charge.

Devin said...

Electromagnetic waves are (as you would expect) varying electric and magnetic fields. When the electric field is near the water molecules, they'll rotate to align themselves with it. E.g. when the field is positive, the oxygen atom will turn towards it (opposites attract). Temperature is simply the average kinetic energy of a bunch of molecules. Rotation is kinetic energy so spinning them increases the energy they have which increases their temperature.

Fortunately, this requires a very precise frequency to work. The electric field needs to switch between positive and negative at just the right rate or the water molecules won't gain energy. If it switches too slowly, they don't spin enough and there's no gain. You can think of this like rubbing your hands together slowly. If you do it fast, your hands warm up, if you do it slowly, the heat dissipates as fast as you are producing it so there's no net gain.

Devin said...

Hurf. Your comment system is driving me insane by arbitrarily deleting my comments so they appear all out of order. I've uploaded the full text here again and you can set it up however you like.

Hopefully this comment doesn't disappear as well.

Devin said...

Your comment system is deleting my comments and driving me insane. I've uploaded the full text at http://dl.dropbox.com/u/180770/comment.html again and you can format it how you like. Fingers crossed that this one actually stays put.

Marie said...

Thanks for all these explanations Devin. For some reason, your comments are being received as span. I'll see if I can affect that.