The effectiveness standard most of us associate with sunscreens — the SPF, or sun protection factor — measures only how well they block UVB rays. That’s because scientists used to believe UVB rays acted alone in causing skin cancer.
UVB rays cause sunburn. So to measure how well a sunscreen blocks UVB rays, testers actually observe a group of people wearing the screen and a group not wearing it and time how long they can be in the sun before they start to turn red. The SPF rating is the ratio of the first group’s average time compared with the second group’s. For example, if you use a sunscreen with SPF 15, theoretically it should take about 15 times longer for you to start to burn than it would if you didn’t use the sunscreen. Of course, the SPF assumes that you use enough sunscreen and that it doesn’t rub or rinse off.
But scientists now know that UVB rays are not the only cancer causers. UVA rays can do dirty work of their own. So a sunscreen that measures only the blockage of UVB rays can’t be completely useful.