Tom in Green Island asks:
What is the difference between the terms dew point and relative humidity? And
why do meteorologists use dew point rather than relative humidity to describe the uncomfortableness or oppressiveness of an air mass?
Dew point is the temperature at which the air becomes “saturated” – in other words, it is holding all the water vapor it possibly can. If the temperature and the dew point equal each other, the water in the air will begin to condense and dew can form, hence the name.
Relative humidity is a related term. It is defined as the ratio of how much water vapor is in the air versus how much the air could potentially hold, expressed as a percentage. A higher percentage means that the air is holding a lot of water vapor and is more humid.
A relative humidity of 100% means that the air is saturated and water begins to condense, the same as when the temperatures equals the dew point.
Lets run through an examples that shows how these two terms, while closely related, can reflect very different things throughout the course of a single day.
Early in the morning, let’s say that the dew point is 50 degrees. For an overnight low, our temperature drops to 50 degrees. The relative humidity is 100% and dew forms.
Later in the day, the temperature rises to 70 degrees. There is no moisture being added to or taken away from the air, so the dew point stays at 50. It would feel just about as humid to you as you stepped outside. Even so, the warmer air can now hold more water and the relative humidity drops to 49%.
I have seen meteorologists use either term on air, and both are useful in their own way. I think it comes down to a matter of personal preference. Relative humidity varies based on both moisture in the air and temperature. I typically use dew point, because it reflects just the moisture, which is what we want to know about anyways.
Thanks to Tom for the question! If you’ve got a weather question you want answered in Weather 101, shoot us an email! You can reach me at email@example.com