How to keep your skin safe during the winter

NBC - Doctors want to remind you about the importance of taking care of your skin during these cold winter months.

Your skin tends to become dehydrated in the winter due to lack of humidity leading to damage on unprotected skin.

And your hands are vulnerable to cracks and cuts that may put you at risk for an infection.

Doctors urge you to wear proper clothing such as gloves and use moisturizer.