naturism

naturism definitions

noun
  • The worship of the powers of nature. 

  • The belief in or practice of going nude in social settings, often in mixed-gender groups, specifically either in cultures where this is not the norm or for health reasons.