There's no proof that being naked is wrong. The only reason you don't run around Walmart with your ass out is because society doesn't like it. I have a read that there was a study on whether nudism increased confidence, And the findings say it does. There is nothing wrong with it, Unless of course, You are completely religious, And get offended by the human body.
Naturism is our natural state, and one of the times that nature doesn't try to kill us. It could promote mental health, increase bonds with loved ones, and be a lot more convenient than wearing clothes. Also, we get to see boobs, so when we act like pervert we get better satisfaction.
Everybody has a private part, Breast or penis, And to show that isn't good. You don't see people with no clothes on whatsoever. This shouldn't be excepted. Why do you think that? Are you okay with someone looking at you, When you are naked? Well, If you want that, Then I won't be any of those 2 people. I would rather be standing on the sidelines. Why else would girls wear bra other than keep their breasts from falling? !