I’m trying to find a documentary from the late 80 or early 90s I think. I saw it on tv in my mid teens. It’s about society perception of nudity and nudists. The the US to Cap d'Agde.
Unfortunately no. I know that film it’s good. But this was a real documentary style. It talked about how nudity is preceived is the States. Showing a lady being photographed in her home, through the glass door. Then it talked about camps/resorts. Showing in resort footage. People standing around talking, using the pool that sort of thing.