What we prohibit - and why we prohibit it - is, or should be, around considerations of sensitivities to make us safer, stronger or more resilient. Ironically, the opposite is sometimes the case, and in seeking to be more secure we actually become more vulnerable.
I remember assessing the vulnerability of a data centre for a large multinational client based in a faraway not to be named land. The cooling systems for the computers had been removed from the roof - and placed only a short grenade throw from the fence and adjacent roadway - in order to accommodate a "Barbeque Vista" for executives at the end of the week. My report - like the child's ubiquitous homework - was "eaten by the dog"; my camera's memory card confiscated; and evidence of my consulting work was erased even to the extent of not paying me. Just "no photos / no paperwork" in another of its various forms.
I ought to have known better courtesy of an early indicator of this culture which I will call the “lost luggage fiasco”. You may have already picked where this is going. I didn’t mind not having my luggage waiting on the carousel. I didn’t mind it taking more than three days to eventually turn up. I did mind my local helper, who went to get the luggage and associated paperwork needed for insurance, being bullied with a refusal to hand over the luggage until he gave them all of my paperwork! No paperwork - no foul.
Our own domestic culture has different but similar weaknesses associated with the way we build a (false) sense of security. The illustration that springs readily to mind is a gig I did for a critical infrastructure T.I.S.N. (Trusted Information Sharing Network) which commissioned me to facilitate a desktop exercise. I couldn’t access their risk assessments as I was not cleared. So I built a credible scenario to bring down interdependent infrastructure and grind a capital city to a halt using only public domain sources to identify key vulnerabilities. Expert risk assessments had yet to identify this particular weakness and this gap in the risk register was too embarrassing for me to use to open the discussion exercise with. So we had a ‘credible, respected stakeholder’ announce that a “realistic and feasible scenario had been identified which would have the following impacts ... prepare to explore what this means”. For me it raised some serious questions about relying on a closed set of secret experts. Of saying albeit in a different way than the above examples “no photos”.
These "defensive cultures" are not always a bad thing however it is always worth keeping an eye on - and checking that when we apply this approach, it itself is supported by risk based considerations beyond the narrow lens of technocrats or people caught in an overly sensitive bubble.