From left to right: random image (scores between 0.03 and 0.1), a generated NSFW image (scores 1) and a SFW image (scores 0)
open_nsfw is an Open Source neural network algorithm that scores images on a scale of O to 1 on whether they contain 'objectable' content, i.e. nudity. Images that score high are marked as 'NSFW' (not safe for work). Using generative visualization techniques similar to Google's Deep Dream, Gabriel Goh used the algorithm to generate maximally NSFW images, then minimal NFSW and continued to combine both scores, 'exciting both neurons in the same place' as he explains it.
His exploration starts with a tongue in cheek warning: "This post contains abstract depictions of nudity and may be unsuitable for the workplace"
... while machines learn that the opposite to the zoomed-in pornographic image is a zoomed-out pastoral landscape, we urgently wonder about the moralistic modes of representation, naming, ordering and generating of, also, computer visions (remembering the extent to which those reflect so-called-human visions).