If hitler had won world war 2, we would probably believe that Germany was doing the right thing, the holocaust would probably be seen as some obscure conspiracy theory which you'd be laughed at for believing in, and we would probably have all been lead to believe that the rest of the world were up to terrible things during the war. Makes you wonder what we've been lead to believe, because there's plenty of "conspiracies" about shadowey goings on and dubious plans, but who's to say that there's nothing commonly accepted as the truth which we don't know about? Some food for thought, hope I inspire some interesting debate in the comments.