Anyone wonder why the world has shifted from a somewhat liberal era of optimism and peace into an era of right wing conservatism which is in favor of strict controls on your mind and your body? What has caused this shift to the left? It can't be blamed entirely on astrology or that fat ass who was the President and lost re-election. What do YOU think is the reason for the change in philosophy? Chime in here and let us all know what you think about this.