Since the beginning of 2009, there has been a significant shift in the focus of the "war on terror". Right after 9/11, the American people were told that the terrorists were Islamic radicals and that we needed to go fight them in the Middle East so that they would not come attack us here at home. But now many anti-terrorism reports put out by the government barely mention Islam at all. Instead, many of them are primarily focused on "domestic right-wing extremists". You may not think of yourself as a "domestic right-wing extremist", and I certainly do not, but the truth is that patriotic Americans and conservative Christians have been repeatedly labeled as "potential terrorists" since Barack Obama became president. If this had just happened one time, it would be easy to dismiss. Sadly, there has been a steady pattern of this happening over the past several years. Large groups of people that are the heart and soul of this country have been systematically demonized over the past four years. When you consider what history has taught us, it is absolutely chilling to think about what this could eventually lead to.