H
Heretic
Guest
By now we all know that Islamic countries teach their school children that the West is evil, America is evil, Christians are evil, Jews and Israel are worse than evil. Even countries that are supposedly allies, such as Saudi Arabia, teach their children this. Why do they do this? A friend of mine suggested that it was to keep their people from demanding freedoms like those us Americans have. Is it to maintain the strict religious control of their countries? or are there more sinister reasons behind it?
Recently I have seen alot of rhetoric from people saying that Islam preaches peace, tolerence, etc. Is this just politically correct banter or actually the truth? Why do so many muslims hate a country that gives them many personal freedoms, that treats them better than their homeland?
Having gone to graduate school I have seen many Muslim students live it up in America but still believe that our society is basically evil.
Any muslims out there care to elaborate?
Recently I have seen alot of rhetoric from people saying that Islam preaches peace, tolerence, etc. Is this just politically correct banter or actually the truth? Why do so many muslims hate a country that gives them many personal freedoms, that treats them better than their homeland?
Having gone to graduate school I have seen many Muslim students live it up in America but still believe that our society is basically evil.
Any muslims out there care to elaborate?