MyNameisHobby
Smash Cadet
A short answer: Yes. Yes they are.
To clarify, I'm not saying that we should not have governments. Governments, whether if they are democratic or even communist, are essential for the very existence and stabilization of our world. However, today, they also prove to be a massive conflict of peace and harmony between nations. Both between foreigners and citizens of their own country.
American politics are usually divided into two different views: the left(Conservatives, Republicans, etc.)and the right(Liberals, Democrats, etc.). Seems simple enough. But if you dig deep, both sides have had very tense relations recently.
First, let's take a look at the media. The news doesn't even report news anymore. All it is is blatant propaganda. Most notably CNN and Fox News. CNN is more of a right leaning news network while Fox News is more of the left. Both of these networks are EXTREMELY BIASED in my eyes and serve the purpose of propagating people into trying to get someone out of office or keeping someone in office.
Not only that, but Leftists and Rightists buy into everything and anything the media says. Anything CNN tells the right, they believe. Anything Fox News tells the left, they believe. They don't do any further research and refer to any opposite leaning news networks as "fake news"(Seriously, 'fake news'? That's the best we can come up with?).
Not only that, but politics are FUELED with social justice warriors. Anytime you say something negative about their political belief, even with respect for their political views, they will attack you by any means necessary. Verbally and even physically. Just today I saw Conservative SJW's on social media telling a person to leave America just for supporting Macron in the French election. And it's all in the name of blatant propaganda.
My views? Put aside propaganda before people take it too far. Actually tell the damn truth.
To clarify, I'm not saying that we should not have governments. Governments, whether if they are democratic or even communist, are essential for the very existence and stabilization of our world. However, today, they also prove to be a massive conflict of peace and harmony between nations. Both between foreigners and citizens of their own country.
American politics are usually divided into two different views: the left(Conservatives, Republicans, etc.)and the right(Liberals, Democrats, etc.). Seems simple enough. But if you dig deep, both sides have had very tense relations recently.
First, let's take a look at the media. The news doesn't even report news anymore. All it is is blatant propaganda. Most notably CNN and Fox News. CNN is more of a right leaning news network while Fox News is more of the left. Both of these networks are EXTREMELY BIASED in my eyes and serve the purpose of propagating people into trying to get someone out of office or keeping someone in office.
Not only that, but Leftists and Rightists buy into everything and anything the media says. Anything CNN tells the right, they believe. Anything Fox News tells the left, they believe. They don't do any further research and refer to any opposite leaning news networks as "fake news"(Seriously, 'fake news'? That's the best we can come up with?).
Not only that, but politics are FUELED with social justice warriors. Anytime you say something negative about their political belief, even with respect for their political views, they will attack you by any means necessary. Verbally and even physically. Just today I saw Conservative SJW's on social media telling a person to leave America just for supporting Macron in the French election. And it's all in the name of blatant propaganda.
My views? Put aside propaganda before people take it too far. Actually tell the damn truth.