The latest media talking point: "America is a center-right country." Really?
I don't think so. This is just more desperate spin spewed by fearful right-wing pundits. Same as the "socialist" and "marxist" and "terrorist pal" crapola. People have been brainwashed by right wing media clowns like Rush and the Faux News Channel who have demonized the word "liberal." Why can't they just accept that most Americans DO want progressive solutions - universal healthcare, higher minimum wage, better public education, environmental protection.
It's not like we are expecting government to solve all of our problems, it's just that the playing field has been tilted SO far against us (because of a failed conservative ideology) we now want some government action. Tax cuts for the wealthy and deregulation does not work. The majority of American voters have rejected the right wing ideology. Someone once said "ideas have consequences."
Thanks to the "small d's": an engaged grass-roots base of community organizers, millions of volunteers, over 2 million small donors- these are the people who helped elect Barack Obama, overcoming the forces of money and power that have been in the way of real change. If we were such a center-right country, then why did the Republicans need to use dirty tricks to try to get elected, while the Dems won on the issues?