i’m always so amazed when i hear americans say shit like “healthcare is a privilege, not a right”. like how do you reach a point in your life where you think people deserve to die because they can’t afford healthcare. what in the actual fuck is wrong with you.
We are taught at an early age that money is everything and if you don’t have money, you’re not worth living.