People in America seem to get hung up on always being good, and doing what's right, but your parents won't tell you that the wrong thing can be legal. I've lived my whole life in America born and raised. People seem not to do what is best for themselves, even though they are allowed to do these things under the law. Why are Americans not willing to be evil, if they won't get in real trouble for it? Doesn't matter what your teachers, priest, or Sunday school taught you. If there is no law against it, and you have the right to do such things under law, why wouldn't a person do what's best for them, or friends and family?
By Joel D Youngquist

Commentaires