People in America seem to get hung up on always being good, and doing what's right, but your parents won't tell that the wrong thing can be legal. I've lived my whole life in america born and raised. People seem not to do what is best for themselves, even though they are allowed to do these things underlaw. Why are American's not willing to be evil, if they won't get in real trouble for it? Doesn't matter what your teachers, priest, sunday school taught you. If there is no law against it, and you have to the right to do such things underlaw, why wouldn't a person do whats best for them, or friends and family?
By Joel D Youngquist
Comments