Feeling like you're apart of something when the actuality of the fact is that no one accepts you has me pondering what is a relationship and why does it matter to everyone? You were born into a family, a family who has been put on this earth to guide you and protect you, so when did outsiders become such a huge role in our lives? Shouldn't family be enough? What happens when your own flesh and blood doesn't accept you? Is that when we turn to strangers to help us feel better? Is turning to strangers a way of feeling the slightlest bit accepted or appreciated? Are we fools to think anyone but ourselves have our best interests at heart?
Being "accepted" comes with years of experience. No one really accepts or respects you if you can't give them anything of value.