i cant be the only one... who's never kissed
anyone, held hands with someone or had a relationship that
actually means something...
...omg, i am, aren't i?
I've never
had a boyfriend, but I think it feels like you're safe.
Like no one's going to hurt you.
It feels like someone actually cares about you. Like he
doesn't care what you look like, or how your body looks
like. He loves you for who you are. He makes you feel like
you deserve to be happy. At least, that's what I think.
I've seen that happen to friends.