
I've never liked hospitals.
I mean, i know that they're supposed to be a place of healing and caring for other people, and i know that doctors & nurses are always trying their best to treat the patients and try and make them comfortable, but somehow...i don't know. There's just this uneasy feeling that i get when i step into the hospital environment.
I'd keep thinking of death and sickness and illness. I don't know why but i just automatically...