I’m casting a wide net out there asking. Because here in Washington I’m treated horribly no matter were I go. Even when I go to the medical clinic getting a intake, I get these passive aggressive cruel remarks made about me. Example I’m opening up to a physicians assistant she then pretends to be caring then she puts the most cruel sentence on my intake form that when I read later just devastated me. I won’t say what that was but it really hurt me. Most of the White medical women are pretty insensitive.
And with the racial slurs that I’ve been on the receiving end of I had this on my mind as I was drinking my morning coffee in my room here ( I live in transitional housing, I’m currently homeless as of this date) What countries treat women black women well? And I don’t care what race either White, Indian, Middle Eastern, I want to be treated like the wonderful woman that I am but truly never , ever received. Basically where I’m at mentally in my life right now? Is that I don’t care if most don’t like me, I’m quite selective. I have particular tastes in what I watch, what I read, I love certain kinds of music especially certain rock music I remember when Mom & I first arrived in Washington fleeing her abusive 2nd Husband.
BUT WAIT! I am grateful for this. Despite all the pain at least I’m not dead. Has anyone tried to leave an abusive man? You go out one of two ways a body bag, or you run. Sorry I digress. Everything else in my life has been like a nightmare.
But, I want to find out about which places treat black women well & then go there. I don’t care where it is…well some countries like Africa I couldn’t go to because of the tremendous HEAT! Canada, UK, Sweden, Norway, Australia.
Germany I would love to go to HOWEVER now in the 21st century it doesn’t seem that black women would be welcome. My Aunt was a former resident of Munich she passed away in 1980. I just can’t stand living here. It’s really way too abusive. I’m just saying.