namibia - Dictionary definition and meaning for word
namibia
Definition (noun) a republic in southwestern Africa on the south Atlantic coast (formerly called South West Africa); achieved independence from South Africa in 1990; the greater part of Namibia forms part of the high Namibian plateau of South Africa
We are looking for Content Writers (1-2 years experience) for our ed-tech startup based out of Gurgaon. If interested, please reach out to us at career@opencubicles.com