• word of the day


    florida - Dictionary definition and meaning for word florida

    (noun) a state in southeastern United States between the Atlantic and the Gulf of Mexico; one of the Confederate states during the American Civil War

Word used in video below:
text: she's going to Florida for spring vacation.
Download our Mobile App Today
Receive our word of the day
on Whatsapp