Popular Wordlists
  • word of the day

    florida

    florida - Dictionary definition and meaning for word florida

    Definition
    (noun) a state in southeastern United States between the Atlantic and the Gulf of Mexico; one of the Confederate states during the American Civil War

Word used in video below:
text: she's going to Florida for spring vacation.
Download our Mobile App Today
Receive our word of the day
on Whatsapp