Established in 1972, the UF College of Dentistry is the only publicly-funded dental school in the State of Florida and is a national leader in dental education, research and community service.
Established in 1972, the UF College of Dentistry is the only publicly-funded dental school in the State of Florida and is a national leader in dental education, research and community service.