American art" redirects here. For other uses, see American art (disambiguation).
The Art of the United States refers to all forms of visual art in or associated with the United States of America since the formation of the country in 1776.
Art of the United States refers to all the visual arts of the United States, but also to other form of arts like Cuisine, literature, music, dance, cinema and other arts