American

People who live in the United States. Note that many people in other parts of the Americas, particularly Latin Americans, see themselves as Americans, too, and consider America a region, not a country. Some see the national identity label American as imperialist. However, American is the most widely recognized and commonly used term for people who live in the United States of America.

Print

« Back to Glossary Index

Share This!