Guns in America examines why a large part of the United States believes that guns are an essential part of American life.