Lately, I have been thinking about how commercial American culture has become. In some cases, I feel as though television are only made to fill space in between commercials. And take a look inside any normal television show, and see how much product placement is used. Even on non-commercial networks (like HBO) and movies, there are ridiculous amounts of ads. So do you believe that American film and television is nothing more than an advertisement?