With the ever-expanding computing power, there are many methods to perform data analysis. This article discusses how a professor at UCSD, Lev Manovich, would like to use supercomputers to study current cultural trends. To study this, the following three items would need to be reviewed: First, are media sources from: blogs, web pages, photos, music, video, etc.; Second, are traces from users’ comments that discuss, share, review, publish, remix and edit these media; Third, are websites that provide statistics on cultural preferences, popularity and consumption; as well as, “meta channels” which are blogs that track any cultural developments.
The past cultural data is fixed; it is already stored in archives, museums and libraries. However, given the growth in social networks, user-generated content and websites that enable users to share media, a supercomputer would be needed to capture, sift and analyze all that data. In fact, on a daily basis, there are 14 million photos uploaded on Facebook, and 65,000 videos uploaded on YouTube. Also, the conversation built around this media in the form of comments and reviews posted on Flickr, YouTube, etc. would give insight to a large group of people’s opinions, comments and ideas as a tool to understand their reception to culture.
This professor envisions a real-time visual display on large screens that depict cultural flows, and on different screens show economic, social and cultural data. This would broaden situational awareness. One can only imagine the number of people who would be interested in this data.