Posted on Leave a comment

Cover Averages: The Process

Accessing the Data

We used the Wayback Machine as our source for records of carleton.edu. Each year, we opened the final snapshot from that year and took a screenshot. To try to keep all of the screenshots identical in size, we took them in full-screen mode and zoomed out as much as we could so that the entire page would fit in the screen. All of the screenshots were added into a desktop folder.

Processing the Data

We used Adobe Photoshop to construct the average covers. The screenshots were layered on top of one another, with the earlier screen-captures being at higher opacity and the recent ones being at lower opacity so that the layers at the bottom are still visible. Since our main goal with this average website design was to see how the site has changed, we manually aligned and layered the screenshots so that any repetition of elements in the design would align as perfectly as possible. This ensured that the repetitive elements would create the strongest pattern and stand out in the overall averaged design.

We decided to break the 1996-2020 cover average into three separate ones, a new one each time there was a drastic change to the design. This is because we noticed that every year, the design was either identical to the last year, or completely new. Therefore, it seemed pointless to break it up into any smaller timeframes because some of the averages would look the same. To make the cover average across 1996-2020, we layered every screenshot we took. To make the others, we layered only the screenshots within the given timeframe. For example, when making the cover average across 1996-2007, only the screenshots from those years were used.

We want to point out that the website screen-captures that were on the Wayback Machine might have been modified in size so that it fits the current-day laptop screen better. Our entire data acquisition and analyzing process was done on an Apple MacBook Air, which is obviously very different from the devices that were used during the 1990s and 2000s. Therefore, the screenshots we took and these cover averages may have looked very different if we had taken them on an older computer where the website would not have been modified or distorted. This opens up an idea for a future study, where the process is identical to what we did here but the data acquisition is done on an older computer–one that is similar to the ones that would have been used at the time of a given carleton.edu design.

Leave a Reply

Your email address will not be published. Required fields are marked *