The smoothie franchise, Jamba Juice was mentioned several times during the “Black Light Attack!” episode of 30 Rock. As Jenna and James Franco work out the details of making their faux relationship believable, Franco mentions having a product placement deal with Jamba Juice. It’s unclear if Jamba Juice actually paid to be mentioned in 30 Rock, but the show is notorious for blatantly calling out its sponsors in clever, tongue-in-cheek ways. Regardless, the Jamba Juice mentions would probably register with many viewers and be monitored by services that track product placements. Auditory brand integration are designed to resonate with viewers like their visual counterparts.
Typically, brand mentions have only been coded and tracked in terms of mention counts and who uttered the brand name. There’s nothing inherently visual about tracking brand mentions compared to on-screen integrations that can be timed for their visual duration and image-captured to analyze how much space a brand name or logo occupies within a scene. One of the goals of my product placement Flickr set is to annotate where branding is featured during product placement occurrences. In my set, the size of the Flickr annotations directly corresponds to the size of the branding displayed.
Ever since Hulu introduced Captions Search, I’ve been trying to figure out how to utilize the feature to express brand mentions visually. I’m fond of the “Heat Map” display, which is capable of outlining how far into a program a brand is mentioned. A quick glance at the above Heat Map for Jamba Juice shows that the company was mentioned within the first five minutes of the episode and then again during the second segment. If Jamba Juice sponsored the show, they would probably want their brand name to be scattered throughout all four segments of the show to ensure audience recall and engagement. Heat Map can easily assist a media planner or product placement broker in ensuring that a brand mention is reinforced by subsequent mentions.
Captions Search results are displayed as textual dialogue, which is makes creating a visualization or infographic difficult. As Captions Search evolves from its beta form into a more dynamic feature, perhaps Hulu will be able to offer results that reads like a script and details character lines in a clearer manner. According to my prior research on product placements, which characters interact with or mention a brand affects how audiences respond to integrations. I noted above which characters mentioned Jamba Juice and even noted that brand was mentioned during a joke.
If Captions Search were capable of conducting some sort of semantic analysis, highlighting areas of irony or humor would be very beneficial since people are more likely to remember a brand mentioned during a line of dialogue that made them laugh. Perhaps tones in dialogue can be color-coded with humorous lines highlighted in yellow and serious, dramatic exchanges displayed in blue. I’m an incredibly visual person and am a lover of words so I’m really anxious to begin organizing and analyzing Captions Search’s data on a graphic level. I’m open to any tips on how to go about it or to program/application suggestions that might be helpful.