%201.jpg?alt=media&token=4e174214-b3ca-4cb7-b3ec-915899e1cf76?w=3840&q=75)
Human Signals, Product Insight: How AI Interprets Emotional Reactions in Video
Video content that captures users interacting with products, sharing reviews, or comparing alternatives provides a rich, multimodal dataset for understanding user experience. With video emotion recognition and its multimodal features such as facial expressions, vocal tone, and body language, AI Emotion Recognition platforms like Imentiv AI decode emotional signals that reflect subtle cues of satisfaction, hesitation, or frustration. This level of emotion recognition AI enhances how product teams interpret user emotional reactions, identify usability gaps, and refine features based on real human feedback, especially during product testing .
To demonstrate the range and relevance of emotion analysis, we examined three distinct videos where user emotional reactions play a critical role in shaping product perception.
The first features a tech creator discussing Apple’s rumored foldable phone in comparison to existing foldables, an exchange rich with speculative excitement, brand contrast, and layered emotional cues.
Case 1: Anticipating Innovation, Apple’s Rumored Foldable Phone
The first video covers the rumored launch of Apple’s first foldable phone, drawing comparisons with existing foldables, particularly around screen design and pricing. The creator speculates on features, highlights market skepticism (especially around the price issue), and reacts to the projected $2,300 price tag with a dose of humor. This mix of brand expectation, speculation, and critique presents a compelling emotional profile that reflects how consumers engage with anticipated tech products, even before they exist.
Multimodal Emotion Analysis
- Video Emotion Data shows a strong presence of happiness (43.07%) and neutral (33.57%), with notable surprise (9.75%) and mild sadness (6.76%), indicating a tone that blends excitement with subtle concern, captured through precise facial expression recognition .
- Audio Emotion Analysis (Speech Emotion Recognition) reveals a striking mix of surprise (31.1%), happiness (27.12%), and strong negative tones like disgust (20.73%) and anger (16.46%), suggesting that despite the humor, the speaker communicates frustration vocally, especially about the rumored pricing.
- Transcript Emotion Data (Text Emotion AI) detects high annoyance (53.63%) and disapproval (16.26%), with lighter expressions of approval and admiration, capturing the speaker's playful yet conflicted emotional reaction to Apple’s rumored direction and pricing strategy.
Dominant Personality Trait : Agreeableness . Despite the sharp remarks, the speaker maintains a light, conversational tone, offering both critique and appreciation. The tone is cooperative, socially engaging, and designed to connect with the audience, even when delivering criticism.
Such personality-driven delivery can influence early user interface testing impressions, as emotional tone often shapes how potential users perceive usability and brand intent.
Psychological Interpretation
This case exemplifies how Appraisal Theory functions in a speculative context. The user evaluates Apple’s rumored innovation against known standards (e.g., Samsung foldables) and personal values (affordability, utility). The emotional blend, hope, skepticism, and frustration stem from conflicting appraisals: the product seems promising, but the price undermines that appeal.
We also see Affective Forecasting in action. The speaker imagines owning the product (“iPhone and iPad in one”), projecting future satisfaction, yet humorously undercuts it with exaggerated pricing anxiety (“I might need to sell a kidney”). These anticipatory emotions, decoded through our video emotional intelligence , offer early signals that support product testing before a launch, even when the product exists only in rumors and renderings.
Insight for Product & Marketing Teams
This type of emotion-driven product anticipation is rich with insight. Emotion AI helps decode the emotional narrative consumers build before a product launch. For product teams, this is a critical layer of pre-launch feedback, highlighting emotional triggers (e.g., price sensitivity, innovation hype, brand trust) that shape early perception. Marketing teams can also use this data to tailor messaging around innovation vs. value and address skepticism directly in product narratives.
Dive into the Apple Foldable Phone video and explore its emotional dynamics in the Imentiv dashboard — analyze the emotions yourself.
The second presents a user’s negative reaction to an Apple product launch, revealing how even trusted brands can face emotional pushback when usability expectations aren’t met.
Case 2: When Brand Trust Meets Product Frustration, Apple Magic Mouse Reaction
Despite Apple’s strong brand equity, this video reveals how users can experience emotional friction when a product doesn’t meet expectations. The speaker openly criticizes the design flaws , especially the charging port placement , ergonomics, and pricing. The overall tone balances between practical acknowledgment and visible disappointment, a compelling mix of rational and emotional feedback. Insights like these are critical for usability testing , where emotion recognition AI helps decode how trusted products can still miss the mark for users.
Multimodal Emotion Analysis
- Video Emotion Data shows the dominance of neutral (53.64%) , but notable traces of sadness (13.14%) , disgust (4.18%) , and even contempt (2.88%) , indicating restrained frustration layered beneath a calm delivery. This layered expression is precisely what facial expression recognition tools are designed to detect.
- Audio Analysis intensifies the emotional tone, with a high disgust score (21.86%) , suggesting strong disapproval in the voice, even if the face remains composed.
- Transcript Emotion Analysis captures the speaker’s annoyance (47.31%) , paired with realization and disapproval , which reflect his internal conflict between acknowledging the mouse’s usefulness and rejecting its flawed design.
Dominant Personality Trait: Openness , this matches the speaker’s willingness to express criticism honestly while still acknowledging value (e.g., the touch-sensitive scrolling feature). Emotion AI platforms like Imentiv provide these multidimensional insights, transforming user feedback into actionable product-level improvements.
Psychological Interpretation
This reaction is a clear example of Cognitive Dissonance . The speaker trusts the Apple brand and appreciates the touch interface, yet feels discomfort and irritation due to poor usability and design. His emotional state reflects conflicted appraisals , a key idea in Appraisal Theory , where the product partially aligns with his goals (video editing convenience) but violates expectations (price, ergonomics).
The emotional mix also connects with Affective Forecasting . Users may expect Apple products to deliver seamless, premium experiences. When those expectations are unmet, emotions like disgust, disapproval , and annoyance emerge, not just toward the product, but toward the gap between expectation and reality .
Insight for Product Teams
Emotion AI reveals not just dissatisfaction, but why it happens. Through tools like video emotion recognition and its multimodal emotion analysis approach , teams gain access to emotional signals that highlight where and how user expectations break down. This kind of insight enables more precise user experience testing , helping teams rethink design trade-offs, especially for flagship products where emotional investment runs high.
View the Apple Magic Mouse reaction video in Imentiv and interact with the emotion graph, audio, and transcript layers — explore the dashboard.
"Your most unhappy customers are your greatest source of learning." - Bill Gates.
The third focuses on public responses to the launch of Luckin Coffee in New York, where spontaneous street interviews capture authentic emotional responses, from curiosity to excitement.
Case 3: First Impressions of a New Brand, Public Reaction to Luckin Coffee’s NYC Launch
Unlike formal reviews or critical rants, spontaneous street interviews capture a different emotional layer of authentic, unfiltered product perception . In this video, multiple people share their initial reactions to Luckin Coffee’s arrival in New York, expressing emotions that range from nostalgia to excitement, curiosity, and joy . The informal setup and social context make this analysis highly relevant for understanding brand reception and emotional momentum during a launch, particularly in early-stage product testing scenarios where real-world user feedback can offer high-impact insights.
Multimodal Emotion Analysis
- Video Emotion Data shows a strong presence of happiness (45.85%) and neutral (34.21%) , with low but noticeable levels of disgust (7.23%) and surprise (3.41%) , likely reflecting differing expectations or sensory reactions to taste and experience.
-
Audio Emotion Data
echoes this positivity:
Happy (38.5%)
dominates, but
sad (10.34%)
and
surprise (7.52%)
suggest subtle emotional variations among participants, possibly tied to comparisons with the original experience in China or the surprise of finding something familiar in NYC.
- Transcript Emotion Data reveals deep emotional engagement: approval (22.04%) , love (18.34%) , excitement (17.26%) , and admiration (15.81%) dominate the textual layer, strongly suggesting brand affinity and delight.
These nuances are picked up using video emotional intelligence capabilities powered by Imentiv’s advanced multimodal features.
Psychological Interpretation
This case strongly aligns with Appraisal Theory . People assess Luckin Coffee’s arrival through the lens of personal relevance and emotional memory, especially Speaker 1, who links the experience with a trip to China. The positive appraisal of taste, nostalgia, and location creates uplifting emotional responses.
We also see Affective Forecasting at play. The speakers had preconceived emotional expectations (e.g., hoping it would taste like it did in China), and their joy indicates that those expectations were met or exceeded.
Interestingly, a few traces of disgust, sadness, or boredom in the video and audio suggest emotional variance across individuals. These slight dips, when aligned with Cognitive Dissonance , might come from sensory mismatches (e.g., sweetness level, strength of flavor) or the contrast between social hype (the line, the buzz) and individual experience.
Insight for Product & Marketing Teams
Emotion AI in this context captures first-hand, multisensory brand perception in a public setting. Teams can identify what drives emotional resonance (nostalgia, taste, brand identity) and where small mismatches exist, even in a largely positive reception. It’s a valuable lens for refining brand messaging, packaging, and sensory experiences during market expansion.
What does delight look like in public?
Visit the Luckin Coffee video dashboard and see how spontaneous joy, curiosity, and subtle disapproval show up across voices and faces, and experience the emotional patterns.
Together, these cases illustrate how emotional signals in the video can surface hidden insights about consumer expectations, brand trust, and product reception.
Academic research has long emphasized the impact of emotions on consumer behavior. *A study on ResearchGate , based on mall visitor responses, found that emotional factors like staff friendliness and store atmosphere strongly shape customer satisfaction and purchasing intent.
Understanding how a customer emotionally connects with a product is essential for building meaningful engagement. Emotion AI makes this possible by decoding subtle, often overlooked emotional reactions that influence perception and decision-making. When teams integrate this into product testing, they gain clarity on functional performance and emotional alignment. Product companies can create more engaging touchpoints, respond more effectively to feedback, and shape stronger brand-customer relationships grounded in emotional intelligence.
Why an API Matters in Emotion Analysis for Product Testing?
In product testing, teams often collect dozens of user videos, voice recordings, or screen captures across different test stages. While the web app works well for analyzing single sessions, it doesn't fit seamlessly into a larger testing pipeline. An Emotion API makes that possible. It allows emotional reactions—facial expressions, vocal tone, and text emotions—to be processed automatically as part of your existing system. That means no manual uploads, easier comparisons across users, and the ability to scale insights as your testing grows. Emotion analysis becomes part of the workflow, not a separate step.
Let’s discuss how our Emotion API can support your product testing setup— contact our team .
Explore how Imentiv AI turns real reactions into emotional insights — see how video emotion recognition works.