AT&T Targeted Ad System Tracks Emotions, Posture

Marketers may be able to deliver more effective ads using sensors in mobile devices that can gauge the emotion and posture of a viewer, AT&T states in a patent application published on Thursday.

“If repetitive or cyclical human movement is observed, the user may be pacing back and forth, perhaps indicating a worried or concerned mood However, if the overall posture indicates the user is upside down, perhaps the user is happily swinging from a tree limb,” AT&T states in the patent appication, titled, “Contextual inference of non-verbal expressions.”

Middetown, N.J.-based AT&T technical architect Rocky Rios is named as lead inventor on the patent application.

Abstract: Contextual awareness is used to reveal non-verbal expressions. A current context of a device is evaluated to infer a user’s body posture and emotional state. The user’s body posture and emotional state may then be used for improved services and targeted advertising.

Patent Application

Related articles:
Disney Building Own Dynamic Ad Insertion System
How Verizon Will Use Targeted Ads and ‘Non-Subscription Access’ to Power OTT Product
Comcast Monitoring System Battles Viewer Distractions

Claims: 

1. A system, comprising: a processor; and a memory storing instructions that when executed cause the processor to perform operations, the operations comprising: determining a context associated with a time and a location of an identifier of a device; storing a body language database that maps different contexts to different human body postures; querying the body language database for the context; and retrieving one of the different human body postures associated with the context.

2. The system of claim 1, wherein the operations further comprise retrieving a characterization of the one of the different human body postures.

3. The system of claim 2, wherein the operations further comprise retrieving an emotional state related to the one of the different human body postures.

4. The system of claim 1, wherein the operations further comprise retrieving a hand orientation associated with the one of the different human body postures.

5. The system of claim 1, wherein the operations further comprise retrieving a forearm orientation associated with the one of the different human body postures.

6. The system of claim 1, wherein the operations further comprise retrieving an advertisement associated with the one of the different human body postures.

7. The system of claim 1, wherein the operations further comprise authenticating the device based on the one of the different human body postures.

8. A method, comprising: determining, by a processor, a context associated with a time and a location of an identifier of a device; storing, in memory, a body language database that maps different contexts to different human body postures; and retrieving, by the processor, one of the different human body postures from the body language database that is associated with the context.

9. The method of claim 8, further comprising retrieving a characterization of the one of the different human body postures.

10. The method of claim 9, further comprising retrieving an emotional state related to the one of the different human body postures.

11. The method of claim 8, further comprising retrieving a hand orientation associated with the one of the different human body postures.

12. The method of claim 8, further comprising retrieving a forearm orientation associated with the one of the different human body postures.

13. The method of claim 8, further comprising retrieving an advertisement associated with the one of the different human body postures.

14. The method of claim 8, further comprising authenticating the device based on the one of the different human body postures.

15. A memory storing instructions that when executed cause a processor to perform operations, the operations comprising: determining a context associated with a time and a location of an identifier of a device; storing a body language database that maps different contexts to different human body postures; querying the body language database for the context; and retrieving one of the different human body postures associated with the context.

16. The memory of claim 15, wherein the operations further comprise retrieving a characterization of the one of the different human body postures.

17. The memory of claim 16, wherein the operations further comprise retrieving an emotional state related to the one of the different human body postures.

18. The memory of claim 15, wherein the operations further comprise retrieving a hand orientation associated with the one of the different human body postures.

19. The memory of claim 15, wherein the operations further comprise retrieving a forearm orientation associated with the one of the different human body postures.

20. The memory of claim 15, wherein the operations further comprise retrieving an advertisement associated with the one of the different human body postures.