67:
306:– a statistical Markov model in which the states and state transitions are not directly available to observation. Instead, the series of outputs dependent on the states are visible. In the case of affect recognition, the outputs represent the sequence of speech feature vectors, which allow the deduction of states' sequences through which the model progressed. The states can consist of various intermediate steps in the expression of an emotion, and each of them has a probability distribution over the possible output vectors. The states' sequences allow us to predict the affective state which we are trying to classify, and this is one of the most commonly used techniques within the area of speech affect detection.
738:, which produces a graph indicating blood flow through the extremities. The peaks of the waves indicate a cardiac cycle where the heart has pumped blood to the extremities. If the subject experiences fear or is startled, their heart usually 'jumps' and beats quickly for some time, causing the amplitude of the cardiac cycle to increase. This can clearly be seen on a photoplethysmograph when the distance between the trough and the peak of the wave has decreased. As the subject calms down, and as the body's inner core expands, allowing more blood to flow back to the extremities, the cycle will return to normal.
434:
databases, the participants are asked to display different basic emotional expressions, while in spontaneous expression database, the expressions are natural. Spontaneous emotion elicitation requires significant effort in the selection of proper stimuli which can lead to a rich display of intended emotions. Secondly, the process involves tagging of emotions by trained individuals manually which makes the databases highly reliable. Since perception of expressions and their intensity is subjective in nature, the annotation by experts is essential for the purpose of validation.
857:
ability, and then formulate reasonable teaching plans. At the same time, they can pay attention to students' inner feelings, which is helpful to students' psychological health. Especially in distance education, due to the separation of time and space, there is no emotional incentive between teachers and students for two-way communication. Without the atmosphere brought by traditional classroom learning, students are easily bored, and affect the learning effect. Applying affective computing in distance education system can effectively improve this situation.
162:. The goal of most of these techniques is to produce labels that would match the labels a human perceiver would give in the same situation: For example, if a person makes a facial expression furrowing their brow, then the computer vision system might be taught to label their face as appearing "confused" or as "concentrating" or "slightly negative" (as opposed to positive, which it might say if they were smiling in a happy-appearing way). These labels may or may not correspond to what the person is actually feeling.
33:
696:
the answer to a question, or they could be complex and meaningful as when communicating with sign language. Without making use of any object or surrounding environment, we can wave our hands, clap or beckon. On the other hand, when using objects, we can point at them, move, touch or handle these. A computer should be able to recognize these, analyze the context and respond in a meaningful way, in order to be efficiently used for Human–Computer
Interaction.
345:
as increasing the performance, which is particularly significant to real-time detection. The range of possible choices is vast, with some studies mentioning the use of over 200 distinct features. It is crucial to identify those that are redundant and undesirable in order to optimize the system and increase the success rate of correct emotion detection. The most common speech characteristics are categorized into the following groups.
336:
attempt to produce such database was the FAU Aibo
Emotion Corpus for CEICES (Combining Efforts for Improving Automatic Classification of Emotional User States), which was developed based on a realistic context of children (age 10–13) playing with Sony's Aibo robot pet. Likewise, producing one standard database for all emotional research would provide a method of evaluating and comparing different affect recognition systems.
4002:
5566:
4012:
779:
usually studied to detect emotion: The corrugator supercilii muscle, also known as the 'frowning' muscle, draws the brow down into a frown, and therefore is the best test for negative, unpleasant emotional response.↵The zygomaticus major muscle is responsible for pulling the corners of the mouth back when you smile, and therefore is the muscle used to test for a positive emotional response.
4770:
4764:
4022:
877:, as well as a growing number of robots used in health care benefit from emotional awareness because they can better judge users' and patient's emotional states and alter their actions/programming appropriately. This is especially important in those countries with growing aging populations and/or a lack of younger workers to address their needs.
783:
760:
653:
recognition, affect recognition), the accuracy of modeling and tracking has been an issue. As hardware evolves, as more data are collected and as new discoveries are made and new practices introduced, this lack of accuracy fades, leaving behind noise issues. However, methods for noise removal exist including neighborhood averaging,
787:
game where there is usually not much exciting game play, there is a high level of resistance recorded, which suggests a low level of conductivity and therefore less arousal. This is in clear contrast with the sudden trough where the player is killed as one is usually very stressed and tense as their character is killed in the game.
311:
classifier taken separately. It is compared with two other sets of classifiers: one-against-all (OAA) multiclass SVM with Hybrid kernels and the set of classifiers which consists of the following two basic classifiers: C5.0 and Neural
Network. The proposed variant achieves better performance than the other two sets of classifiers.
834:
converting the pixel color of the standard RGB color space to a color space such as oRGB color space or LMS channels perform better when dealing with faces. So, map the above vector onto the better color space and decompose into red-green and yellow-blue channels. Then use deep learning methods to find equivalent emotions.
803:
sympathetic branch of the autonomic nervous system. As the sweat glands are activated, even before the skin feels sweaty, the level of the EDA can be captured (usually using conductance) and used to discern small changes in autonomic arousal. The more aroused a subject is, the greater the skin conductance tends to be.
974:
improve computer-mediated interpersonal communication. It does not necessarily seek to map emotion into an objective mathematical model for machine interpretation, but rather let humans make sense of each other's emotional expressions in open-ended ways that might be ambiguous, subjective, and sensitive to context.
324:), which assumes the existence of six basic emotions (anger, fear, disgust, surprise, joy, sadness), the others simply being a mix of the former ones. Nevertheless, these still offer high audio quality and balanced classes (although often too few), which contribute to high success rates in recognizing emotions.
945:
One could also use affective state recognition in order to judge the impact of a TV advertisement through a real-time video recording of that person and through the subsequent study of his or her facial expression. Averaging the results obtained on a large group of subjects, one can tell whether that
824:
The surface of the human face is innervated with a large network of blood vessels. Blood flow variations in these vessels yield visible color changes on the face. Whether or not facial emotions activate facial muscles, variations in blood flow, blood pressure, glucose levels, and other changes occur.
778:
Facial electromyography is a technique used to measure the electrical activity of the facial muscles by amplifying the tiny electrical impulses that are generated by muscle fibers when they contract. The face expresses a great deal of emotion, however, there are two main facial muscle groups that are
344:
The complexity of the affect recognition process increases with the number of classes (affects) and speech descriptors used within the classifier. It is, therefore, crucial to select only the most relevant features in order to assure the ability of the model to successfully identify emotions, as well
310:
It is proved that having enough acoustic evidence available the emotional state of a person can be classified by a set of majority voting classifiers. The proposed set of classifiers is based on three main classifiers: kNN, C4.5 and SVM-RBF Kernel. This set achieves better performance than each basic
418:
processing or active appearance models. More than one modalities can be combined or fused (multimodal recognition, e.g. facial expressions and speech prosody, facial expressions and hand gestures, or facial expressions with speech and text for multimodal data and metadata analysis) to provide a more
335:
Despite the numerous advantages which naturalistic data has over acted data, it is difficult to obtain and usually has low emotional intensity. Moreover, data obtained in a natural context has lower signal quality, due to surroundings noise and distance of the subjects from the microphone. The first
262:
As of 2010, the most frequently used classifiers were linear discriminant classifiers (LDC), k-nearest neighbor (k-NN), Gaussian mixture model (GMM), support vector machines (SVM), artificial neural networks (ANN), decision tree algorithms and hidden Markov models (HMMs). Various studies showed that
977:
Picard's critics describe her concept of emotion as "objective, internal, private, and mechanistic". They say it reduces emotion to a discrete psychological signal occurring inside the body that can be measured and which is an input to cognition, undercutting the complexity of emotional experience.
786:
Here we can see a plot of skin resistance measured using GSR and time whilst the subject played a video game. There are several peaks that are clear in the graph, which suggests that GSR is a good method of differentiating between an aroused and a non-aroused state. For example, at the start of the
227:
Various changes in the autonomic nervous system can indirectly alter a person's speech, and affective technologies can leverage this information to recognize emotion. For example, speech produced in a state of fear, anger, or joy becomes fast, loud, and precisely enunciated, with a higher and wider
755:
It can be cumbersome to ensure that the sensor shining an infra-red light and monitoring the reflected light is always pointing at the same extremity, especially seeing as subjects often stretch and readjust their position while using a computer. There are other factors that can affect one's blood
708:
This could be used to detect a user's affective state by monitoring and analyzing their physiological signs. These signs range from changes in heart rate and skin conductance to minute contractions of the facial muscles and changes in facial blood flow. This area is gaining momentum and we are now
699:
There are many proposed methods to detect the body gesture. Some literature differentiates 2 different approaches in gesture recognition: a 3D model based and an appearance-based. The foremost method makes use of 3D information of key elements of the body parts in order to obtain several important
695:
Gestures could be efficiently used as a means of detecting a particular emotional state of the user, especially when used in conjunction with speech and face recognition. Depending on the specific action, gestures could be simple reflexive responses, like lifting your shoulders when you don't know
327:
However, for real life application, naturalistic data is preferred. A naturalistic database can be produced by observation and analysis of subjects in their natural context. Ultimately, such database should allow the system to recognize emotions based on their context as well as work out the goals
211:
The categorical approach tends to use discrete classes such as happy, sad, angry, fearful, surprise, disgust. Different kinds of machine learning regression and classification models can be used for having machines produce continuous or discrete labels. Sometimes models are also built that allow
925:
operations such as steering and maneuvering are used in various fields such as aviation, transportation and medicine. Integrating affective computing capabilities in this type of training systems, in accordance with the adaptive automation approach, has been found to be effective in improving the
802:
Galvanic skin response (GSR) is an outdated term for a more general phenomenon known as or EDA. EDA is a general phenomena whereby the skin's electrical properties change. The skin is innervated by the , so measuring its resistance or conductance provides a way to quantify small changes in the
170:
Another area within affective computing is the design of computational devices proposed to exhibit either innate emotional capabilities or that are capable of convincingly simulating emotions. A more practical approach, based on current technological capabilities, is the simulation of emotions in
833:
Approaches are based on facial color changes. Delaunay triangulation is used to create the triangular local areas. Some of these triangles which define the interior of the mouth and eyes (sclera and iris) are removed. Use the left triangular areas’ pixels to create feature vectors. It shows that
238:
Speech analysis is an effective method of identifying affective state, having an average reported accuracy of 70 to 80% in research from 2003 and 2006. These systems tend to outperform average human accuracy (approximately 60%) but are less accurate than systems which employ other modalities for
941:
One idea put forth by the
Romanian researcher Dr. Nicu Sebe in an interview is the analysis of a person's face while they are using a certain product (he mentioned ice cream as an example). Companies would then be able to use such analysis to infer whether their product will or will not be well
865:
The applications of sensory computing may contribute to improving road safety. For example, a car can monitor the emotion of all occupants and engage in additional safety measures, such as alerting other vehicles if it detects the driver to be angry. In addition, affective computing systems for
842:
Aesthetics, in the world of art and photography, refers to the principles of the nature and appreciation of beauty. Judging beauty and other aesthetic qualities is a highly subjective task. Computer scientists at Penn State treat the challenge of automatically inferring the aesthetic quality of
973:
Picard's focus is human–computer interaction, and her goal for affective computing is to "give computers the ability to recognize, express, and in some cases, 'have' emotions". In contrast, the interactional approach seeks to help "people to understand and experience their own emotions" and to
676:
The FACS combinations do not correspond in a 1:1 way with the emotions that the psychologists originally proposed (note that this lack of a 1:1 mapping also occurs in speech recognition with homophones and homonyms and many other sources of ambiguity, and may be mitigated by bringing in other
652:
As with every computational practice, in affect detection by facial processing, some obstacles need to be surpassed, in order to fully unlock the hidden potential of the overall algorithm or method employed. In the early days of almost every kind of AI-based detection (speech recognition, face
319:
The vast majority of present systems are data-dependent. This creates one of the biggest challenges in detecting emotions based on speech, as it implicates choosing an appropriate database used to train the classifier. Most of the currently possessed data was obtained from actors and is thus a
134:
that capture data about the user's physical state or behavior without interpreting the input. The data gathered is analogous to the cues humans use to perceive emotions in others. For example, a video camera might capture facial expressions, body posture, and gestures, while a microphone might
856:
Affection influences learners' learning state. Using affective computing technology, computers can judge the learners' affection and learning state by recognizing their facial expressions. In education, the teacher can use the analysis result to understand the student's learning and accepting
437:
Researchers work with three types of databases, such as a database of peak expression images only, a database of image sequences portraying an emotion from neutral to its peak, and video clips with emotional annotations. Many facial expression databases have been created and made public for
433:
Creation of an emotion database is a difficult and time-consuming task. However, database creation is an essential step in the creation of a system that will recognize human emotions. Most of the publicly available emotion databases include posed facial expressions only. In posed expression
981:
The interactional approach asserts that though emotion has biophysical aspects, it is "culturally grounded, dynamically experienced, and to some degree constructed in action and interaction". Put another way, it considers "emotion as a social and cultural product experienced through our
843:
pictures using their visual content as a machine learning problem, with a peer-rated on-line photo sharing website as a data source. They extract certain visual features based on the intuition that they can discriminate between aesthetically pleasing and displeasing images.
191:
includes an attempt to give these programs, which simulate humans, the emotional dimension as well, including reactions in accordance with the reaction that a real person would react in a certain emotionally stimulating situation as well as facial expressions and gestures.
456:
proposed the idea that facial expressions of emotion are not culturally determined, but universal. Thus, he suggested that they are biological in origin and can, therefore, be safely and correctly categorized. He therefore officially put forth six basic emotions, in 1972:
207:
In psychology, cognitive science, and in neuroscience, there have been two main approaches for describing how humans perceive and classify emotion: continuous or categorical. The continuous approach tends to use dimensions such as negative vs. positive, calm vs. aroused.
746:
Infra-red light is shone on the skin by special sensor hardware, and the amount of light reflected is measured. The amount of reflected and transmitted light correlates to the BVP as light is absorbed by hemoglobin which is found richly in the bloodstream.
756:
volume pulse. As it is a measure of blood flow through the extremities, if the subject feels hot, or particularly cold, then their body may allow more, or less, blood to flow to the extremities, all of this regardless of the subject's emotional state.
282:– is a probabilistic model used for representing the existence of subpopulations within the overall population. Each sub-population is described using the mixture distribution, which allows for classification of observations into the sub-populations.
880:
Affective computing is also being applied to the development of communicative technologies for use by people with autism. The affective component of a text is also increasingly gaining attention, particularly its role in the so-called emotional or
810:
electrodes placed somewhere on the skin and applying a small voltage between them. To maximize comfort and reduce irritation the electrodes can be placed on the wrist, legs, or feet, which leaves the hands fully free for daily activity.
575:
are action units (AU). They are, basically, a contraction or a relaxation of one or more muscles. Psychologists have proposed the following classification of six basic emotions, according to their action units ("+" here mean "and"):
228:
range in pitch, whereas emotions such as tiredness, boredom, or sadness tend to generate slow, low-pitched, and slurred speech. Some emotions have been found to be more easily computationally identified, such as anger or approval.
239:
emotion detection, such as physiological states or facial expressions. However, since many speech characteristics are independent of semantics or culture, this technique is considered to be a promising route for further research.
866:
monitoring the driver's stress may allow various interventions such as driver assistance systems adjusted according to the stress level and minimal and direct interventions to change the emotional state of the driver.
700:
parameters, like palm position or joint angles. On the other hand, appearance-based systems use images or videos to for direct interpretation. Hand gestures have been a common focus of body gesture detection methods.
938:, such as affective mirrors allowing the user to see how he or she performs; emotion monitoring agents sending a warning before one sends an angry email; or even music players selecting tracks based on mood.
5251:
2231:
2062:
1167:
300:– work based on following a decision tree in which leaves represent the classification outcome, and branches represent the conjunction of subsequent features that lead to the classification.
276:– Classification happens by locating the object in the feature space, and comparing it with the k nearest neighbors (training examples). The majority vote decides on the classification.
670:
Facial expressions do not always correspond to an underlying emotion that matches them (e.g. they can be posed or faked, or a person can feel emotions but maintain a "poker face").
328:
and outcomes of the interaction. The nature of this type of data allows for authentic real life implementation, due to the fact it describes states naturally occurring during the
664:
The fact that posed expressions, as used by most subjects of the various studies, are not natural, and therefore algorithms trained on these may not apply to natural expressions.
146:
Recognizing emotional information requires the extraction of meaningful patterns from the gathered data. This is done using machine learning techniques that process different
1268:
2798:
1766:
259:, broad enough to fit every need for its application, as well as the selection of a successful classifier which will allow for quick and accurate emotion identification.
1568:
Lee, C.M.; Narayanan, S.; Pieraccini, R., Recognition of
Negative Emotion in the Human Speech Signals, Workshop on Auto. Speech Recognition and Understanding, Dec 2001
673:
FACS did not include dynamics, while dynamics can help disambiguate (e.g. smiles of genuine happiness tend to have different dynamics than "try to look happy" smiles.)
667:
The lack of rotational movement freedom. Affect detection works very well with frontal use, but upon rotating the head more than 20 degrees, "there've been problems".
270:– Classification happens based on the value obtained from the linear combination of the feature values, which are usually provided in the form of vector features.
263:
choosing the appropriate classifier can significantly enhance the overall performance of the system. The list below gives a brief description of each algorithm:
1334:"The Effect of Multimodal Emotional Expression on Responses to a Digital Human during a Self-Disclosure Conversation: a Computational Analysis of User Language"
2483:
2248:
2109:
763:
The corrugator supercilii muscle and zygomaticus major muscle are the 2 main muscles used for measuring the electrical activity, in facial electromyography.
2941:
2251:, Lecture Notes in Computer Science, vol. 3953, Proceedings of the European Conference on Computer Vision, Part III, pp. 288–301, Graz, Austria, May 2006.
3038:
147:
6354:
1741:
1174:
567:
A system has been conceived by psychologists in order to formally categorize the physical expression of emotions on faces. The central concept of the
1812:
Singh, Premjeet; Saha, Goutam; Sahidullah, Md (2021). "Non-linear frequency warping using constant-Q transformation for speech emotion recognition".
5659:
4058:
1929:
231:
Emotional speech processing technologies recognize the user's emotional state using computational analysis of speech features. Vocal parameters and
66:
2638:
Khandaker, M (2009). "Designing affective video games to support the social-emotional development of teenagers with autism spectrum disorders".
6263:
5754:
3755:
3727:
680:
Accuracy of recognition is improved by adding context; however, adding context and other modalities increases computational cost and complexity
117:. The machine should interpret the emotional state of humans and adapt its behavior to them, giving an appropriate response to those emotions.
3780:
3007:
2622:
2390:
1992:
1839:
1697:
1545:
2775:
5354:
3631:
288:– is a type of (usually binary) linear classifier which decides in which of the two (or more) possible classes, each input may fall into.
6191:
5822:
5095:
4999:
3785:
3057:
1958:
1518:
Dellaert, F., Polizin, t., and Waibel, A., Recognizing
Emotion in Speech", In Proc. Of ICSLP 1996, Philadelphia, PA, pp.1970–1973, 1996
294:– is a mathematical model, inspired by biological neural networks, that can better grasp possible non-linearities of the feature space.
5817:
3290:
2492:
5209:
3937:
3765:
3295:
2423:
1889:
187:
that emotion is "not especially different from the processes that we call 'thinking.'" The innovative approach "digital humans" or
1616:
Yacoub, Sherif; Simske, Steve; Lin, Xiaofan; Burns, John (2003). "Recognition of
Emotions in Interactive Voice Response Systems".
195:
Emotion in machines often refers to emotion in computational, often AI-based, systems. As a result, the terms 'emotional AI' and '
6243:
5011:
4025:
3119:
2024:
1241:
1028:
2839:
1211:
5083:
3413:
966:
or "information model" concept of emotion has been criticized by and contrasted with the "post-cognitivist" or "interactional"
901:
that measure the pressure with which a button is pressed: this has been shown to correlate strongly with the players' level of
3666:
955:
329:
6238:
5689:
4969:
3704:
3323:
3031:
2547:
2470:
2312:"Associating Vehicles Automation With Drivers Functional State Assessment Systems: A Challenge for Road Safety in the Future"
2184:
Bratkova, Margarita; Boulos, Solomon; Shirley, Peter (2009). "oRGB: A Practical
Opponent Color Space for Computer Graphics".
6455:
6347:
6268:
6151:
5453:
5299:
4944:
3846:
3823:
3553:
3543:
1038:
320:
representation of archetypal emotions. Those so-called acted databases are usually based on the Basic
Emotions theory (by
2292:
906:
6557:
6278:
5900:
5276:
4051:
3927:
3515:
3423:
3328:
3104:
3089:
1308:
273:
1190:
The introduction of emotion to computer science was done by
Pickard (sic) who created the field of affective computing.
5199:
4015:
3750:
3248:
2110:
Effects of positive and negative affect on electromyographic activity over zygomaticus major and corrugator supercilii
2040:
J. K. Aggarwal, Q. Cai, Human Motion Analysis: A Review, Computer Vision and Image Understanding, Vol. 73, No. 3, 1999
568:
562:
428:
155:
709:
seeing real products that implement the techniques. The four main physiological signs that are usually analyzed are
6562:
6101:
5399:
5229:
3987:
3636:
935:
2496:
1292:
Heise, David (2004). "Enculturating agents with expressive role behavior". In Sabine Payr; Trappl, Robert (eds.).
6233:
6225:
5855:
5589:
5465:
5309:
5204:
5006:
4005:
3932:
3907:
3770:
3418:
3024:
1975:
415:
291:
6316:
6340:
5475:
5443:
5379:
5347:
5162:
5127:
4168:
3856:
3689:
3275:
3144:
963:
77:
is the study and development of systems and devices that can recognize, interpret, process, and simulate human
6296:
1783:
358:
Contour slope – describes the tendency of the frequency change over time, it can be rising, falling or level.
6397:
6258:
5865:
5787:
5545:
5510:
5495:
5490:
5480:
5429:
5304:
5219:
5063:
4786:
4278:
4044:
3917:
3851:
3742:
3558:
3218:
1465:
1017:
807:
773:
718:
572:
178:
6450:
6273:
6076:
5915:
5792:
5772:
5719:
5629:
5584:
5555:
5530:
5460:
5448:
5414:
5147:
5090:
4954:
4884:
4736:
4475:
4071:
3982:
3813:
3694:
3461:
3451:
3446:
2974:
2813:
2525:
1621:
1440:"A model of the perception of facial expressions of emotion by humans: Research overview and perspectives"
797:
520:
447:
297:
285:
279:
140:
110:
6306:
2564:
1938:
235:
features such as pitch variables and speech rate can be analyzed through pattern recognition techniques.
6301:
6253:
6248:
5933:
5875:
5827:
5654:
5619:
5594:
5485:
5374:
5234:
4879:
4684:
4113:
3952:
3922:
3912:
3808:
3722:
3598:
3538:
3505:
3495:
3378:
3343:
3333:
3270:
3139:
3114:
3109:
3074:
1863:
Caridakis, G.; Malatesta, L.; Kessous, L.; Amir, N.; Raouzaiou, A.; Karpouzis, K. (November 2–4, 2006).
992:
946:
commercial (or movie) has the desired effect and what the elements which interest the watcher most are.
232:
93:. While some core ideas in the field may be traced as far back as to early philosophical inquiries into
2667:"Stress-Adaptive Training: An Adaptive Psychomotor Training According to Stress Measured by Grip Force"
452:
By doing cross-cultural research in Papua, New Guinea, on the Fore Tribesmen, at the end of the 1960s,
2783:
1914:. Nebraska Symposium on Motivation. Lincoln, Nebraska: University of Nebraska Press. pp. 207–283.
6490:
6440:
6409:
5860:
5797:
5699:
5609:
5439:
5424:
5246:
5189:
5179:
5167:
5078:
5073:
5058:
5043:
4959:
4909:
4904:
4849:
4709:
4460:
3712:
3684:
3656:
3651:
3480:
3456:
3408:
3391:
3386:
3368:
3358:
3353:
3315:
3265:
3260:
3177:
3123:
2678:
2579:
2138:
922:
735:
196:
2979:
2818:
1626:
1135:
6470:
5704:
5694:
5679:
5644:
5639:
5624:
5604:
5599:
5470:
5409:
5340:
5194:
5157:
5142:
5112:
5053:
5038:
4994:
4979:
4864:
4510:
3977:
3902:
3818:
3803:
3568:
3348:
3305:
3300:
3197:
3187:
3159:
2530:
1439:
1023:
1012:
690:
492:
However, in the 1990s Ekman expanded his list of basic emotions, including a range of positive and
411:
303:
216:
183:
78:
2965:
Hudlicka, Eva (2003). "To feel or not to feel: The role of affect in human–computer interaction".
1989:
389:
Loudness – measures the amplitude of the speech waveform, translates to the energy of an utterance
6171:
6051:
5885:
5832:
5744:
5739:
5674:
5614:
5574:
5500:
5152:
5132:
5117:
5107:
5048:
5028:
4989:
4984:
4949:
4934:
4894:
4854:
4726:
4465:
4455:
4103:
3942:
3841:
3717:
3674:
3583:
3525:
3510:
3500:
3285:
3084:
2899:
2831:
2396:
2209:
2079:
1845:
1817:
1639:
1598:
1551:
1496:
1369:
1043:
486:
256:
171:
conversational agents in order to enrich and facilitate interactivity between human and machine.
151:
114:
1389:"An analytical framework for studying attitude towards emotional AI: The three-pronged approach"
1880:
Balomenos, T.; Raouzaiou, A.; Ioannou, S.; Drosopoulos, A.; Karpouzis, K.; Kollias, S. (2004).
406:
The detection and processing of facial expression are achieved through various methods such as
6531:
6500:
6414:
6196:
6186:
6111:
6086:
5767:
5724:
5649:
5550:
5535:
5419:
5384:
5137:
4964:
4939:
4916:
4889:
4704:
4694:
4551:
4541:
4408:
4335:
4268:
4238:
3962:
3892:
3871:
3833:
3641:
3608:
3588:
3280:
3192:
3066:
3003:
2714:
2696:
2647:
2618:
2543:
2466:
2429:
2419:
2386:
2351:
2333:
2201:
2166:
1835:
1693:
1541:
1488:
1420:
1393:
1361:
1353:
1048:
1002:
267:
90:
2054:
909:. Affective games have been used in medical research to support the emotional development of
355:
Average pitch – description of how high/low the speaker speaks relative to the normal speech.
212:
combinations across the categories, e.g. a happy-surprised face or a fearful-surprised face.
6460:
6382:
6206:
6201:
6066:
6061:
5961:
5782:
5729:
5684:
5669:
5664:
5540:
5515:
5294:
5100:
5068:
4741:
4699:
4615:
4610:
3795:
3679:
3646:
3441:
3363:
3252:
3238:
3233:
3182:
3169:
3094:
3047:
2984:
2891:
2823:
2776:"Mona Lisa: Smiling? Computer Scientists Develop Software That Evaluates Facial Expressions"
2755:
2745:
2704:
2686:
2610:
2587:
2535:
2378:
2341:
2323:
2274:
2193:
2156:
2146:
2071:
1955:
1827:
1631:
1590:
1579:
1533:
1530:
Proceedings of the Second International Conference on Automatic Face and Gesture Recognition
1480:
1410:
1402:
1345:
1258:
1249:
1113:
1082:
997:
882:
825:
Also, the facial color signal is independent from that provided by facial muscle movements.
714:
657:, median filtering, or newer methods such as the Bacterial Foraging Optimization Algorithm.
571:, or FACS, as created by Paul Ekman and Wallace V. Friesen in 1978 based on earlier work by
535:
493:
364:
Pitch range – measures the spread between the maximum and minimum frequency of an utterance.
159:
82:
70:
Electronic devices such as robots are increasingly able to recognise and mimic human emotion
1528:
Roy, D.; Pentland, A. (1996-10-01). "Automatic spoken affect classification and analysis".
6387:
6176:
6141:
6081:
6031:
5520:
5256:
5224:
5122:
4929:
4921:
4859:
4746:
4365:
3866:
3760:
3732:
3626:
3578:
3563:
3548:
3403:
3398:
3338:
3228:
3202:
3154:
3099:
2028:
1996:
1962:
1893:
959:
525:
98:
2682:
2583:
2142:
2031:. Technology Review: The Authority on the Future of Technology. Retrieved 21 March 2011.
1881:
1864:
1415:
1388:
1228:, triggered an explosion of interest in the emotional side of computers and their users.
46:
Please help update this article to reflect recent events or newly available information.
6526:
6521:
6363:
6216:
6136:
6131:
6096:
6071:
6041:
6036:
6021:
6016:
5996:
5991:
5971:
5762:
5281:
5214:
5033:
4714:
4674:
4600:
4248:
3972:
3876:
3775:
3621:
3593:
2922:
2879:
2709:
2666:
2609:. Intelligent Technologies for Interactive Entertainment (INTETAIN). pp. 221–227.
2346:
2311:
2161:
2126:
1033:
970:
approach taken by Kirsten Boehner and others which views emotion as inherently social.
252:
6291:
2988:
2021:
1869:. International Conference on Multimodal Interfaces (ICMI'06). Banff, Alberta, Canada.
1203:
63:
Area of research in computer science aiming to understand the emotional state of users
6551:
6480:
6121:
6106:
6091:
6046:
6011:
5976:
5956:
5946:
5941:
5920:
5880:
5812:
5714:
5709:
5289:
4974:
4658:
4524:
4376:
4345:
4273:
4178:
3861:
3149:
2485:
Affective Videogames and Modes of Affective Gaming: Assist Me, Challenge Me, Emote Me
2400:
1849:
1686:
1373:
654:
515:
215:
The following sections consider many of the kinds of input data used for the task of
188:
174:
136:
6321:
6311:
2903:
2835:
2213:
2125:
Benitez-Quiroz, Carlos F.; Srinivasan, Ramprakash; Martinez, Aleix M. (2018-03-19).
1831:
1643:
1555:
438:
expression recognition purpose. Two of the widely used databases are CK+ and JAFFE.
361:
Final lowering – the amount by which the frequency falls at the end of an utterance.
6516:
6485:
6392:
6377:
6166:
6161:
6156:
6126:
6116:
6056:
6026:
6006:
5905:
5802:
5505:
5389:
5261:
5174:
5018:
4470:
4398:
4285:
4148:
3957:
3616:
2918:
2875:
2458:
2375:
Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems
2227:
2083:
1866:
Modeling naturalistic affective states via facial and vocal expressions recognition
1730:. Connexions – Sharing Knowledge and Building Communities. Retrieved 10 March 2011.
1602:
1240:
Garay, Nestor; Idoia Cearreta; Juan Miguel LĂłpez; Inmaculada Fajardo (April 2006).
874:
407:
1990:"Bacterial Foraging Optimization Algorithm – Swarm Algorithms – Clever Algorithms"
1746:
Scientific and Technical Journal of Information Technologies, Mechanics and Optics
1594:
1500:
1224:
Rosalind Picard, a genial MIT professor, is the field's godmother; her 1997 book,
372:
Speech rate – describes the rate of words or syllables uttered over a unit of time
17:
2997:
2827:
2614:
2591:
2370:
2055:"Visual Interpretation of Hand Gestures for Human–Computer Interaction: A Review"
1635:
1131:
496:
not all of which are encoded in facial muscles. The newly included emotions are:
6445:
6211:
6001:
5986:
5951:
5807:
5734:
5434:
4828:
4801:
4640:
4578:
4450:
4418:
4340:
4320:
4213:
3947:
3573:
3485:
2895:
2416:
Heart of the Machine: Our Future in a World of Artificial Emotional Intelligence
2008:
1333:
1117:
894:
540:
510:
375:
Stress frequency – measures the rate of occurrences of pitch accented utterances
247:
The process of speech/text affect detection requires the creation of a reliable
1814:
2021 International Conference on Computer Communication and Informatics (ICCCI)
1349:
6419:
6404:
6146:
5981:
5777:
5634:
5525:
5363:
5266:
4625:
4495:
4305:
4198:
4193:
4093:
4088:
3967:
3897:
3490:
3223:
3079:
2750:
2733:
1925:
1907:
1484:
1406:
1263:
1007:
967:
453:
321:
86:
2700:
2607:
Affective Pacman: A Frustrating Game for Brain–Computer Interface Experiments
2433:
2337:
2328:
2234:, The IEEE International Conference on Computer Vision (ICCV), 2019, pp. 0–0.
1537:
1492:
1357:
395:
Pitch Discontinuity – describes the transitions of the fundamental frequency.
386:
Brilliance – describes the dominance of high or low frequencies In the speech
6495:
6465:
6424:
6181:
5966:
5870:
4816:
4791:
4778:
4769:
4763:
4721:
4667:
4561:
4536:
4505:
4413:
4370:
4350:
4300:
4295:
4233:
4228:
4203:
4143:
4123:
4108:
4098:
3472:
3433:
2382:
2263:"Review of affective computing in education/Learning: Trends and challenges"
2232:
Micro Expression Classification using Facial Color and Deep Learning Methods
2151:
1767:
The repertoire of nonverbal behavior: Categories, origins, usage, and coding
500:
476:
181:, relates emotions to the broader issues of machine intelligence stating in
106:
2940:
Hook, Kristina; Staahl, Anna; Sundstrom, Petra; Laaksolahti, Jarmo (2008).
2718:
2651:
2565:"Turning shortcomings into challenges: Brain–computer interfaces for games"
2446:
2355:
2205:
2170:
1424:
1365:
352:
Accent shape – affected by the rate of change of the fundamental frequency.
5565:
2665:
Sahar, Yotam; Wagner, Michael; Barel, Ariel; Shoval, Shraga (2022-11-01).
2539:
135:
capture speech. Other sensors detect emotional cues by directly measuring
109:. One of the motivations for the research is the ability to give machines
6475:
5895:
5890:
5850:
4811:
4806:
4796:
4731:
4568:
4490:
4480:
4445:
4432:
4315:
4258:
4208:
4188:
3533:
2369:
Balters, Stephanie; Bernstein, Madeline; Paredes, Pablo E. (2019-05-02).
2249:
Studying Aesthetics in Photographic Images Using a Computational Approach
893:
Affective video games can access their players' emotional states through
734:
A subject's blood volume pulse (BVP) can be measured by a process called
545:
505:
392:
Pause Discontinuity – describes the transitions between sound and silence
248:
2734:"Tune in to Your Emotions: A Robust Personalized Affective Music Player"
2262:
2197:
1727:
1466:"Recognition of Affective Communicative Intent in Robot-Directed Speech"
897:
devices. A particularly simple form of biofeedback is available through
5241:
4821:
4689:
4649:
4635:
4630:
4620:
4531:
4391:
4290:
4253:
4223:
4218:
4183:
4163:
4153:
4138:
4080:
4067:
2760:
2278:
902:
898:
481:
466:
131:
94:
2691:
2293:"In-Car Facial Recognition Detects Angry Drivers To Prevent Road Rage"
2075:
759:
6536:
5271:
4605:
4593:
4588:
4583:
4437:
4384:
4243:
4158:
2371:"On-road Stress Analysis for In-car Interventions During the Commute"
2244:
2127:"Facial color is an efficient mechanism to visually transmit emotion"
1086:
910:
2927:
Proceedings of the Aarhus Decennial Conference on Critical Computing
2524:. CHI '03 Extended Abstracts on Human Factors in Computing Systems.
1101:
3016:
2869:
2867:
2865:
2863:
2861:
2859:
1912:
Universals and Cultural Differences in Facial Expression of Emotion
1822:
926:
quality of training and shortening the required training duration.
782:
6332:
4753:
4679:
4573:
4546:
4425:
4360:
4355:
4325:
4263:
4133:
4128:
4036:
1073:
Tao, Jianhua; Tieniu Tan (2005). "Affective Computing: A Review".
758:
710:
550:
530:
461:
65:
5910:
4556:
4517:
4500:
4403:
4330:
4310:
4118:
2563:
Nijholt, Anton; Plass-Oude Bos, Danny; Reuderink, Boris (2009).
2261:
Wu, Chih-Hung; Huang, Yueh-Min; Hwang, Jan-Pan (November 2016).
2053:
Pavlovic, Vladimir I.; Sharma, Rajeev; Huang, Thomas S. (1997).
1078:
471:
6336:
5336:
4040:
3020:
1294:
Agent Culture: Human-Agent Interaction in a Mutlicultural World
5252:
Interactions between the emotional and executive brain systems
4485:
4173:
2463:
Computing Attitude and Affect in Text: Theory and Applications
2063:
IEEE Transactions on Pattern Analysis and Machine Intelligence
1332:
Loveys, Kate; Sagar, Mark; Broadbent, Elizabeth (2020-07-22).
26:
5332:
2996:
Scherer, Klaus R; Bänziger, Tanja; Roesch, Etienne B (2010).
97:, the more modern branch of computer science originated with
2999:
A Blueprint for Affective Computing: A Sourcebook and Manual
1928:(1999). "Basic Emotions". In Dalgleish, T; Power, M (eds.).
130:
Detecting emotional information usually begins with passive
2465:. Dordrecht: Springer Science & Business Media. p. 94.
2095:
2093:
1956:"Facial Action Coding System (FACS) and the FACS Manual"
2732:
Janssen, Joris H.; van den Broek, Egon L. (July 2012).
2605:
Reuderink, Boris; Nijholt, Anton; Poel, Mannes (2009).
2522:
Affective gaming: Measuring emotion through the gamepad
2925:(2005). "Affection: From Information to Interaction".
2482:
Gilleade, Kiel Mark; Dix, Alan; Allanson, Jen (2005).
2022:"Better Face-Recognition Software – Technology Review"
1580:"Emotion recognition in spontaneous speech using GMMs"
1882:"Emotion Analysis in Man-Machine Interaction Systems"
1799:
1715:
1146:
1144:
419:
robust estimation of the subject's emotional state.
383:
Breathiness – measures the aspiration noise in speech
1795:
1793:
1742:"Extended speech emotion recognition and prediction"
1711:
1709:
1655:
1653:
6509:
6433:
6370:
6224:
5931:
5841:
5753:
5573:
5398:
4839:
4777:
4079:
3885:
3832:
3794:
3741:
3703:
3665:
3607:
3524:
3470:
3432:
3377:
3314:
3247:
3211:
3168:
3132:
3065:
1937:. Sussex, UK: John Wiley & Sons. Archived from
806:Skin conductance is often measured using two small
2099:Picard, Rosalind (1998). Affective Computing. MIT.
1685:
1684:Charles Osgood; William May; Murray Miron (1975).
934:Affective computing has potential applications in
101:'s 1995 paper on affective computing and her book
2418:. New York: Arcade Publishing. pp. 150–153.
1761:
1759:
380:Voice quality parameters and energy descriptors:
4647:
2310:Collet, Christian; Musicant, Oren (2019-04-24).
1296:. Lawrence Erlbaum Associates. pp. 127–142.
1204:"The Love Machine; Building computers that care"
2967:International Journal of Human–Computer Studies
2884:International Journal of Human–Computer Studies
2799:"Co-experience: user experience as interaction"
2131:Proceedings of the National Academy of Sciences
1075:Affective Computing and Intelligent Interaction
177:, one of the pioneering computer scientists in
126:Detecting and recognizing emotional information
4522:
1777:
1775:
1688:Cross-Cultural Universals of Affective Meaning
1242:"Assistive Technology and Affective Mediation"
6348:
5348:
4052:
3056:Note: This template roughly follows the 2012
3032:
1999:. Clever Algorithms. Retrieved 21 March 2011.
1578:Neiberg, D; Elenius, K; Laskowski, K (2006).
8:
4826:
4665:
4656:
4423:
2882:(2007). "How emotion is made and measured".
2640:Studies in Health Technology and Informatics
1464:Breazeal, Cynthia; Aryananda, Lijin (2002).
81:. It is an interdisciplinary field spanning
5325:indicate emotion names in foreign languages
4389:
2048:
2046:
1886:Machine Learning for Multimodal Interaction
1884:. In Bengio, Samy; Bourlard, Herve (eds.).
1514:
1512:
1510:
6355:
6341:
6333:
5355:
5341:
5333:
4059:
4045:
4037:
3039:
3025:
3017:
2738:User Modeling and User-Adapted Interaction
2011:. Soft Computing. Retrieved 18 March 2011.
578:
2978:
2817:
2797:Battarbee, Katja; Koskinen, Ilpo (2005).
2759:
2749:
2708:
2690:
2529:
2345:
2327:
2267:British Journal of Educational Technology
2160:
2150:
1821:
1625:
1438:Aleix, and Shichuan Du, Martinez (2012).
1414:
1262:
1166:Kleine-Cosack, Christian (October 2006).
1965:. A Human Face. Retrieved 21 March 2011.
1671:
1659:
1447:The Journal of Machine Learning Research
1168:"Recognition and Simulation of Emotions"
950:Cognitivist vs. interactional approaches
781:
2186:IEEE Computer Graphics and Applications
2120:
2118:
1740:S.E. Khoruzhnikov; et al. (2014).
1065:
3756:Knowledge representation and reasoning
2520:Sykes, Jonathan; Brown, Simon (2003).
1765:Ekman, P. & Friesen, W. V (1969).
1155:. Cambridge, MA: MIT Press. p. 1.
3781:Philosophy of artificial intelligence
7:
3100:Energy consumption (Green computing)
2917:Boehner, Kirsten; DePaula, Rogerio;
2874:Boehner, Kirsten; DePaula, Rogerio;
2108:Larsen JT, Norris CJ, Cacioppo JT, "
905:; at the other end of the scale are
3786:Distributed artificial intelligence
3058:ACM Computing Classification System
3002:. Oxford: Oxford University Press.
1800:Scherer, Bänziger & Roesch 2010
1716:Scherer, Bänziger & Roesch 2010
942:received by the respective market.
139:data, such as skin temperature and
3291:Integrated development environment
1081:3784. Springer. pp. 981–995.
25:
3766:Automated planning and scheduling
3296:Software configuration management
1931:Handbook of Cognition and Emotion
1890:Lecture Notes in Computer Science
6481:Augmented reality tabletop games
5564:
4768:
4762:
4020:
4010:
4001:
4000:
1274:from the original on 28 May 2008
1214:from the original on 18 May 2008
1202:Diamond, David (December 2003).
1029:Friendly artificial intelligence
31:
4011:
3414:Computational complexity theory
2782:. 1 August 2006. Archived from
2447:Projects in Affective Computing
2316:Frontiers in Human Neuroscience
1832:10.1109/ICCCI50826.2021.9402569
1782:Steidl, Stefan (5 March 2011).
1387:Ho, Manh-Tung (29 March 2023).
6456:Live action role-playing games
3198:Network performance evaluation
2243:Ritendra Datta, Dhiraj Joshi,
1307:Restak, Richard (2006-12-17).
648:Challenges in facial detection
1:
5660:Industrial and organizational
3569:Multimedia information system
3554:Geographic information system
3544:Enterprise information system
3133:Computer systems organization
2989:10.1016/s1071-5819(03)00047-8
1595:10.21437/Interspeech.2006-277
1039:Multimodal sentiment analysis
721:, and facial color patterns.
5901:Human factors and ergonomics
5277:Social emotional development
3928:Computational social science
3516:Theoretical computer science
3329:Software development process
3105:Electronic design automation
3090:Very Large Scale Integration
2828:10.1080/15710880412331289917
2615:10.1007/978-3-642-02315-6_23
2592:10.1016/j.entcom.2009.09.007
1636:10.21437/Eurospeech.2003-307
4523:
3751:Natural language processing
3539:Information storage systems
2942:"Interactional empowerment"
2896:10.1016/j.ijhcs.2006.11.016
1692:. Univ. of Illinois Press.
1134:MIT Technical Report #321 (
569:Facial Action Coding System
563:Facial Action Coding System
557:Facial Action Coding System
429:Facial expression databases
423:Facial expression databases
160:facial expression detection
156:natural language processing
6579:
3667:Human–computer interaction
3637:Intrusion detection system
3549:Social information systems
3534:Database management system
2457:Shanahan, James; Qu, Yan;
1786:. Pattern Recognition Lab.
1587:Proceedings of Interspeech
1350:10.1007/s10916-020-01624-4
1338:Journal of Medical Systems
956:human–computer interaction
936:human–computer interaction
795:
771:
688:
560:
445:
426:
349:Frequency characteristics
330:human–computer interaction
6287:
5590:Applied behavior analysis
5562:
5370:
5320:
4760:
3996:
3933:Computational engineering
3908:Computational mathematics
3054:
2751:10.1007/s11257-011-9107-7
1961:October 19, 2013, at the
1784:"FAU Aibo Emotion Corpus"
1618:Proceedings of Eurospeech
1407:10.1016/j.mex.2023.102149
1151:Picard, Rosalind (1997).
1118:10.1093/mind/os-IX.34.188
907:brain–computer interfaces
677:channels of information).
660:Other challenges include
655:linear Gaussian smoothing
40:This article needs to be
5210:in virtual communication
3943:Computational healthcare
3938:Differentiable computing
3857:Graphics processing unit
3276:Domain-specific language
3145:Computational complexity
2329:10.3389/fnhum.2019.00131
1976:"Spatial domain methods"
1910:(1972). Cole, J. (ed.).
1728:"Gaussian Mixture Model"
1538:10.1109/AFGR.1996.557292
704:Physiological monitoring
298:Decision tree algorithms
6451:Alternate reality games
5866:Behavioral neuroscience
5430:Behavioral neuroscience
3918:Computational chemistry
3852:Photograph manipulation
3743:Artificial intelligence
3559:Decision support system
2572:Entertainment Computing
2414:Yonck, Richard (2017).
2383:10.1145/3290607.3312824
2152:10.1073/pnas.1716084115
1485:10.1023/a:1013215010749
1479:(1). Springer: 83–104.
1264:10.17011/ht/urn.2006159
1100:James, William (1884).
1018:Emotion Markup Language
774:Facial electromyography
768:Facial electromyography
719:facial electromyography
402:Facial affect detection
369:Time-related features:
179:artificial intelligence
5916:Psychology of religion
5856:Behavioral engineering
5793:Human subject research
5449:Cognitive neuroscience
5415:Affective neuroscience
4827:
4666:
4657:
4648:
4424:
4390:
3983:Educational technology
3814:Reinforcement learning
3564:Process control system
3462:Computational geometry
3452:Algorithmic efficiency
3447:Analysis of algorithms
3095:Systems on Chip (SoCs)
1769:. Semiotica, 1, 49–98.
847:Potential applications
808:silver-silver chloride
798:Galvanic skin response
792:Galvanic skin response
788:
764:
715:galvanic skin response
448:Emotion classification
442:Emotion classification
111:emotional intelligence
71:
6292:Wiktionary definition
5828:Self-report inventory
5823:Quantitative research
3953:Electronic publishing
3923:Computational biology
3913:Computational physics
3809:Unsupervised learning
3723:Distributed computing
3599:Information retrieval
3506:Mathematical analysis
3496:Mathematical software
3379:Theory of computation
3344:Software construction
3334:Requirements analysis
3212:Software organization
3140:Computer architecture
3110:Hardware acceleration
3075:Printed circuit board
2540:10.1145/765891.765957
2377:. ACM. pp. 1–6.
1132:"Affective Computing"
1122:Cited by Tao and Tan.
993:Affect control theory
785:
762:
69:
6471:Location-based games
6410:Ubiquitous computing
5818:Qualitative research
5773:Behavior epigenetics
5247:Group affective tone
3713:Concurrent computing
3685:Ubiquitous computing
3657:Application security
3652:Information security
3481:Discrete mathematics
3457:Randomized algorithm
3409:Computability theory
3387:Model of computation
3359:Software maintenance
3354:Software engineering
3316:Software development
3266:Programming language
3261:Programming paradigm
3178:Network architecture
2495:Conf. Archived from
1532:. pp. 363–367.
954:Within the field of
921:Training methods of
917:Psychomotor training
736:photoplethysmography
573:Carl-Herman Hjortsjö
531:Pride in achievement
412:hidden Markov models
6558:Affective computing
6297:Wiktionary category
5861:Behavioral genetics
5833:Statistical surveys
5690:Occupational health
5425:Behavioral genetics
5300:constructed emotion
4970:functional accounts
3988:Document management
3978:Operations research
3903:Enterprise software
3819:Multi-task learning
3804:Supervised learning
3526:Information systems
3349:Software deployment
3306:Software repository
3160:Real-time computing
2786:on 19 October 2007.
2683:2022Senso..22.8368S
2584:2009itie.conf..153N
2247:and James Z. Wang,
2198:10.1109/mcg.2009.13
2143:2018PNAS..115.3581B
2112:", (September 2003)
1988:Clever Algorithms.
1896:. pp. 318–328.
1313:The Washington Post
1226:Affective Computing
1153:Affective Computing
1024:Emotion recognition
1013:Character Computing
691:Gesture recognition
217:emotion recognition
184:The Emotion Machine
166:Emotion in machines
141:galvanic resistance
103:Affective Computing
75:Affective computing
6269:Schools of thought
6172:Richard E. Nisbett
6052:Donald T. Campbell
5730:Sport and exercise
5200:in decision-making
4441:(sense of purpose)
3771:Search methodology
3718:Parallel computing
3675:Interaction design
3584:Computing platform
3511:Numerical analysis
3501:Information theory
3286:Software framework
3249:Software notations
3188:Network components
3085:Integrated circuit
2279:10.1111/bjet.12324
2027:2011-06-08 at the
1995:2019-06-12 at the
1892:. Vol. 3361.
1309:"Mind Over Matter"
1044:Sentiment analysis
930:Other applications
789:
765:
725:Blood volume pulse
711:blood volume pulse
340:Speech descriptors
257:vector space model
199:' are being used.
152:speech recognition
72:
18:Artificial emotion
6563:Affective science
6545:
6544:
6501:Transreality game
6415:Context awareness
6330:
6329:
6307:Wikimedia Commons
6234:Counseling topics
6197:Ronald C. Kessler
6187:Shelley E. Taylor
6112:Lawrence Kohlberg
6087:Stanley Schachter
5886:Consumer behavior
5768:Archival research
5536:Psycholinguistics
5420:Affective science
5330:
5329:
4917:Appeal to emotion
4695:Social connection
4034:
4033:
3963:Electronic voting
3893:Quantum Computing
3886:Applied computing
3872:Image compression
3642:Hardware security
3632:Security services
3589:Digital marketing
3369:Open-source model
3281:Modeling language
3193:Network scheduler
3009:978-0-19-956670-9
2692:10.3390/s22218368
2624:978-3-642-02314-9
2392:978-1-4503-5971-9
2299:. 30 August 2018.
2137:(14): 3581–3586.
2076:10.1109/34.598226
1841:978-1-7281-5875-4
1699:978-94-007-5069-2
1547:978-0-8186-7713-7
1473:Autonomous Robots
1102:"What Is Emotion"
1049:Wearable computer
1003:Affective haptics
838:Visual aesthetics
645:
644:
494:negative emotions
91:cognitive science
61:
60:
16:(Redirected from
6570:
6461:Affective gaming
6441:Ubiquitous games
6357:
6350:
6343:
6334:
6264:Research methods
6207:Richard Davidson
6202:Joseph E. LeDoux
6077:George A. Miller
6067:David McClelland
6062:Herbert A. Simon
5962:Edward Thorndike
5783:Content analysis
5568:
5541:Psychophysiology
5357:
5350:
5343:
5334:
5305:discrete emotion
5205:in the workplace
5101:Empathy quotient
4832:
4772:
4766:
4671:
4662:
4653:
4528:
4429:
4395:
4061:
4054:
4047:
4038:
4024:
4023:
4014:
4013:
4004:
4003:
3824:Cross-validation
3796:Machine learning
3680:Social computing
3647:Network security
3442:Algorithm design
3364:Programming team
3324:Control variable
3301:Software library
3239:Software quality
3234:Operating system
3183:Network protocol
3048:Computer science
3041:
3034:
3027:
3018:
3013:
2992:
2982:
2953:
2952:
2946:
2937:
2931:
2930:
2914:
2908:
2907:
2871:
2854:
2853:
2851:
2850:
2844:
2838:. Archived from
2821:
2803:
2794:
2788:
2787:
2772:
2766:
2765:
2763:
2753:
2729:
2723:
2722:
2712:
2694:
2662:
2656:
2655:
2635:
2629:
2628:
2602:
2596:
2595:
2569:
2560:
2554:
2553:
2533:
2517:
2511:
2510:
2508:
2507:
2501:
2490:
2479:
2473:
2455:
2449:
2444:
2438:
2437:
2411:
2405:
2404:
2366:
2360:
2359:
2349:
2331:
2307:
2301:
2300:
2289:
2283:
2282:
2273:(6): 1304–1323.
2258:
2252:
2241:
2235:
2224:
2218:
2217:
2181:
2175:
2174:
2164:
2154:
2122:
2113:
2106:
2100:
2097:
2088:
2087:
2059:
2050:
2041:
2038:
2032:
2020:Williams, Mark.
2018:
2012:
2009:"Soft Computing"
2006:
2000:
1986:
1980:
1979:
1972:
1966:
1953:
1947:
1945:
1943:
1936:
1922:
1916:
1915:
1904:
1898:
1897:
1877:
1871:
1870:
1860:
1854:
1853:
1825:
1816:. pp. 1–4.
1809:
1803:
1797:
1788:
1787:
1779:
1770:
1763:
1754:
1753:
1737:
1731:
1725:
1719:
1713:
1704:
1703:
1691:
1681:
1675:
1669:
1663:
1657:
1648:
1647:
1629:
1613:
1607:
1606:
1584:
1575:
1569:
1566:
1560:
1559:
1525:
1519:
1516:
1505:
1504:
1470:
1461:
1455:
1454:
1444:
1435:
1429:
1428:
1418:
1384:
1378:
1377:
1329:
1323:
1322:
1320:
1319:
1304:
1298:
1297:
1289:
1283:
1282:
1280:
1279:
1273:
1266:
1250:Human Technology
1246:
1237:
1231:
1230:
1221:
1219:
1199:
1193:
1192:
1187:
1185:
1179:
1173:. Archived from
1172:
1163:
1157:
1156:
1148:
1139:
1129:
1123:
1121:
1097:
1091:
1090:
1087:10.1007/11573548
1070:
998:Affective design
883:emotive Internet
579:
546:Sensory pleasure
223:Emotional speech
115:simulate empathy
83:computer science
56:
53:
47:
35:
34:
27:
21:
6578:
6577:
6573:
6572:
6571:
6569:
6568:
6567:
6548:
6547:
6546:
6541:
6505:
6429:
6388:Performing arts
6366:
6364:Pervasive games
6361:
6331:
6326:
6283:
6259:Psychotherapies
6220:
6177:Martin Seligman
6142:Daniel Kahneman
6082:Richard Lazarus
6032:Raymond Cattell
5936:
5927:
5926:
5925:
5837:
5749:
5576:
5569:
5560:
5521:Neuropsychology
5401:
5394:
5366:
5361:
5331:
5326:
5316:
5257:Jealousy in art
5000:in conversation
4922:Amygdala hijack
4835:
4773:
4767:
4758:
4747:sense of wonder
4075:
4065:
4035:
4030:
4021:
3992:
3973:Word processing
3881:
3867:Virtual reality
3828:
3790:
3761:Computer vision
3737:
3733:Multiprocessing
3699:
3661:
3627:Security hacker
3603:
3579:Digital library
3520:
3471:Mathematics of
3466:
3428:
3404:Automata theory
3399:Formal language
3373:
3339:Software design
3310:
3243:
3229:Virtual machine
3207:
3203:Network service
3164:
3155:Embedded system
3128:
3061:
3050:
3045:
3010:
2995:
2980:10.1.1.180.6429
2964:
2961:
2956:
2944:
2939:
2938:
2934:
2923:Sengers, Phoebe
2916:
2915:
2911:
2880:Sengers, Phoebe
2873:
2872:
2857:
2848:
2846:
2842:
2819:10.1.1.294.9178
2801:
2796:
2795:
2791:
2774:
2773:
2769:
2731:
2730:
2726:
2664:
2663:
2659:
2637:
2636:
2632:
2625:
2604:
2603:
2599:
2567:
2562:
2561:
2557:
2550:
2519:
2518:
2514:
2505:
2503:
2499:
2488:
2481:
2480:
2476:
2456:
2452:
2445:
2441:
2426:
2413:
2412:
2408:
2393:
2368:
2367:
2363:
2309:
2308:
2304:
2291:
2290:
2286:
2260:
2259:
2255:
2242:
2238:
2225:
2221:
2183:
2182:
2178:
2124:
2123:
2116:
2107:
2103:
2098:
2091:
2057:
2052:
2051:
2044:
2039:
2035:
2029:Wayback Machine
2019:
2015:
2007:
2003:
1997:Wayback Machine
1987:
1983:
1974:
1973:
1969:
1963:Wayback Machine
1954:
1950:
1941:
1934:
1924:
1923:
1919:
1906:
1905:
1901:
1894:Springer-Verlag
1879:
1878:
1874:
1862:
1861:
1857:
1842:
1811:
1810:
1806:
1798:
1791:
1781:
1780:
1773:
1764:
1757:
1739:
1738:
1734:
1726:
1722:
1714:
1707:
1700:
1683:
1682:
1678:
1670:
1666:
1658:
1651:
1627:10.1.1.420.8158
1615:
1614:
1610:
1582:
1577:
1576:
1572:
1567:
1563:
1548:
1527:
1526:
1522:
1517:
1508:
1468:
1463:
1462:
1458:
1453:(1): 1589–1608.
1442:
1437:
1436:
1432:
1386:
1385:
1381:
1331:
1330:
1326:
1317:
1315:
1306:
1305:
1301:
1291:
1290:
1286:
1277:
1275:
1271:
1244:
1239:
1238:
1234:
1217:
1215:
1201:
1200:
1196:
1183:
1181:
1180:on May 28, 2008
1177:
1170:
1165:
1164:
1160:
1150:
1149:
1142:
1130:
1126:
1112:(34): 188–205.
1099:
1098:
1094:
1072:
1071:
1067:
1063:
1058:
1053:
988:
982:interactions".
960:Rosalind Picard
952:
932:
919:
891:
872:
863:
854:
849:
840:
831:
822:
817:
800:
794:
776:
770:
753:
744:
732:
727:
706:
693:
687:
650:
565:
559:
450:
444:
431:
425:
404:
342:
317:
245:
225:
205:
168:
128:
123:
113:, including to
99:Rosalind Picard
64:
57:
51:
48:
45:
36:
32:
23:
22:
15:
12:
11:
5:
6576:
6574:
6566:
6565:
6560:
6550:
6549:
6543:
6542:
6540:
6539:
6534:
6529:
6527:Jane McGonigal
6524:
6522:Eric Zimmerman
6519:
6513:
6511:
6507:
6506:
6504:
6503:
6498:
6493:
6491:Treasure hunts
6488:
6483:
6478:
6473:
6468:
6463:
6458:
6453:
6448:
6443:
6437:
6435:
6431:
6430:
6428:
6427:
6422:
6417:
6412:
6407:
6402:
6401:
6400:
6390:
6385:
6380:
6374:
6372:
6368:
6367:
6362:
6360:
6359:
6352:
6345:
6337:
6328:
6327:
6325:
6324:
6319:
6314:
6309:
6304:
6299:
6294:
6288:
6285:
6284:
6282:
6281:
6276:
6271:
6266:
6261:
6256:
6251:
6246:
6241:
6236:
6230:
6228:
6222:
6221:
6219:
6217:Roy Baumeister
6214:
6209:
6204:
6199:
6194:
6189:
6184:
6179:
6174:
6169:
6164:
6159:
6154:
6152:Michael Posner
6149:
6144:
6139:
6137:Elliot Aronson
6134:
6132:Walter Mischel
6129:
6124:
6119:
6114:
6109:
6104:
6099:
6097:Albert Bandura
6094:
6089:
6084:
6079:
6074:
6072:Leon Festinger
6069:
6064:
6059:
6054:
6049:
6044:
6042:Neal E. Miller
6039:
6037:Abraham Maslow
6034:
6029:
6024:
6022:Ernest Hilgard
6019:
6017:Donald O. Hebb
6014:
6009:
6004:
5999:
5997:J. P. Guilford
5994:
5992:Gordon Allport
5989:
5984:
5979:
5974:
5972:John B. Watson
5969:
5964:
5959:
5954:
5949:
5944:
5939:
5937:
5932:
5929:
5928:
5924:
5923:
5918:
5913:
5908:
5903:
5898:
5893:
5888:
5883:
5878:
5873:
5868:
5863:
5858:
5853:
5847:
5846:
5845:
5843:
5839:
5838:
5836:
5835:
5830:
5825:
5820:
5815:
5810:
5805:
5800:
5795:
5790:
5785:
5780:
5775:
5770:
5765:
5763:Animal testing
5759:
5757:
5751:
5750:
5748:
5747:
5742:
5737:
5732:
5727:
5722:
5717:
5712:
5707:
5702:
5697:
5692:
5687:
5682:
5677:
5672:
5667:
5662:
5657:
5652:
5647:
5642:
5637:
5632:
5627:
5622:
5617:
5612:
5607:
5602:
5597:
5592:
5587:
5581:
5579:
5571:
5570:
5563:
5561:
5559:
5558:
5553:
5548:
5543:
5538:
5533:
5528:
5523:
5518:
5513:
5508:
5503:
5498:
5493:
5488:
5483:
5478:
5473:
5468:
5466:Cross-cultural
5463:
5458:
5457:
5456:
5446:
5437:
5432:
5427:
5422:
5417:
5412:
5406:
5404:
5396:
5395:
5393:
5392:
5387:
5382:
5377:
5371:
5368:
5367:
5362:
5360:
5359:
5352:
5345:
5337:
5328:
5327:
5321:
5318:
5317:
5315:
5314:
5313:
5312:
5310:somatic marker
5307:
5302:
5297:
5292:
5284:
5282:Stoic passions
5279:
5274:
5269:
5264:
5259:
5254:
5249:
5244:
5239:
5238:
5237:
5232:
5230:social sharing
5227:
5222:
5220:self-conscious
5217:
5212:
5207:
5202:
5197:
5192:
5184:
5183:
5182:
5172:
5171:
5170:
5165:
5163:thought method
5160:
5155:
5150:
5145:
5140:
5135:
5130:
5128:lateralization
5125:
5120:
5115:
5110:
5105:
5104:
5103:
5098:
5088:
5087:
5086:
5076:
5071:
5066:
5061:
5056:
5051:
5046:
5041:
5036:
5031:
5023:
5022:
5021:
5016:
5015:
5014:
5004:
5003:
5002:
4992:
4987:
4982:
4977:
4972:
4967:
4962:
4957:
4955:classification
4952:
4947:
4942:
4937:
4932:
4924:
4919:
4914:
4913:
4912:
4907:
4899:
4898:
4897:
4892:
4887:
4882:
4877:
4869:
4868:
4867:
4862:
4857:
4852:
4843:
4841:
4837:
4836:
4834:
4833:
4824:
4819:
4814:
4809:
4804:
4799:
4794:
4789:
4783:
4781:
4775:
4774:
4761:
4759:
4757:
4756:
4751:
4750:
4749:
4739:
4734:
4729:
4724:
4719:
4718:
4717:
4707:
4702:
4697:
4692:
4687:
4682:
4677:
4675:Sentimentality
4672:
4663:
4654:
4645:
4644:
4643:
4633:
4628:
4623:
4618:
4613:
4608:
4603:
4598:
4597:
4596:
4591:
4586:
4581:
4571:
4566:
4565:
4564:
4554:
4549:
4544:
4539:
4534:
4529:
4520:
4515:
4514:
4513:
4511:at first sight
4508:
4498:
4493:
4488:
4483:
4478:
4473:
4468:
4463:
4458:
4453:
4448:
4443:
4435:
4430:
4421:
4416:
4411:
4406:
4401:
4396:
4387:
4382:
4381:
4380:
4368:
4363:
4358:
4353:
4348:
4343:
4338:
4333:
4328:
4323:
4318:
4313:
4308:
4303:
4298:
4293:
4288:
4283:
4282:
4281:
4271:
4266:
4261:
4256:
4251:
4249:Disappointment
4246:
4241:
4236:
4231:
4226:
4221:
4216:
4211:
4206:
4201:
4196:
4191:
4186:
4181:
4176:
4171:
4166:
4161:
4156:
4151:
4146:
4141:
4136:
4131:
4126:
4121:
4116:
4111:
4106:
4101:
4096:
4091:
4085:
4083:
4077:
4076:
4066:
4064:
4063:
4056:
4049:
4041:
4032:
4031:
4029:
4028:
4018:
4008:
3997:
3994:
3993:
3991:
3990:
3985:
3980:
3975:
3970:
3965:
3960:
3955:
3950:
3945:
3940:
3935:
3930:
3925:
3920:
3915:
3910:
3905:
3900:
3895:
3889:
3887:
3883:
3882:
3880:
3879:
3877:Solid modeling
3874:
3869:
3864:
3859:
3854:
3849:
3844:
3838:
3836:
3830:
3829:
3827:
3826:
3821:
3816:
3811:
3806:
3800:
3798:
3792:
3791:
3789:
3788:
3783:
3778:
3776:Control method
3773:
3768:
3763:
3758:
3753:
3747:
3745:
3739:
3738:
3736:
3735:
3730:
3728:Multithreading
3725:
3720:
3715:
3709:
3707:
3701:
3700:
3698:
3697:
3692:
3687:
3682:
3677:
3671:
3669:
3663:
3662:
3660:
3659:
3654:
3649:
3644:
3639:
3634:
3629:
3624:
3622:Formal methods
3619:
3613:
3611:
3605:
3604:
3602:
3601:
3596:
3594:World Wide Web
3591:
3586:
3581:
3576:
3571:
3566:
3561:
3556:
3551:
3546:
3541:
3536:
3530:
3528:
3522:
3521:
3519:
3518:
3513:
3508:
3503:
3498:
3493:
3488:
3483:
3477:
3475:
3468:
3467:
3465:
3464:
3459:
3454:
3449:
3444:
3438:
3436:
3430:
3429:
3427:
3426:
3421:
3416:
3411:
3406:
3401:
3396:
3395:
3394:
3383:
3381:
3375:
3374:
3372:
3371:
3366:
3361:
3356:
3351:
3346:
3341:
3336:
3331:
3326:
3320:
3318:
3312:
3311:
3309:
3308:
3303:
3298:
3293:
3288:
3283:
3278:
3273:
3268:
3263:
3257:
3255:
3245:
3244:
3242:
3241:
3236:
3231:
3226:
3221:
3215:
3213:
3209:
3208:
3206:
3205:
3200:
3195:
3190:
3185:
3180:
3174:
3172:
3166:
3165:
3163:
3162:
3157:
3152:
3147:
3142:
3136:
3134:
3130:
3129:
3127:
3126:
3117:
3112:
3107:
3102:
3097:
3092:
3087:
3082:
3077:
3071:
3069:
3063:
3062:
3055:
3052:
3051:
3046:
3044:
3043:
3036:
3029:
3021:
3015:
3014:
3008:
2993:
2960:
2957:
2955:
2954:
2932:
2909:
2890:(4): 275–291.
2855:
2789:
2767:
2744:(3): 255–279.
2724:
2657:
2630:
2623:
2597:
2555:
2548:
2531:10.1.1.92.2123
2512:
2474:
2450:
2439:
2424:
2406:
2391:
2361:
2302:
2284:
2253:
2236:
2226:Hadas Shahar,
2219:
2176:
2114:
2101:
2089:
2070:(7): 677–695.
2042:
2033:
2013:
2001:
1981:
1967:
1948:
1944:on 2010-12-28.
1917:
1899:
1872:
1855:
1840:
1804:
1789:
1771:
1755:
1732:
1720:
1705:
1698:
1676:
1664:
1649:
1608:
1570:
1561:
1546:
1520:
1506:
1456:
1430:
1379:
1324:
1299:
1284:
1232:
1194:
1158:
1140:
1124:
1092:
1064:
1062:
1059:
1057:
1054:
1052:
1051:
1046:
1041:
1036:
1034:Kismet (robot)
1031:
1026:
1021:
1015:
1010:
1005:
1000:
995:
989:
987:
984:
951:
948:
931:
928:
918:
915:
890:
887:
871:
868:
862:
861:Transportation
859:
853:
850:
848:
845:
839:
836:
830:
827:
821:
818:
816:
813:
796:Main article:
793:
790:
772:Main article:
769:
766:
752:
749:
743:
740:
731:
728:
726:
723:
705:
702:
689:Main article:
686:
683:
682:
681:
678:
674:
671:
668:
665:
649:
646:
643:
642:
639:
635:
634:
631:
627:
626:
623:
619:
618:
617:1+2+4+5+20+26
615:
611:
610:
607:
603:
602:
599:
595:
594:
591:
587:
586:
583:
561:Main article:
558:
555:
554:
553:
548:
543:
538:
533:
528:
523:
518:
513:
508:
503:
490:
489:
484:
479:
474:
469:
464:
446:Main article:
443:
440:
427:Main article:
424:
421:
416:neural network
403:
400:
399:
398:
397:
396:
393:
390:
387:
384:
378:
377:
376:
373:
367:
366:
365:
362:
359:
356:
353:
341:
338:
316:
313:
308:
307:
301:
295:
289:
283:
277:
271:
253:knowledge base
244:
241:
224:
221:
204:
201:
189:virtual humans
167:
164:
127:
124:
122:
119:
62:
59:
58:
39:
37:
30:
24:
14:
13:
10:
9:
6:
4:
3:
2:
6575:
6564:
6561:
6559:
6556:
6555:
6553:
6538:
6535:
6533:
6530:
6528:
6525:
6523:
6520:
6518:
6515:
6514:
6512:
6508:
6502:
6499:
6497:
6494:
6492:
6489:
6487:
6486:Serious games
6484:
6482:
6479:
6477:
6474:
6472:
6469:
6467:
6464:
6462:
6459:
6457:
6454:
6452:
6449:
6447:
6444:
6442:
6439:
6438:
6436:
6432:
6426:
6423:
6421:
6418:
6416:
6413:
6411:
6408:
6406:
6403:
6399:
6396:
6395:
6394:
6391:
6389:
6386:
6384:
6381:
6379:
6376:
6375:
6373:
6369:
6365:
6358:
6353:
6351:
6346:
6344:
6339:
6338:
6335:
6323:
6320:
6318:
6315:
6313:
6310:
6308:
6305:
6303:
6300:
6298:
6295:
6293:
6290:
6289:
6286:
6280:
6277:
6275:
6272:
6270:
6267:
6265:
6262:
6260:
6257:
6255:
6254:Psychologists
6252:
6250:
6247:
6245:
6244:Organizations
6242:
6240:
6237:
6235:
6232:
6231:
6229:
6227:
6223:
6218:
6215:
6213:
6210:
6208:
6205:
6203:
6200:
6198:
6195:
6193:
6192:John Anderson
6190:
6188:
6185:
6183:
6180:
6178:
6175:
6173:
6170:
6168:
6165:
6163:
6160:
6158:
6155:
6153:
6150:
6148:
6145:
6143:
6140:
6138:
6135:
6133:
6130:
6128:
6125:
6123:
6122:Ulric Neisser
6120:
6118:
6115:
6113:
6110:
6108:
6107:Endel Tulving
6105:
6103:
6100:
6098:
6095:
6093:
6092:Robert Zajonc
6090:
6088:
6085:
6083:
6080:
6078:
6075:
6073:
6070:
6068:
6065:
6063:
6060:
6058:
6055:
6053:
6050:
6048:
6047:Jerome Bruner
6045:
6043:
6040:
6038:
6035:
6033:
6030:
6028:
6025:
6023:
6020:
6018:
6015:
6013:
6012:B. F. Skinner
6010:
6008:
6005:
6003:
6000:
5998:
5995:
5993:
5990:
5988:
5985:
5983:
5980:
5978:
5977:Clark L. Hull
5975:
5973:
5970:
5968:
5965:
5963:
5960:
5958:
5957:Sigmund Freud
5955:
5953:
5950:
5948:
5947:William James
5945:
5943:
5942:Wilhelm Wundt
5940:
5938:
5935:
5934:Psychologists
5930:
5922:
5921:Psychometrics
5919:
5917:
5914:
5912:
5909:
5907:
5904:
5902:
5899:
5897:
5894:
5892:
5889:
5887:
5884:
5882:
5881:Consciousness
5879:
5877:
5874:
5872:
5869:
5867:
5864:
5862:
5859:
5857:
5854:
5852:
5849:
5848:
5844:
5840:
5834:
5831:
5829:
5826:
5824:
5821:
5819:
5816:
5814:
5813:Psychophysics
5811:
5809:
5806:
5804:
5801:
5799:
5796:
5794:
5791:
5789:
5786:
5784:
5781:
5779:
5776:
5774:
5771:
5769:
5766:
5764:
5761:
5760:
5758:
5756:
5755:Methodologies
5752:
5746:
5743:
5741:
5738:
5736:
5733:
5731:
5728:
5726:
5723:
5721:
5718:
5716:
5715:Psychotherapy
5713:
5711:
5710:Psychometrics
5708:
5706:
5703:
5701:
5698:
5696:
5693:
5691:
5688:
5686:
5683:
5681:
5678:
5676:
5673:
5671:
5668:
5666:
5663:
5661:
5658:
5656:
5653:
5651:
5648:
5646:
5643:
5641:
5638:
5636:
5633:
5631:
5628:
5626:
5623:
5621:
5618:
5616:
5613:
5611:
5608:
5606:
5603:
5601:
5598:
5596:
5593:
5591:
5588:
5586:
5583:
5582:
5580:
5578:
5572:
5567:
5557:
5554:
5552:
5549:
5547:
5544:
5542:
5539:
5537:
5534:
5532:
5529:
5527:
5524:
5522:
5519:
5517:
5514:
5512:
5509:
5507:
5504:
5502:
5499:
5497:
5494:
5492:
5489:
5487:
5484:
5482:
5479:
5477:
5476:Developmental
5474:
5472:
5469:
5467:
5464:
5462:
5459:
5455:
5452:
5451:
5450:
5447:
5445:
5441:
5438:
5436:
5433:
5431:
5428:
5426:
5423:
5421:
5418:
5416:
5413:
5411:
5408:
5407:
5405:
5403:
5397:
5391:
5388:
5386:
5383:
5381:
5378:
5376:
5373:
5372:
5369:
5365:
5358:
5353:
5351:
5346:
5344:
5339:
5338:
5335:
5324:
5319:
5311:
5308:
5306:
5303:
5301:
5298:
5296:
5293:
5291:
5288:
5287:
5285:
5283:
5280:
5278:
5275:
5273:
5270:
5268:
5265:
5263:
5260:
5258:
5255:
5253:
5250:
5248:
5245:
5243:
5240:
5236:
5233:
5231:
5228:
5226:
5223:
5221:
5218:
5216:
5213:
5211:
5208:
5206:
5203:
5201:
5198:
5196:
5193:
5191:
5188:
5187:
5185:
5181:
5178:
5177:
5176:
5173:
5169:
5166:
5164:
5161:
5159:
5156:
5154:
5151:
5149:
5146:
5144:
5141:
5139:
5136:
5134:
5131:
5129:
5126:
5124:
5121:
5119:
5116:
5114:
5111:
5109:
5106:
5102:
5099:
5097:
5094:
5093:
5092:
5089:
5085:
5082:
5081:
5080:
5077:
5075:
5072:
5070:
5067:
5065:
5064:dysregulation
5062:
5060:
5057:
5055:
5052:
5050:
5047:
5045:
5042:
5040:
5037:
5035:
5032:
5030:
5027:
5026:
5024:
5020:
5017:
5013:
5012:interpersonal
5010:
5009:
5008:
5005:
5001:
4998:
4997:
4996:
4993:
4991:
4988:
4986:
4983:
4981:
4978:
4976:
4973:
4971:
4968:
4966:
4963:
4961:
4958:
4956:
4953:
4951:
4948:
4946:
4943:
4941:
4938:
4936:
4933:
4931:
4928:
4927:
4925:
4923:
4920:
4918:
4915:
4911:
4908:
4906:
4903:
4902:
4900:
4896:
4893:
4891:
4888:
4886:
4883:
4881:
4878:
4876:
4873:
4872:
4870:
4866:
4865:in psychology
4863:
4861:
4858:
4856:
4853:
4851:
4850:consciousness
4848:
4847:
4845:
4844:
4842:
4838:
4831:
4830:
4825:
4823:
4820:
4818:
4815:
4813:
4810:
4808:
4805:
4803:
4800:
4798:
4795:
4793:
4790:
4788:
4785:
4784:
4782:
4780:
4776:
4771:
4765:
4755:
4752:
4748:
4745:
4744:
4743:
4740:
4738:
4735:
4733:
4730:
4728:
4725:
4723:
4720:
4716:
4713:
4712:
4711:
4708:
4706:
4703:
4701:
4698:
4696:
4693:
4691:
4688:
4686:
4683:
4681:
4678:
4676:
4673:
4670:
4669:
4664:
4661:
4660:
4659:Schadenfreude
4655:
4652:
4651:
4646:
4642:
4639:
4638:
4637:
4634:
4632:
4629:
4627:
4624:
4622:
4619:
4617:
4614:
4612:
4609:
4607:
4604:
4602:
4599:
4595:
4592:
4590:
4587:
4585:
4582:
4580:
4577:
4576:
4575:
4572:
4570:
4567:
4563:
4560:
4559:
4558:
4555:
4553:
4550:
4548:
4545:
4543:
4540:
4538:
4535:
4533:
4530:
4527:
4526:
4525:Mono no aware
4521:
4519:
4516:
4512:
4509:
4507:
4504:
4503:
4502:
4499:
4497:
4494:
4492:
4489:
4487:
4484:
4482:
4479:
4477:
4474:
4472:
4469:
4467:
4464:
4462:
4459:
4457:
4454:
4452:
4449:
4447:
4444:
4442:
4440:
4436:
4434:
4431:
4428:
4427:
4422:
4420:
4417:
4415:
4412:
4410:
4407:
4405:
4402:
4400:
4397:
4394:
4393:
4388:
4386:
4383:
4379:
4378:
4377:Joie de vivre
4374:
4373:
4372:
4369:
4367:
4364:
4362:
4359:
4357:
4354:
4352:
4349:
4347:
4346:Gratification
4344:
4342:
4339:
4337:
4334:
4332:
4329:
4327:
4324:
4322:
4319:
4317:
4314:
4312:
4309:
4307:
4304:
4302:
4299:
4297:
4294:
4292:
4289:
4287:
4284:
4280:
4277:
4276:
4275:
4274:Embarrassment
4272:
4270:
4267:
4265:
4262:
4260:
4257:
4255:
4252:
4250:
4247:
4245:
4242:
4240:
4237:
4235:
4232:
4230:
4227:
4225:
4222:
4220:
4217:
4215:
4212:
4210:
4207:
4205:
4202:
4200:
4197:
4195:
4192:
4190:
4187:
4185:
4182:
4180:
4179:Belongingness
4177:
4175:
4172:
4170:
4167:
4165:
4162:
4160:
4157:
4155:
4152:
4150:
4147:
4145:
4142:
4140:
4137:
4135:
4132:
4130:
4127:
4125:
4122:
4120:
4117:
4115:
4112:
4110:
4107:
4105:
4102:
4100:
4097:
4095:
4092:
4090:
4087:
4086:
4084:
4082:
4078:
4073:
4069:
4062:
4057:
4055:
4050:
4048:
4043:
4042:
4039:
4027:
4019:
4017:
4009:
4007:
3999:
3998:
3995:
3989:
3986:
3984:
3981:
3979:
3976:
3974:
3971:
3969:
3966:
3964:
3961:
3959:
3956:
3954:
3951:
3949:
3946:
3944:
3941:
3939:
3936:
3934:
3931:
3929:
3926:
3924:
3921:
3919:
3916:
3914:
3911:
3909:
3906:
3904:
3901:
3899:
3896:
3894:
3891:
3890:
3888:
3884:
3878:
3875:
3873:
3870:
3868:
3865:
3863:
3862:Mixed reality
3860:
3858:
3855:
3853:
3850:
3848:
3845:
3843:
3840:
3839:
3837:
3835:
3831:
3825:
3822:
3820:
3817:
3815:
3812:
3810:
3807:
3805:
3802:
3801:
3799:
3797:
3793:
3787:
3784:
3782:
3779:
3777:
3774:
3772:
3769:
3767:
3764:
3762:
3759:
3757:
3754:
3752:
3749:
3748:
3746:
3744:
3740:
3734:
3731:
3729:
3726:
3724:
3721:
3719:
3716:
3714:
3711:
3710:
3708:
3706:
3702:
3696:
3695:Accessibility
3693:
3691:
3690:Visualization
3688:
3686:
3683:
3681:
3678:
3676:
3673:
3672:
3670:
3668:
3664:
3658:
3655:
3653:
3650:
3648:
3645:
3643:
3640:
3638:
3635:
3633:
3630:
3628:
3625:
3623:
3620:
3618:
3615:
3614:
3612:
3610:
3606:
3600:
3597:
3595:
3592:
3590:
3587:
3585:
3582:
3580:
3577:
3575:
3572:
3570:
3567:
3565:
3562:
3560:
3557:
3555:
3552:
3550:
3547:
3545:
3542:
3540:
3537:
3535:
3532:
3531:
3529:
3527:
3523:
3517:
3514:
3512:
3509:
3507:
3504:
3502:
3499:
3497:
3494:
3492:
3489:
3487:
3484:
3482:
3479:
3478:
3476:
3474:
3469:
3463:
3460:
3458:
3455:
3453:
3450:
3448:
3445:
3443:
3440:
3439:
3437:
3435:
3431:
3425:
3422:
3420:
3417:
3415:
3412:
3410:
3407:
3405:
3402:
3400:
3397:
3393:
3390:
3389:
3388:
3385:
3384:
3382:
3380:
3376:
3370:
3367:
3365:
3362:
3360:
3357:
3355:
3352:
3350:
3347:
3345:
3342:
3340:
3337:
3335:
3332:
3330:
3327:
3325:
3322:
3321:
3319:
3317:
3313:
3307:
3304:
3302:
3299:
3297:
3294:
3292:
3289:
3287:
3284:
3282:
3279:
3277:
3274:
3272:
3269:
3267:
3264:
3262:
3259:
3258:
3256:
3254:
3250:
3246:
3240:
3237:
3235:
3232:
3230:
3227:
3225:
3222:
3220:
3217:
3216:
3214:
3210:
3204:
3201:
3199:
3196:
3194:
3191:
3189:
3186:
3184:
3181:
3179:
3176:
3175:
3173:
3171:
3167:
3161:
3158:
3156:
3153:
3151:
3150:Dependability
3148:
3146:
3143:
3141:
3138:
3137:
3135:
3131:
3125:
3121:
3118:
3116:
3113:
3111:
3108:
3106:
3103:
3101:
3098:
3096:
3093:
3091:
3088:
3086:
3083:
3081:
3078:
3076:
3073:
3072:
3070:
3068:
3064:
3059:
3053:
3049:
3042:
3037:
3035:
3030:
3028:
3023:
3022:
3019:
3011:
3005:
3001:
3000:
2994:
2990:
2986:
2981:
2976:
2973:(1–2): 1–32.
2972:
2968:
2963:
2962:
2958:
2950:
2943:
2936:
2933:
2928:
2924:
2920:
2919:Dourish, Paul
2913:
2910:
2905:
2901:
2897:
2893:
2889:
2885:
2881:
2877:
2876:Dourish, Paul
2870:
2868:
2866:
2864:
2862:
2860:
2856:
2845:on 2017-12-14
2841:
2837:
2833:
2829:
2825:
2820:
2815:
2811:
2807:
2800:
2793:
2790:
2785:
2781:
2777:
2771:
2768:
2762:
2757:
2752:
2747:
2743:
2739:
2735:
2728:
2725:
2720:
2716:
2711:
2706:
2702:
2698:
2693:
2688:
2684:
2680:
2676:
2672:
2668:
2661:
2658:
2653:
2649:
2645:
2641:
2634:
2631:
2626:
2620:
2616:
2612:
2608:
2601:
2598:
2593:
2589:
2585:
2581:
2577:
2573:
2566:
2559:
2556:
2551:
2545:
2541:
2537:
2532:
2527:
2523:
2516:
2513:
2502:on 2015-04-06
2498:
2494:
2487:
2486:
2478:
2475:
2472:
2468:
2464:
2460:
2459:Wiebe, Janyce
2454:
2451:
2448:
2443:
2440:
2435:
2431:
2427:
2425:9781628727333
2421:
2417:
2410:
2407:
2402:
2398:
2394:
2388:
2384:
2380:
2376:
2372:
2365:
2362:
2357:
2353:
2348:
2343:
2339:
2335:
2330:
2325:
2321:
2317:
2313:
2306:
2303:
2298:
2294:
2288:
2285:
2280:
2276:
2272:
2268:
2264:
2257:
2254:
2250:
2246:
2240:
2237:
2233:
2229:
2223:
2220:
2215:
2211:
2207:
2203:
2199:
2195:
2191:
2187:
2180:
2177:
2172:
2168:
2163:
2158:
2153:
2148:
2144:
2140:
2136:
2132:
2128:
2121:
2119:
2115:
2111:
2105:
2102:
2096:
2094:
2090:
2085:
2081:
2077:
2073:
2069:
2065:
2064:
2056:
2049:
2047:
2043:
2037:
2034:
2030:
2026:
2023:
2017:
2014:
2010:
2005:
2002:
1998:
1994:
1991:
1985:
1982:
1977:
1971:
1968:
1964:
1960:
1957:
1952:
1949:
1940:
1933:
1932:
1927:
1921:
1918:
1913:
1909:
1903:
1900:
1895:
1891:
1887:
1883:
1876:
1873:
1868:
1867:
1859:
1856:
1851:
1847:
1843:
1837:
1833:
1829:
1824:
1819:
1815:
1808:
1805:
1802:, p. 243
1801:
1796:
1794:
1790:
1785:
1778:
1776:
1772:
1768:
1762:
1760:
1756:
1751:
1747:
1743:
1736:
1733:
1729:
1724:
1721:
1718:, p. 241
1717:
1712:
1710:
1706:
1701:
1695:
1690:
1689:
1680:
1677:
1673:
1672:Hudlicka 2003
1668:
1665:
1661:
1660:Hudlicka 2003
1656:
1654:
1650:
1645:
1641:
1637:
1633:
1628:
1623:
1619:
1612:
1609:
1604:
1600:
1596:
1592:
1588:
1581:
1574:
1571:
1565:
1562:
1557:
1553:
1549:
1543:
1539:
1535:
1531:
1524:
1521:
1515:
1513:
1511:
1507:
1502:
1498:
1494:
1490:
1486:
1482:
1478:
1474:
1467:
1460:
1457:
1452:
1448:
1441:
1434:
1431:
1426:
1422:
1417:
1412:
1408:
1404:
1400:
1396:
1395:
1390:
1383:
1380:
1375:
1371:
1367:
1363:
1359:
1355:
1351:
1347:
1343:
1339:
1335:
1328:
1325:
1314:
1310:
1303:
1300:
1295:
1288:
1285:
1270:
1265:
1260:
1256:
1252:
1251:
1243:
1236:
1233:
1229:
1227:
1213:
1209:
1205:
1198:
1195:
1191:
1176:
1169:
1162:
1159:
1154:
1147:
1145:
1141:
1137:
1133:
1128:
1125:
1119:
1115:
1111:
1107:
1103:
1096:
1093:
1088:
1084:
1080:
1076:
1069:
1066:
1060:
1055:
1050:
1047:
1045:
1042:
1040:
1037:
1035:
1032:
1030:
1027:
1025:
1022:
1019:
1016:
1014:
1011:
1009:
1006:
1004:
1001:
999:
996:
994:
991:
990:
985:
983:
979:
975:
971:
969:
965:
961:
957:
949:
947:
943:
939:
937:
929:
927:
924:
916:
914:
912:
908:
904:
900:
896:
888:
886:
884:
878:
876:
875:Social robots
869:
867:
860:
858:
851:
846:
844:
837:
835:
828:
826:
819:
814:
812:
809:
804:
799:
791:
784:
780:
775:
767:
761:
757:
751:Disadvantages
750:
748:
741:
739:
737:
729:
724:
722:
720:
716:
712:
703:
701:
697:
692:
684:
679:
675:
672:
669:
666:
663:
662:
661:
658:
656:
647:
640:
637:
636:
632:
629:
628:
624:
621:
620:
616:
613:
612:
608:
605:
604:
600:
597:
596:
592:
589:
588:
585:Action units
584:
581:
580:
577:
574:
570:
564:
556:
552:
549:
547:
544:
542:
539:
537:
534:
532:
529:
527:
524:
522:
519:
517:
516:Embarrassment
514:
512:
509:
507:
504:
502:
499:
498:
497:
495:
488:
485:
483:
480:
478:
475:
473:
470:
468:
465:
463:
460:
459:
458:
455:
449:
441:
439:
435:
430:
422:
420:
417:
413:
409:
401:
394:
391:
388:
385:
382:
381:
379:
374:
371:
370:
368:
363:
360:
357:
354:
351:
350:
348:
347:
346:
339:
337:
333:
331:
325:
323:
314:
312:
305:
302:
299:
296:
293:
290:
287:
284:
281:
278:
275:
272:
269:
266:
265:
264:
260:
258:
254:
250:
242:
240:
236:
234:
229:
222:
220:
218:
213:
209:
202:
200:
198:
193:
190:
186:
185:
180:
176:
175:Marvin Minsky
172:
165:
163:
161:
157:
153:
149:
144:
142:
138:
137:physiological
133:
125:
120:
118:
116:
112:
108:
105:published by
104:
100:
96:
92:
88:
84:
80:
76:
68:
55:
43:
38:
29:
28:
19:
6517:Blast Theory
6446:Mobile games
6393:Storytelling
6378:Role-playing
6167:Larry Squire
6162:Bruce McEwen
6157:Amos Tversky
6127:Jerome Kagan
6117:Noam Chomsky
6057:Hans Eysenck
6027:Harry Harlow
6007:Erik Erikson
5906:Intelligence
5803:Neuroimaging
5546:Quantitative
5511:Mathematical
5506:Intelligence
5496:Experimental
5491:Evolutionary
5481:Differential
5390:Psychologist
5322:
5262:Meta-emotion
5175:Emotionality
5148:responsivity
5096:and bullying
5091:intelligence
4901:Affectivity
4885:neuroscience
4874:
4855:in education
4438:
4399:Homesickness
4375:
4301:Enthrallment
4286:Emotion work
4149:Anticipation
3958:Cyberwarfare
3617:Cryptography
2998:
2970:
2966:
2948:
2935:
2926:
2912:
2887:
2883:
2847:. Retrieved
2840:the original
2809:
2805:
2792:
2784:the original
2780:ScienceDaily
2779:
2770:
2741:
2737:
2727:
2677:(21): 8368.
2674:
2670:
2660:
2643:
2639:
2633:
2606:
2600:
2578:(2): 85–94.
2575:
2571:
2558:
2521:
2515:
2504:. Retrieved
2497:the original
2484:
2477:
2462:
2453:
2442:
2415:
2409:
2374:
2364:
2319:
2315:
2305:
2296:
2287:
2270:
2266:
2256:
2239:
2228:Hagit Hel-Or
2222:
2192:(1): 42–55.
2189:
2185:
2179:
2134:
2130:
2104:
2067:
2061:
2036:
2016:
2004:
1984:
1970:
1951:
1939:the original
1930:
1920:
1911:
1902:
1885:
1875:
1865:
1858:
1813:
1807:
1749:
1745:
1735:
1723:
1687:
1679:
1674:, p. 25
1667:
1662:, p. 24
1617:
1611:
1586:
1573:
1564:
1529:
1523:
1476:
1472:
1459:
1450:
1446:
1433:
1398:
1392:
1382:
1341:
1337:
1327:
1316:. Retrieved
1312:
1302:
1293:
1287:
1276:. Retrieved
1257:(1): 55–83.
1254:
1248:
1235:
1225:
1223:
1216:. Retrieved
1207:
1197:
1189:
1182:. Retrieved
1175:the original
1161:
1152:
1127:
1109:
1105:
1095:
1077:. Vol.
1074:
1068:
980:
976:
972:
953:
944:
940:
933:
920:
892:
879:
873:
864:
855:
841:
832:
823:
815:Facial color
805:
801:
777:
754:
745:
733:
707:
698:
694:
685:Body gesture
659:
651:
566:
541:Satisfaction
491:
451:
436:
432:
408:optical flow
405:
343:
334:
326:
318:
309:
261:
246:
237:
230:
226:
214:
210:
206:
203:Technologies
194:
182:
173:
169:
145:
129:
102:
74:
73:
52:January 2023
49:
41:
6383:Persistence
6239:Disciplines
6212:Susan Fiske
6102:Roger Brown
6002:Carl Rogers
5987:Jean Piaget
5952:Ivan Pavlov
5808:Observation
5788:Experiments
5735:Suicidology
5630:Educational
5585:Anomalistic
5556:Theoretical
5531:Personality
5461:Comparative
5444:Cognitivism
5435:Behaviorism
5190:and culture
4995:recognition
4980:homeostatic
4880:forecasting
4829:Weltschmerz
4802:Misanthropy
4579:grandiosity
4461:Inspiration
4451:Infatuation
4419:Humiliation
4341:Frustration
4214:Contentment
3968:Video games
3948:Digital art
3705:Concurrency
3574:Data mining
3486:Probability
3219:Interpreter
2959:Works cited
2812:(1): 5–18.
2761:2066/103051
1926:Ekman, Paul
1908:Ekman, Paul
1620:: 729–732.
1020:(EmotionML)
964:cognitivist
923:psychomotor
895:biofeedback
889:Video games
829:Methodology
742:Methodology
511:Contentment
6552:Categories
6510:Developers
6496:Flash mobs
6466:Smart toys
6420:Crossmedia
6405:Gamemaster
6398:transmedia
6302:Wikisource
6147:Paul Ekman
5982:Kurt Lewin
5876:Competence
5798:Interviews
5778:Case study
5655:Humanistic
5635:Ergonomics
5620:Counseling
5595:Assessment
5577:psychology
5526:Perception
5486:Ecological
5402:psychology
5380:Philosophy
5364:Psychology
5267:Pathognomy
5168:well-being
5084:and gender
5079:expression
5074:exhaustion
5059:detachment
5044:competence
5025:Emotional
5007:regulation
4990:perception
4985:in animals
4935:and memory
4871:Affective
4779:Worldviews
4641:melancholy
4626:Resentment
4496:Loneliness
4471:Irritation
4456:Insecurity
4446:Indulgence
4321:Excitement
4306:Enthusiasm
4239:Depression
4199:Confidence
4194:Compassion
4169:Attraction
4094:Admiration
4089:Acceptance
4026:Glossaries
3898:E-commerce
3491:Statistics
3434:Algorithms
3392:Stochastic
3224:Middleware
3080:Peripheral
2951:: 647–656.
2849:2016-02-02
2549:1581136374
2506:2016-12-10
2471:1402040261
1823:2102.04029
1401:(102149).
1344:(9): 143.
1318:2008-05-13
1278:2008-05-12
1056:References
1008:Chatterbot
968:pragmatist
913:children.
870:Healthcare
641:R12A+R14A
609:1+2+5B+26
521:Excitement
454:Paul Ekman
322:Paul Ekman
243:Algorithms
197:emotion AI
150:, such as
148:modalities
87:psychology
6476:Exergames
6425:Emergence
6322:Wikibooks
6312:Wikiquote
6182:Ed Diener
5967:Carl Jung
5871:Cognition
5700:Political
5610:Community
5440:Cognitive
5295:appraisal
5235:sociology
5186:Emotions
5158:symbiosis
5143:reasoning
5113:isolation
5054:contagion
5039:blackmail
4965:expressed
4960:evolution
4950:and sleep
4940:and music
4875:computing
4822:Reclusion
4817:Pessimism
4792:Defeatism
4722:Suffering
4668:Sehnsucht
4611:Rejection
4562:self-pity
4537:Nostalgia
4506:limerence
4476:Isolation
4414:Hostility
4371:Happiness
4351:Gratitude
4296:Emptiness
4279:vicarious
4229:Curiosity
4204:Confusion
4144:Annoyance
4124:Amusement
4114:Agitation
4109:Affection
4104:Aesthetic
4099:Adoration
3847:Rendering
3842:Animation
3473:computing
3424:Semantics
3115:Processor
2975:CiteSeerX
2949:Proc. CHI
2814:CiteSeerX
2701:1424-8220
2526:CiteSeerX
2434:956349457
2401:144207824
2338:1662-5161
1850:231846518
1752:(6): 137.
1622:CiteSeerX
1493:0929-5593
1374:220717084
1358:0148-5598
1061:Citations
852:Education
625:4+5+7+23
590:Happiness
501:Amusement
477:Happiness
315:Databases
107:MIT Press
6371:Concepts
6317:Wikinews
6274:Timeline
5896:Feelings
5891:Emotions
5851:Behavior
5842:Concepts
5720:Religion
5705:Positive
5695:Pastoral
5680:Military
5645:Forensic
5640:Feminist
5625:Critical
5615:Consumer
5605:Coaching
5600:Clinical
5575:Applied
5471:Cultural
5410:Abnormal
5153:security
5133:literacy
5118:lability
5108:intimacy
5049:conflict
5029:aperture
4926:Emotion
4910:negative
4905:positive
4895:spectrum
4860:measures
4812:Optimism
4807:Nihilism
4797:Fatalism
4787:Cynicism
4732:Sympathy
4727:Surprise
4569:Pleasure
4491:Kindness
4481:Jealousy
4466:Interest
4433:Hysteria
4316:Euphoria
4259:Distrust
4209:Contempt
4189:Calmness
4081:Emotions
4068:Emotions
4006:Category
3834:Graphics
3609:Security
3271:Compiler
3170:Networks
3067:Hardware
2929:: 59–68.
2904:15551492
2836:15296236
2806:CoDesign
2719:36366066
2652:19592726
2646:: 37–9.
2491:. Proc.
2461:(2006).
2356:31114489
2214:16690341
2206:19363957
2171:29555780
2025:Archived
1993:Archived
1959:Archived
1644:11671944
1556:23157273
1425:37091958
1416:10113835
1394:MethodsX
1366:32700060
1269:Archived
1212:Archived
1136:Abstract
986:See also
911:autistic
899:gamepads
820:Overview
730:Overview
638:Contempt
633:9+15+16
606:Surprise
506:Contempt
487:Surprise
249:database
233:prosodic
6532:Niantic
6249:Outline
5745:Traffic
5740:Systems
5675:Medical
5501:Gestalt
5375:History
5323:Italics
5286:Theory
5242:Feeling
5195:history
5180:bounded
5138:prosody
4945:and sex
4930:and art
4890:science
4846:Affect
4840:Related
4715:chronic
4690:Shyness
4650:Saudade
4636:Sadness
4631:Revenge
4621:Remorse
4552:Passion
4542:Outrage
4532:Neglect
4392:Hiraeth
4291:Empathy
4269:Ecstasy
4254:Disgust
4224:Cruelty
4219:Courage
4184:Boredom
4164:Arousal
4154:Anxiety
4139:Anguish
4016:Outline
2710:9654132
2679:Bibcode
2671:Sensors
2580:Bibcode
2347:6503868
2322:: 131.
2297:Gizmodo
2162:5889636
2139:Bibcode
2084:7185733
1603:5790745
1218:May 13,
1184:May 13,
1138:), 1995
903:arousal
630:Disgust
601:1+4+15
598:Sadness
582:Emotion
482:Sadness
467:Disgust
332:(HCI).
132:sensors
95:emotion
79:affects
42:updated
6537:Mojang
6434:Genres
6279:Topics
5725:School
5650:Health
5551:Social
5454:Social
5400:Basic
5385:Portal
5290:affect
5272:Pathos
5225:social
5069:eating
4742:Wonder
4710:Stress
4700:Sorrow
4616:Relief
4606:Regret
4594:vanity
4589:insult
4584:hubris
4439:Ikigai
4409:Horror
4385:Hatred
4244:Desire
4234:Defeat
4159:Apathy
3006:
2977:
2902:
2834:
2816:
2717:
2707:
2699:
2650:
2621:
2546:
2528:
2469:
2432:
2422:
2399:
2389:
2354:
2344:
2336:
2245:Jia Li
2212:
2204:
2169:
2159:
2082:
1848:
1838:
1696:
1642:
1624:
1601:
1554:
1544:
1501:459892
1499:
1491:
1423:
1413:
1372:
1364:
1356:
536:Relief
89:, and
6226:Lists
5685:Music
5670:Media
5665:Legal
5516:Moral
5215:moral
5123:labor
4975:group
4754:Worry
4737:Trust
4705:Spite
4685:Shock
4680:Shame
4574:Pride
4547:Panic
4426:Hygge
4366:Guilt
4361:Grief
4356:Greed
4326:Faith
4264:Doubt
4134:Angst
4129:Anger
4119:Agony
3419:Logic
3253:tools
2945:(PDF)
2900:S2CID
2843:(PDF)
2832:S2CID
2802:(PDF)
2568:(PDF)
2500:(PDF)
2493:DiGRA
2489:(PDF)
2397:S2CID
2210:S2CID
2080:S2CID
2058:(PDF)
1942:(PDF)
1935:(PDF)
1846:S2CID
1818:arXiv
1640:S2CID
1599:S2CID
1583:(PDF)
1552:S2CID
1497:S2CID
1469:(PDF)
1443:(PDF)
1370:S2CID
1272:(PDF)
1245:(PDF)
1208:Wired
1178:(PDF)
1171:(PDF)
622:Anger
593:6+12
551:Shame
526:Guilt
462:Anger
255:, or
158:, or
121:Areas
5911:Mind
5034:bias
5019:work
4601:Rage
4557:Pity
4518:Lust
4501:Love
4404:Hope
4336:Flow
4331:Fear
4311:Envy
4072:list
3251:and
3124:Form
3120:Size
3004:ISBN
2715:PMID
2697:ISSN
2648:PMID
2619:ISBN
2544:ISBN
2467:ISBN
2430:OCLC
2420:ISBN
2387:ISBN
2352:PMID
2334:ISSN
2202:PMID
2167:PMID
1836:ISBN
1694:ISBN
1542:ISBN
1489:ISSN
1421:PMID
1362:PMID
1354:ISSN
1220:2008
1186:2008
1106:Mind
1079:LNCS
614:Fear
472:Fear
304:HMMs
274:k-NN
4486:Joy
4174:Awe
2985:doi
2892:doi
2824:doi
2756:hdl
2746:doi
2705:PMC
2687:doi
2644:144
2611:doi
2588:doi
2536:doi
2379:doi
2342:PMC
2324:doi
2275:doi
2194:doi
2157:PMC
2147:doi
2135:115
2072:doi
1828:doi
1632:doi
1591:doi
1534:doi
1481:doi
1411:PMC
1403:doi
1346:doi
1259:doi
1114:doi
1083:doi
962:'s
292:ANN
286:SVM
280:GMM
268:LDC
6554::
3122:/
2983:.
2971:59
2969:.
2947:.
2921:;
2898:.
2888:65
2886:.
2878:;
2858:^
2830:.
2822:.
2808:.
2804:.
2778:.
2754:.
2742:22
2740:.
2736:.
2713:.
2703:.
2695:.
2685:.
2675:22
2673:.
2669:.
2642:.
2617:.
2586:.
2574:.
2570:.
2542:.
2534:.
2428:.
2395:.
2385:.
2373:.
2350:.
2340:.
2332:.
2320:13
2318:.
2314:.
2295:.
2271:47
2269:.
2265:.
2230:,
2208:.
2200:.
2190:29
2188:.
2165:.
2155:.
2145:.
2133:.
2129:.
2117:^
2092:^
2078:.
2068:19
2066:.
2060:.
2045:^
1888:.
1844:.
1834:.
1826:.
1792:^
1774:^
1758:^
1750:14
1748:.
1744:.
1708:^
1652:^
1638:.
1630:.
1597:.
1589:.
1585:.
1550:.
1540:.
1509:^
1495:.
1487:.
1477:12
1475:.
1471:.
1451:13
1449:.
1445:.
1419:.
1409:.
1399:10
1397:.
1391:.
1368:.
1360:.
1352:.
1342:44
1340:.
1336:.
1311:.
1267:.
1253:.
1247:.
1222:.
1210:.
1206:.
1188:.
1143:^
1108:.
1104:.
958:,
885:.
717:,
713:,
414:,
410:,
251:,
219:.
154:,
143:.
85:,
6356:e
6349:t
6342:v
5442:/
5356:e
5349:t
5342:v
4074:)
4070:(
4060:e
4053:t
4046:v
3060:.
3040:e
3033:t
3026:v
3012:.
2991:.
2987::
2906:.
2894::
2852:.
2826::
2810:1
2764:.
2758::
2748::
2721:.
2689::
2681::
2654:.
2627:.
2613::
2594:.
2590::
2582::
2576:1
2552:.
2538::
2509:.
2436:.
2403:.
2381::
2358:.
2326::
2281:.
2277::
2216:.
2196::
2173:.
2149::
2141::
2086:.
2074::
1978:.
1946:.
1852:.
1830::
1820::
1702:.
1646:.
1634::
1605:.
1593::
1558:.
1536::
1503:.
1483::
1427:.
1405::
1376:.
1348::
1321:.
1281:.
1261::
1255:2
1120:.
1116::
1110:9
1089:.
1085::
54:)
50:(
44:.
20:)
Text is available under the Creative Commons Attribution-ShareAlike License. Additional terms may apply.