Skip to main content

Enhanced bodily states of fear facilitates bias perception of fearful faces

Abstract

We investigated whether enhanced interoceptive bodily states of fear would facilitate recognition of the fearful faces. Participants performed an emotional judgment task after a bodily imagery task inside a functional magnetic resonance imaging scanner. In the bodily imagery task, participants were instructed to imagine feeling the bodily sensations of two specific somatotopic patterns: a fear-associated bodily sensation (FBS) or a disgust-associated bodily sensation (DBS). They were shown faces expressing various levels of fearfulness and disgust and instructed to classify the facial expression as fear or disgust. We found a stronger bias favoring the “fearful face” under the congruent FBS condition than under the incongruent DBS condition. The brain response to fearful versus intermediate faces increased in the fronto-insular-temporal network under the FBS condition, but not the DBS condition. The fearful face elicited activity in the anterior cingulate cortex and extrastriate body area under the FBS condition relative to the DBS condition. Furthermore, functional connectivity between the anterior cingulate cortex/extrastriate body area and the fronto-insular-temporal network was modulated according to the specific bodily sensation. Our findings suggest that somatotopic patterns of bodily sensation provide informative access to the collective visceral state in the fear processing via the fronto-insular-temporal network.

Introduction

Physiological feedback plays an important role in the perception of emotion [1], which is thought to be the subjective experience of a physiological reaction to emotional stimuli or a physiological reaction itself [2]. Debate concerning this concept has focused on whether a distinct physiological state accompanies specific emotions [3,4,5]. Recent studies propose that perceived emotion is a product not only of ascending emotional stimuli but also of the reciprocal interaction between the descending inference and internal states [6,7,8,9]. An accurate discrimination of facial expressions is important for social functioning. The cognitive process underlying emotional face recognition is known to be highly associated with affective disorders, such as major depressive disorder (MDD) and anxiety disorders [10]. MDD patients showed inaccurate recognition among facial expressions of six basic emotions [11] and neutral faces [12]. Surcinelli and his colleagues found more accurate recognition for fearful faces in participants with high trait anxiety than with low trait anxiety [13]. Furthermore, socially anxious individuals showed biased recognition of facial expressions to anger [14].

The core function of the brain is homeostatic regulation of the physiological state to promote survival. In contrast to the standard regulatory model in which errors are corrected via feedback, a newer model, “allostasis,” proposes a predictive regulatory model in which changes in the visceral state are anticipated and modified before they arise [15]. The concept of “predictive coding” can be explained as hierarchical Bayesian inference about the hidden causes of our sensations [16,17,18]. Recently, the predictive coding framework has been used in the context of interoception. “Interoceptive inference” envisions a subjective feeling (emotional feeling) as arising from predictive models of the causes of interoceptive afferents [19]. Nummenmaa et al. recently showed that different emotional states are associated with distinct bodily sensation maps, suggesting that different bodily sensation patterns originate from the different physiological conditions underlying emotion [20, 21]. Our previous study showed that somatotopic bodily sensation patterns provided a channel for inferring bodily state [22]. In this context, the somatotopical pattern of bodily sensation may provide efficient access to the collective interoceptive information. However, few studies have investigated whether manipulations of emotion-specific bodily sensation patterns can affect emotion perception.

Emerging evidence suggests that the brain functions as a generative model of the world using past experience to construct the present [23]. Interoceptive predictive coding hypothesis suggests that the conscious sense of presence depends on interactions between an interoceptive-comparator integrating ascending visceral signals [19]. Interoceptive experiences are formed from probabilistic inference about the causes of viscerosensory inputs [19]. The Embodied Predictive Interoception Coding (EPIC) model proposes that bodily predictions act as a binding pacemaker signal to create a core neuronal network workspace [24, 25]. Recently, unexpected and unconscious surges of interoceptive arousal regulated the encoding of sensory noise on perceptual awareness [26]. Insular cortex is assumed to be the principal cortical region, integrating low-level sensory prediction errors with interoceptive and attentional expectations to regulate affective salience and emotion [19, 27]. Anterior insular cortex, a center of awareness of subjective feeling, constitutes a site for multimodal integration of interoceptive and exteroceptive signals through interoceptive predictions [27]. Anterior insular cortex not only integrates bottom-up interoceptive prediction errors with top-down predictions from high-order cortical areas, but also sends descending predictions to visceral system that provide a point of reference for autonomic reflexes and for generating future awareness [28]. The bodily sensations are in part a reflection of what the brain predicts and the idea of interoceptive inference is explained within the context of body homeostasis [27]. From the perspectives of active inference framework, the emotional representation includes not only the external environment, but also interoceptive sensations from the body [27].

The present study investigated whether the synchronization with the bodily signature of fear enhance bias perception of fearful faces. We used a bodily imagery task in which participants were instructed to imagine the bodily sensations depicted on bodily sensation maps of fear or disgust. Immediately after the bodily imagery task, participants were asked to judge emotional facial expressions as fear or disgust. We hypothesized that enhanced interoceptive bodily states of fear would facilitate recognition of the fearful faces and increase activity in the fronto-insular-temporal network.

Methods

Participants

In total, 17 healthy student volunteers (22.8 ± 2.2 years; eight females) were recruited from Kyung Hee and Korea Universities by advertisement. No participant had a history of neurological, psychiatric, or other major medical problems, and no participants were taking medications at the time of the study. Participants were instructed not to drink alcohol or caffeine or take any medications the day before the study participation. All participants provided written informed consent before the experiments. The Institutional Review Board of Korea University approved all study protocols (KU-IRB-15–108-A-1). In the current study, we performed power analysis using Neuropower tool which provides sample size calculations for fMRI experiments [29]. A sample size of 16 participants in total was required to achieve power above 0.8 with the analysis options: cluster-forming threshold p < 0.001; alpha-level < 0.05; random-field theory correction.

Experimental stimuli and tasks

Participants performed an emotional judgment task after a bodily imagery task inside a functional magnetic resonance imaging (fMRI) scanner. For the bodily imagery task, we generated somatotopic images, which were averaged maps of emotion-specific bodily sensations obtained in our previous study [22]. The participants of the present study were recruited from the previous study who volunteered to participate in an additional study. In the previous study, we recorded bodily sensations related to six basic emotions (happiness, sadness, fear, disgust, surprise, and anger) reported by 31 subjects on a somatotopic map. In the present study, we used fear-associated bodily sensation (FBS) and disgust-associated bodily sensation (DBS) somatotopic maps in the bodily imagery task.

During the bodily imagery task, participants were asked to imagine the bodily sensations depicted in the FBS or DBS somatotopic maps. They were told that they had to focus attention to different types of somatotopic locations and imagine feeling the bodily sensations from their own body. Importantly, subjects received no information about which emotion was associated with each somatotopic map, and the somatotopic maps were labeled as somatotopic pattern 1 and somatotopic pattern 2 throughout the procedure. The participants were given the task to guess which bodily sensation patterns were derived from the five emotions (fear, disgust, happiness, sadness, and anger). The emotional judgment task immediately followed the bodily imagery task. Participants were asked to look at an image of a facial expression, one of five morphed facial expressions ranging from fear to disgust, and then to classify the expression as fearful or disgusted (Fig. 1).

Fig. 1
figure 1

Experimental procedure during fMRI scanning. Inside the fMRI scanner, the bodily imagery task required participants to view a somatotopic map that flickered twice during an 8-s period with a 4-s cycle. Participants were instructed to imagine the bodily sensation depicted by the somatotopic pattern. The somatotopic maps were representative sensation maps for fear or disgust generated by averaging the sensation patterns recorded in subjects after watching video clips containing fearful or disgusting stimuli in our previous study. After the bodily imagery task, a face with a morphed facial expression between fear and disgust appeared for 2 s. In this two-alternative forced-choice task, participants were given 4 s to classify the expression as fearful or disgusted

Generation of bodily sensation maps for fear and disgust

In our previous study, we recorded the somatotopic patterns associated with bodily sensations shortly after inducing a specific emotion [22]. After viewing an emotional video clip, participants were asked to mark the location of their bodily sensations on a somatotopic map presented on an iPad (Apple Inc., Cupertino, CA, USA) using a bodily sensation map-emotion (BSM-E) application with a template of the human body as two-dimensional frontal images. We generated a representative sensation map for each emotion (fear and disgust) by averaging the extracted FBS and DBS somatotopic patterns.

The fear-associated bodily sensations were distributed primarily over the heart and both eyes, and the disgust-associated bodily sensations were distributed mainly along a straight line connecting the mouth, neck, and chest. These patterns are in agreement with those reported by Nummenmaa et al. [20, 21]. The averaged map was visualized using an “afm hot” color map with a black-red-yellow-white color gradient, where white represented the location with the most intense sensation. We selected bodily states associated with fear and disgust, because they are primitive emotions and have opposite physiological properties [30]. Furthermore, a previous study found that a perceptual choice task successfully discriminated between fearful and disgusted faces [31] (Fig. 1).

Bodily imagery task

Participants viewed a somatotopic map (FBS or DBS) that flickered twice during an 8-s period with a 4-s cycle. Participants were instructed to breathe in time with the flickering of a given image. The peak of inhalation was matched to the brightest moment of the flickering. With the breathing, they were asked to imagine the strong feelings illustrated by the somatotopic pattern.

Morphed emotional facial expressions

Facial expressions that morphed between fear and disgust were generated for the emotional judgment task. The original stimuli were 16 pictures of emotional facial expressions (eight identities: four each of male fearful and disgusted faces and four each of female fearful and disgusted faces selected from the Karolinska Directed Emotional Faces (KDEF) database (https://www.emotionlab.se/resources/kdef)).

For each identity, five levels of facial expression from 100% fearful to 100% disgusted were generated with the morphing program, Facemorpher, using a Python library (https://pypi.python.org/pypi/facemorpher/). The five facial expressions included two original faces expressing 100% fear and 100% disgust. The remaining three faces were 75% fearful and 25% disgusted, 25% fearful and 75% disgusted, and 50% fearful and 50% disgusted.

Emotional judgment task

The emotional judgment task was a two-alternative forced-choice task classifying the emotional faces into fear or disgust. After the bodily imagery task, a fixation cross was displayed during an interstimulus interval; this was followed by the 2-s presentation of a face pseudo-randomly selected from the five emotional facial expressions. The duration of the interstimulus interval was pseudo-randomly selected from a range between 800 and 3200 ms, with an average of 2000 ms. After a second interstimulus interval showing the fixation cross, participants were asked to classify the presented face as fearful or disgusted. The participants were allowed 4 s to respond.

Participants were instructed to hold an fMRI-compatible four-button box in their right hand and to press the first or second button to select fear or disgust, respectively. The assignment of buttons was fixed within a session, but it was pseudo-randomly determined over the sessions. Participants were instructed to use their index and middle fingers to press the corresponding buttons. Participants were told which button was assigned to each emotion at every decision point.

Experimental procedure

All experimental procedures were performed inside the fMRI scanner. Participants underwent a training session to familiarize them with the bodily imagery task. Before the experiments, participants were instructed to perform the bodily imagery task while viewing a continuously flickering bodily sensation map for 3 min for each emotion (fear and disgust) during the training session. The order of the fear and disgust stimuli was randomly determined.

After the training session, the main experimental session was undergone. Inside the fMRI scanner, participants performed emotional judgment tasks after a short bodily imagery task (8 s). The experiment was divided into eight sub-sessions per session. Each session had twelve trials presenting four fearful faces (two for 100% and two for 75%), four intermediate faces, and four disgusted faces (two for 100% and two for 75%), and only one type of somatotopic image (FBS or DBS) was presented in a single session. The eight sessions included four FBS images and four DBS images. The order of the eight sessions was pseudo-randomly determined with the constraint that two consecutive sessions could not use the same somatotopic information. A structural MRI scan was inserted between four-session blocks.

After the eight sessions were completed, participants were removed from the fMRI scanner and asked to evaluate the bodily sensations evoked by the bodily imagery task in terms of intensity and spatial distribution. Intensity was evaluated on a scale of 0 (no bodily sensation) to 5 (most intense bodily sensation imaginable). To determine the spatial pattern of bodily sensations under each condition, participants were asked to mark the location of their bodily sensations on a somatotopic map presented on an iPad (Apple Inc.) using a BSM-E application with a template of the human body as two-dimensional frontal images [22, 32, 33].

After being told that the bodily sensation patterns used in the bodily imagery task were derived from a previous experiment in which they had participated, the participants were asked to guess which bodily sensation patterns were derived from each emotion (fear, disgust, happiness, sadness, and anger).

Analysis of behavioral data

The self-reported intensity levels of bodily sensations during the bodily imagery task under the FBS and DBS conditions were compared using paired t-tests. The spatial patterns of bodily sensation were assessed using a pixel-wise univariate t-test for each condition (3dttest + + , the Analysis of Functional NeuroImage (AFNI), https://afni.nimh.nih.gov/afni) within a mask of the body template. In all statistical parametric map analyses, the false discovery rate (FDR) correction was used to handle statistical inflation from multiple comparisons [22, 32, 33]. The group-level accuracy of participants’ guesses regarding the emotion used to derive the bodily sensation patterns was calculated using the F-beta score. The statistical significance of the accuracy was evaluated using a null distribution generated by 10,000 iterated random guesses among the five emotions.

Group-level emotional judgment task responses under the FBS and DBS conditions were analyzed using a 2 × 5 repeated-measures analysis of variance (ANOVA) and Tukey’s post hoc test. The statistical tests for the behavioral data were conducted using R 3.4.0. The psychometric curves were fitted using the “quickpsy” package, which uses the maximum likelihood method [34]. The group-level psychometric function was determined by plotting a cumulative Gaussian model with the respective group means of its defining parameters (i.e., threshold, slope, guessing, and lapsing rate) calculated from the individual fittings of the responses of 17 participants [35].

Physiological arousal level measurement

During the whole fMRI scanning, heart rate was monitored using the scanner’s built-in finger-tip pulse oximeter. We compared the heart rate between FBS and DBS by examining the heart rate for 10 s including a synchronization task and the interstimulus interval after the task. Heart rate variability (HRV) derived from pulse oximetry signals was also investigated. Because 10 s time window is too narrow for the reliable HRV measurement, the inter-beat interval (IBI) was extracted from the whole procedures (4 min 30 s) of the individual sessions. IBI data were then resampled to 4 Hz using a cubic interpolation. Amplitudes of the high frequency (HF: 0.15—0.4 Hz) and low frequency (LF: 0.05—0.15 Hz) components were extracted in the time–frequency domain using python hrv library (https://github.com/rhenanbartels/hrv).

fMRI acquisition

Structural and functional imaging was performed on a 3 T Siemens Tim Trio magnetic resonance scanner with a head coil attached. As an anatomical reference, a three-dimensional T1-weighted magnetization-prepared rapid gradient echo image dataset was obtained (TR = 2000 ms, TE = 2.37 ms, flip angle = 9°, field of view = 240 × 240 mm2, voxel size = 0.9 × 0.9 × 1.0 mm3, and 192 slices). Blood-oxygen-level-dependent (BOLD) fMRI of the whole brain was conducted using an echo planar imaging (EPI) sequence (TR = 2000 ms, TE = 30 ms, flip angle = 90°, field of view = 240 × 240 mm2, voxel size = 3.8 × 3.8 × 4.0 mm3, and 37 slices).

fMRI analysis

Preprocessing was performed using the AFNI software package [36]. The EPI time-series data were corrected for slice timing and motion, then concatenated and transformed to a common Talairach space [37], registered to the volume with the minimum outlier fraction, spatially blurred using a 6-mm full-width-at-half-maximum (FWHM) Gaussian filter, resampled to a 3-mm isotropic resolution, and scaled to yield a mean of 100 for each voxel. Head movement during the scanning session was assessed prior to any movement correction to the fMRI data.

The nine regressors of interest represented time periods in the experimental procedure. Two regressors represented the timing of imagining bodily sensations under the FBS or DBS conditions during 8 s. Other six regressors (3 stimuli × 2 conditions) represented the timing of presentation of each level of emotional faces which are fearful (100% and 75% fearful) face, intermediate (50% fearful and 50% disgusted) face, and disgusted (100% and 75% disgusted) face under two different bodily sensation imagination conditions. The other regressor represents the timing of the emotional judgment task. These regressors of interest were fitted to the scan time course using the AFNI program, 3dDeconvolve [36]. The six motion-correction parameters of head movement assessed using the realignment procedure were entered as covariates of no interest. Regressors were convolved with a gamma variate hemodynamic response function.

Omnibus two-way within-subject ANOVA was performed to study differences in neural activation arising from 2 factors (bodily imagination and emotional face stimuli) using AFNI’s 3dANOVA3 program with the option “type” as “4”, which indicates two way repeated measure ANOVA analysis. The emotional face stimuli factor had three conditions (fearful, intermediate, and disgusted face) and the bodily imagination factor had two conditions (FBS and DBS). However, the two way ANOVA revealed no statistically significant brain activations in all main effects and interaction under family-wise error (FWE) < 0.05 (activated brain regions found with uncorrected p < 0.001 were described in Additional file 1: Table S1).

In the emotional judgement task, we found significant differences in the fearful face judgements from the intermediate faces between FBS and DBS conditions (63.2 ± 3.7% under FBS; 51.5 ± 4.7% under DBS). We further investigated the correlations between individual bias in behavior (the differences of fearful face judgement ratio of intermediate face between FBS and DBS) and contrasted the brain activation to the intermediate faces between FBS and DBS conditions. Brain regions showing correlations between behavioral outcomes and brain activations across individuals were tested using the analysis of covariance (ANCOVA) function provided in the 3dttest + + program (AFNI, https://afni.nimh.nih.gov/afni). However, there was no statistically significant brain activations with FWE < 0.05 (significant brain regions found with uncorrected p < 0.005 were described in Additional file 1: Table S2).

Secondary analysis was performed to evaluate differences in the neural response of emotional (fearful and disgusted) faces and of intermediate face under the FBS and DBS conditions individually, univariate t-tests were performed for two contrast images which are a contrast between fearful and intermediate face and a contrast between disgusted and intermediate face. Cluster threshold criteria were determined using Monte Carlo simulations, which resulted in a FWE-corrected significance threshold of p < 0.05 [38, 39]. The spatial smoothness of the data was evaluated using a modern approach, which estimates a non-Gaussian spatial autocorrelation function, greatly reducing the false positive rates (FPRs). A modified version of the AFNI 3dFWHMx software with an auto-correlation function was used to extract the actual smoothness for each participant. Then, the mean FWHM value across participants was used for the permutation test to determine the cluster-wise threshold. The permutation test was performed by generating a null distribution by randomizing the signs of the residuals among subjects. The t-statistic calculations were iterated 10,000 times, and the accumulated distribution of the 10,000 t-statistic maps were used to determine the appropriate cluster size threshold for various voxel-wise p-values (e.g., 0.01, 0.005, 0.001) to achieve a FPR < 0.05 [40].

Furthermore, region-of-interest analyses were performed by extracting beta estimates for each subject from a priori region of interest (ROI): the insula and the amygdala. The increased activation of amygdala and insula to social threat including facial expressions of fear and disgust are known to be robust and consistent [41,42,43,44]. We extracted ROI mask using FreeSurfer based on Deskian-Killiany atlas [45]. Beta estimates were extracted from brain responses to the fearful face and to the disgusted face for each bodily imagery task condition (FBS and DBS) separately using the 3dmaskave function in AFNI. The extracted beta estimates between bodily imagery task conditions were compared using paired t-tests.

A priori ROI analysis revealed statistical difference between the two bodily imagery task conditions only to the fearful face, but not to the disgusted face. The data can be shown with the raincloud plots [46]. To identify brain regions that contributed to emotional perception under each bodily sensation condition, a contrast image of the response to the fearful face after inference of the FBS (congruent emotion) and of the DBS (incongruent emotion) conditions was obtained. We used a paired t-test to assess the difference between FBS and DBS conditions. The same statistical thresholding method used for the neural evaluation of emotional perception was applied at the same statistical level.

A generalized form of the context-dependent psychophysiological interactions (gPPI) analysis was applied [47, 48] to the whole brain using the anterior cingulate cortex (ACC) and extrastriate body area (EBA) as seed regions. Compared with a conventional PPI analysis, the gPPI exhibits improved sensitivity and specificity [47, 48]. First, we subtracted the global trend from the original time-series data over the entire experiment. Then, the average time series was extracted for each subject in the ACC and right EBA clusters, in which group-level activity in response to the fearful face was significantly greater under the FBS than under the DBS condition. The extracted average time series of the seed regions were deconvolved using the gamma-variate hemodynamic response function as the impulse response function. The interaction PPI regressors were generated using the deconvolved seed time series by multiplying by the onset timing vectors separately for the FBS and DBS conditions. The multiplied vectors were convolved again using the gamma variate hemodynamic response function, and they were finally used in the general linear model analysis. The beta estimates associated with the PPI regressors for FBS and DBS represented the extent to which activity in each voxel correlated with activity in the ACC or EBA under each condition. The group-level analysis was applied to the beta estimates of each regressor using paired t-tests (FBS > DBS). We then identified voxels with significant connectivity differences under the FBS and DBS conditions using the same statistical thresholding method at the same statistical significance level (p < 0.05, FWE correction). All the data and code used in this study are available upon direct request as well as the conditions for its sharing or re-use.

Results

Bodily sensations induced by the bodily imagery task

The intensity of the bodily sensation, assessed using the 0–5 numerical rating scale, was 3.4 ± 0.2 (mean ± standard error of mean) in response to the FBS and 3.1 ± 0.3 in response to the DBS. The paired t-test revealed that the intensity did not significantly differ between conditions (t = 1.16, p = 0.264; Fig. 2a). The spatial patterns of the bodily sensations induced in the bodily imagery task were reported after the task was completed. The statistical parametric maps of bodily sensations induced in the bodily imagery task under each condition were visualized on a body template. The patterns of the self-reported bodily sensations were well matched with those presented in the bodily imagery task (Fig. 2b). However, the participants did not realize that their bodily sensation pattern corresponded to each emotion in the synchronized task. The group-level F-beta score for matching the emotion associated with each bodily sensation pattern was 0.146 for the FBS and 0.135 for the DBS. The F-beta scores were not significantly different from the null distribution generated by random simulation (FBS: p = 0.113; DBS: p = 0.198).

Fig. 2
figure 2

Self-reported bodily sensations according to intensity and spatial pattern and percentage of decisions favoring fearful face under congruent bodily sensation pattern. a Fear/disgust-associated bodily sensation (intensity). The intensity of the bodily sensations did not significantly differ between the conditions (t = 1.16, p = 0.264). The statistical parametric maps of bodily sensations induced in the bodily imagery task under each condition were visualized on a body template (FBS on the left side and DBS on the right side). b Fear/disgust-associated bodily sensation (spatial patterns). The location of each self-reported bodily sensation was well matched with the patterns presented in the bodily imagery task. c Emotional judgment task. The group-level classification ratio for each morphed face under the FBS (red) and DBS (blue) conditions. The psychometric curves fitted to the classification results are shown in the corresponding colors. The two-way repeated-measures ANOVA revealed a main effect of somatotopic pattern on the classification pattern of the emotional faces (F [1, 16] = 5.191; p = 0.0242). Tukey’s HSD post hoc analyses indicated that the emotional recognition bias favoring the “fearful face” was more pronounced under the FBS condition than under the DBS condition

Emotional judgment task

The group-level findings for the emotional judgment task are shown in Fig. 2c according to condition. Under the FBS condition, the percentage of prototypical disgusted faces classified as fearful was 14.0 ± 4.0%, whereas 87.5 ± 3.2% of the prototypical fearful faces were classified as fearful and 63.2 ± 3.7% of the intermediate faces (50% fearful and 50% disgusted) classified as fearful. Under the DBS condition, the percentage of prototypical disgusted faces classified as fearful was 11.0 ± 2.1%, whereas 84.6 ± 3.8% of the prototypically fearful faces were classified as fearful and 51.5 ± 4.7% of the intermediate faces (50% fearful and 50% disgusted) classified as fearful. From individual psychometric curve fittings of the responses of 17 participants, mid-points under FBS (0.59 ± 0.03%) and mid-points under DBS (0.51 ± 0.04%) were extracted. Paired t-test revealed that they were significantly different (t = 2.386, p = 0.029). But slope values under FBS (6.94 ± 2.28%) and slope values under DBS (12.08 ± 6.64%) were not significantly different (t = -0.767, p = 0.453).

We found a significant difference (2 × 5 repeated-measures ANOVA) in emotional judgment across somatotopic patterns (F [1, 16] = 5.191; p = 0.024) and across emotional facial levels (F [4, 64] = 194.7; p < 0.0001), but no interaction was observed (F [4, 64] = 0.901; p = 0.465). The post hoc test revealed that bias toward the “fearful face” in the emotional judgment task was greater under the FBS condition than under the DBS condition (z-score = 2.281; p = 0.022).

Physiological arousal level measurement

The heart rate during bodily imagery task was 65.3 ± 1.8 in the FBS condition and 65.7 ± 1.8 in the DBS condition. The paired t-test revealed that the intensity did not significantly differ between the two conditions (t = 1.30, p = 0.212). The HF and LF components of HRV showed no significant effects in either condition. The normalized HF value during synchronization task was 0.42 ± 0.22 in the FBS condition and 0.39 ± 0.21 in the DBS condition. The normalized LF value during synchronization task was 0.58 ± 0.22 in the FBS condition and 0.61 ± 0.21 in the DBS condition. The paired t-test revealed that the normalized HRV values did not significantly differ between the two conditions (t = 0.538, p = 0.597).

Brain responses to fearful and disgusted faces according to somatotopic condition

Brain activity in response to the fearful face (fearful face > intermediate face) under the FBS (congruent) condition was found in the bilateral anterior insula, dorsolateral prefrontal (dlPFC), and inferior frontal cortices; right posterior insula; secondary somatosensory cortex; middle temporal gyrus (MTG); and middle occipital gyrus (p < 0.05; cluster-wise corrected; Table 1 and Fig. 3a). In contrast, the fearful face did not elicit significant brain activity under the DBS (incongruent) condition. Moreover, the disgusted face did not elicit significant brain activity under the FBS or DBS condition (disgusted face > intermediate face).

Table 1 Response to the fearful face versus the intermediate face under the fear-associated bodily sensation (congruent) condition
Fig. 3
figure 3

Brain responses to emotional faces according to somatotopic information (FBS and DBS). A: BOLD response to emotional face after bodily imagery task. Brain activity in response to the fearful face (fearful face > intermediate face) under the FBS condition was found in the bilateral regions of the anterior insula and dorsolateral prefrontal and inferior frontal cortices and in the right posterior insula, secondary somatosensory cortex, middle temporal gyrus (extrastriate body area), and middle occipital gyrus (p < 0.05; cluster-wise corrected). In contrast, the fearful face did not evoke significant brain activity under the DBS condition. Moreover, the disgusted face did not evoke a significant brain response (disgusted face > intermediate face) under the FBS or DBS condition. a Beta estimates for insula and amygdala (ROIs). b The ROI analysis of the bilateral amygdala and insula revealed that the FBS condition enhanced the brain response to fearful faces

Subsequent priori ROI analysis of the bilateral amygdala and insula regions revealed that the brain response to the fearful face was enhanced under the FBS but not the DBS condition (Fig. 3b). We found the significant differences in three among four predefined ROIs: right insula (t = 3.55, p < 0.05; Bonferroni corrected), left insula (t = 3.14, p < 0.05; Bonferroni corrected), right amygdala (t = 3.45, p < 0.05; Bonferroni corrected). However, brain response to the disgusted face was not different between two imagination conditions. It should be noted that the activation was driven purely by the descending modulation of the prior somatotopic information derived from interoception, because the visually displayed facial images themselves were equivalent across conditions.

Brain activity change to the fearful face between FBS and DBS

There was more pronounced brain activity in response to the same fearful face under the FBS (congruent) condition than under the DBS (incongruent) condition (FBS > DBS). Enhanced activity was observed in the bilateral regions of the ACC (Brodmann area 32) and the right area of the MTG (Brodmann area 37: EBA; p < 0.05; cluster-wise corrected; Fig. 4a and Table 2).

Fig. 4
figure 4

a Brain responses to the fearful face according to somatotopic condition. Comparison of the brain response to the fearful face under the FBS and DBS conditions revealed enhanced activity in the bilateral regions of the anterior cingulate cortex (ACC, Brodmann area 32) and the right region of middle temporal gyrus (Brodmann area 37) under the FBS condition (p < 0.05; cluster-wise corrected). b Functional connectivity of the anterior cingulate cortex (ACC) and extrastriate body area (EBA) according to somatotopic condition. The ACC and EBA, which encode different bodily sensations, were used as seed regions in the functional connectivity analysis. The dorsolateral prefrontal cortex (dlPFC), insula, operculum, fusiform gyrus, cerebellum, and extrastriate cortex (V4) showed somatotopic pattern-dependent connectivity modulation with the ACC (top). The amygdala (basolateral amygdala; BLA), insula, dlPFC, supramarginal gyrus, and left MTG showed somatotopic pattern-dependent connectivity modulation with the right EBA (p < 0.05; cluster-wise corrected, bottom)

Table 2 Comparison of responses to the fearful face under the fear-associated and disgust-associated bodily sensation conditions

Functional connectivity of the ACC and EBA according to somatotopic condition

Having shown that interoceptive modulation of fearful face recognition is associated with increased activity in the ACC and right EBA, we sought to determine whether dynamic changes in functional connectivity occurred in the ACC and right EBA.

The whole-brain gPPI analysis using the ACC as the seed region revealed a significant difference in bodily sensation-dependent connectivity changes between the FBS and DBS conditions in the dlPFC, insula, operculum, fusiform gyrus, cerebellum, and extrastriate cortex (V4). Compared with the DBS (incongruent) condition, the FBS (congruent) condition increased the connectivity between the ACC on the dlPFC, insula, and operculum and decreased the influence of the ACC on the fusiform gyrus, cerebellum, and V4 (Fig. 4b and Table 3).

Table 3 Functional connectivity of the anterior cingulate cortex according to somatotopic condition

The gPPI analysis using the right EBA as the seed region revealed a significant difference in interoceptive modulation-dependent connectivity changes in the amygdala (basolateral amygdala; BLA), insula, dlPFC, supramarginal gyrus, and left MTG. Compared with the DBS (incongruent) condition, the FBS (congruent) condition decreased the influence of the right EBA on the amygdala (BLA), insula, dlPFC, supramarginal gyrus, and left MTG (Fig. 4b and Table 4).

Table 4 Functional connectivity of the extrastriate body area according to somatotopic condition

Discussion

We found that imagining the bodily sensation patterns associated with fearful state facilitated the classification of morphed emotional faces as fearful. The neuroimaging study revealed a significant increase in the neural response to fearful faces under the FBS condition in the brain regions comprising the fronto-insular-temporal network. Subsequent analysis of the ROIs associated with fear, the amygdala and insula, revealed significantly greater activation under the FBS than under the DBS condition. Furthermore, the same fearful face elicited a more pronounced response in the ACC and EBA after the congruent bodily sensation pattern (FBS) was imagined than after the incongruent bodily sensation pattern (DBS) was. The gPPI analysis revealed that the bodily sensation pattern modulated the connectivity between the ACC and the dlPFC/mid-insula/fusiform area and between the EBA and dlPFC/anterior-insula/amygdala, which are involved in emotional processing and are components of the fronto-insular-temporal network. The ACC and EBA may modulate the processing of bodily sensation patterns related to emotion perception.

In the present study, fearful face recognition was facilitated after enhanced interoceptive bodily sates of fear. Recently, a new view of interoceptive inference has been proposed by extending the concept of predictive coding, the notion that the human brain predicts afferent signals based on a generative model, to interoception [19]. In daily life, bodily sensation patterns are constantly accompanied by emotion. Thus, we would have learned that somatotopic patterns are predictors of the emotional state in the framework of interoceptive inference. Imagining the bodily sensation of a specific emotion in the bodily imagery task increased the probability of predicting the emotional state congruent with the bodily sensation. This finding provides a concrete example of how the inference of interoceptive information can influence the process of emotional face perception. Previous findings support the influence of interoception of the visceral state on the emotional state. Previous animal studies of fear conditioning have shown that interoceptive state (hunger versus satiety) can act as a contextual factor to signal the delivery of the electric shock [49, 50]. In humans, the interoceptive state of cardiovascular arousal enhances feelings of fear and anxiety [3, 7], and cardiac signals have been shown to influence body ownership and self-identification [51,52,53]. Our results extend previous findings on the effect of visceral states themselves on emotion by showing that simply imagining a somatotopic pattern sensation can affect emotional processing.

Together with behavioral data, our findings provide neural evidence that the brain activation patterns arising in response to fearful faces change depending on bodily sensation. Moreover, a selective neural response to the fearful face in the amygdala and insula was observed only under the congruent bodily sensation pattern (FBS) condition. Fearful facial expressions have been shown to evoke consistent neural activity in the amygdala [54,55,56]. The amygdala and insula play an important role in processing the emotional content in facial expressions [54, 57, 58]. Extending the previous findings, our study suggests that bodily sensation patterns modulate emotional face perception and neural activation of the emotional perception in response to a congruent emotional state. Someone might raise the concerns in which the effect of interoceptive imagination on face recognition might be influenced by more general mechanisms, i.e., the different arousal level of fear and disgust. When we compared the heart rate and LF and HF of HRV during bodily imagery task in the current study, however, there was no significant differences of arousal level between FBS and DBS condition.

We compared the neural activity elicited by fearful faces under the FBS and DBS conditions and found that activation in the ACC and EBA was increased for the congruent bodily sensation pattern, suggesting a possible role for these brain regions as modulators of emotional processing according to bodily sensation patterns. In particular, activation of the ACC was confined to Brodmann area 32. This region corresponds to the agranular cortex, which has cortical columns with less laminar differentiation [59,60,61]. Although little empirical evidence for interoceptive inference has been reported, it may be that the agranular cortex in the ACC and anterior insula are involved in interoceptive prediction based on neuroanatomical characteristics [27, 60]. This notion is in agreement with evidence suggesting that the ACC plays a key role when the interoceptive state acts as a contextual factor or an unconditioned stimulus in the conditioning process [62, 63]. Furthermore, a previous study found that neuronal activation was observed in the ACC when an interoceptive threat (hyperventilation task) was anticipated by the presentation of a conditioned cue [64]. Taken together, these findings suggest that the ACC may code the prediction of the emotional state according to bodily sensation patterns.

We found that the EBA showed greater brain activation to fearful faces under interoceptive inference of congruent bodily sensation pattern than under incongruent bodily sensation pattern. The EBA is responsible for the integration of multisensory input associated with the body, including visual and tactile afferent signals [65, 66]. In addition to multisensory integration, the EBA is involved in self-body representation [67,68,69]. Specifically, the EBA has been shown to have more selective activity locally than in the whole body [70]. Furthermore, the EBA is involved in emotional processing and has been shown to play a key role in extracting emotional content from body expressions [71,72,73,74]. Previous studies have shown that emotional body expressions stimulated EBA activity, which was positively correlated with amygdala activity in response to emotional content [71]. Furthermore, the EBA is involved in processing associated with interoceptive signals as well as with visual information related to the body. A recent electroencephalogram study found that synchronous cardiac signals enhanced the visual processing of body, and the signal was characterized by enhanced activity in the EBA and the inferior frontal and right basal ganglia–insula regions [75]. Our finding supports the previous evidence suggesting EBA as a major channel through which interoceptive information transmits into the other brain areas. The EBA is thought to facilitate the processing of emotion associated with the somatotopic pattern of interoception information by modulating the fronto-insular-temporal network, which is involved in high-level cognitive functions [76, 77]. Analogously, our gPPI analysis revealed that the connectivity between the EBA and the vlPFC, insula, hippocampus, and amygdala, which constitute the fronto-insular-temporal network, was modulated by imagination with bodily sensation patterns. We observed significantly less connectivity between the EBA and the fronto-insular-temporal network in the congruent bodily sensation pattern than in the incongruent pattern. We regressed out the global mean of every voxel to remove correlations of no interest across the brain. However, this procedure has been shown to shift the correlation distribution to a mean near zero and force the emergence of negative correlations [78,79,80]. Thus, we focused on the difference in connectivity between conditions rather than on the directionality of the difference. Our findings suggest that, during the process involved in the interoceptive prediction of emotional perception, information about the specific body part associated with sensation is delivered via the EBA, as a part of the fronto-insular-temporal network, thereby modulating the processing of emotional stimuli.

Our study had several limitations. First, we used a 50% fearful 50% disgusted morphed face as a control in the evaluation of the neural response to the fearful face. Although the intermediate face contained less fearful emotional content, we cannot say that it was a neutral face. Second, interoceptive body maps of fear and disgust viewed by participants differed mainly by the involvement of face regions, i.e., eye and mouth regions. Since the bodily imagery task required them to pay attention to these body parts, their perception of the subsequent faces might be biased towards deeper processing of these face parts and thus lead to differential brain responses to fear face expression. Third, our investigation was restricted to the emotions of fear and disgust; therefore, we acknowledge that our findings are not sufficient to confirm the underlying neural pathway of bodily sensation affecting emotional processing across emotions. One possible option for control may be the neutral somatotopic maps, and future studies are needed to investigate the more generalized neural nature of this mechanism. Lastly, the sample size was quite small in the current study. Recently, many researchers raised the concerns in which relatively low power of fMRI studies contribute to a potentially highly inflated levels of false-positives [81]. The use of heuristic sample-size guidelines may give rise to increased risk of false or exaggerated results. More sufficient data are needed to ensure the reproducibility of neuroimaging research in the future.

In summary, our behavioral and neuroimaging findings support the theory that the top-down inference of bodily sensation can facilitate the corresponding emotional perception. Somatotopic patterns of bodily sensation provide informative access to the collective visceral state. The ACC and EBA were involved in the selective modulation of bodily sensation-dependent connectivity with the fronto-insular-temporal network. Our findings suggest that perceived emotion is the product of ascending emotional stimuli and the reciprocal interaction of the descending inference about internal states.

Availability of data and materials

Please contact author for data requests.

Abbreviations

ACC:

Anterior cingulate cortex

AFNI:

Analysis of functional neuroimages

ANOVA:

Analysis of variance

BSM-E:

Bodily sensation map-emotion

DBS:

Disgust-associated bodily sensation

dlPFC:

Dorsolateral prefrontal

EBA:

Extrastriate body area

EPIC:

Embodied predictive interoception coding

FBS:

Fear-associated bodily sensation

FDR:

False discovery rate

fMRI:

Functional magnetic resonance imaging

FPRs:

False positive rates

FWE:

Family-wise error

FWHM:

Full-width-at-half-maximum

HRV:

Heart rate variability

IBI:

Inter-beat interval

KDEF:

Karolinska directed emotional faces

MDD:

Major depressive disorder

MTG:

Middle temporal gyrus

PPI:

Psychophysiological interactions

ROI:

Region of interest

References

  1. James W: What is an emotion? Mind 1884, os-IX(34):188–205.

  2. Damasio A, Carvalho GB. The nature of feelings: evolutionary and neurobiological origins. Nat Rev Neurosci. 2013;14(2):143–52.

    Article  CAS  PubMed  Google Scholar 

  3. Garfinkel SN, Critchley HD. Threat and the Body: How the Heart Supports Fear Processing. Trends in cognitive sciences. 2016;20(1):34–46.

    Article  PubMed  Google Scholar 

  4. Levenson RW. Blood, sweat, and fears: the autonomic architecture of emotion. Ann N Y Acad Sci. 2003;1000:348–66.

    Article  PubMed  Google Scholar 

  5. Kreibig SD. Autonomic nervous system activity in emotion: a review. Biol Psychol. 2010;84(3):394–421.

    Article  PubMed  Google Scholar 

  6. Craig AD. Interoception: the sense of the physiological condition of the body. Curr Opin Neurobiol. 2003;13(4):500–5.

    Article  CAS  PubMed  Google Scholar 

  7. Garfinkel SN, Minati L, Gray MA, Seth AK, Dolan RJ, Critchley HD. Fear from the heart: sensitivity to fear stimuli depends on individual heartbeats. J Neurosci. 2014;34(19):6573–82.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  8. Tsakiris M, Tajadura-Jimenez A, Costantini M. Just a heartbeat away from one’s body: interoceptive sensitivity predicts malleability of body-representations. Proc Biol Sci. 2011;278(1717):2470–6.

    PubMed  PubMed Central  Google Scholar 

  9. Dunn BD, Galton HC, Morgan R, Evans D, Oliver C, Meyer M, Cusack R, Lawrence AD, Dalgleish T. Listening to your heart. How interoception shapes emotion experience and intuitive decision making. Psychol Sci. 2010;21(12):1835–44.

    Article  PubMed  Google Scholar 

  10. Demenescu LR, Kortekaas R, den Boer JA, Aleman A. Impaired attribution of emotion to facial expressions in anxiety and major depression. PLoS ONE. 2010;5(12):e15058.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  11. Feinberg TE, Rifkin A, Schaffer C, Walker E. Facial discrimination and emotional recognition in schizophrenia and affective disorders. Arch Gen Psychiatry. 1986;43(3):276–9.

    Article  CAS  PubMed  Google Scholar 

  12. Leppanen JM, Milders M, Bell JS, Terriere E, Hietanen JK. Depression biases the recognition of emotionally neutral faces. Psychiatry Res. 2004;128(2):123–33.

    Article  PubMed  Google Scholar 

  13. Surcinelli P, Codispoti M, Montebarocci O, Rossi N, Baldaro B. Facial emotion recognition in trait anxiety. J Anxiety Disord. 2006;20(1):110–7.

    Article  PubMed  Google Scholar 

  14. Bell C, Bourke C, Colhoun H, Carter F, Frampton C, Porter R. The misclassification of facial expressions in generalised social phobia. J Anxiety Disord. 2011;25(2):278–83.

    Article  CAS  PubMed  Google Scholar 

  15. Sterling P. Allostasis: a model of predictive regulation. Physiol Behav. 2012;106(1):5–15.

    Article  CAS  PubMed  Google Scholar 

  16. Knill DC, Pouget A. The Bayesian brain: the role of uncertainty in neural coding and computation. Trends Neurosci. 2004;27(12):712–9.

    Article  CAS  PubMed  Google Scholar 

  17. Friston K. The free-energy principle: a unified brain theory? Nat Rev Neurosci. 2010;11(2):127–38.

    Article  CAS  PubMed  Google Scholar 

  18. O’Reilly JX, Jbabdi S, Behrens TE. How can a Bayesian approach inform neuroscience? Eur J Neurosci. 2012;35(7):1169–79.

    Article  PubMed  Google Scholar 

  19. Seth AK. Interoceptive inference, emotion, and the embodied self. Trends Cogn Sci. 2013;17(11):565–73.n

    Article  PubMed  Google Scholar 

  20. Nummenmaa L, Glerean E, Hari R, Hietanen JK. Bodily maps of emotions. Proc Natl Acad Sci USA. 2014;111(2):646–51.n

    Article  CAS  PubMed  Google Scholar 

  21. Hietanen JK, Glerean E, Hari R, Nummenmaa L. Bodily maps of emotions across child development. Dev Sci. 2016;19(6):1111–8.

    Article  PubMed  Google Scholar 

  22. Jung WM, Ryu Y, Lee YS, Wallraven C, Chae Y. Role of interoceptive accuracy in topographical changes in emotion-induced bodily sensations. PLoS ONE. 2017;12(9):e0183211.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  23. Chanes L, Barrett LF. Redefining the role of limbic areas in cortical processing. Trends Cogn Sci. 2016;20(2):96–106.

    Article  PubMed  Google Scholar 

  24. Allen M, Friston KJ. From cognitivism to autopoiesis: towards a computational framework for the embodied mind. Synthese. 2018;195(6):2459–82.

    Article  PubMed  Google Scholar 

  25. Owens AP, Allen M, Ondobaka S, Friston KJ. Interoceptive inference: from computational neuroscience to clinic. Neurosci Biobehav Rev. 2018;90:174–83.

    Article  PubMed  Google Scholar 

  26. Allen M, Frank D, Schwarzkopf DS, Fardo F, Winston JS, Hauser TU, Rees G. Unexpected arousal modulates the influence of sensory noise on confidence. Elife. 2016;5:1.

    Google Scholar 

  27. Barrett LF, Simmons WK. Interoceptive predictions in the brain. Nat Rev Neurosci. 2015;16(7):419–29.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  28. Gu X, Hof PR, Friston KJ, Fan J. Anterior insular cortex and emotional awareness. J Comp Neurol. 2013;521(15):3371–88.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Durnez J, Degryse J, Moerkerke B, Seurinck R, Sochat V, Poldrack RA, Nichols TE: Power and sample size calculations for fMRI studies based on the prevalence of active peaks. BioRxiv 2016.

  30. Susskind JM, Lee DH, Cusi A, Feiman R, Grabski W, Anderson AK. Expressing fear enhances sensory acquisition. Nat Neurosci. 2008;11(7):843–50.

    Article  CAS  PubMed  Google Scholar 

  31. Thielscher A, Pessoa L. Neural correlates of perceptual choice and decision making during fear-disgust discrimination. J Neurosci. 2007;27(11):2908–17.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  32. Jung WM, Shim W, Lee T, Park HJ, Ryu Y, Beissner F, Chae Y. More than DeQi: spatial patterns of acupuncture-induced bodily sensations. Front Neurosci. 2016;10:462.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Jung WM, Lee SH, Lee YS, Chae Y. Exploring spatial patterns of acupoint indications from clinical data: A STROBE-compliant article. Medicine. 2017;96(17):e6768.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Wichmann FA, Hill NJ. The psychometric function: I Fitting, sampling, and goodness of fit. Percept Psychophys. 2001;63(8):1293–313.n

    Article  CAS  PubMed  Google Scholar 

  35. Traschutz A, Zinke W, Wegener D. Speed change detection in foveal and peripheral vision. Vision Res. 2012;72:1–13.

    Article  PubMed  Google Scholar 

  36. Cox RW. AFNI: software for analysis and visualization of functional magnetic resonance neuroimages. Comput Biomed Res. 1996;29(3):162–73.

    Article  CAS  PubMed  Google Scholar 

  37. Paus T, Otaky N, Caramanos Z, MacDonald D, Zijdenbos A, D’Avirro D, Gutmans D, Holmes C, Tomaiuolo F, Evans AC. In vivo morphometry of the intrasulcal gray matter in the human cingulate, paracingulate, and superior-rostral sulci: hemispheric asymmetries, gender differences and probability maps. J Comp Neurol. 1996;376(4):664–73.

    Article  CAS  PubMed  Google Scholar 

  38. Forman SD, Cohen JD, Fitzgerald M, Eddy WF, Mintun MA, Noll DC. Improved assessment of significant activation in functional magnetic resonance imaging (fMRI): use of a cluster-size threshold. Magn Reson Med. 1995;33(5):636–47.

    Article  CAS  PubMed  Google Scholar 

  39. de Andrade TG, Peterson KR, Cunha AF, Moreira LS, Fattori A, Saad ST, Costa FF. Identification of novel candidate genes for globin regulation in erythroid cells containing large deletions of the human beta-globin gene cluster. Blood Cells Mol Dis. 2006;37(2):82–90.

    Article  PubMed  CAS  Google Scholar 

  40. Cox RW, Chen G, Glen DR, Reynolds RC, Taylor PA. fMRI clustering and false-positive rates. Proc Natl Acad Sci USA. 2017;114(17):E3370-e3371.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  41. Paulus MP, Stein MB. An insular view of anxiety. Biol Psychiatry. 2006;60(4):383–7.

    Article  PubMed  Google Scholar 

  42. Phillips ML, Williams LM, Heining M, Herba CM, Russell T, Andrew C, Bullmore ET, Brammer MJ, Williams SC, Morgan M, et al. Differential neural responses to overt and covert presentations of facial expressions of fear and disgust. Neuroimage. 2004;21(4):1484–96.

    Article  PubMed  Google Scholar 

  43. Shah SG, Klumpp H, Angstadt M, Nathan PJ, Phan KL. Amygdala and insula response to emotional images in patients with generalized social anxiety disorder. J Psychiatry Neurosci. 2009;34(4):296–302.

    PubMed  PubMed Central  Google Scholar 

  44. Zald DH. The human amygdala and the emotional evaluation of sensory stimuli. Brain Res Brain Res Rev. 2003;41(1):88–123.

    Article  PubMed  Google Scholar 

  45. Desikan RS, Segonne F, Fischl B, Quinn BT, Dickerson BC, Blacker D, Buckner RL, Dale AM, Maguire RP, Hyman BT, et al. An automated labeling system for subdividing the human cerebral cortex on MRI scans into gyral based regions of interest. Neuroimage. 2006;31(3):968–80.

    Article  PubMed  Google Scholar 

  46. Allen M, Poggiali D, Whitaker K, Marshall TR, Kievit RA. Raincloud plots: a multi-platform tool for robust data visualization. Wellcome Open Res. 2019;4:63.

    Article  PubMed  PubMed Central  Google Scholar 

  47. McLaren DG, Ries ML, Xu G, Johnson SC. A generalized form of context-dependent psychophysiological interactions (gPPI): a comparison to standard approaches. Neuroimage. 2012;61(4):1277–86.

    Article  PubMed  Google Scholar 

  48. Cisler JM, Bush K, Steele JS. A comparison of statistical methods for detecting context-modulated functional connectivity in fMRI. Neuroimage. 2014;84:1042–52.

    Article  PubMed  Google Scholar 

  49. Davidson TL, Jarrard LE. A role for hippocampus in the utilization of hunger signals. Behavioral and neural biology. 1993;59(2):167–71.

    Article  CAS  PubMed  Google Scholar 

  50. Davidson TL, Kanoski SE, Chan K, Clegg DJ, Benoit SC, Jarrard LE. Hippocampal lesions impair retention of discriminative responding based on energy state cues. Behav Neurosci. 2010;124(1):97–105.

    Article  PubMed  PubMed Central  Google Scholar 

  51. Suzuki K, Garfinkel SN, Critchley HD, Seth AK. Multisensory integration across exteroceptive and interoceptive domains modulates self-experience in the rubber-hand illusion. Neuropsychologia. 2013;51(13):2909–17.

    Article  PubMed  Google Scholar 

  52. Sel A, Azevedo RT, Tsakiris M: Heartfelt Self: Cardio-Visual Integration Affects Self-Face Recognition and Interoceptive Cortical Processing. Cerebral cortex (New York, NY : 1991) 2016:1–12.

  53. Aspell JE, Heydrich L, Marillier G, Lavanchy T, Herbelin B, Blanke O. Turning body and self inside out: visualized heartbeats alter bodily self-consciousness and tactile perception. Psychol Sci. 2013;24(12):2445–53.

    Article  PubMed  Google Scholar 

  54. Phillips ML, Young AW, Scott SK, Calder AJ, Andrew C, Giampietro V, Williams SC, Bullmore ET, Brammer M, Gray JA. Neural responses to facial and vocal expressions of fear and disgust. Proc Biol Sci. 1998;265(1408):1809–17.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  55. Breiter HC, Etcoff NL, Whalen PJ, Kennedy WA, Rauch SL, Buckner RL, Strauss MM, Hyman SE, Rosen BR. Response and habituation of the human amygdala during visual processing of facial expression. Neuron. 1996;17(5):875–87.

    Article  CAS  PubMed  Google Scholar 

  56. Morris JS, Frith CD, Perrett DI, Rowland D, Young AW, Calder AJ, Dolan RJ. A differential neural response in the human amygdala to fearful and happy facial expressions. Nature. 1996;383(6603):812–5.

    Article  CAS  PubMed  Google Scholar 

  57. Phillips ML, Young AW, Senior C, Brammer M, Andrew C, Calder AJ, Bullmore ET, Perrett DI, Rowland D, Williams SC, et al. A specific neural substrate for perceiving facial expressions of disgust. Nature. 1997;389(6650):495–8.

    Article  CAS  PubMed  Google Scholar 

  58. Haxby JV, Hoffman EA, Gobbini MI. The distributed human neural system for face perception. Trends in cognitive sciences. 2000;4(6):223–33.

    Article  CAS  PubMed  Google Scholar 

  59. Barbas H, Rempel-Clower N: Cortical structure predicts the pattern of corticocortical connections. Cerebral cortex (New York, NY : 1991) 1997, 7(7):635–646.

  60. Shipp S, Adams RA, Friston KJ. Reflections on agranular architecture: predictive coding in the motor cortex. Trends Neurosci. 2013;36(12):706–16.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  61. van den Heuvel MP, Sporns O. An anatomical substrate for integration among functional networks in human cortex. J Neurosci. 2013;33(36):14489–500.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  62. Critchley HD, Mathias CJ, Dolan RJ. Neuroanatomical basis for first- and second-order representations of bodily states. Nat Neurosci. 2001;4(2):207–12.

    Article  CAS  PubMed  Google Scholar 

  63. Critchley HD, Melmed RN, Featherstone E, Mathias CJ, Dolan RJ. Brain activity during biofeedback relaxation: a functional neuroimaging investigation. Brain. 2001;124(Pt 5):1003–12.

    Article  CAS  PubMed  Google Scholar 

  64. Holtz K, Pane-Farre CA, Wendt J, Lotze M, Hamm AO. Brain activation during anticipation of interoceptive threat. Neuroimage. 2012;61(4):857–65.

    Article  PubMed  Google Scholar 

  65. Costantini M, Haggard P. The rubber hand illusion: sensitivity and reference frame for body ownership. Conscious Cogn. 2007;16(2):229–40.

    Article  PubMed  Google Scholar 

  66. Limanowski J, Lutti A, Blankenburg F. The extrastriate body area is involved in illusory limb ownership. Neuroimage. 2014;86:514–24.

    Article  PubMed  Google Scholar 

  67. Wold A, Limanowski J, Walter H, Blankenburg F. Proprioceptive drift in the rubber hand illusion is intensified following 1 Hz TMS of the left EBA. Front Hum Neurosci. 2014;8:390.

    Article  PubMed  PubMed Central  Google Scholar 

  68. Tsakiris M. My body in the brain: a neurocognitive model of body-ownership. Neuropsychologia. 2010;48(3):703–12.

    Article  PubMed  Google Scholar 

  69. Tsakiris M, Hesse MD, Boy C, Haggard P, Fink GR. Neural signatures of body ownership: a sensory network for bodily self-consciousness. Cereb Cortex. 2007;17(10):2235–44.

    Article  PubMed  Google Scholar 

  70. Taylor JC, Wiggett AJ, Downing PE. Functional MRI analysis of body and body part representations in the extrastriate and fusiform body areas. J Neurophysiol. 2007;98(3):1626–33.

    Article  PubMed  Google Scholar 

  71. Peelen MV, Atkinson AP, Andersson F, Vuilleumier P. Emotional modulation of body-selective visual areas. Social cognitive and affective neuroscience. 2007;2(4):274–83.

    Article  PubMed  PubMed Central  Google Scholar 

  72. Atkinson AP, Vuong QC, Smithson HE. Modulation of the face- and body-selective visual regions by the motion and emotion of point-light face and body stimuli. Neuroimage. 2012;59(2):1700–12.

    Article  PubMed  Google Scholar 

  73. Vuilleumier P, Richardson MP, Armony JL, Driver J, Dolan RJ. Distant influences of amygdala lesion on visual cortical activation during emotional face processing. Nat Neurosci. 2004;7(11):1271–8.

    Article  CAS  PubMed  Google Scholar 

  74. van de Riet WA, Grezes J, de Gelder B. Specific and common brain regions involved in the perception of faces and bodies and the representation of their emotional expressions. Soc Neurosci. 2009;4(2):101–20.

    Article  PubMed  Google Scholar 

  75. Ronchi R, Bernasconi F, Pfeiffer C, Bello-Ruiz J, Kaliuzhna M, Blanke O. Interoceptive signals impact visual processing: Cardiac modulation of visual body perception. Neuroimage. 2017;158:176–85.

    Article  PubMed  Google Scholar 

  76. Downing PE, Peelen MV. The role of occipitotemporal body-selective regions in person perception. Cognitive neuroscience. 2011;2(3–4):186–203.

    Article  PubMed  Google Scholar 

  77. Amoruso L, Couto B, Ibanez A. Beyond Extrastriate Body Area (EBA) and Fusiform Body Area (FBA): Context Integration in the Meaning of Actions. Front Hum Neurosci. 2011;5:124.

    Article  PubMed  PubMed Central  Google Scholar 

  78. Van Dijk KR, Hedden T, Venkataraman A, Evans KC, Lazar SW, Buckner RL. Intrinsic functional connectivity as a tool for human connectomics: theory, properties, and optimization. J Neurophysiol. 2010;103(1):297–321.

    Article  PubMed  Google Scholar 

  79. Murphy K, Birn RM, Handwerker DA, Jones TB, Bandettini PA. The impact of global signal regression on resting state correlations: are anti-correlated networks introduced? Neuroimage. 2009;44(3):893–905.

    Article  PubMed  Google Scholar 

  80. Fox MD, Zhang D, Snyder AZ, Raichle ME. The global signal and observed anticorrelated resting state brain networks. J Neurophysiol. 2009;101(6):3270–83.

    Article  PubMed  PubMed Central  Google Scholar 

  81. Poldrack RA, Baker CI, Durnez J, Gorgolewski KJ, Matthews PM, Munafo MR, Nichols TE, Poline JB, Vul E, Yarkoni T. Scanning the horizon: towards transparent and reproducible neuroimaging research. Nat Rev Neurosci. 2017;18(2):115–26.n

    Article  CAS  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (No. 2018R1D1A1B07042313) and the Korea Institute of Oriental Medicine (no. K18181).

Author information

Authors and Affiliations

Authors

Contributions

WMJ, CW, and YC designed the experiments. WMJ, CW, YR, and YSL performed experiments and analyzed data; WMJ, YSL and ISL drafted the manuscript and finished the final version of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Younbyoung Chae.

Ethics declarations

Ethics approval and consent to participate

All participants provided written informed consent before the experiments. The Institutional Review Board of Korea University approved all study protocols (KU-IRB-15-108-A-1).

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1: Table S1.

Additional tables.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jung, WM., Lee, YS., Lee, IS. et al. Enhanced bodily states of fear facilitates bias perception of fearful faces. Mol Brain 13, 157 (2020). https://doi.org/10.1186/s13041-020-00674-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13041-020-00674-6

Keywords