Intuitive thinking

Authored by: Tilmann Betsch , Pablina Roth

The Routledge International Handbook of Thinking and Reasoning

Print publication date:  November  2017
Online publication date:  November  2017

Print ISBN: 9781138849303
eBook ISBN: 9781315725697
Adobe ISBN:

10.4324/9781315725697-3

 

Abstract

Approaching the concept of intuition is a risky endeavor. Traps lurk everywhere and one is prone to be sniped at for disseminating misconceptions about intuitive thinking. Psychologists widely agree that intuition is a phenomenon of paramount importance characterized by distinct properties. Unfortunately, they also disagree about what those properties are. For example, the Take-The-Best heuristic (Gigerenzer & Goldstein, 1999; see also Hoffrage, Hafenbrädl, & Marewski, this volume) is a lexicographic rule of decision making that simply compares arguments or outcomes on the most important dimension while ignoring all the others. The option with the best value on this single dimension is chosen. Some researchers view this heuristic as an example of intuitive thinking (Gigerenzer, 2007). Others conceive such rules as shortcuts to deliberation (T. Betsch, 2008; Frederick, 2002) and show that children have difficulty learning these simple rules (Mata, von Helversen, & Rieskamp, 2011).

 Add to shortlist  Cite

Intuitive thinking

Approaching the concept of intuition is a risky endeavor. Traps lurk everywhere and one is prone to be sniped at for disseminating misconceptions about intuitive thinking. Psychologists widely agree that intuition is a phenomenon of paramount importance characterized by distinct properties. Unfortunately, they also disagree about what those properties are. For example, the Take-The-Best heuristic (Gigerenzer & Goldstein, 1999; see also Hoffrage, Hafenbrädl, & Marewski, this volume) is a lexicographic rule of decision making that simply compares arguments or outcomes on the most important dimension while ignoring all the others. The option with the best value on this single dimension is chosen. Some researchers view this heuristic as an example of intuitive thinking (Gigerenzer, 2007). Others conceive such rules as shortcuts to deliberation (T. Betsch, 2008; Frederick, 2002) and show that children have difficulty learning these simple rules (Mata, von Helversen, & Rieskamp, 2011).

We do not see much merit in debating definitions. Instead, in this chapter we wish to illustrate the diversity of theoretical approaches and accompanying proposed processes subsumed under the concept of intuitive thought. This diversity emanates to a considerable extent from the selection of research paradigms and tasks. Specific task environments provide habitats for certain processes to reveal their power and, at the same time, obstruct the flourishing of others. Research habits may occasionally yield voodoo-correlations (Fiedler, 2011) but, at the same time, have important virtues. Tailoring tasks for processes allows us to gain insight into their functioning and power. It is our motivation to convince the reader that intuitive thinking is driven by a multitude of processes with strikingly diverse properties. We begin our illustration of diversity with an attribute list reflecting the well-known two-faculty view of the human mind that underlies most assumptions about intuitive thinking. In the second section, we show that sorting processes into faculties does not remedy the problem that intuitive processes sometimes differ strikingly from each other. We discuss two processes in more detail in the third section and argue that the diversity of processes should be acknowledged on a theoretical level. In the fourth section, we consider some topics much discussed in the literature, such as neural correlates of intuition, the accuracy of intuitive thought, and the role of learning and emotions. We close with a remark on the nature of intuition.

Attribute lists and a two-faculty view

Psychologists often come up with attribute lists to describe the concept of intuition (Winerman, 2005). Typically, these lists are presented as semantic differentials so that intuition is delimited from its alleged opposite, which is given different labels such as reflection, deliberation, or analysis. Attribute lists describing intuition and its counterpart vary strikingly with respect to the number of attributes, content, and precision. Although some attributes are jointly agreed upon by different scholars, the various lists hosted in the literature make no reference to one another and contain contradictory features (e.g., fallible vs. non-fallible; cf. Abernathy & Hamm, 1995 for an early review). To illustrate this problem, Table 3.1 presents an attribute list compiled from three contributions (Evans & Stanovich, 2013; Plessner, Betsch, & Betsch, 2008; special issue on intuition in Psychological Inquiry, Vol. 21, No. 4, 2010). The list is far from comprehensive and could have easily been extended if we considered a larger sample of papers. The content of Table 3.1 strongly reflects a phenomenological orientation. The sheer number of features results in a challenge to construct validity. Attribute lists seem to “replace mystery with mystery” (Hammond, 2010, p. 327) and make us “wonder whether the term [intuition] has any meaning at all” (Epstein, 2008, p. 23). Notwithstanding the diversity, attribute lists share the conceptual idea of contrasting intuition with a reflective opponent. This confrontation with intuition and its opponent is familiar to all of us. Sometimes insights, judgments, and decisions suddenly pop into our awareness, and we have no idea how it happened. Other times, we engage in effortful deliberation until we eventually solve a problem and make estimations or decisions under seemingly full conscious control.

Defining intuition in contrast to other types of thinking implies at least three specific assumptions about the faculties in our mind: (1) There is a determinable number of faculties, namely two (and not three, four, or an infinite number); (2) these faculties are discrete entities; and (3) the faculties are alternative ways of thinking. The two-faculty view is not necessarily justified by pure induction. It assimilates with a preconception that is deeply rooted in cultural traditions. The Kantian distinction between a priori knowledge and rational justification, the Freudian battle between the conscious and the unconscious, the layperson’s affection for the close and cozy gut reaction that is so often believed to outperform the calculating mind – all of these conceptions still resonate in the psychological two-faculty approach to intuitive and non-intuitive thinking. However, it was not until the late 1970s that the two-faculty view began to proliferate in the literature (e.g., Schneider & Shiffrin, 1977; Shiffrin & Schneider, 1977). In subsequent decades, it became prominent and spread widely across various subfields of our discipline, from social psychology (Chaiken & Trope, 1999) to developmental psychology (Stanovich, West, & Toplak, 2011) to judgment and decision making (Evans, 2008; Gilovich, Griffin, & Kahneman, 2002; Kahneman, 2011) to personality psychology (C. Betsch, 2008; Epstein, Pacini, Denes-Raj, & Heier, 1996) and various areas of practical application (Furley, Schweizer, & Bertrams, 2015; Myers, 2010; Sadler-Smith, 2008; Sinclair, 2010).

Similarly to the diversity of intuitive processes, dual-process theories present a heterogeneous picture. There are various levels on which dual-process theories can be arranged according to specific criteria (Evans, 2008; Evans & Stanovich, 2013) – for example, in regard to terminology (e.g., minds, characters, brains, systems, modes), the degree of interplay between both faculties (e.g., independence, reciprocity, or default-interventionism), and classes of attributes associated with both faculties (e.g., role of consciousness, interpersonal difference, age of evolution). As an answer to the wide array of assumptions, Evans and Stanovich (2013, p. 225) provided a conceptual clean sweep. They separated defining features from correlates. Assuming that there are two qualitatively distinct processes, they define the (intuitive) Type 1 process as autonomous – that is, it can function automatically without cognitive control. Moreover, it does not require working memory to operate. The (reflective) Type 2 process, in contrast, requires working memory. It is capable of performing decoupling operations, such as hypothetical thinking and actively separating representations. The authors claim that all the other labels and attributes carry with them “some semantic baggage” (p. 227) and are not necessary to define the two types of processes (see also Evans, this volume).

Table 3.1   Attributes aligned with intuition in contrast to reasoning

Intuition

Reasoning

fast

1, 2, 3, 4, 7, 9, 10

slow

1, 3, 4, 7, 10

parallel

1, 4, 10

serial

1, 4, 10

automatic

1, 3, 4, 5, 7, 9, 10

controlled, deliberative

1, 3, 4, 5, 7, 10

effortless

1, 4, 6, 10

effortful

1, 4, 6, 10

associative

1, 3, 4, 7, 10

rule-governed

1, 3, 4, 5, 7, 10

slow-learning

1, 3, 4

flexible

1, 3, 4

emotional/affective

1, 2, 3, 4

neutral

1, 3, 4

non-analytic

2

analytic

3, 5, 10

pattern recognition

2

arises from symbolic rules

2

non-symbolic

2

functional reasoning

2

lazy thinking

2

unavoidable

2

infallible

2

fallible

2

feeling of certainty, confidence

2, 5

based on experience

2, 3, 7

logical learning

3

preconscious, non-conscious

3, 5, 6, 9

conscious

3, 5, 6

concrete

3

abstract

3, 7

holistic

3, 10

hedonic principle

3

reality principle

3

outcome oriented

3

process oriented

3

crudely differentiated

3

highly differentiated

3

crudely integrated

3

highly integrated

3

experienced passively

3

experienced actively

3

self-evidently valid

3

requires justification

3

tacit

5

reactive

5

proactive

5

implicit

5

explicit

5

approximate

5

precise

5

no need for working memory

7, 10

need for working memory

7, 10

autonomous

7

cognitive decoupling

7

consequential decision making

7

based on small samples

8, 9

efficient

9

The diversity of intuitive processes

In this section we show that Type 1 processes have unique properties. Nonetheless, they can be assigned to the same type as Type 2 processes (see also Stanovich, 2004). It is far beyond the scope of this chapter to come up with a detailed description of all the processes, rules, and heuristics that are declared intuitive. In contrast to theoretically grounded systematizations of process characteristics of intuition (e.g., Glöckner & Witteman, 2009, 2010), we simply consider two aspects that relate to informational input. A broad classification suffices to achieve three goals. First, we seek to illustrate the diversity of processes that are subsumed by the term intuition. Second, we aim to show that this diversity evokes incoherent conceptions of intuition reflecting pronounced different assumptions regarding the capabilities of the human mind. Third, we wish to allude to the design of experimental paradigms and accompanied task selection as one cause for this diversity. Pre-conceptions of intuition drive the selection of task environments and research paradigms. For instance, the assumption that intuition tends to rest on one reason may direct the researcher’s attention to situations in which samples of information are small due to, for instance, high costs of information access. In contrast, if one assumes that the power of intuition stems from its holistic character, which allows the integration of many pieces of information, one may focus on situations in which the given sample of information is large and costs to access information are low. For example, several heuristics described in the literature are assumed to exploit peripheral cues, proxies, or even irrelevant information. Not surprisingly, evidence of such heuristics is often sought in tasks in which relevant information is difficult to access. As a means to cluster the diverse intuitive processes, we consider two aspects of the stimulus environment in which the operation of a particular process is studied – information quantity and quality.

Quality refers to the state of information. We distinguish between surrogate and primary information. Primary information is commonly described in the literature as normatively relevant, central, on-target, and analytic. Ideally, validity is known and weights in the decision or judgment process can be determined. Primary pieces of information in a persuasion task, for example, are the given arguments that differ in terms of the strength to which they affect attitude change (Petty & Cacioppo, 1986). In decision tasks, described or experienced outcomes of the options differ in terms of value, while their weight is a function of the probability of their occurrence (e.g., Edwards, 1954) or goal-importance (e.g., von Winterfeld & Edwards, 1986). In probabilistic inference tasks, the cues (e.g., advice givers) differ with regard to their stated or learned validity, i.e. the probability that they predict an outcome (e.g., Bröder, 2003). In judgments of frequency of occurrence, primary information stems from on-target observations of the number of times that a particular stimulus (e.g., a name) is encountered during an encoding episode (e.g., Hasher & Zacks, 1984).

Surrogates are pieces of information that are usually described as non-analytic, off-target, remote, or peripheral. Their validity is volatile, unknown, or opaque to the actor at the time of judgment or decision making. In certain situations, surrogates are normatively invalid and lead to systematic biases in judgment and decisions. For example, the physical attractiveness of a communicator is an often considered surrogate in persuasion research (e.g., Chaiken, 1979). In risk assessment, individuals may neglect stated probabilities and instead use their affective reactions towards a threatening outcome to judge risk (Loewenstein, Weber, Hsee, & Welch, 2001; Slovic, Finucane, Peters, & McGregor, 2002). Advice takers may neglect validities of the advice and rely on their personal relation to the advice giver (Betsch, Lang, Lehmann, & Axmann, 2014). When judging frequency of a category, individuals can base their estimates on availability – the ease with which instances can be brought to mind (Tversky & Kahneman, 1973).

Note that quality of the stimulus input does not refer to performance. Surrogates can yield a high level of decision accuracy in some situations but not in others. Relying on recognition in emergency situations, for example, can yield decisions of high quality if the learned behavioral routines apply to the current situation (e.g., Klein, 1999). If the world has changed, however, so that learned behavioral rules have become obsolete, recognition-based decisions can lead to severely mal-adaptive behavior (Betsch et al., 2001). Relying on primary information can also decrease decision accuracy, for example, if the individual attempts to apply a complex analytic rule under situational constraints (e.g., Payne, Bettman, & Johnson, 1988). “Thinking too much” can decrease decision accuracy (Wilson & Schooler, 1991), for instance if extensive consideration of information causes dilution effects in integration (e.g., Anderson, 1971).

Quantity, or the size of the sample of information, can be a function of the environment, the process, or both. Some processes are capable of handling multiple pieces of information but can also process small samples. In decision research, evidence accumulation models (e.g., Busemeyer & Townsend, 1993, Lee & Cummins, 2004) and connectionist models (e.g., Glöckner & Betsch, 2008b; Simon & Holyoak, 2002) assume that a single process (or all-purpose rule) integrates information and selects a choice candidate. The process is normally assumed to operate automatically or intuitively (e.g., Betsch & Glöckner, 2010; Busemeyer & Townsend, 1993). Information input can vary in size dependent on the situation. In contrast, multiple strategy models (e.g., Beach & Mitchell, 1978; Gigerenzer, Todd, & the ABC Research Group, 1999; Payne, Bettman, & Johnson, 1993) propose that individuals have a toolbox of strategies, some utilizing large and others small samples of information (see Söllner & Bröder, 2016, for a critical empirical test between these two types of models). Especially under the umbrella of multiple strategy models, researchers impose constraints to the access of large samples in order to provide existence proofs of small-sample strategies.

Figure 3.1 shows four clusters of processes associated with intuition resulting from a cross tabulation of quality and quantity. The following sections describe exemplars of each cluster in more detail. All examples satisfy the defining criteria put forward by Evans and Stanovich (2013). They are described, for instance, as essentially effortless in terms of working memory load. They can function autonomously without cognitive control, although some strategies require repetition before they become automatic (cf. Kruglanski & Gigerenzer, 2011). Note again that it was not our decision whether a particular process was considered intuitive. We simply followed the authors and their suggestions regarding what represents an intuitive process.

Small samples of primary information

It is the backbone tenet of the bounded rationality approach (Simon, 1955) that individuals can cope with task complexity by applying low-effort strategies that minimize the size of the sample of information (Beach & Mitchell, 1978; Gigerenzer et al., 1999; Kahneman et al., 1982; Payne et al., 1993; Shah & Oppenheimer, 2008). In the most extreme case, low-effort strategies rely on just one reason. We already mentioned the lexicographic strategies (e.g., Fishburn, 1974) such as the Take-the-Best heuristic (Gigerenzer & Goldstein, 1999). A user of a lexicographic strategy inspects attributes or cues in the order of their importance (validity) until one is identified that differentiates between options. If the most important attribute differentiates, information search is stopped and a choice is made. For example, a consumer who is interested only in minimizing monetary costs could make a decision very quickly among dozens of detergents by choosing the least expensive product. In a similar vein, Fiedler and Kareev (2008, p. 150) state that “judgments and decisions are intuitive to the extent that they rest on small samples”. Ambady’s notion of thin slice judgments also aligns with this view (Ambady, 2010). Individuals are able to form (quite accurate) impressions about other people from brief observations (“thin slices”) of their behavior. Kahneman and colleagues (Kahneman et al., 1993) have put forward a small-sample strategy for forming summary evaluations. According to the peak-and-end heuristic, evaluative judgments of past experiences reflect the average of the peak and the end of a sequence of values. Strategies relying on small samples may have to be learned (Mata et al., 2011) and solidified into a routine before they can be performed without conscious control (Gigerenzer, 2007; Kruglanski & Gigerenzer, 2011).

Examples of intuitive processes clustered by quantity and quality of information input

Figure 3.1   Examples of intuitive processes clustered by quantity and quality of information input

These examples of heuristics, rules, and strategies are usually studied in environments that provide primary information. Information can be presented to the participant by description such as monetary pay-offs of options in an information board (e.g., “Mouselab”, Payne et al., 1988). In other paradigms, individuals make their decisions or judgments (e.g., assessment of discomfort during a medical exam) after having made pertinent experiences on their own (e.g., pain during a colonoscopy, Redelmeier & Kahneman, 1996). Fostering the use of small samples can be achieved by several means – for example, when one is required to complete memory-based rather than on-line tasks (Hastie & Park, 1986). In memory-based tasks, decreases in sample size are simply caused by a decrease in accessibility because, at the time of judgment, not all of the prior experiences can be explicitly remembered. As for a medical exam, Redelmeier and Kahneman (1996) argue that probably only the worst and the final, painful moments are remembered. These peaks and end experiences provide the input for the eponymous heuristic. In decision research, the information board is a widely applied tool to track information search and choice. For this purpose, the information contained in an option-by-attribute matrix is hidden. The individual must sequentially open cells in the matrix by using the computer mouse. An opened cell may close when another is inspected. Thus, sampling information is costly because it consumes time and memory resources. Manipulations, such as time pressure or monetary payments for information acquisition, impose additional constrains. Not surprisingly, adult decision makers adapt to such constraints by reducing the size of sampled information (see Glöckner & Betsch, 2008a,b, for discussions).

Small samples of surrogate information

Individuals can also apply small-sample strategies that rely on surrogates. They must do so if they lack access to primary information. At the foremost, research from the heuristics-and-biases program identified intuition as an omnivore when it comes to information intake. Representativeness (Tversky & Kahneman, 1982), feeling of rightness (Thompson et al., 2011), attractiveness of an information source (Petty & Cacioppo, 1986), affective reactions evoked by the issue under consideration (Finucane et al., 2000) – individuals exploit a plethora of non-analytic, off-target, remote, peripheral pieces of information when thinking intuitively. A famous example for a surrogate-strategy is the availability heuristic. It relies on a process feeling that is, how difficult it is to retrieve certain instances from memory or identify objects in a perceptual task. Tversky and Kahneman (1973) defined the availability heuristic as an intuitive device to estimate frequency or probability based on the ease with which instances come to mind. In a similar vein, the fluency heuristic (Schooler & Hertwig, 2005) uses the speed with which the event category itself is recognized or retrieved from memory (cf. also Jacoby & Dallas, 1981). Although proponents of the mutual approach stress differences (Hertwig, Herzog, Schooler, & Reimer, 2008), they spotlight a recursive capability of the human mind to use a by-product of thought as informational input.

Another important process is recognition, which is widely considered to be a key process of intuitive thought. With reference to chess experts, Herbert Simon stated: “the situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition” (Simon, 1992). In a similar vein, Klein (1999) proposes that most expert decisions are recognition-based. Goldstein and Gigerenzer (2002) put forward the recognition heuristic for probabilistic inference (see Hoffrage et al., this volume). Assume you ask a child in America: which city has the larger population, Detroit or Düsseldorf? One can consider good methods with which to infer city size, for example, whether the city is a state capital, has a professional soccer team in the premier league, and so on. American children, however, might know something about Detroit but may have never heard of Düsseldorf. This ignorance can turn out to be advantage. They could use recognition as a proxy for inferring city size and decide that Detroit is larger than Düsseldorf, which is actually true. For German children, it is likely the other way round. If they recognize Düsseldorf but not Detroit and judge size by recognition, their judgment would be incorrect. Thus, depending on the environment, recognition can be a valid or invalid cue for judgment criteria. Obviously, the prevalence of the recognition-based intuitions depends on the task. If the individual recognizes all or none of the eligible candidates, intuition cannot rely on the process of recognition. Hence, existence proofs for recognition-based judgments and decisions often stem from studies in which candidates differ only with regard to recognition (e.g., one city is recognized, whereas the other is not).

Large samples of primary information

Whereas simple heuristics are prominent in the judgment-and-decision-making literature, they are comparatively rare in other areas of cognition. Perception, categorization, speech comprehension – individuals regularly perform these and other fundamental cognitive tasks very quickly and without noticeable effort. The underlying operations can handle a vast amount of primary information without cognitive control and, hence, are widely considered to be autonomous or automatic.

Cheng and colleagues (2007) reviewed a large body of literature on the integration of spatial cues. The evidence on this topic is uncontroversial. Spatial perception/categorization is regularly based on multiple cues of different modalities (e.g., visual, audio, haptic). Individuals are capable of performing weighted integration procedures “in a near optimal fashion” (Cheng et al., 2007, p. 625) and without effortful and deliberative calculation (cf. also Hillis, Ernst, Banks, & Landy, 2002).

A fascinating example stems from the categorization of biological motion. Troje (2002) presented participants with point-light displays of walking figures. A variety of structural and dynamic cues were found to systematically direct categorization, among them arm swing amplitude and velocity, walking speed, lateral sway of upper body, and elbow-body distance. Viewers were intuitively capable of accurately identifying a walking pattern as male or female. Most notably, accuracy in gender classification drops to chance level when cues are eliminated – that is, when the information sample becomes smaller (e.g., Kozlowski & Cutting, 1977). Troje concludes that judgments are not a matter of a single feature but rather a “complex process with a holistic character” (Troje, 2002, p. 373) that corresponds to a weighted linear integration rule.

The literature on speech comprehension converges with the notion that individuals routinely integrate multiple pieces of information. For instance, adult recipients simultaneously utilize prosodic cues, mimic, gesture, and prior knowledge about the communicator to understand irony (see Gibbs & Colston, 2007, for overviews). This can happen in less than 600 to 800 milliseconds (e.g., Schwoebel, Dews, Winner, & Srinivas, 2000). Interestingly, increasing the amount of information (e.g., by providing contextual cues) can result in a faster detection of irony (Amenta & Balconi, 2008).

Human and non-human animals are remarkably good at registering the relative frequency of events (Sedlmeier & Betsch, 2002). Frequency encoding and storage is an implicit and automated process that neither requires motivation nor consumes cognitive resources in a noticeable fashion. Similar skills were observed in attitude formation. Individuals are capable of implicitly aggregating the values associated with successive encounters with an attitude object (Betsch, Hoffmann, Hoffrage, & Plessner, 2003; Betsch et al., 2001).

Evidence for intuitions based on large samples of primary information can also be found in the domain of judgment and decision making. Glöckner and Betsch (2008a) presented adult participants with two versions of an information board in a probabilistic choice task. The hidden presentation format was similar to the standard Mouselab (Payne et al., 1988), in which participants must uncover information (values, probabilities) by clicking on cells in the matrix with the computer mouse. In the open format, all information was displayed simultaneously and could be inspected by the individuals at once. This manipulation had a tremendous effect. With the classic version, individuals tended to employ simple strategies that reduced the effort of pre-decisional information search. Their decisions were informed by small samples of information. With the open board, however, the overwhelming majority of participants used all available information and employed weighted-additive-like integration procedures in an astoundingly narrow time frame. The authors propose a component approach for explaining decision making (Betsch & Glöckner, 2010; Glöckner & Betsch, 2008b). Accordingly, intuitive and deliberative processes interplay. The intuitive processes consider encoded information in a parallel fashion and function automatically and rapidly. They are formally described by a parallel-constraint satisfaction (PCS) rule that identifies a promising option by changing activations of nodes in a connectionist network that represents the decision problem at stake. Deliberate processes are necessary to control the formation of the network by active information search, determining and changing relations between information, suppressing irrelevant information, and subjecting the identified option to intentional processes of implementation (for example, performing motor operations to select an option via mouse-click on a computer screen). According to such a component approach, intuitive processes are always involved in decision making as a default to integrate information and detect a solution (cf. the default-interventionist view by Evans, 2008; Evans, this volume). As such, even decisions that are commonly considered to be “deliberate” (because the individual actively seeks information and becomes consciously aware of a final preference) involve intuitive processes.

In research on large sample processing, the paradigms and experimental settings vary strongly. Nevertheless, most of them share a common feature: information acquisition is largely unconstrained and encoding is easy. For example, Glöckner and Betsch (2008a) presented their participants with an open information board, relieving them of the need to sequentially open cells in the matrix and remember the contents. In multi-modality research on spatial categorization and speech comprehension, participants can perceive and encode simultaneously via different perceptual channels.

Large samples of surrogate information

In problem solving, the pieces of primary information provided are not sufficient to come up with a solution. The sample space must be enlarged. To this end, one must “go beyond the information given” (Bruner, 1957). Perceived stimuli activate prior knowledge (e.g., schemata), which in turn enables decision makers to associate meaning with the stimuli. Creative thinking and problem solving exploit the potentials of spreading activation in an associative network – a process genuinely non-intentional, autonomous, and capable of accessing remote knowledge (Gilhooly, Ball, & Macchi, 2015; see also Gilhooly, this volume; Runco, this volume). Additionally, the absence of cognitive control and conscious focus of attention appear to promote the access of new information as well as the combination of disparate, initially surrogate information in a new, original way (Sinclair, 2010). During this process, large samples of information are accessed to pave the way for an unexpected solution to appear. Several mechanisms can cause such “aha” experiences (Öllinger et al., 2013) as chunk decomposition and constraint relaxation (Knoblich et al., 1999). These mechanisms yield a change in representation, assign new meaning to elements of the problem, and open paths to new solutions. Intuitive problem solving is promoted by a good mood – a factor known to reduce the bottom-up scrutinization of stimuli (Bless & Fiedler, 1995; Bolte, Goschke, & Kuhl, 2003).

Some authors differentiate intuition from insight (“aha” experiences). For instance, Sadler-Smith (2010) stresses that insight is objective, clear, and easy to articulate, whereas intuitions are subjective, fuzzy, and difficult to articulate (for related views see Hogarth, 2010; Topolinski & Reber, 2010; see also Topolinski, this volume). Despite these phenomenological differences on the output level, the underlying processes of spreading activation, creating meaning, and figuring out a solution under constraints of other pieces of information still fit into the category of Type 1 processes. In the section above, we described the PCS rule. PCS processes are assumed to function autonomously and without cognitive control – even in situations in which the individual deliberatively searches information and explicitly intends to make a decision. According to a component view, conscious decisions (as insights) involve intuitive processes capable of handling large samples of information.

Intrusion effects in decision making provide an additional example evidencing this capability. Unfortunately, in this case, the power of intuitive processing can lead to biases. In intrusion paradigms, the individual is provided with primary information together with surrogate information that is normatively irrelevant. The consideration of the surrogates manifests itself in dilution effects or even decreased decision accuracy. Betsch and colleagues (2014) presented children and adults with choice tasks in which several probabilistic cues (advice givers) made outcome predictions. Prior to the choice tasks, one advice giver was announced as a personal friend (the “lure” information). In one condition, the information board was closed and predictions had to be sequentially opened, thus increasing the time needed to make a decision. In this condition, the lure had no effect on choices, neither in children nor in adults. In an open-board condition, in which decisions could be made very quickly (so that the intrusion of irrelevant information was more difficult to control), participants of all age groups were influenced by the lure. Specifically, if the lure was associated with the low-validity cue, the relative impact of the other cues (with higher validities) on decisions decreased. Söllner and colleagues (2014) trained participants to use a simple take-the-best (TTB) strategy when making probabilistic inference decisions in an information board paradigm. Although TTB led to optimal results in this decision environment, individuals were not able to ignore TTB-irrelevant information. If predictions of low validity cues opened “for free” on the computer screen, individuals altered their choices and showed varying confidence judgments contingent on the quality of the “irrelevant” information.

Research on insight and intrusion effects jointly indicates that intuitive thinking can make full use of the richness of memory and the stimulus environment. It can be profitable for the individual to extend information search and breach the boundaries of initial samples of primary information. Surrogates may provide trajectories to new combinations of information resulting in creative solutions. If invalid surrogates mingle with primary information, however, implicit aggregation can yield biases in judgment and decision making. The individual may control for such biases by attempting to inhibit the influence of surrogate information. However, control requires Type 2 processing, which consumes cognitive resources.

Acknowledging differences between intuitive processes

In Figure 3.1 we illustrated the diversity of processes considered to be intuitive by various theoretical approaches in psychology. Although they all conform to the definition of Type 1 processes, they appear to have different properties evidenced by the quantity and quality of information they use. This position, however, is controversial. Kruglanski (2013; see also Kruglanski & Gigerenzer, 2011) proposed a uni-model rather than a dual-model of thought. Accordingly, he does not consider Type 1 and Type 2 to be qualitatively distinct processes; consequently, his model denies differences between intuitive and rational processes. The uni-model conceptualizes all processes as rule-based. Within this framework, intuitive processes only differ with regard to the amount of deliberate activity involved in their operation, which can be projected on a continuum of cognitive effort and speed. Thus, according to the uni-model, intuitive thought reflects rules that have become automatic and, as a result, can be performed quickly and without conscious awareness.

We have doubts regarding the appropriateness of this approach because the model’s assumptions neglect important facets of intuition. Consider, for example, the recognition heuristic. Goldstein and Gigerenzer (1999, 2002) view the recognition heuristic as an example par excellence of simple heuristics that rely on just one piece of information. Although recognition is an automatic process that is virtually unconstrained by working memory, it exploits the entire richness of our knowledge. To illustrate this point, consider how the process of recognition is modelled in memory research. Hintzman (1988) put forward MINERVA 2, a multiple trace model that accounts for a wide range of memory judgments including recognition. The model assumes that experience of each event creates a new trace in memory that can be formally described as a vector of features. If a judgment requires retrieval (such as recognition), a probe vector representing the stimulus is simultaneously compared with a huge sample of trace vectors in memory. The judgment criterion is a function of the degree of overlap between probe and aggregated trace vectors. According to the multiple trace approach, recognition reflects large samples of information accessed in parallel in long-term memory.

On the surface, the recognition heuristic can be described in rule-like terms such as, for instance, the Take-The-Best heuristic (cf. Gigerenzer, 2007, pp. 8 and 149). Nevertheless, their underlying processes are strikingly different (see also Thompson, 2013). Recognition is an innate process capability (see below). In contrast, lexicographic strategies comprise rules of search. Search, in turn, rests on executive functioning (e.g., focusing attention, inhibition) and planning – processes that require maturation and learning. These process capabilities are not fully developed until the age of 10 (Betsch, Lehmann, Lindow, Lang, & Schoemann, 2016; Mata et al., 2011). Recognition occurs autonomously and involuntarily. We cannot prevent ourselves from recognizing a stimulus. Lexicographic strategies require focus of attention and controlled inhibition of information that should not be considered by the rule. If decisions are made quickly or additional information becomes salient, automatic processes of integration cause an intrusion of irrelevant information into the decision process and so debauch the functioning of the strategy (Betsch et al., 2014; Söllner et al., 2014). Recognition rests on parallel processing of a huge amount of information. Otherwise, it could not be performed so quickly. Lexicographic strategies involve serial search processes, which are slow and consume memory resources. They capitalize on information neglect to speed up the process. The multiplicity of processes must be mirrored on the theoretical level. Treating all such processes alike (e.g., as rule-based) veils their emergent properties and alleviates precision in explanation and prediction.

Debates

Are there two brains?

The two faculty approach may suggest the assumption that intuitive and non-intuitive processes are located in different regions of the brain. Stanovich (2004), for example, described the (intuitive) System 1 as the “old mind” that evolved early. In contrast, the (reflective) System 2, the “new mind”, evolved late in phylogenesis and continues to do so in human ontogenesis. Accordingly, one might suspect that older regions of the brain, such as the limbic system, would operate by intuitive processes, whereas phylogenetically younger areas, such as the prefrontal cortex, would be responsible for reflective processing.

Empirical evidence from neuroimaging studies, however, clearly shows that intuitive judgment and decision making involves neural activation that is widespread across different areas of the human brain (e.g., Volz & von Cramon, 2008). Results do not indicate a common neural network for intuitive processing. Therefore, Evans and Stanovich (2013) decided to replace the term “system” with “type” in order to avoid misleading connotations. Kahneman put it nicely: “and of course […] the two systems do not really exist in the brain or anywhere else” (Kahneman, 2011, p. 415). Nevertheless, intuitive processes may be accompanied by different activation patterns compared to those associated with reflective processing. Again, consider the case of understanding irony, which often happens very quickly. Wang and colleagues (2006) compared accuracy and speed in the identification of irony in average and clinical patients. In the latter, parallel activation of brain regions was impaired. Whereas average participants showed simultaneous activation of different brain regions and were quick to understand irony, clinical patients had to engage in time-consuming deliberative thought to grasp the ironic intention behind the message.

The role of learning

“There is almost universal agreement that […] intuition is shaped by learning” (Hogarth, 2010, p. 343). What do we have to learn before we can form intuitions? Not every kind of knowledge is the product of the individual’s own learning history. Seven-month-old infants are capable of inferring linguistic rules from speech (Marcus, Vijayan, Rao, & Vishton, 1999). Conditioning not only reflects contiguity and reinforcement schedules but is also determined by stimulus categories. For example, nausea is easily conditioned to the taste of food but not audiovisual stimuli (preparedness effect, Seligman, 1970). Accordingly, some knowledge structures appear to be innate, providing a phylogenetically acquired database for further learning and intuition.

Similarly, several processes underlying intuition do not require learning, such as recognition. Fortunately, individuals are endowed with this process capability from birth. For instance, three-day-old infants are able to recognize their mother’s voice (DeCasper & Fifer, 1980). We cannot and need not further train the recognition process because of its automaticity and austerity of deliberate surveillance. Nevertheless, recognition capitalizes on learned knowledge. In routine behavior, we rely on a consolidated and rich source of knowledge that reflects intensive experience.

Gary Klein compiled numerous ethograms on what he called recognition-primed decisions in experts. One episode describes behavioral choices of a rescue team leader:

The first decision facing Lieutenant M. is to diagnose the problem. As he ran to the man, even before listening to his wife, he made his diagnosis. He can see from the amount of blood that the man has cut open an artery, and from the dishcloths held against the man’s arm he can tell which artery. Next comes the decision of how to treat the wound. In fact, there is nothing to deliberate over. As quickly as possible, Lieutenant M. applies firm pressure.

(Klein, 1999, p. 3)

Obviously, Lieutenant M.’s decision capitalized on knowledge he has acquired during extensive training.

However, there are also processes that require learning and experience in order to function without effort and cognitive control. Betsch and colleagues (Betsch & Lang, 2013; Betsch et al., 2014, 2016) studied probabilistic inference decisions in children with an information board in which predictions of outcomes have to be actively inspected by opening doors in a cue-by-option matrix. The children’s task was to find treasures in houses (options). They chose a house based on predictions of three animals that differed with regard to the probability that they correctly predicted whether or not a house contained a treasure. The probability structure was non-compensatory; that is, it invited the application of simple lexicographic strategies. Specifically, it was sufficient to inspect only the prediction of the animal with the highest predictive validity and follow this prediction to maximize the amount of treasures in a series of decision trials. Prior to decision making, children learned the validities of the animals by experience. Results from a series of experiments reliably showed that a substantial proportion of nine-year-olds used probabilities as decision weights in their choices. However, probability weighting did not transfer to information search. Although children effectively limited the number of inspected predictions under varying context factors (increase in probability dispersion, instructions to limit search), they were unable to employ a simple lexicographic strategy (Betsch et al., 2016). None of the children systematically focused on the predictions of the best cue. The likelihood that a certain prediction was considered was completely random. Mata and colleagues (2011) report a similar reluctance to apply simple strategies in even older children (10- to 11-year-olds) – a phenomenon that these authors nicely described as “when easy comes hard”. This difference between children and adults shows that some intuitive processes require maturation and learning until they can be implemented effectively.

There are numerous studies on the use of such small-sample strategies that use primary information (e.g., Beach & Mitchell, 1978; Payne et al., 1988, 1993; Newell & Shanks, 2003). After being acquainted with the information presentation format, adults can apply search strategies in information boards in a quite routinized fashion (Bröder & Schiffer, 2006; Rieskamp & Otto, 2006). Note, however, that information in this paradigm is given and not retrieved from memory. Thus, there are good reasons to assume that the application of intuitive strategies can also deal with new information, even in domains in which the individuals lack expertise.

The role of feelings

Intuition is often regarded to be wedded with feelings (e.g., T. Betsch, 2008; Epstein, 2008; Gilovich et al., 2002; Slovic et al., 2002; see also Topolinski, this volume). Heuristics exploit feelings of risk, preference, liking, knowing, familiarity, fluency, and others as judgmental proxies. Feelings arise involuntarily and break into consciousness (Wundt, 1907; Zajonc, 1968, 1980). They may serve as communication devices within the organism (e.g., Simon, 1967). As an output of implicit processes, they can be used as a basis for a wide range of intuitive judgments and decisions. However, are all of these feelings real? Presumably, evidence from self-report is of dubious validity. Intuitive processes are opaque and cannot be accessed by introspection. Lacking facts to communicate, individuals may be tempted to describe their internal reactions in terms of feelings; surely it is possible to describe all sorts of experiences in this specific way. The distinction between emotional (e.g., risks as a negative affect) and non-emotional feelings (e.g., the feeling of knowing, Hart, 1965), renders the construct even more fuzzy. Especially if cognition and affect converge in meaning and behavioral inclinations, it is a purely semantic game to distinguish feelings from cognitions.

If feelings play a genuine role in intuition, we must precisely define their emergent properties and support those with empirical level. Affective reactions, for example, manifest themselves in physiological changes. Accordingly, intuitions that use affective reactions towards a stimulus as a basis for judgment (e.g., affect as information, Schwarz & Clore, 1983; affect heuristic, Slovic et al., 2002) should be accompanied by changes in peripheral physiological reactions such as heart rate and galvanic skin response. To determine whether it is truly affect on which the individual relies, changes in the physiological pattern should covary with judgments and decisions. In evaluating the risk-as-feelings hypothesis, Loewenstein and colleagues (2001) collected empirical evidence for this notion based on a comprehensive review of psychological, physiological, and neuropsychological studies. For instance, when the connections between the prefrontal cortices and the limbic system are impaired (e.g., due to permanent tissue injury by stroke), patients lack access to the affective input during decision making. Lacking access to such “somatic markers” (Damasio, 1994) increases risk taking compared to non-clinical controls (Bechara, Damasio, Tranel, & Damasio, 1997; but see Loewenstein et al., 2001, p. 273, for a critical discussion). Schwarz and Clore (1983) applied a misattribution paradigm from emotion research and showed that the availability heuristic was not applied when the feeling of ease or difficulty could be attributed to an external source.

These examples show that feelings can play an emergent role in intuition. It is important to note that feelings are not necessarily always involved in intuitive judgments and decisions. For example, heuristics using small samples of central information do not make the assumption that feelings must be involved.

Good and bad intuitions

Some individuals trust intuition more than deliberation. Others mistrust intuition and prefer to think carefully before they make a decision (C. Betsch, 2008; Betsch & Kunz, 2008). Are intuitions good or bad? Granting empirical evidence, the answer is straightforward: it depends! Sometimes intuitions produce serious shortcomings compared to normative standards (Nisbett & Ross, 1980; Kahneman et al., 1982). In other situations, shortcut heuristics can even outperform formal rules (Gigerenzer et al., 1999; see also Hoffrage et al., this volume). Dijksterhuis (2004) postulated that unconscious thought leads to better decisions in complex tasks. In contrast, conscious thinking should increase performance in simple tasks consisting of small samples of information (for critical discussions and counter-evidence, cf. Acker, 2008; Payne, Samper, Bettman, & Luce, 2008).

A more fine-grained approach assumes that specific heuristics are tailored to specific environmental structures (Gigerenzer et al., 1999) or cognitive niches (Marewski & Schooler, 2011). For instance, Goldstein and Gigerenzer (2002) propose that the recognition heuristic has evolved for and is primarily used in situations in which recognition is correlated with the judgment criterion. Consequently, intuition will drop in accuracy or may even lead to mal-adaptive performance if there is a mismatch between the process and situation.

The output quality resulting from any thinking process is also contingent upon the quality of the information input. Fiedler demonstrated that even an ideal calculating device (a computer) can produce biases similar to those observed in humans when the stimulus matrix is noisy (Fiedler, 1996) or the sample itself is biased due to proximity, salience, or attentional focus (Fiedler, 2000; Fiedler, Brinkmann, Betsch, & Wild, 2000). At worst, the information itself is false. For instance, in judging other people, our intuitions can reflect personal schemata regarding the target’s group membership. Such stereotypes sometimes contain a kernel of truth but can also reflect mere prejudice. For example, it may be true that Germans lack a sense of humor but it is definitely wrong to assume that their favorite dish is sauerkraut (in fact, it is pizza).

Changes in the world are responsible for a substantial portion of errors in intuitive decisions. For example, experts’ intuitions exploit consolidated prior experience. If the contingency structure in the environment changes, the knowledge base is no longer representative for the task. Under such conditions, inertia effects are likely to occur, yielding maladaptive outcomes (Betsch & Haberstroh, 2005, for an overview). Interestingly, such routine effects can even occur counter to the intention to quit a maladaptive course of action. This is likely to happen when situational factors such as time pressure foster intuitive thinking in decision making (Betsch, Haberstroh, Molter, & Glöckner, 2004).

The nature of intuition

The nature of intuitive thoughts is subject to persistent dispute among psychologists and researchers in other disciplines. The various views on intuition strongly reflect the beholder’s background and lead to the sense that one is studying a mysterious creature. Expertise researchers emphasize learning and recognition; proponents of the bounded rationality approach equate intuition with shortcut heuristics; and neural networkers claim the very opposite: that intuition builds on holistic processes unbound by capacity constraints. Some scholars see affect as a constitutive feature of intuition; others as an epi-phenomenon. Some of the proposed processes utilize primary information; others use surrogate information. Some handle large samples very quickly; others rely on just one piece of information. All these features associated with intuition may simply reflect different facets of the potentials of the human mind. While illuminating these diverse facets, it should have become evident that it is impossible to come up with a satisfying integrative conceptualization of the nature of intuition. It is not helpful to add another list of attributes to the literature. Eclectic attempts exist often enough. Though pretending to be integrative, they yield conceptual dilution and confusion. Clear-cut definitions, and they do exist in the literature, are possible only if one systematically ignores findings outside one’s own research camp. Neither of the two alternatives is acceptable from an epistemological point of view.

In conclusion, we make the case for taking the plethora of findings on intuitive thought seriously. Skimming results from various domains of cognitive research, such as perception, categorization, speech comprehension, judgment, decision making, and problem solving, we must acknowledge the virtuosity with which our minds can handle a variety of tasks without apparent effort and conscious control.

In this chapter we showed that individuals can apply different processes when thinking intuitively. We suggest that researchers should focus on the sublevel of core processes of intuition rather than ruminating about categorizations on the superordinate level. With core processes, we do not refer to heuristics or strategies. The latter are compounds of the former. Recognition is surely one of these core processes and a mighty one at that. Others, such as implicit aggregation (forming representations of key variables such as frequency and value) and an all-purpose device for integrating stimuli (in spatial perception, speech, decision making), probably add to this list. We believe that the number of core processes is limited because they are not domain specific, do not have to be learned, and apply to diverse environments. Through lifetime learning, they may be combined with strategies that help obtain mastery in specific situations. Learning can eventually result in strategy routines that can be performed automatically just as in the case of the core processes right from birth. Becoming curious about the fascinating process diversity may encourage us to zoom deeper into the cosmos of intuitive thinking. Hopefully, this approach will evoke new questions and a striving for future research designs capable of tracing the interaction of different processes. With a more fine-grained view, we might overcome the futile debate regarding what intuition really is.

References

Abernathy, C. M. , & Hamm, R. M. (1995). Surgical intuition: What it is and how to get it. Philadelphia: Hanley & Belfus.
Acker, F. (2008). New findings on unconscious versus conscious thought in decision making: Additional empirical data and meta-analysis. Judgment and Decision Making, 3, 292–303.
Ambady, N. (2010). The perils of pondering: Intuition and thin slice judgments. Psychological Inquiry, 21, 271–278.
Amenta, S. , & Balconi, M. (2008). Understanding irony: An ERP analysis on the elaboration of acoustic ironic statements. Neuropsychological Trends, 3, 7–27.
Anderson, N. H. (1971). Integration theory and attitude change. Psychological Review, 78, 171–206.
Beach, L. R. , & Mitchell, T. R. (1978). A contingency model for the selection of decision strategies. Academy Management Review, 3, 439–449.
Bechara, A. , Damasio, H. , Tranel, D. , & Damasio, A. R. (1997). Deciding advantageously before knowing the advantageous strategy. Science, 275, 1293–1295.
Betsch, C. (2008). Chronic preferences for intuition and deliberation in decision making: Lessons learned about intuition from an individual differences approach. In H. Plessner , C. Betsch , & T. Betsch (Eds.), Intuition in judgment and decision making (pp. 231–248). Mahwah, NJ: Lawrence Erlbaum Associates.
Betsch, C. , & Kunz, J. J. (2008). Individual strategy preferences and decisional fit. Journal of Behavioral Decision Making, 21, 532–555.
Betsch, T. (2008). The nature of intuition and its neglect in research on judgment and decision making. In H. Plessner , C. Betsch , & T. Betsch (Eds.), Intuition in judgment and decision making (pp. 3–22). New York, NY: Lawrence Erlbaum Associates.
Betsch, T. , & Glöckner, A. (2010). Intuition in judgment and decision making: Extensive thinking without effort. Psychological Inquiry, 21, 279–294.
Betsch, T. , & Haberstroh, S. (Eds.). (2005). The routines of decision making. Mahwah, NJ: Lawrence Erlbaum Associates.
Betsch, T. , Haberstroh, S. , Molter, B. , & Glöckner, A. (2004). Oops, I did it again – relapse errors in routinized decision making. Organizational Behavior and Human Decision Processes, 93, 62–74.
Betsch, T. , Hoffmann, K. , Hoffrage, U. , & Plessner, H. (2003). Intuition beyond recognition: When less familiar events are liked more. Experimental Psychology, 50, 49–54.
Betsch, T. , & Lang, A. (2013). Utilization of probabilistic cues in the presence of irrelevant information: A comparison of risky choice in children and adults. Journal of Experimental Child Psychology, 115, 108–125.
Betsch, T. , Lang, A. , Lehmann, A. , & Axmann, J. M. (2014). Utilizing probabilities as decision weights in closed and open information boards: A comparison of children and adults. Acta Psychologica, 153, 74–86.
Betsch, T. , Lehmann, A. , Lindow, S. , Lang, A. , & Schoemann, M. (2016). Lost in search: (Mal-) Adaptation to probabilistic decision environments in children and adults. Developmental Psychology, 52, 311–325.
Betsch, T. , Plessner, H. , Schwieren, C. , & Gütig, R. (2001). I like it but I don’t know why: A value-account approach to implicit attitude formation. Personality and Social Psychology Bulletin, 27, 242–253.
Bless, H. , & Fiedler, K. (1995). Affective states and the influence of activated general knowledge. Personality and Social Psychology Bulletin, 21, 766–778.
Bolte, A. , Goschke, T. , & Kuhl, J. (2003). Emotion and intuition: Effects of positive and negative mood on implicit judgments of semantic coherence. Psychological Science, 14, 416–421.
Bröder, A. (2003). Decision making with the “adaptive toolbox”: Influence of environmental structure, intelligence, and working memory load. Journal of Experimental Psychology: Learning, Memory, & Cognition, 29, 611–625.
Bröder, A. , & Schiffer, S. (2006). Adaptive flexibility and maladaptive routines in selecting fast and frugal decision strategies. Journal of Experimental Psychology: Learning, Memory, and Cognition, 32, 904–918.
Bruner, J. S. (1957). Going beyond the information given. Contemporary Approaches to Cognition, 1, 119–160.
Busemeyer, J. R. , & Townsend, J. T. (1993). Decision field theory: A dynamic-cognitive approach to decision making in an uncertain environment. Psychological Review, 100, 432–459.
Chaiken, S. (1979). Communicator physical attractiveness and persuasion. Journal of Personality and Social Psychology, 37, 1387–1397.
Chaiken, S. , & Trope, Y. (1999). Dual-process theories in social psychology. New York: Guilford Press.
Cheng, K. , Shettleworth, S. J. , Huttenlocher, J. , & Rieser, J. J. (2007). Bayesian integration of spatial information. Psychological Bulletin, 133, 625–637.
Damasio, A. R. (1994). Descartes’ Error: Emotion, reason, and the human brain. New York: Putnam Publishing.
DeCasper, A. J. , & Fifer, W. P. (1980). Of human bonding: Newborns prefer their mothers’ voices. Science, 208, 1174–1176.
Dijksterhuis, A. (2004). Think different: The merits of unconscious though in preference development and decision making. Journal of Personality and Social Psychology, 87, 586–598.
Edwards, W. (1954). The theory of decision making. Psychological Bulletin, 51, 380–417.
Epstein, S. (2008). Intuition from the perspective of cognitive-experiential self-theory. In H. Plessner , C. Betsch , T. Betsch , H. Plessner , C. Betsch , & T. Betsch (Eds.), Intuition in judgment and decision making (pp. 23–37). Mahwah, NJ: Lawrence Erlbaum Associates.
Epstein, S. , Pacini, R. , Denes-Raj, V. , & Heier, H. (1996). Individual differences in intuitive experiential and analytical rational thinking styles. Journal of Personality and Social Psychology, 71, 390–405.
Evans, J. St. B. T. (2008). Dual-processing accounts of reasoning, judgment, and social cognition. Annual Review of Psychology, 59, 255–278.
Evans, J. St. B. T. , & Stanovich, K. E. (2013). Dual-process theories of higher cognition: Advancing the debate. Perspectives on Psychological Science, 8, 223–241.
Fiedler, K. (1996). Explaining and simulating judgment biases as an aggregation phenomenon in probabilistic, multiple-cue environments. Psychological Review, 103, 193–214.
Fiedler, K. (2000). Beware of samples! A cognitive-ecological sampling approach to judgment biases. Psychological Review, 107, 659–676.
Fiedler, K. (2011). Voodoo correlations are everywhere – Not only in neuroscience. Perspectives On Psychological Science, 6, 163–171.
Fiedler, K. , Brinkmann, B. , Betsch, T. , & Wild, B. (2000). A sampling approach to biases in conditional probability judgments: Beyond base rate neglect and statistical format. Journal of Experimental Psychology: General, 129, 399–418.
Fiedler, K. , & Kareev, Y. (2008). Implications and ramifications of a sample-size approach to intuition. In H. Plessner , C. Betsch , & T. Betsch (Eds.), Intuition in judgment and decision making (pp. 149–172). Mahwah, NJ: Lawrence Erlbaum Associates.
Finucane, M. L. , Alhakami, A. , Slovic, P. , & Johnson, S. M. (2000). The affect heuristic in judgments of risks and benefits. Journal of Behavioral Decision Making, 13, 1–17.
Fishburn, P. C. (1974). Lexicographic orders, utilities, and decision rules: A survey. Management Science, 20, 1442–1472.
Frederick, S. (2002). Automated choice heuristics. In T. Gilovich , D. Griffin , & D. Kahneman (Eds.), Heuristics & biases: The psychology of intuitive judgment (pp. 548–558). New York: Cambridge University Press.
Furley, P. , Schweizer, G. , & Bertrams, A. (2015). The two modes of an athlete: dual-process theories in the field of sport. International Review of Sport and Exercise Psychology, 8, 1–19.
Gibbs, R. W. , & Colston, H. L. (Eds.). (2007). Irony in language and thought: A cognitive science reader. Hillsdale, NJ: Lawrence Erlbaum Associates.
Gigerenzer, G. (2007). Gut feelings: The intelligence of the unconscious. New York: Viking Press.
Gigerenzer, G. , & Goldstein, D. G. (1999). Betting on one good reason: The take the best heuristic. In G. Gigerenzer , P. M. Todd , & the ABC Research Group (Eds.), Simple heuristics that make us smart (pp. 75–95). Oxford: Oxford University Press.
Gigerenzer, G. , Todd, P. M. , & the ABC Research Group. (1999). Simple heuristics that make us smart. Oxford: Oxford University Press.
Gilhooly, K. J. , Ball, L. J. , & Macchi, L. (2015). Insight and creative thinking processes: Routine and special. Thinking & Reasoning, 21, 1–4.
Gilovich, T. , Griffin, D. , & Kahneman, D. (Eds.). (2002) Heuristics and biases: The psychology of intuitive judgment. Cambridge: Cambridge University Press.
Glöckner, A. , & Betsch, T. (2008a). Multiple-reason decision making based on automatic processing. Journal of Experimental Psychology: Learning, Memory and Cognition, 34, 1055–1075.
Glöckner, A. , & Betsch, T. (2008b). Modeling option and strategy choices with connectionist networks: Towards an integrative model of automatic and deliberate decision making. Judgment and Decision Making, 3, 215–228.
Glöckner, A. , & Witteman, C. L. M. (Eds.). (2009) Foundations of tracing intuition: Challenges and methods. London: Psychology Press.
Glöckner, A. , & Witteman, C. L. M. (2010). Beyond dual-process models: A categorization of processes underlying intuitive judgment and decision making. Thinking & Reasoning, 16, 1–25.
Goldstein, D. G. , & Gigerenzer, G. (1999). The recognition heuristic: How ignorance makes us smart. In G. Gigerenzer , P. M. Todd , & the ABC Research Group (Eds.), Simple heuristics that make us smart (pp. 37–58). New York: Oxford University Press.
Goldstein, D. G. , & Gigerenzer, G. (2002). Models of ecological rationality: The recognition heuristic. Psychological Review, 109, 75–90.
Hammond, K. R. (2010). Intuition, no! … Quasirationality, yes! Psychological Inquiry, 21, 327–337.
Hart, J. T. (1965). Memory and the feeling-of-knowing experience. Journal of Educational Psychology, 56, 208–216.
Hasher, L. , & Zacks, R. T. (1979). Automatic and effortful processes in memory. Journal of Experimental Psychology: General, 108, 356–388.
Hasher, L , & Zacks, R. T. (1984). Automatic processing of fundamental information: The case of frequency of occurrence. American Psychologist, 12, 1372–1388.
Hastie, R. , & Park, B. (1986). The relationship between memory and judgment depends on whether the judgment task is memory-based or on-line. Psychological Review, 93, 258–268.
Hertwig, R. , Herzog, S. M. , Schooler, L. J. , & Reimer, T. (2008). Fluency heuristic: A model of how the mind exploits a by-product of information retrieval. Journal of Experimental Psychology: Learning, Memory, and Cognition, 34, 1191–1206.
Hillis, J. M. , Ernst, M. O. , Banks, M. S. , & Landy, M. S. (2002). Combining sensory information: Mandatory fusion within, but not between, senses. Science, 298, 1627–1630.
Hintzman, D. L. (1988). Judgments of frequency and recognition memory in a multiple-trace memory model. Psycholgoical Review, 95, 528–551.
Hogarth, R. M. (2001). Educating intuition. Chicago: University of Chicago Press.
Hogarth, R. M. (2010). Intuition: A challenge for psychological research on decision making. Psychological Inquiry, 21, 338–353.
Jacoby, L. L. , & Dallas, M. (1981). On the relationship between autobiographical memory and perceptual learning. Journal of Experimental Psychology: General, 110, 306–340.
Kahneman, D. (2002). Maps of bounded rationality: A perspective on intuitive judgment and choice. Princeton: manuscript of the Nobel price lecture.
Kahneman, D. (2011). Thinking, fast and slow. New York, NY: Farrar, Straus and Giroux.
Kahneman, D. , Fredrickson, B. L. , Schreiber, C. A. , & Redelmeier, D. A. (1993). When more pain is preferred to less: Adding a better end. Psychological Science, 4, 401–405.
Kahneman, D. , Slovic, P. , & Tversky, A. (1982). Judgment under uncertainty: Heuristics and biases. Cambridge: Cambridge University Press.
Klein, G. (1999). Sources of power. How people make decisions. Cambridge, MA: MIT Press.
Knoblich, G. , Ohlsson, S. , Haider, H. , & Rhenius, D. (1999). Constraint relaxation and chunk decomposition in insight problem solving. Journal of Experimental Psychology: Learning, Memory, and Cognition, 25, 1534–1555.
Kozlowski, L. T. , & Cutting, J. E. (1977). Recognizing the sex of a walker from a dynamic point-light display. Perception & Psychophysics, 21, 575–580.
Kruglanski, A. W. (2013). Only one? The default interventionist perspective as a unimodel – Commentary on Evans & Stanovich. Perspectives on Psychological Science, 8, 242–247.
Kruglanski, A. W. , & Gigerenzer, G. (2011). Intuitive and deliberate judgments are based on common principles. Psychological Review, 118, 97–109.
Lee, M. D. , & Cummins, T. R. (2004). Evidence accumulation in decision making: Unifying the ‘take the best’ and the ‘rational’ models. Psychonomic Bulletin & Review, 11, 343–352.
Loewenstein, G. F. , Weber, E. U. , Hsee, C. K. , & Welch, N. (2001). Risk as feelings. Psychological Bulletin, 127, 267–286.
Marcus, G. F. , Vijayan, S. , Rao, S. B. , & Vishton, P. M. (1999). Rule learning by seven-month-old infants. Science, 283, 77–80.
Marewski, J. N. , & Schooler, L. J. (2011). Cognitive niches: An ecological model of strategy selection. Psychological Review, 118, 393–437.
Mata, R. , von Helversen, B. , & Rieskamp, J. (2011). When easy comes hard: The development of adaptive strategy selection. Child Development, 82, 687–700.
Myers, D. G. (2010). Intuition’s powers and perils. Psychological Inquiry, 21, 371–377.
Newell, B. R. , & Shanks, D. R. (2003). Take the best or look at the rest? Factors influencing “one-reason” decision making. Journal of Experimental Psychology: Learning, Memory and Cognition, 29, 53–65.
Nisbett, R. E. , & Ross, L. (1980). Human inference and shortcoming of social judgment. Englewood-Cliffs, NJ: Prentice-Hall.
Öllinger, M. , Jones, G. , Faber, A. H. , & Knoblich, G. (2013). Cognitive mechanisms of insight: The role of heuristics and representational change in solving the eight-coin problem. Journal of Experimental Psychology: Learning, Memory, and Cognition, 39, 931–939.
Payne, J. W. , Bettman, J. R. , & Johnson, E. J. (1988). Adaptive strategy selection in decision making. Journal of Experimental Psychology: Learning, Memory and Cognition, 14, 534–552.
Payne, J. W. , Bettman, J. R. , & Johnson, E. J. (1993). The adaptive decision maker. Cambridge: Cambridge University Press.
Payne, J. W. , Samper, A. , Bettman, J. R. , & Luce, M. F. (2008). Boundary conditions on unconscious thought in complex decision making. Psychological Science, 19, 1118–1123.
Petty, R. , & Cacioppo, J. (1986). Communication and persuasion: Central and peripheral routes to attitude change. New York: Springer.
Plessner, H. , Betsch, C. , & Betsch, T. (Eds.). (2008) Intuition in judgment and decision making. New York: Lawrence Erlbaum Associates.
Redelmeier, D. A. , & Kahneman, D. (1996). Patient’s memories of painful medical treatments: Real-time and retrospective evaluations of two minimally invasive procedures. Paine, 66, 3–8.
Rieskamp, J. , & Otto, P. E. (2006). SSL: A theory of how people learn to select strategies. Journal of Experimental Psychology: General, 135, 207–236.
Sadler-Smith, E. (2008). Inside intuition. New York: Routledge/Taylor & Francis Group.
Sadler-Smith, E. (2010). The intuitive mind: Profiting from the power of your sixth sense. Chichester, UK: Wiley.
Schneider, W. , & Shiffrin, R. M. (1977). Controlled and automatic human information processing: I. Detection, Search, and Attention. Psychological Review, 84, 1–65.
Schooler, L. J. , & Hertwig, R. (2005). How forgetting aids heuristic inference. Psychological Review, 112, 610–628.
Schwarz, N. , & Clore, G. L. (1983). Mood, misattribution, and judgments of well-being: Informative and directive functions of affective states. Journal of Personality and Social Psychology, 45, 513–523.
Schwoebel, J. , Dews, S. , Winner, E. , & Srinivas, K. (2000). Obligatory processing of the literal meaning of ironic utterances: Further evidence. Metaphor and Symbol, 15, 47–61.
Sedlmeier, P. , & Betsch, T. (2002). Etc.: Frequency processing and cognition. Oxford: Oxford University Press.
Seligman, M. E. (1970). On the generality of the laws of learning. Psychological Review, 77, 406–418.
Shah, A. K. , & Oppenheimer, D. M. (2008). Heuristics made easy: An effort-reduction framework. Psychological Bulletin, 134, 207–222.
Shiffrin, R. M. , & Schneider, W. (1977). Controlled and automatic human information processing: II. Perceptual learning, automatic attending, and general theory. Psychological Review, 84, 127–190.
Simon, D. , & Holyoak, K. J. (2002). Structural dynamics of cognition: From consistency theories to constraint satisfaction. Personality and Social Psychology Review, 6, 283–294.
Simon, H. A. (1955). A behavioral model of rational choice. Quarterly Journal of Economics, 69, 99–118.
Simon, H. A. (1967). Motivational and emotional controls of cognition. Psychological Review, 74, 29–39.
Simon, H. A. (1992). What is an “explanation” of behavior? Psychological Science, 3, 150–161.
Sinclair, M. (2010). Misconceptions about intuition. Psychological Inquiry, 21, 378–386.
Slovic, P. , Finucane, M. , Peters, E. , & McGregor, D. G. (2002). The affect heuristic. In T. Gilovich , D. Griffin , & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment (pp. 397–420). Cambridge, UK: Cambridge University Press.
Söllner, A. , & Bröder, A. (2016). Toolbox or adjustable spanner? A critical comparison of two metaphors for adaptive decision making. Journal of Experimental Psychology: Learning, Memory, And Cognition, 42, 215–237.
Söllner, A. , Bröder, A. , Glöckner, A. , & Betsch, T. (2014). Single-process versus multiple-strategy models of decision making: Evidence from an information intrusion paradigm. Acta Psychologica, 146, 84–96. (Open Access: www.sciencedirect.com/science/article/pii/S0001691813002692).
Stanovich, K. E. (2004). The robot’s rebellion: Finding meaning in the age of Darwin. Chicago: University of Chicago Press.
Stanovich, K. E. , West, R. F. , & Toplak, M. E. (2011). The complexity of developmental predictions from dual process models. Developmental Review, 31, 103–118.
Thompson, V. A. (2013). Why it matters: The implications of autonomous processes for dual process theories. Perspectives in Psychological Sciences, 8, 253–256.
Thompson, V. A. , Prowse Turner, J. A. , & Pennycook, G. (2011). Intuition, reason, and metacognition. Cognitive Psychology, 63, 107–140.
Topolinski, S. , & Reber, R. (2010). Gaining insight into the “Aha”- experience. Current Directions in Psychological Science, 19, 402–405.
Troje, N. F. (2002). Decomposing biological motion: A framework for analysis and synthesis of human gait patterns. Journal of Vision, 2, 371–387.
Tversky, A. , & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5, 207–232.
Tversky, A. , & Kahneman, D. (1982). Judgment of and by representativeness. In D. Kahneman , P. Slovic , & A. Tversky (Eds), Judgment under uncertainty: Heuristics and biases (pp. 84–98). Cambridge: Cambridge University Press.
Volz, K. G. , & von Cramon, D. Y. (2008). Can neuroscience tell a story about intuition? In H. Plessner , C. Betsch , & T. Betsch (Eds.), Intuition in judgment and decision making (pp. 71–87). Mahwah, NJ: Lawrence Erlbaum Associates.
Von Winterfeld, D. , & Edwards, W. (1986). Decision analysis and behavioral research. Cambridge: Cambridge University Press.
Wang, A. T. , Lee, S. S. , Sigman, M. , & Dapretto, M. (2006). Neural basis of irony comprehension in children with autism: The role of prosody and context. Brain, 12, 932–943.
Wilson, T. D. , & Schooler, J. W. (1991). Thinking too much: Introspection can reduce the quality of preferences and decisions. Journal of Personality and Social Psychology, 60, 181–192.
Winerman, L. (2005). Intuition (special issue). APA Monitor on Psychology, 36(3), 50–64.
Wundt, W. (1907). Outlines of psychology. Leipzig: Wilhelm Engelmann.
Zajonc, R. B. (1968). Attitudinal effects of mere exposure. Journal of Personality and Social Psychology, 9, 1–27.
Zajonc, R. B. (1980). Feeling and thinking: Preferences need no inferences. American Psychologist, 35, 151–175.
Search for more...
Back to top

Use of cookies on this website

We are using cookies to provide statistics that help us give you the best experience of our site. You can find out more in our Privacy Policy. By continuing to use the site you are agreeing to our use of cookies.