1. Definition of a Problem in Cognitive Psychology
A problem is defined as any situation in which a person has a goal but does not immediately know the best way to reach it. It involves a gap between the current state and the desired goal state, with no obvious path to bridge the gap.
|
Key Components of a Problem |
Initial State: The current, unsatisfactory situation. Goal State: The desired outcome or solution. Obstacles: The limitations or constraints that prevent easy movement from the initial to the goal state. Operators: The available actions or tools that can be used to move toward the goal.
|
Criteria That Make Something a “Problem”: |
Lack of an immediate solution: If the answer is obvious or automatic, it is not a problem in the cognitive sense. Requires cognitive effort: The individual must think, strategize, or analyze. Goal-directedness: There must be an intended outcome or objective. Involves decision-making and uncertainty: Solutions are not always clear-cut.
|
Algorithms in Cognitive Psychology
Definition of Algorithm |
An algorithm is a systematic, rule-based procedure used to solve a problem. It involves a sequence of operations that, when followed correctly, guarantees a correct solution. In cognitive psychology, algorithms are studied as one of the structured methods humans may (rarely) use to solve well-defined problems.
|
Key Properties of Algorithms |
Step-by-step: Each action follows logically from the previous one. Exhaustive: All possible pathways are considered. Rule-governed: Operates under fixed, pre-determined rules. Solution-guaranteed: If a solution exists, the algorithm will find it. Often computationally expensive: May require a lot of time and mental effort.
|
Role of Algorithms in Human Cognition |
While algorithms are infallible in theory, humans do not always use them due to cognitive limitations like attention, working memory load, or time pressure. Nonetheless, algorithms are important in modeling cognitive processes such as logical reasoning, mathematical problem solving, and scientific thinking.
|
Related Concepts:
Cognitive load: Mental effort needed to solve a problem. Insight: Sudden realization of a solution. Problem representation: The way a problem is mentally structured can affect ease of solving.
|
Algorithms vs Heuristics in Cognitive Psychology
Heuristics are mental shortcuts. Unlike algorithms, they do not guarantee a correct solution but are faster and often used by humans in real-world decisions. Cognitive psychologists often compare algorithms (ideal problem-solving) with heuristics (actual strategies people use).
|
Theoretical Frameworks and Research
Newell and Simon’s Information-Processing Approach (1972): Introduced the General Problem Solver, a computer program simulating algorithmic reasoning. It illustrated how humans could hypothetically approach problems algorithmically, though in practice they often used heuristics.
|
Herbert Simon’s Bounded Rationality: Highlighted that while algorithms represent ideal rationality, humans operate within cognitive limits. This makes purely algorithmic thinking rare in day-to-day decision-making.
|
Cognitive Load Theory (Sweller): Emphasizes that high working memory demand reduces the likelihood of using algorithmic methods unless the individual is highly practiced.
|
Gestalt Psychology (Early Foundations): While not focused on algorithms, the Gestaltists emphasized insight in problem solving—an alternative to stepwise logic. This contrast laid early groundwork for comparing algorithms with non-linear problem-solving. |
Summary (Algorithms)
Algorithms represent the ideal of rational, structured problem solving. They are essential to understanding how problem solving could work in optimal cognitive systems. However, due to human limitations, algorithms are often replaced by quicker, intuitive heuristics in real-world situations. Nonetheless, they remain central to modeling cognitive processes and developing AI systems.
|
Insight Learning
Definition |
Insight learning refers to the sudden realization of a problem’s solution without the use of trial-and-error. It involves a cognitive reorganization of information leading to an “Aha!” or “Eureka” moment. |
Theorist: Wolfgang Köhler Köhler was a Gestalt psychologist who emphasized that perception and understanding are holistic. His work with chimpanzees laid the foundation for insight as a distinct form of learning. |
Key Characteristics |
Suddenness: The solution appears abruptly rather than through gradual attempts. Perceptual Reorganization: The problem is viewed in a new way, revealing the solution. No Overt Trial-and-Error: Unlike Thorndike’s animals, subjects do not randomly try different methods. Transfer of Learning: Once insight is achieved, it can be applied to similar problems.
|
Köhler’s Experiments |
Conducted on chimpanzees in the Canary Islands. In one study, a banana was placed out of reach, and chimpanzees used sticks or stacked boxes to retrieve it. The animals did not solve the problem by repeated random attempts; instead, they paused and then acted with purpose, suggesting cognitive restructuring.
|
Cognitive Explanation |
Involves accessing previously unconnected elements in memory and restructuring them. The solution often comes after a period of incubation — a temporary break from conscious problem-solving. Insight is associated with higher-order cognitive functions such as abstraction, pattern recognition, and divergent thinking.
|
Relevance to Cognitive Psychology |
Supports the idea that learning is not always linear or behaviorally observable. Provides evidence against purely behaviorist models of learning. Related to creative thinking, complex problem solving, and real-life innovation. |
Neuroimaging studies show right hemisphere involvement (especially anterior temporal lobe) during insight. Insight is now studied alongside intuitive decision-making and creativity research. |
Applications |
Educational strategies that promote deep understanding over memorization. Problem solving in design thinking, innovation, therapy, and scientific discovery. Used to explain sudden clarity in problem-based learning environments. |
Contrast with Other Learning Models |
Vs. Trial-and-Error Learning: Insight does not involve repeated failure before success. Vs. Operant Conditioning: Insight is not reinforced incrementally; it emerges through internal processing. |
Types of Heuristics
Core Types |
Availability Heuristic |
Judging the likelihood of an event based on how easily examples come to mind. Example: Overestimating plane crashes after seeing news coverage.
|
Representativeness Heuristic |
Evaluating probabilities by comparing how similar an instance is to a prototype. Example: Assuming someone is a librarian because they are quiet and introverted.
|
Anchoring and Adjustment Heuristic |
Making estimates by starting from an initial value (anchor) and adjusting, often insufficiently. Example: Being influenced by the first price offered in a negotiation.
|
Other Common Heuristics |
Recognition Heuristic |
Preferring options that are recognized over those that are not, especially when knowledge is limited. |
Simulation Heuristic |
Judging the likelihood of an event based on how easily one can imagine it happening. |
Affect Heuristic |
Making decisions based on emotional responses rather than detailed analysis. |
Fluency Heuristic |
Assuming that information processed more fluently (e.g., read more easily) is more accurate or important. |
Means-Ends Analysis
Definition: Means-Ends Analysis (MEA) is a problem-solving strategy used to reduce the difference between a current situation and a desired goal by breaking the problem into smaller subgoals. |
Origin: |
Developed by Newell and Simon in the 1950s. Based on the idea that people solve problems by identifying differences between the present state and the goal state.
|
Core Idea: |
Compare the current state with the goal state. Identify the biggest difference. Choose an action (means) to reduce that difference. If the action can’t be applied directly, set a new subgoal to achieve conditions that allow the action. Repeat the process until the goal is reached.
|
Steps in Means-Ends analysis |
Identify the current state. Identify the goal state. Determine the difference(s) between the two. Select the most significant difference. Find an operator (action) to reduce that difference. If the operator can’t be applied, create a subgoal to make it applicable. Apply the operator and update the current state. Repeat the steps until the goal is achieved.
|
Example |
Problem: You want to bake a cake, but you have no eggs. Current state: No eggs. Goal state: Have a cake. Difference: Missing eggs. Operator: Go to the store and buy eggs. Subgoal: Get money, go to store. Apply operator, return with eggs. Now you can bake the cake.
|
Advantages |
Helps structure problem-solving. Breaks down complex problems into manageable parts.
|
Limitations |
Can be inefficient if the subgoals are not well chosen. Assumes the problem solver can correctly identify and apply operators.
|
This method of problem solving comes under the information processing approach to problem solving |
General Problem Solver (GPS)
Definition: |
The General Problem Solver (GPS) is a computer program developed in the 1950s to simulate human problem-solving. It uses logical steps and rules to solve well-defined problems by mimicking human cognitive strategies. |
Developed By: |
Allen Newell Herbert A. Simon J.C. Shaw (1957) |
Purpose: |
To model how humans solve problems. To serve as a universal problem-solving engine for AI and psychology research.
|
How GPS Works: |
Define the problem (initial state, goal state, and rules). Analyze the difference between the current and goal states. Select an operator to reduce the difference. If the operator can't be used, set a subgoal to make it usable. Apply the operator and update the current state. Repeat until the goal is reached.
|
Strengths: |
First program to separate problem-solving method from problem content. Helped lay the foundation for symbolic AI. Modeled human-like reasoning. |
Limitations |
Could only solve well-structured problems (with clear rules and goals). Not effective for real-world or ill-structured problems. Required a lot of predefined information. |
Analogical Problem Solving
Definition: |
Analogical problem solving is a strategy where a person solves a new problem (target problem) by referring to a previously solved problem (source problem) that is structurally similar. It involves mapping relationships from the known to the unknown. |
Key Steps in Analogical Problem Solving |
Noticing a Relational Similarity Recognizing that the current problem is similar to one you’ve seen before. Retrieving a Source Problem Recalling a past situation that resembles the current one. Mapping Corresponding Elements Aligning the structure of the old problem with the new one. Identifying which elements play similar roles. Applying the Mapping Using the solution from the old problem to address the new problem.
|
Classic Experiment Example Gick & Holyoak (1980s) – The Radiation Problem |
Participants were given a difficult medical problem. If previously told a structurally similar story (attacking a fortress with small forces from different sides), they were more likely to solve it. Key finding: Analogical transfer improves when people are explicitly told to compare stories. |
Types of Analogies |
Surface analogy: Similar in details but not in structure. Structural analogy: Similar in underlying relationship — more useful for problem solving. |
Why It’s Important |
Promotes creative problem solving. Helps transfer knowledge across domains. Essential in learning, reasoning, and intelligence. |
Strengths |
Encourages flexible thinking Aids in solving novel or unfamiliar problems Builds on past experience and knowledge |
Limitations |
People often focus on surface features, not deeper structure May fail if analogy is inappropriate or misleading Requires prior experience with relevant problems |
|
|
Types of Problems in Cognitive Psychology:
Well-Defined Problems: Clear initial state, goal, and rules (e.g., solving a math equation). Ill-Defined Problems: Ambiguous or unclear goals and solutions (e.g., designing a career plan).
|
Stages of Problem Solving
Problem Identification |
Recognizing that a problem exists. Distinguishing between the current situation and the desired goal. Requires attention, perception, and sometimes intuition. Example: Realizing you can't submit an assignment because your file is corrupted. |
Problem Representation (or Understanding the Problem) |
Mentally organizing the elements of the problem. Involves creating a “problem space” with possible states and transitions. Good representation often simplifies the problem. Example: Drawing a diagram or making a flowchart to visualize relationships. |
Strategy Formulation |
Deciding how to approach the problem. Choosing between strategies like trial and error, heuristics, or algorithms. Involves planning, goal-setting, and sometimes setting subgoals.
|
Organization of Information |
Sorting relevant and irrelevant data. Grouping information based on patterns, categories, or importance. Helps reduce cognitive load and improve focus.
|
Resource Allocation |
Assessing time, energy, attention, and tools required. Determining how much effort or what external help might be needed. Example: Deciding whether to solve the problem now or postpone it for later. |
Monitoring (or Progress Tracking) |
Continuously checking if the strategy is working. Adjusting methods or correcting errors along the way. Metacognition (thinking about one’s own thinking) plays a big role here. |
Evaluation (or Reviewing the Outcome) |
Reflecting on the solution: did it work? Assessing the outcome against the original goal. Learning from mistakes and successes to improve future problem solving.
|
Cognitive Processing Involved:
Representation: Mental model or schema of the problem. Planning and strategizing: Selecting and organizing steps. Monitoring: Keeping track of progress. Evaluation: Judging if the goal is met or if another approach is needed.
|
Examples of Problems in Cognitive Contexts: |
Solving a jigsaw puzzle (well-defined) Choosing a college major (ill-defined) Figuring out how to fix a broken device without instructions
|
Why It Matters in Cognitive Psychology:
Problem solving is a core cognitive function that reveals how we learn, reason, and adapt. |
Understanding what constitutes a problem helps in designing cognitive tests and therapeutic interventions. |
Key Theorists:
Allen Newell & Herbert Simon – Information-processing approach to problem solving. Karl Duncker – Insight and functional fixedness. Gestalt Psychologists – Emphasis on perception and restructuring in problem solving.
|
Use of Algorithms in Problem Solving
Algorithms are most useful for:
|
Well-defined problems: These have a clear goal, starting point, and rules (e.g., solving a quadratic equation). Tasks with limited variables: Such as number-based puzzles or rule-based logic problems.
|
They are less effective for:
|
Time-sensitive situations: Where fast approximations are needed over perfect solutions. Ill-defined problems: Where goals or paths are vague (e.g., resolving interpersonal conflict).
|
Types of Algorithms
Brute-force search: Tries every possible option until the right one is found. Effective but inefficient. Means-end analysis: Compares current state with goal state and takes steps to reduce the difference. Common in both algorithmic and heuristic frameworks. Recursive algorithms: Solves a problem by breaking it down into smaller instances of the same problem. Conceptually aligned with problem decomposition. Search algorithms in memory: Used in modeling retrieval (e.g., serial exhaustive search).
|
Relevance to Cognitive Science and AI
In cognitive science and artificial intelligence, algorithms are crucial for simulating problem-solving processes. Cognitive architectures like ACT-R and SOAR are built around rule-based processing models that mimic algorithmic thinking. These models provide insight into how humans could solve problems if they followed strict computational logic.
|
Cognitive Conditions for Algorithm Use
Formal education and training: Increases familiarity with algorithmic methods.
|
Task structure: Problems with clearly defined variables and rules favor algorithmic approaches.
|
Motivation for accuracy: People are more likely to use algorithms when stakes are high. |
Supportive environment: Tools like pen-and-paper, calculators, or structured formats facilitate algorithm use. |
Limitations in Human Use of Algorithms{{nl}}
Working memory constraints
|
Processing speed limitations
|
Susceptibility to fatigue or distraction
|
Tendency toward cognitive economy (favoring fast over correct answers)
|
Definition of Heuristics
Heuristics are mental shortcuts or informal rules of thumb that people use to make judgments, solve problems, and make decisions quickly and efficiently. Unlike algorithms, heuristics do not guarantee correct solutions, but they are cognitively economical and often sufficient in everyday contexts. |
Origins and Development |
The concept of heuristics became central to cognitive psychology in the 1970s through the work of Amos Tversky and Daniel Kahneman, who explored how people systematically deviate from rational judgment. They identified heuristics as the cognitive tools that lead to biases in judgment and decision making. |
Why We Use Heuristics |
Cognitive economy: Heuristics reduce mental effort and processing time. Limited information: People often make decisions with incomplete data. Time pressure: Heuristics allow for quick decisions in urgent situations. Uncertainty: Heuristics help navigate ambiguous or novel circumstances. Adaptive value: In many situations, heuristics lead to reasonably accurate outcomes. |
Heuristics and Cognitive Biases
While heuristics are generally adaptive, they often lead to systematic errors or biases. Examples of biases arising from heuristics include: |
Confirmation bias (favoring information that confirms prior beliefs) |
Gambler’s fallacy (expecting outcomes to "balance out") |
Base rate neglect (ignoring statistical base rates in favor of vivid or specific details) |
Relevance of Heuristics in Problem Solving
Heuristics often guide initial hypothesis formation and strategy selection in ill-defined problems. In insight-based or real-world problems, people frequently rely on intuitive rules rather than structured, algorithmic approaches. |
Limitations of Heuristics
Can lead to biases and errors when used inappropriately |
Overreliance may prevent deeper analysis or re-evaluation |
Often context-dependent — what works well in one domain may fail in another |
Difficult to detect or correct due to their unconscious, automatic nature |
Theoretical Frameworks of Heuristics
Bounded Rationality (Herbert Simon) Humans are "satisficers" rather than optimizers — they seek satisfactory solutions rather than perfect ones, especially when using heuristics. |
Fast and Frugal Heuristics (Gerd Gigerenzer) Contrasts Tversky and Kahneman’s error-focused view. Emphasizes that heuristics are often ecologically rational and well-adapted to specific environments. Argues that under certain conditions, heuristics outperform complex strategies.
|
Dual Process Theories Heuristics are typically associated with System 1 thinking — fast, automatic, and intuitive — in contrast to System 2, which is slower and more analytical. Tversky and Kahneman’s System 1/System 2 model is central to understanding how heuristics are deployed in real-time decision making.
|
Functional Fixedness and Mental Set
Functional Fixedness |
Mental Set |
The cognitive bias that limits a person to using an object only in the way it is traditionally used. It prevents people from seeing alternative uses or functions for familiar tools and materials. |
The tendency to approach problems using a strategy that has worked in the past, even when a newer, more efficient method is available. |
First identified by Karl Duncker in the 1930s. |
Theorist: Abraham Luchins (1942) |
Classical Experiment: Duncker’s Candle Problem Participants are given a candle, a box of tacks, and matches. The task: Attach the candle to the wall so that it does not drip on the table. Many fail to see the box as a platform rather than just a container, illustrating functional fixedness. |
Classical Experiment: Luchins’ Water Jar Problem Participants are taught to use a complex formula (e.g., B - A - 2C) to solve several water jar volume problems. Later, when a simpler method is possible, many still use the earlier complex strategy, showing a rigid mental set |
Cognitive Explanation: Arises from strong object-function associations stored in semantic memory. Inhibits divergent thinking and insight-based solutions. Reflects how schema and experience can constrain perception of problem elements. |
Cognitive Explanation: Mental set reflects positive transfer that becomes maladaptive. Reliance on familiar schemas blocks more efficient or creative solutions. Involves automatization of procedures at the cost of flexibility. |
Overcoming Functional Fixedness: Reframing or recontextualizing the problem. Engaging in conceptual expansion (seeing familiar things in unfamiliar ways). Encouraging creativity and flexible thinking. |
Reducing Mental Set Effects: Training in flexible thinking and metacognition. Awareness of cognitive biases. Varied practice that discourages rigid rule-following. |
|
Relation to Problem Space Theory (Newell & Simon): Mental sets limit the exploration of alternative paths in the problem space. They can cause a person to prematurely settle into a fixed path or strategy.
|
Dual Process Theory of Thinking
Definition: |
Dual Process Theory suggests that human thinking operates through two distinct systems: System 1: Fast, automatic, intuitive, and emotional. System 2: Slow, deliberate, analytical, and logical. This theory explains why we sometimes rely on gut instincts, and other times on careful reasoning. |
Key Features of the Two Systems |
System 1 Operates automatically and quickly Requires little or no effort Based on heuristics (mental shortcuts) Emotionally charged and context-dependent Examples: Detecting hostility in a voice, driving a familiar route, solving 2 + 2
|
System 2 Allocates attention to effortful mental activities Involves reasoning, logic, planning Slower but more reliable Used in unfamiliar or complex situations Examples: Solving a math problem, evaluating an argument, planning a trip |
Theorists Daniel Kahneman and Amos Tversky popularized this model in the context of judgment and decision making. Kahneman's book “Thinking, Fast and Slow” is a foundational text. |
Applications of Dual Process Theory Explains biases and errors in decision-making (System 1 can be misleading) Used in cognitive psychology, behavioral economics, and education Helps design better problem-solving strategies and interventions |
Strengths Explains both quick decisions and complex reasoning Supported by research in neuroscience and cognitive science Helps account for cognitive biases and heuristics
|
Limitations May oversimplify human thought into just two systems Real thinking often involves interaction between the two Boundaries between systems can blur
|
|
Created By
Metadata
Comments
No comments yet. Add yours below!
Add a Comment
Related Cheat Sheets
More Cheat Sheets by rentasticco