Problem solving

"Problem" redirects here. For other uses, see Problem (disambiguation).

Problem solving consists of using generic or ad hoc methods, in an orderly manner, for finding solutions to problems. Some of the problem-solving techniques developed and used in artificial intelligence, computer science, engineering, mathematics, or medicine are related to mental problem-solving techniques studied in psychology.

Definition

The term problem-solving is used in many disciplines, sometimes with different perspectives, and often with different terminologies. For instance, it is a mental process in psychology and a computerized process in computer science. Problems can also be classified into two different types (ill-defined and well-defined) from which appropriate solutions are to be made. Ill-defined problems are those that do not have clear goals, solution paths, or expected solution. Well-defined problems have specific goals, clearly defined solution paths, and clear expected solutions. These problems also allow for more initial planning than ill-defined problems.[1] Being able to solve problems sometimes involves dealing with pragmatics (logic) and semantics (interpretation of the problem). The ability to understand what the goal of the problem is and what rules could be applied represent the key to solving the problem. Sometimes the problem requires some abstract thinking and coming up with a creative solution.

Psychology

In psychology, problem solving refers to a state of desire for reaching a definite 'goal' from a present condition that either is not directly moving toward the goal, is far from it, or needs more complex logic for finding a missing description of conditions or steps toward the goal.[2] In psychology, problem solving is the concluding part of a larger process that also includes problem finding and problem shaping.

Considered the most complex of all intellectual functions, problem solving has been defined as a higher-order cognitive process that requires the modulation and control of more routine or fundamental skills.[3] Problem solving has two major domains: mathematical problem solving and personal problem solving where, in the second, some difficulty or barrier is encountered.[4] Further problem solving occurs when moving from a given state to a desired goal state is needed for either living organisms or an artificial intelligence system.

While problem solving accompanies the very beginning of human evolution and especially the history of mathematics,[4] the nature of human problem solving processes and methods has been studied by psychologists over the past hundred years. Methods of studying problem solving include introspection, behaviorism, simulation, computer modeling, and experiment. Social psychologists have recently distinguished between independent and interdependent problem-solving.[5]

Clinical psychology

Simple laboratory-based tasks can be useful solving; however, they usually omit the complexity and emotional valence of "real-world" problems. In clinical psychology, researchers have focused on the role of emotions in problem solving (D'Zurilla & Goldfried, 1971; D'Zurilla & Nezu, 1982), demonstrating that poor emotional control can disrupt focus on the target task and impede problem resolution (Rath, Langenbahn, Simon, Sherr, & Diller, 2004). In this conceptualization, human problem solving consists of two related processes: problem orientation, the motivational/attitudinal/affective approach to problematic situations and problem-solving skills. Working with individuals with frontal lobe injuries, neuropsychologists have discovered that deficits in emotional control and reasoning can be remediated, improving the capacity of injured persons to resolve everyday problems successfully (Rath, Simon, Langenbahn, Sherr, & Diller, 2003).

Cognitive sciences

The early experimental work of the Gestaltists in Germany placed the beginning of problem solving study (e.g., Karl Duncker in 1935 with his book The psychology of productive thinking[6]). Later this experimental work continued through the 1960s and early 1970s with research conducted on relatively simple (but novel for participants) laboratory tasks of problem solving.[7][8] Choosing simple novel tasks was based on the clearly defined optimal solutions and their short time for solving, which made it possible for the researchers to trace participants' steps in problem-solving process. Researchers' underlying assumption was that simple tasks such as the Tower of Hanoi correspond to the main properties of "real world" problems and thus the characteristic cognitive processes within participants' attempts to solve simple problems are the same for "real world" problems too; simple problems were used for reasons of convenience and with the expectation that thought generalizations to more complex problems would become possible. Perhaps the best-known and most impressive example of this line of research is the work by Allen Newell and Herbert A. Simon.[9] Other experts have shown that the principle of decomposition improves the ability of the problem solver to make good judgment.[10]

Computer science and algorithmics

In computer science and in the part of artificial intelligence that deals with algorithms ("algorithmics"), problem solving encompasses a number of techniques known as algorithms, heuristics, root cause analysis, etc. In these disciplines, problem solving is part of a larger process that encompasses problem determination, de-duplication, analysis, diagnosis, repair, etc.

Engineering

Problem solving is used in when products or processes fail, so corrective action can be taken to prevent further failures. It can also be applied to a product or process prior to an actual fail event, i.e., when a potential problem can be predicted and analyzed, and mitigation applied so the problem never actually occurs. Techniques such as Failure Mode Effects Analysis can be used to proactively reduce the likelihood of problems occurring.

Military science

In military science, problem solving is linked to the concept of "end-states", the desired condition or situation that strategists wish to generate.[11]:xiii, E-2 The ability to solve problems is important at any military rank, but is highly critical at the command and control level, where it is strictly correlated to the deep understanding of qualitative and quantitative scenarios. Effectiveness of problem solving is "a criterion used to assess changes in system behavior, capability, or operational environment that is tied to measuring the attainment of an end state, achievement of an objective, or creation of an effect".[11]:IV-24 Planning for problem-solving is a "process that determines and describes how to employ 'means' in specific 'ways' to achieve 'ends' (the problem's solution)."[11]:IV-1

Other

Forensic engineering is an important technique of failure analysis that involves tracing product defects and flaws. Corrective action can then be taken to prevent further failures.

Reverse engineering attempts to discover the original problem-solving logic used in developing a product by taking it apart.

Other problem solving tools are linear and nonlinear programming, queuing systems, and simulation.[12]

Cognitive sciences: two schools

In cognitive sciences, researchers' realization that problem-solving processes differ across knowledge domains and across levels of expertise (e.g. Sternberg, 1995) and that, consequently, findings obtained in the laboratory cannot necessarily generalize to problem-solving situations outside the laboratory, has led to an emphasis on real-world problem solving since the 1990s. This emphasis has been expressed quite differently in North America and Europe, however. Whereas North American research has typically concentrated on studying problem solving in separate, natural knowledge domains, much of the European research has focused on novel, complex problems, and has been performed with computerized scenarios (see Funke, 1991, for an overview).

Europe

In Europe, two main approaches have surfaced, one initiated by Donald Broadbent (1977; see Berry & Broadbent, 1995) in the United Kingdom and the other one by Dietrich Dörner (1975, 1985; see Dörner & Wearing, 1995) in Germany. The two approaches share an emphasis on relatively complex, semantically rich, computerized laboratory tasks, constructed to resemble real-life problems. The approaches differ somewhat in their theoretical goals and methodology, however. The tradition initiated by Broadbent emphasizes the distinction between cognitive problem-solving processes that operate under awareness versus outside of awareness, and typically employs mathematically well-defined computerized systems. The tradition initiated by Dörner, on the other hand, has an interest in the interplay of the cognitive, motivational, and social components of problem solving, and utilizes very complex computerized scenarios that contain up to 2,000 highly interconnected variables (e.g., Dörner, Kreuzig, Reither & Stäudel's 1983 LOHHAUSEN project; Ringelband, Misiak & Kluwe, 1990). Buchner (1995) describes the two traditions in detail.

North America

In North America, initiated by the work of Herbert A. Simon on "learning by doing" in semantically rich domains (e.g. Anzai & Simon, 1979; Bhaskar & Simon, 1977), researchers began to investigate problem solving separately in different natural knowledge domains – such as physics, writing, or chess playing – thus relinquishing their attempts to extract a global theory of problem solving (e.g. Sternberg & Frensch, 1991). Instead, these researchers have frequently focused on the development of problem solving within a certain domain, that is on the development of expertise (e.g. Anderson, Boyle & Reiser, 1985; Chase & Simon, 1973; Chi, Feltovich & Glaser, 1981).

Areas that have attracted rather intensive attention in North America include:

Characteristics of complex problems

As elucidated by Dietrich Dörner and later expanded upon by Joachim Funke, complex problems have some typical characteristics that can be summarized as follows:

Problem-solving strategies

Problem-solving strategies are the steps that one would use to find the problem(s) that are in the way to getting to one's own goal. Some would refer to this as the 'problem-solving cycle'. (Bransford & Stein, 1993) In this cycle one will recognize the problem, define the problem, develop a strategy to fix the problem, organize the knowledge of the problem cycle, figure-out the resources at the user's disposal, monitor one's progress, and evaluate the solution for accuracy. Although called a cycle, one does not have to do each step in order to fix the problem, in fact those who don't are usually better at problem solving. The reason it is called a cycle is that once one is completed with a problem another usually will pop up.

Blanchard-Fields (2007) looks at problem solving from one of two facets. The first looking at those problems that only have one solution (like mathematical problems, or fact-based questions) which are grounded in psychometric intelligence. The other that is socioemotional in nature and are unpredictable with answers that are constantly changing (like what's your favorite color or what you should get someone for Christmas).

The following techniques are usually called problem-solving strategies'[13]

Problem-solving methods

Common barriers to problem solving

Common barriers to problem solving are mental constructs that impede our ability to correctly solve problems. These barriers prevent people from solving problems in the most efficient manner possible. Five of the most common processes and factors that researchers have identified as barriers to problem solving are confirmation bias, mental set, functional fixedness, unnecessary constraints, and irrelevant information.

Confirmation bias

Main article: Confirmation bias

Within the field of science there exists a set of fundamental standards, the scientific method, which outlines the process of discovering facts or truths about the world through unbiased consideration of all pertinent information and through impartial observation of and/or experimentation with that information. According to this theory, one is able to most accurately find a solution to a perceived problem by performing the aforementioned steps. The scientific method does not prescribe a process that is limited to scientists, but rather one that all people can practice in their respective fields of work as well as in their personal lives. Confirmation bias can be described as one's unconscious or unintentional corruption of the scientific method. Thus when one demonstrates confirmation bias, one is formally or informally collecting data and then subsequently observing and experimenting with that data in such a way that favors a preconceived notion that may or may not have motivation.[14] Research has found that professionals within scientific fields of study also experience confirmation bias. Andreas Hergovich, Reinhard Schott, and Christoph Burger's experiment conducted online, for instance, suggested that professionals within the field of psychological research are likely to view scientific studies that are congruent with their preconceived understandings more favorably than studies that are incongruent with their established beliefs.[15]

Motivation refers to one's desire to defend or find substantiation for beliefs (e.g., religious beliefs) that are important to one.[16] According to Raymond Nickerson, one can see the consequences of confirmation bias in real-life situations, which range in severity from inefficient government policies to genocide. With respect to the latter and most severe ramification of this cognitive barrier, Nickerson argued that those involved in committing genocide of persons accused of witchcraft, an atrocity that occurred from the 15th to 17th centuries, demonstrated confirmation bias with motivation. Researcher Michael Allen found evidence for confirmation bias with motivation in school children who worked to manipulate their science experiments in such a way that would produce their hoped for results.[17] However, confirmation bias does not necessarily require motivation. In 1960, Peter Cathcart Wason conducted an experiment in which participants first viewed three numbers and then created a hypothesis that proposed a rule that could have been used to create that triplet of numbers. When testing their hypotheses, participants tended to only create additional triplets of numbers that would confirm their hypotheses, and tended not to create triplets that would negate or disprove their hypotheses. Thus research also shows that people can and do work to confirm theories or ideas that do not support or engage personally significant beliefs.[18]

Mental set

Main article: Mental set

Mental set was first articulated by Abraham Luchins in the 1940s and demonstrated in his well-known water jug experiments.[19] In these experiments, participants were asked to fill one jug with a specific amount of water using only other jugs (typically three) with different maximum capacities as tools. After Luchins gave his participants a set of water jug problems that could all be solved by employing a single technique, he would then give them a problem that could either be solved using that same technique or a novel and simpler method. Luchins discovered that his participants tended to use the same technique that they had become accustomed to despite the possibility of using a simpler alternative.[20] Thus mental set describes one's inclination to attempt to solve problems in such a way that has proved successful in previous experiences. However, as Luchins' work revealed, such methods for finding a solution that have worked in the past may not be adequate or optimal for certain new but similar problems. Therefore, it is often necessary for people to move beyond their mental sets in order to find solutions. This was again demonstrated in Norman Maier's 1931 experiment, which challenged participants to solve a problem by using a household object (pliers) in an unconventional manner. Maier observed that participants were often unable to view the object in a way that strayed from its typical use, a phenomenon regarded as a particular form of mental set (more specifically known as functional fixedness, which is the topic of the following section). When people cling rigidly to their mental sets, they are said to be experiencing fixation, a seeming obsession or preoccupation with attempted strategies that are repeatedly unsuccessful.[21] In the late 1990s, researcher Jennifer Wiley worked to reveal that expertise can work to create a mental set in persons considered to be experts in certain fields, and she furthermore gained evidence that the mental set created by expertise could lead to the development of fixation.[22]

Functional fixedness

Main article: Functional fixedness

Functional fixedness is a specific form of mental set and fixation, which was alluded to earlier in the Maier experiment, and furthermore it is another way in which cognitive bias can be seen throughout daily life. Tim German and Clark Barrett describe this barrier as the fixed design of an object hindering the individual's ability to see it serving other functions. In more technical terms, these researchers explained that "[s]ubjects become "fixed" on the design function of the objects, and problem solving suffers relative to control conditions in which the object's function is not demonstrated."[23] Functional fixedness is defined as only having that primary function of the object itself hinder the ability of it serving another purpose other than its original function. In research that highlighted the primary reasons that young children are immune to functional fixedness, it was stated that "functional fixedness...[is when]subjects are hindered in reaching the solution to a problem by their knowledge of an object's conventional function."[24] Furthermore, it is important to note that functional fixedness can be easily expressed in commonplace situations. For instance, imagine the following situation: a man sees a bug on the floor that he wants to kill, but the only thing in his hand at the moment is a can of air freshener. If the man starts looking around for something in the house to kill the bug with instead of realizing that the can of air freshener could in fact be used not only as having its main function as to freshen the air, he is said to be experiencing functional fixedness. The man's knowledge of the can being served as purely an air freshener hindered his ability to realize that it too could have been used to serve another purpose, which in this instance was as an instrument to kill the bug. Functional fixedness can happen on multiple occasions and can cause us to have certain cognitive biases. If we only see an object as serving one primary focus than we fail to realize that the object can be used in various ways other than its intended purpose. This can in turn cause many issues with regards to problem solving. Common sense seems to be a plausible answer to functional fixedness. One could make this argument because it seems rather simple to consider possible alternative uses for an object. Perhaps using common sense to solve this issue could be the most accurate answer within this context. With the previous stated example, it seems as if it would make perfect sense to use the can of air freshener to kill the bug rather than to search for something else to serve that function but, as research shows, this is often not the case.

Functional fixedness limits the ability for people to solve problems accurately by causing one to have a very narrow way of thinking. Functional fixedness can be seen in other types of learning behaviors as well. For instance, research has discovered the presence of functional fixedness in many educational instances. Researchers Furio, Calatayud, Baracenas, and Padilla stated that "... functional fixedness may be found in learning concepts as well as in solving chemistry problems."[25] There was more emphasis on this function being seen in this type of subject and others.

There are several hypotheses in regards to how functional fixedness relates to problem solving.[26] There are also many ways in which a person can run into problems while thinking of a particular object with having this function. If there is one way in which a person usually thinks of something rather than multiple ways then this can lead to a constraint in how the person thinks of that particular object. This can be seen as narrow minded thinking, which is defined as a way in which one is not able to see or accept certain ideas in a particular context. Functional fixedness is very closely related to this as previously mentioned. This can be done intentionally and or unintentionally, but for the most part it seems as if this process to problem solving is done in an unintentional way.

Functional fixedness can affect problem solvers in at least two particular ways. The first is with regards to time, as functional fixedness causes people to use more time than necessary to solve any given problem. Secondly, functional fixedness often causes solvers to make more attempts to solve a problem than they would have made if they were not experiencing this cognitive barrier. In the worst case, functional fixedness can completely prevent a person from realizing a solution to a problem. Functional fixedness is a commonplace occurrence, which affects the lives of many people.

Unnecessary constraints

Unnecessary constraints is another very common barrier that people face while attempting to problem-solve. This particular phenomenon occurs when the subject, trying to solve the problem subconsciously, places boundaries on the task at hand, which in turn forces him or her to strain to be more innovative in their thinking. The solver hits a barrier when they become fixated on only one way to solve their problem, and it becomes increasingly difficult to see anything but the method they have chosen. Typically, the solver experiences this when attempting to use a method they have already experienced success from, and they can not help but try to make it work in the present circumstances as well, even if they see that it is counterproductive.[27]

Groupthink, or taking on the mindset of the rest of the group members, can also act as an unnecessary constraint while trying to solve problems.[28] This is due to the fact that with everybody thinking the same thing, stopping on the same conclusions, and inhibiting themselves to think beyond this. This is very common, but the most well-known example of this barrier making itself present is in the famous example of the dot problem. In this example, there are nine dots lying in a square- three dots across, and three dots running up and down. The solver is then asked to draw no more than four lines, without lifting their pen or pencil from the paper. This series of lines should connect all of the dots on the paper. Then, what typically happens is the subject creates an assumption in their mind that they must connect the dots without letting his or her pen or pencil go outside of the square of dots. Standardized procedures like this can often bring mentally invented constraints of this kind,[29] and researchers have found a 0% correct solution rate in the time allotted for the task to be completed.[30] The imposed constraint inhibits the solver to think beyond the bounds of the dots. It is from this phenomenon that the expression "think outside the box" is derived.[31]

This problem can be quickly solved with a dawning of realization, or insight. A few minutes of struggling over a problem can bring these sudden insights, where the solver quickly sees the solution clearly. Problems such as this are most typically solved via insight and can be very difficult for the subject depending on either how they have structured the problem in their minds, how they draw on their past experiences, and how much they juggle this information in their working memories[31] In the case of the nine-dot example, the solver has already been structured incorrectly in their minds because of the constraint that they have placed upon the solution. In addition to this, people experience struggles when they try to compare the problem to their prior knowledge, and they think they must keep their lines within the dots and not go beyond. They do this because trying to envision the dots connected outside of the basic square puts a strain on their working memory.[31]

Luckily, the solution to the problem becomes obvious as insight occurs following incremental movements made toward the solution. These tiny movements happen without the solver knowing. Then when the insight is realized fully, the "aha" moment happens for the subject.[32] These moments of insight can take a long while to manifest or not so long at other times, but the way that the solution is arrived at after toiling over these barriers stays the same.

Irrelevant information

Irrelevant information is information presented within a problem that is unrelated or unimportant to the specific problem.[27] Within the specific context of the problem, irrelevant information would serve no purpose in helping solve that particular problem. Often irrelevant information is detrimental to the problem solving process. It is a common barrier that many people have trouble getting through, especially if they are not aware of it. Irrelevant information makes solving otherwise relatively simple problems much harder.[33]

For example: "Fifteen percent of the people in Topeka have unlisted telephone numbers. You select 200 names at random from the Topeka phone book. How many of these people have unlisted phone numbers?"[34]

The people that are not listed in the phone book would not be among the 200 names you selected. The individuals looking at this task would have naturally wanted to use the 15% given to them in the problem. They see that there is information present and they immediately think that it needs to be used. This of course is not true. These kinds of questions are often used to test students taking aptitude tests or cognitive evaluations.[35] They aren't meant to be difficult but they are meant to require thinking that is not necessarily common. Irrelevant Information is commonly represented in math problems, word problems specifically, where numerical information is put for the purpose of challenging the individual.

One reason irrelevant information is so effective at keeping a person off topic and away from the relevant information, is in how it is represented.[35] The way information is represented can make a vast difference in how difficult the problem is to be overcome. Whether a problem is represented visually, verbally, spatially, or mathematically, irrelevant information can have a profound effect on how long a problem takes to be solved; or if it's even possible. The Buddhist monk problem is a classic example of irrelevant information and how it can be represented in different ways:

A Buddhist monk begins at dawn one day walking up a mountain, reaches the top at sunset, meditates at the top for several days until one dawn when he begins to walk back to the foot of the mountain, which he reaches at sunset. Making no assumptions about his starting or stopping or about his pace during the trips, prove that there is a place on the path which he occupies at the same hour of the day on the two separate journeys.

This problem is near impossible to solve because of how the information is represented. Because it is written out in a way that represents the information verbally, it causes us to try and create a mental image of the paragraph. This is often very difficult to do especially with all the irrelevant information involved in the question. This example is made much easier to understand when the paragraph is represented visually. Now if the same problem was asked, but it was also accompanied by a corresponding graph, it would be far easier to answer this question; irrelevant information no longer serves as a road block. By representing the problem visually, there are no difficult words to understand or scenarios to imagine. The visual representation of this problem has removed the difficulty of solving it.

These types of representations are often used to make difficult problems easier.[36] They can be used on tests as a strategy to remove Irrelevant Information, which is one of the most common forms of barriers when discussing the issues of problem solving.[27] Identifying crucial information presented in a problem and then being able to correctly identify its usefulness is essential. Being aware of irrelevant information is the first step in overcoming this common barrier.

See also

Notes

  1. Schacter, D.L. et al. (2009). Psychology, Second Edition. New York: Worth Publishers. pp. 376
  2. "In each case "where you want to be" is an imagined (or written) state in which you would like to be. We might use the term 'Problem Identification' or analysis in order to figure out exactly what the problem is. After we have found a problem we need to define what the problem is. In other words, a distinguished feature of a problem is that there is a goal to be reached and how you get there is not immediately obvious.", What is a problem? in S. Ian Robertson, Problem solving, Psychology Press, 2001, p. 2.
  3. Goldstein F. C., & Levin H. S. (1987). Disorders of reasoning and problem-solving ability. In M. Meier, A. Benton, & L. Diller (Eds.), Neuropsychological rehabilitation. London: Taylor & Francis Group.
  4. 1 2 Bernd Zimmermann, On mathematical problem solving processes and history of mathematics, University of Jena.
  5. Rubin, M.; Watt, S. E.; Ramelli, M. (2012). "Immigrants' social integration as a function of approach-avoidance orientation and problem-solving style". International Journal of Intercultural Relations. 36: 498–505. doi:10.1016/j.ijintrel.2011.12.009.
  6. Duncker, K. (1935). Zur Psychologie des produktiven Denkens [The psychology of productive thinking]. Berlin: Julius Springer.
  7. For example Duncker's "X-ray" problem; Ewert & Lambert's "disk" problem in 1932, later known as Tower of Hanoi.
  8. Mayer, R. E. (1992). Thinking, problem solving, cognition. Second edition. New York: W. H. Freeman and Company.
    • Newell, A., & Simon, H. A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice-Hall.
  9. J. Scott Armstrong, William B. Denniston, Jr. and Matt M. Gordon (1975). "The Use of the Decomposition Principle in Making Judgments" (PDF). Organizational Behavior and Human Performance. 14: 257–263. doi:10.1016/0030-5073(75)90028-8.
  10. 1 2 3 "Commander's Handbook for Strategic Communication and Communication Strategy" (PDF). United States Joint Forces Command, Joint Warfighting Center, Suffolk, VA. 24 June 2010. Retrieved 10 October 2016.
  11. Malakooti, Behnam (2013). Operations and Production Systems with Multiple Objectives. John Wiley & Sons. ISBN 978-1-118-58537-5.
  12. Wang, Y., & Chiew, V. (2010). On the cognitive process of human problem solving. Cognitive Systems Research, 11(1), 81-92.
  13. Nickerson, R. S. (1998). "Confirmation bias: A ubiquitous phenomenon in many guises". Review of General Psychology. 2 (2): 176. doi:10.1037/1089-2680.2.2.175.
  14. Hergovich, Schott; Burger (2010). "Biased evaluation of abstracts depending on topic and conclusion: Further evidence of a confirmation bias within scientific psychology". Current Psychology. 29 (3): 188–209. doi:10.1007/s12144-010-9087-5.
  15. Nickerson, R. S. (1998). "Confirmation bias: A ubiquitous phenomenon in many guises". Review of General Psychology. 2 (2): 175–220. doi:10.1037/1089-2680.2.2.175.
  16. Allen (2011). "Theory-led confirmation bias and experimental persona". Research in Science & Technological Education. 29 (1): 107–127. doi:10.1080/02635143.2010.539973.
  17. Wason, P. C. (1960). "On the failure to eliminate hypotheses in a conceptual task". Quarterly Journal of Experimental Psychology. 12: 129–140. doi:10.1080/17470216008416717.
  18. Luchins, A. S. (1942). Mechanization in problem solving: The effect of Einstellung. Psychological Monographs, 54 (Whole No. 248).
  19. Öllinger, Jones, & Knoblich (2008). Investigating the effect of mental set on insight problem solving. Experimental Psychology',' 55(4), 269–270.
  20. Wiley, J (1998). "Expertise as mental set: The effects of domain knowledge in creative problem solving". Memory & Cognition. 24 (4): 716–730. doi:10.3758/bf03211392.
  21. Wiley, J (1998). "Expertise as mental set: The effects of domain knowledge in creative problem solving". Memory & Cognition. 24 (4): 716–730. doi:10.3758/bf03211392.
  22. German, Tim, P., and Barrett, Clark., H. Functional fixedness in a technologically sparse culture. University of California, Santa Barbara. American psychological society. 16 (1), 2005.
  23. German, Tim, P., Defeyter, Margaret A. Immunity to functional fixedness in young children. University of Essex, Colchester, England. Psychonomic Bulletin and Review. 7 (4), 2000.
  24. Furio, C.; Calatayud, M. L.; Baracenas, S; Padilla, O (2000). "Functional fixedness and functional reduction as common sense reasonings in chemical equilibrium and in geometry and polarity of molecules. Valencia, Spain". Science Education. 84 (5).
  25. Adamson, Robert E. "Functional fixedness as related to problem solving: A repetition of three experiments. Stanford University. California". Journal of Experimental Psychology. 44 (4): 1952. doi:10.1037/h0062487.
  26. 1 2 3 Kellogg, R. T. (2003). Cognitive psychology (2nd ed.). California: Sage Publications, Inc.
  27. Cottam, Martha L., Dietz-Uhler, Beth, Mastors, Elena, & Preston, & Thomas. (2010). Introduction to Political Psychology (2nd ed.). New York: Psychology Press.
  28. Meloy, J. R. (1998). The Psychology of Stalking, Clinical and Forensic Perspectives (2nd ed.). London, England: Academic Press.
  29. MacGregor, J.N.; Ormerod, T.C.; Chronicle, E.P. (2001). "Information-processing and insight: A process model of performance on the nine-dot and related problems". Journal of Experimental Psychology: Learning, Memory, and Cognition. 27 (1): 176–201. doi:10.1037/0278-7393.27.1.176.
  30. 1 2 3 Weiten, Wayne. (2011). Psychology: themes and variations (8th ed.). California: Wadsworth.
  31. Novick, L. R., & Bassok, M. (2005). Problem solving. In K. J. Holyoak & R. G. Morrison (Eds.), Cambridge handbook of thinking and reasoning (Ch. 14, pp. 321-349). New York, NY: Cambridge University Press.
  32. Walinga, Jennifer (2010). "From walls to windows: Using barriers as pathways to insightful solutions". The Journal of Creative Behavior. 44: 143–167. doi:10.1002/j.2162-6057.2010.tb01331.x.
  33. Weiten, Wayne. (2011). Psychology: themes and variations (8th ed.) California: Wadsworth.
  34. 1 2 Walinga, Jennifer, Cunningham, J. Barton, & MacGregor, James N. (2011). Training insight problem solving through focus on barriers and assumptions. The Journal of Creative Behavior.
  35. Vlamings, Petra H. J. M., Hare, Brian, & Call, Joseph. Reaching around barriers: The performance of great apes and 3-5-year-old children. Animal Cognition, 13, 273-285. doi:10.1007/s10071-009-0265-5

References

Wikiquote has quotations related to: Problem solving
  • Altshuller, Genrich (1994). And Suddenly the Inventor Appeared. translated by Lev Shulyak. Worcester, MA: Technical Innovation Center. ISBN 0-9640740-1-X. 
  • Amsel, E., Langer, R., & Loutzenhiser, L. (1991). Do lawyers reason differently from psychologists? A comparative design for studying expertise. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 223-250). Hillsdale, NJ: Lawrence Erlbaum Associates. ISBN 978-0-8058-1783-6
  • Anderson, J. R., Boyle, C. B., & Reiser, B. J. (1985). "Intelligent tutoring systems". Science. 228 (4698): 456–462. doi:10.1126/science.228.4698.456. PMID 17746875. 
  • Anzai, K., & Simon, H. A. (1979) (1979). "The theory of learning by doing". Psychological Review. 86 (2): 124–140. doi:10.1037/0033-295X.86.2.124. PMID 493441. 
  • Beckmann, J. F., & Guthke, J. (1995). Complex problem solving, intelligence, and learning ability. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 177-200). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Berry, D. C., & Broadbent, D. E. (1995). Implicit learning in the control of complex systems: A reconsideration of some of the earlier claims. In P.A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 131-150). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Bhaskar, R., & Simon, H. A. (1977). Problem solving in semantically rich domains: An example from engineering thermodynamics. Cognitive Science, 1, 193-215.
  • Blanchard-Fields, F. (2007). "Everyday problem solving and emotion: An adult developmental perspective". Current Directions in Psychological Science. 16 (1): 26–31. doi:10.1111/j.1467-8721.2007.00469.x. 
  • Bransford, J. D., & Stein, B. S (1993). The ideal problem solver: A guide for improving thinking, learning, and creativity (2nd ed.). New York: W.H. Freeman. 
  • Brehmer, B. (1995). Feedback delays in dynamic decision making. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 103-130). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Brehmer, B., & Dörner, D. (1993). Experiments with computer-simulated microworlds: Escaping both the narrow straits of the laboratory and the deep blue sea of the field study. Computers in Human Behavior, 9, 171-184.
  • Broadbent, D. E. (1977). Levels, hierarchies, and the locus of control. Quarterly Journal of Experimental Psychology, 29, 181-201.
  • Bryson, M., Bereiter, C., Scardamalia, M., & Joram, E. (1991). Going beyond the problem as given: Problem solving in expert and novice writers. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 61-84). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Buchner, A. (1995). Theories of complex problem solving. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 27-63). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Chase, W. G., & Simon, H. A. (1973). Perception in chess. Cognitive Psychology, 4, 55-81.
  • Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1981). "Categorization and representation of physics problems by experts and novices". Cognitive Science. 5 (2): 121–152. doi:10.1207/s15516709cog0502_2. 
  • Dörner, D. (1975). Wie Menschen eine Welt verbessern wollten [How people wanted to improve the world]. Bild der Wissenschaft, 12, 48-53.
  • Dörner, D. (1985). Verhalten, Denken und Emotionen [Behavior, thinking, and emotions]. In L. H. Eckensberger & E. D. Lantermann (Eds.), Emotion und Reflexivität (pp. 157-181). München, Germany: Urban & Schwarzenberg.
  • Dörner, D. (1992). Über die Philosophie der Verwendung von Mikrowelten oder "Computerszenarios" in der psychologischen Forschung [On the proper use of microworlds or "computer scenarios" in psychological research]. In H. Gundlach (Ed.), Psychologische Forschung und Methode: Das Versprechen des Experiments. Festschrift für Werner Traxel (pp. 53-87). Passau, Germany: Passavia-Universitäts-Verlag.
  • Dörner, D., Kreuzig, H. W., Reither, F., & Stäudel, T. (Eds.). (1983). Lohhausen. Vom Umgang mit Unbestimmtheit und Komplexität [Lohhausen. On dealing with uncertainty and complexity]. Bern, Switzerland: Hans Huber.
  • Dörner, D., & Wearing, A. (1995). Complex problem solving: Toward a (computer-simulated) theory. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 65-99). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Duncker, K. (1935). Zur Psychologie des produktiven Denkens [The psychology of productive thinking]. Berlin: Julius Springer.
  • Ewert, P. H., & Lambert, J. F. (1932). Part II: The effect of verbal instructions upon the formation of a concept. Journal of General Psychology, 6, 400-411.
  • D'Zurilla, T. J.; Goldfried, M. R. (1971). "Problem solving and behavior modification". Journal of Abnormal Psychology. 78: 107–126. doi:10.1037/h0031360. 
  • D'Zurilla, T. J., & Nezu, A. M. (1982). Social problem solving in adults. In P. C. Kendall (Ed.), Advances in cognitive-behavioral research and therapy (Vol. 1, pp. 201–274). New York: Academic Press.
  • Eyferth, K., Schömann, M., & Widowski, D. (1986). Der Umgang von Psychologen mit Komplexität [On how psychologists deal with complexity]. Sprache & Kognition, 5, 11-26.
  • Frensch, P. A., & Funke, J. (Eds.). (1995). Complex problem solving: The European Perspective. Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Frensch, P. A., & Sternberg, R. J. (1991). Skill-related differences in game playing. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 343-381). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Funke, J. (1991). Solving complex problems: Human identification and control of complex systems. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 185-222). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Funke, J. (1993). Microworlds based on linear equation systems: A new approach to complex problem solving and experimental results. In G. Strube & K.-F. Wender (Eds.), The cognitive psychology of knowledge (pp. 313-330). Amsterdam: Elsevier Science Publishers.
  • Funke, J. (1995). Experimental research on complex problem solving. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 243-268). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Funke, U. (1995). Complex problem solving in personnel selection and training. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 219-240). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Goldstein F. C., & Levin H. S. (1987). Disorders of reasoning and problem-solving ability. In M. Meier, A. Benton, & L. Diller (Eds.), Neuropsychological rehabilitation. London: Taylor & Francis Group.
  • Groner, M., Groner, R., & Bischof, W. F. (1983). Approaches to heuristics: A historical review. In R. Groner, M. Groner, & W. F. Bischof (Eds.), Methods of heuristics (pp. 1-18). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Hayes, J. (1980). The complete problem solver. Philadelphia: The Franklin Institute Press.
  • Hegarty, M. (1991). Knowledge and processes in mechanical problem solving. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 253-285). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Heppner, P. P., & Krauskopf, C. J. (1987). An information-processing approach to personal problem solving. The Counseling Psychologist, 15, 371-447.
  • Huber, O. (1995). Complex problem solving as multi stage decision making. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 151-173). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Hübner, R. (1989). Methoden zur Analyse und Konstruktion von Aufgaben zur kognitiven Steuerung dynamischer Systeme [Methods for the analysis and construction of dynamic system control tasks]. Zeitschrift für Experimentelle und Angewandte Psychologie, 36, 221-238.
  • Hunt, E. (1991). Some comments on the study of complexity. In R. J. Sternberg, & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 383-395). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Hussy, W. (1985). Komplexes Problemlösen - Eine Sackgasse? [Complex problem solving - a dead end?]. Zeitschrift für Experimentelle und Angewandte Psychologie, 32, 55-77.
  • Kay, D. S. (1991). Computer interaction: Debugging the problems. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 317-340). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Kluwe, R. H. (1993). Knowledge and performance in complex problem solving. In G. Strube & K.-F. Wender (Eds.), The cognitive psychology of knowledge (pp. 401-423). Amsterdam: Elsevier Science Publishers.
  • Kluwe, R. H. (1995). Single case studies and models of complex problem solving. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 269-291). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Kolb, S., Petzing, F., & Stumpf, S. (1992). Komplexes Problemlösen: Bestimmung der Problemlösegüte von Probanden mittels Verfahren des Operations Research ? ein interdisziplinärer Ansatz [Complex problem solving: determining the quality of human problem solving by operations research tools - an interdisciplinary approach]. Sprache & Kognition, 11, 115-128.
  • Krems, J. F. (1995). Cognitive flexibility and complex problem solving. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 201-218). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Lesgold, A., & Lajoie, S. (1991). Complex problem solving in electronics. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 287-316). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Mayer, R. E. (1992). Thinking, problem solving, cognition. Second edition. New York: W. H. Freeman and Company.
  • Müller, H. (1993). Komplexes Problemlösen: Reliabilität und Wissen [Complex problem solving: Reliability and knowledge]. Bonn, Germany: Holos.
  • Newell, A., & Simon, H. A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice-Hall.
  • Paradies, M.W., & Unger, L. W. (2000). TapRooT - The System for Root Cause Analysis, Problem Investigation, and Proactive Improvement. Knoxville, TN: System Improvements.
  • Putz-Osterloh, W. (1993). Strategies for knowledge acquisition and transfer of knowledge in dynamic tasks. In G. Strube & K.-F. Wender (Eds.), The cognitive psychology of knowledge (pp. 331-350). Amsterdam: Elsevier Science Publishers.
  • Rath J. F.; Langenbahn D. M.; Simon D.; Sherr R. L.; Fletcher J.; Diller L. (2004). The construct of problem solving in higher level neuropsychological assessment and rehabilitation. Archives of Clinical Neuropsychology, 19, 613-635.
  • Rath, J. F.; Simon, D.; Langenbahn, D. M.; Sherr, R. L.; Diller, L. (2003). Group treatment of problem-solving deficits in outpatients with traumatic brain injury: A randomised outcome study. Neuropsychological Rehabilitation, 13, 461-488.
  • Riefer, D.M., & Batchelder, W.H. (1988). Multinomial modeling and the measurement of cognitive processes. Psychological Review, 95, 318-339.
  • Ringelband, O. J., Misiak, C., & Kluwe, R. H. (1990). Mental models and strategies in the control of a complex system. In D. Ackermann, & M. J. Tauber (Eds.), Mental models and human-computer interaction (Vol. 1, pp. 151-164). Amsterdam: Elsevier Science Publishers.
  • Schaub, H. (1993). Modellierung der Handlungsorganisation. Bern, Switzerland: Hans Huber.
  • Schoenfeld, A. H. (1985). Mathematical Problem Solving. Orlando, FL: Academic Press.
  • Sokol, S. M., & McCloskey, M. (1991). Cognitive mechanisms in calculation. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 85-116). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Stanovich, K. E., & Cunningham, A. E. (1991). Reading as constrained reasoning. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 3-60). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Sternberg, R. J. (1995). Conceptions of expertise in complex problem solving: A comparison of alternative conceptions. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European Perspective (pp. 295-321). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Sternberg, R. J., & Frensch, P. A. (Eds.). (1991). Complex problem solving: Principles and mechanisms. Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Strauß, B. (1993). Konfundierungen beim Komplexen Problemlösen. Zum Einfluß des Anteils der richtigen Lösungen (ArL) auf das Problemlöseverhalten in komplexen Situationen [Confoundations in complex problem solving. On the influence of the degree of correct solutions on problem solving in complex situations]. Bonn, Germany: Holos.
  • Strohschneider, S. (1991). Kein System von Systemen! Kommentar zu dem Aufsatz "Systemmerkmale als Determinanten des Umgangs mit dynamischen Systemen" von Joachim Funke [No system of systems! Reply to the paper "System features as determinants of behavior in dynamic task environments" by Joachim Funke]. Sprache & Kognition, 10, 109-113.
  • Van Lehn, K. (1989). Problem solving and cognitive skill acquisition. In M. I. Posner (Ed.), Foundations of cognitive science (pp. 527-579). Cambridge, MA: MIT Press.
  • Voss, J. F., Wolfe, C. R., Lawrence, J. A., & Engle, R. A. (1991). From representation to decision: An analysis of problem solving in international relations. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 119-158). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Wagner, R. K. (1991). Managerial problem solving. In R. J. Sternberg & P. A. Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 159-183). Hillsdale, NJ: Lawrence Erlbaum Associates.
  • Wisconsin Educational Media Association. (1993). "Information literacy: A position paper on information problem-solving." Madison, WI: WEMA Publications. (ED 376 817). (Portions adapted from Michigan State Board of Education's Position Paper on Information Processing Skills, 1992).
This article is issued from Wikipedia - version of the 12/4/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.