Stairs.  Entry group.  Materials.  Doors.  Locks.  Design

Stairs. Entry group. Materials. Doors. Locks. Design

» Skinner's main scientific works. Operant behavior

Skinner's main scientific works. Operant behavior

Story modern psychology Schultz Duan

B.F. Skinner (1904–1990)

B.F. Skinner (1904–1990)

The most influential figure in psychology for several decades was B. F. Skinner. One historian of psychology called him “without doubt the most famous American psychologist in the world” (Gilgen. 1982. P. 97). A survey of psychological historians and department chairs found Skinner to be one of the most eminent scientists of our time (Cote, Davis, & Davis. 1991). When Skinner died in 1990, the editor of American Psychologist wrote of him as “one of the giants of our field” who “has left an indelible mark on psychology” (Forwer 1990, p. 1203). And in the obituary of the Journal of the History of the Behavioral Sciences, he was described as “a leading figure in behaviorism of this century” (Keller. 1991. P. 3).

Beginning in the fifties and continuing for many years, Skinner was the leading behaviorist in the United States of America and attracted a huge number of loyal and enthusiastic followers and supporters. He developed a program for behavioral control of society, invented an automated playpen and became one of the main inspirers and creators of behavior modification techniques and learning machines. He wrote the novel Walden Two ( Walden Two), which remained popular fifty years after its publication. In 1971, his book “Beyond Freedom and Dignity” ( Beyond Freedom and Dignity) became a national bestseller, and Skinner himself became “the most popular character on various national and urban talk shows” (Bjork. 1993. P. 192). He became a celebrity: both the general public and his colleagues knew him well.

Pages of life

Skinner was born in Susquehanna, Pennsylvania, where he lived until he went to college. According to his own memories, his childhood was spent in an atmosphere of love and tranquility. He studied at the same school where his parents once studied; there were only seven students in Skinner's graduating class. He loved his school and always arrived early in the morning. In childhood and adolescence, he was interested in creating a variety of objects: rafts, carts, carousels, slings and slingshots, model airplanes, and even a steam cannon that shot potatoes and carrots over the roof of a neighbor’s house. He spent several years inventing perpetual motion machine. He also read a lot about animal behavior and kept a zoo at home, consisting of turtles, snakes, lizards, toads and chipmunks. Once at a fair he saw trained pigeons: many years later he himself taught the pigeons various tricks.

Skinner's psychological system reflects the experiences of his life in childhood and adolescence. According to his own views, human life is the fruit of past reinforcements. He claimed that his own life was as predetermined, orderly and correct as his system dictated that any human life. He believed that all aspects of human life can be traced back to their origins.

Skinner went to Hamilton College in New York, but he did not like it there. He wrote:

I could never fit into student life. I joined this brotherhood without knowing at all. what it is. I did not excel in sports and suffered severely when I was hit in the shin in hockey or when a skilled basketball player played the ball back from my skull... In an essay I wrote after my first year, I complained about this. that in college I was constantly overwhelmed by unnecessary demands (one of them was going to church every day) and that most students had no intellectual interests. In my senior year I was already an open rebel.(Skinner, 1967. P. 392.)

Skinner's rebelliousness included pranks, shocking the student community and openly criticizing the faculty and administration. His disobedience only stopped on graduation day, when, before the start of the ceremony, the college president warned Skinner and his friends that if they did not calm down, they would not be given diplomas.

Skinner still successfully graduated from college with a degree in English language, the right to belong to society<Фи Бета Каппа>and aspirations to become a writer. At a summer writing workshop, poet Robert Frost praised Skinner's poems and stories. For two years after graduating from college, Skinner was engaged in literary activities, and then decided that he<нечего сказать>. His failure as a writer left him so discouraged that he even began to consider consulting a psychiatrist. He considered himself a failure. My sense of self-worth was severely shaken.

In addition, he was disappointed in love. He was rejected by at least half a dozen young women, which caused him, in his own words, great physical pain. Once he was so shocked that he burned his lover's initials on his hand. The burn mark remained for many years. The biographer notes that Skinner's "love interests" were "always somewhat overwhelmed by disillusionment and disillusionment. True, Skinner soon acquired a reputation as a flighty” (Bjork. 1993. P. 116).

After reading about Watson and Pavlov's experiments in forming conditioned reflexes, Skinner made a sharp turn from the literary aspects of human behavior to the scientific ones. In 1928, he entered graduate school at Harvard University in psychology - despite the fact that he had never taken a psychology course before. In his own words, he entered graduate school “not because he suddenly felt an irresistible pull towards psychology, but only because. to get rid of an intolerable alternative” (Skinner. 1979. P. 37). Whether or not he had an irresistible craving for psychology, but three years later he received his Ph.D. Upon completion scientific work After completing his doctorate, he taught at the University of Minnesota (1936–1945) and Indiana University (1945–1974) before returning to Harvard.

The topic of his dissertation relates to a position that Skinner followed steadily throughout his career. He proposed that a reflex is a correlation between stimulus and response, and nothing more. In his 1938 book The Behavior of Organisms ( The Behavior of Organism) describes the main provisions of this system. Interestingly, the book sold only 500 copies in the first eight years of publication and received mostly negative reviews, and fifty years later the book was said to be "one of the few books that changed the face of modern psychology" ( Thompson, 1988, p. 397).

The quality of the system described in the book that changed the attitude towards it from complete failure to stunning success was its obvious applied significance for a wide variety of areas of psychology. “The sixties saw the rise of Skinner's star, partly due to the acceptance of his ideas in the field of education, partly due to the growing influence of Skinner's ideas in the field of clinical behavior modification” (Benjamin. 1993. P. 177). The wide applicability of Skinner's ideas was consistent with his aspirations, since he had a deep interest in the problems real life. His later work Science and Human Behavior ( Science and Human Behavior. 1953) has become the main textbook in behavioral psychology.

Skinner continued to work fruitfully until his death at the age of 86 - and he worked with the same enthusiasm that he showed sixty years ago. In the basement of his home, he set up a personal “Skinner Box,” a controlled environment that provided positive reinforcement. He slept there in a big yellow one plastic box, which just fit a mattress, several shelves with books, and a small TV. Every night he went to bed at ten o'clock, slept three hours, worked for an hour, then slept another three hours and got up at five o'clock in the morning to work another three hours. In the morning he would go to his office at the university and work there again, and in the afternoon he would give himself positive reinforcement by listening to music. In addition, the process of writing articles had a huge positive impact on him. “I really enjoy writing, and it would be a shame if I ever had to give it up” (Skinner. 1985, quoted in Fallen. 1992. P. 1439).

At the age of 78, Skinner wrote an article entitled "How to Maintain Intelligence in Old Age" ( Intellectual Self - Management in Old Age), in which he referred to his own experience(Skinner. 1983a). This article talks about how useful it is in old age to exercise the brain for several hours a day, while being sure to give breaks between bursts of activity - in order to support weakening memory and prevent a decline in intellectual abilities.

In 1989, Skinner was diagnosed with leukemia. He had no more than two months to live. In a radio interview he spoke about his feelings:

I am not a religious person, and therefore I am not worried about what will happen to me after death. And when they told me that I had such a disease and that in a few months I would die, I did not experience any emotions. No panic, no fear, no anxiety. Nothing at all. The only thing that touched me and made my eyes moist was the thought of how I would tell my wife and daughters about this. You see, when you die, you unwittingly hurt those who love you. And nothing can be done about it... I lived good life. It would be quite stupid of me to complain about her in any sense. Therefore, I will joyfully live the months that remain to me - just as I have always enjoyed life.(Quoted from: Catania. 1992. P. 1527.)

Eight days before his death, severely weakened, Skinner presented his paper at a meeting of the American Psychological Association in Boston. It was devoted to observable and unobservable stimuli, and, accordingly, respondent and operant behavior.

Skinner's behaviorism

Operant behavior occurs without exposure to any external observable stimuli. The body's response appears spontaneous in the sense that it is not externally related to any observable stimulus. This does not mean at all that the stimulus causing this or that reaction does not exist; this means that when a given reaction occurs, no stimulus is observable. From an experimental point of view, if a stimulus is absent, this means that it was not applied and therefore is not observed.

Another difference between respondent and operant behavior is that operant behavior affects the organism's environment, while respondent behavior does not. The experimental dog in Pavlov's laboratory, cuffed in harness, can do nothing more than react (for example, drool) when the experimenter offers it any stimuli. The dog itself cannot do anything to get the stimulus (food).

The operant behavior of a rat in a Skinner box, in contrast, is instrumental in the sense that the rat achieves its stimulus (food). When the rat presses the lever, it receives food; and if he doesn’t press the lever, he doesn’t get food. In this way the rat influences environment. (Skinner really did not like the term “Skinner box”, first introduced by Hull in 1933. He himself called this equipment an operant conditioning apparatus. However, the term “Skinner box” became so popular that it was included in all reference books and is currently in psychology is generally accepted.)

Skinner believed that operant behavior is characteristic of everyday learning. Since behavior is typically operant in nature, the most effective approach to behavioral science is to study the conditioning and extinction of operant behavior.

The classic experimental demonstration involved pressing a lever in a Skinner box. In this experiment, a food-deprived rat was placed in a box and given full opportunity to explore it. During the research, she inevitably had to touch the lever that activated the mechanism that pulled out the shelf with food. After receiving several portions of food, which were supposed to serve as reinforcement, the rat quickly formed a conditioned reflex. Note that the rat's behavior (pressing the lever) has an effect on the environment and is a tool for acquiring food. The dependent variable in this experiment is simple and straightforward: the rate of reaction.

Based on this experiment, Skinner formulated his law of acquisition which states that strength operant behavior increases if the behavior is accompanied by a reinforcing stimulus. Although it takes practice to develop a quick lever-pushing response, reinforcement is still key. Practice in itself does not achieve anything: it only provides the opportunity for additional reinforcement to occur.

Skinner's law of acquisition differs from Thorndike's and Hull's provisions on learning. Skinner did not address the consequences of reinforcement such as pain at all - pleasant feeling or pleasure - dissatisfaction, as Thorndike did. Skinner also did not try to interpret reinforcement in terms of reducing the impact of incentives, as did Clark Hull. Thorndike's and Hull's systems were explanatory; Skinner's system is strictly descriptive.

Skinner and his followers carried out a huge research work on learning issues - such as the role of punishment in skill acquisition, the impact various systems reinforcement, a measure of extinction of operant conditioning, the presence of secondary reinforcement, etc.

In addition to rats, they worked with other experimental animals and with humans, using the same principle as the main approach.<скиннеровского ящика>. If pigeons were used as experimental animals, then they had to peck at a certain point or spot; the reinforcement was food. Operant behavior in humans included aspects such as problem solving reinforced by praise or the knowledge that the correct answer had been given.

Skinner reported that he used back rubs as a reinforcer for his three-year-old daughter. However, this experiment turned out to be in an unexpected way. One day he was putting the girl to bed, stroking her back and suddenly decided to check how much of an encouraging reinforcement this was. “I waited,” Skinner wrote, “for her to lift her leg, and then I stroked it. Almost immediately she raised her leg again and I stroked it again. She laughed. “What are you laughing at?” - I asked, and she answered: “As soon as I lift my leg, you start stroking me!” (Skinner. 1987. P. 179).

Reinforcement schedule

Already the first studies in<скиннеровском ящике>lever presses demonstrated the importance of reinforcement for operant behavior. In this situation, the rat's behavior was reinforced with each lever press. That is, every time you execute right action, the rat received food. Skinner noted that although in real life reinforcement is not always consistent or continuous, learning still occurs and behavior is maintained, even if the reinforcement was random or rare.

It’s not always that when we go skating or skiing we end up on good ice or snow... It’s not always that when we go to a restaurant we get good food. because chefs are unpredictable. Calling friends on the phone. we don't always get an answer because friends may be absent. …The reinforcing characteristics of activity and learning are almost always intermittent. since it simply does not make sense to control every response with reinforcement.(Skinner. 1953. P. 99.)

Even if you do research all the time, you don't get reaction A every time you experiment. You don't get praised or promoted every day at work. wages. How does this intermittent reinforcement affect behavior? Is this or that reinforcement mode better than others in terms of its impact on behavior? Skinner and his colleagues spent years researching these issues (Ferster & Skinner 1857; Skinner 1969).

The need for these studies arose not out of purely scientific curiosity, but on the basis of practical expediency - which, by the way, illustrates the fact that science often differs significantly from the idealized model that is presented in some textbooks. One Saturday evening, Skinner discovered that he was almost out of food. At that time (the thirties) it was still impossible to buy food from special companies supplying research laboratories; the experimenter had to make the balls by hand, which was a rather lengthy and labor-intensive process.

Instead of spending his weekend making food pellets, Skinner asked himself: What would happen if he rewarded his rats once per minute, regardless of the number of responses? With this approach he will need much less feed and should have enough for the weekend. Skinner decided to conduct a long series of experiments to test various options reinforcement systems.

In one such study, Skinner compared the response rate of animals that received reinforcement on every response with the response rate of those animals that received reinforcement only after a certain interval of time. The latter condition is called a fixed-interval reinforcement schedule. Reinforcement could be given, for example, once per minute or every four minutes. An important point V in this case is that the experimental animal received reinforcement only after a certain period of time. (For example, a job where money is paid once a week or once a month is a fixed-interval reinforcement schedule; workers are paid not for the amount of output produced - that is, not for the number of conditioned responses - but for the number of days of the week that have passed.) Skinner's research showed that the shorter the interval between reinforcements, the more often the animal exhibits a conditioned response. Conversely, as the interval between reinforcements increases, the frequency of the response decreases.

The frequency of reinforcement also influences the extinction of a conditioned response. The manifestation of a conditioned response fades away at a faster rate if there was continuous reinforcement, which was then abruptly stopped, than in the case when reinforcement was given intermittently. Some pigeons demonstrated up to ten thousand reactions without reinforcement if they were initially conditioned on the basis of periodic, intermittent reinforcement.

Skinner also investigated fixed frequency reinforcement schedules. In this case, reinforcement is given not after a certain period of time, but after a certain number of conditioned reactions have been completed. The animal's behavior itself determines how often reinforcement will be given. For example, it takes ten or twenty conditioned responses to obtain a new reinforcer. Animals receiving reinforcement on a fixed frequency schedule respond much more intensely than those receiving reinforcement on a fixed interval schedule. After all, it is obvious that a high frequency of responses in a fixed-interval schedule does not lead to additional reinforcement; the animal can press the lever five times or fifty times, but the reinforcement will only appear after a specified period of time has elapsed.

The highest rates of responding with a fixed frequency reinforcement schedule were observed in rats, pigeons, and humans. An example of this: piecework wages, when an employee’s earnings at his workplace depend on the number of products produced, and commissions depend on the number of sales. True, such a reinforcement scheme works successfully only when the required level of conditioned response is not too high (thus, daily production rates must be realistic) and if the expected reinforcement is worth the effort.

Verbal behavior

Those sounds that human body produced in the process of speech, Skinner argued, are also a form of behavior, namely, verbal behavior. They are responses that can be reinforced by other speech sounds or gestures in the same way that a rat's pressing a lever is reinforced by receiving food.

Verbal behavior requires two interacting people - a speaker and a listener. The speaker reacts in a certain way - this means that he utters a sound. The listener can control the speaker's subsequent behavior by expressing reinforcement, non-reinforcement, or punishment - depending on what was said.

For example, if every time a speaker uses a word, the listener smiles, then he thereby increases the likelihood that the speaker will use that word again. If a listener reacts to a word by furrowing his brow or making sarcastic remarks, he increases the likelihood that the speaker will avoid using that word in the future.

Examples of this process can be observed in the behavior of parents when their children learn to speak. Inappropriate words or expressions are not correct application words and poor pronunciation cause a reaction that is fundamentally different from that which greets polite phrases, correct application, and clean pronunciation. In this way, the child learns correct speech - at least at the level at which parents or educators speak it.

Since speech is a behavior, it is also subject to reinforcement, prediction and control, like any other behavior. Skinner summarized the results of his research in the book Verbal Behavior ( Verbal Behavior) (Skinner. 1957).

Air cradle and teaching machines

The use of the Skinner box in psychological research laboratories made him famous among psychologists, but the air cradle - an apparatus for automating the care of infants - made him famous throughout the country.

He described the invention of the air cradle in an article in a magazine for housewives. When he and his wife decided to have a second child, she told him that caring for a baby in the first two years of life required too much attention and tedious work, so Skinner invented automatic device, which was supposed to save parents from routine work. Air cradles began to be produced commercially, but, frankly speaking, they were not very successful.

An air cradle was “a large, soundproof, air-conditioned, temperature-controlled, bacteria-proof room in which a baby could sleep or stay awake without diapers, wearing only a diaper. This provides complete freedom of movement and relative safety from colds or overheating” (Rice. 1968. P. 98). Skinner's daughter did not experience any harmful effects from using the air cradle.

In addition, Skinner contributed to the spread of the learning machine, invented back in the twenties by psychologist Sidney Pressey. Unfortunately for Pressey, his invention was far ahead of its time and did not arouse anyone's interest.

The current situation was such that if at first the teaching machine did not attract attention, then thirty years later it caused a real explosion of enthusiasm (Benjamin. 1988b). In the twenties, when Pressey had just invented his machine, he argued that it would now be possible to teach schoolchildren more efficiently and with fewer teachers. However, at that time there was an excess of teachers, and public opinion was not committed to improving the educational process. In the fifties, when Skinner introduced similar device, there were not enough teachers, classrooms were overcrowded, and the public was concerned and urgently demanding improvements in the educational process so that it could compete with the Russians in the field of space exploration. Skinner claimed that he knew nothing of Pressey's invention and had developed his own teaching machine, but he always gave credit to his predecessor.

Skinner began developing his teaching machine after visiting his daughter's fourth grade class and deciding that something needed to be done to improve the learning process. He summarized his experience in this field in the book "Teaching Technology" ( The Technology of Teaching, 1968). Teaching machines were widely used in the fifties and early sixties until they were replaced by computer-based teaching methods.

Walden Two - Behavioral Society

Skinner put forward a behavior control program - behavior technology, in which he attempted to apply his laboratory discoveries to the life of the whole society. While John B. Watson spoke only in general terms about the use of conditioned reflexes on the way to more healthy life, Skinner outlined in detail the functioning of a society in which this idea is realized.

In 1948, he published the novel Walden Two, which described the life of a rural community of a thousand people. Every aspect of life in this community is controlled by positive reinforcement. The book was the fruit of a mid-life crisis that Skinner experienced at the age of 41. He was able to overcome his depression by returning to his youthful dream of becoming a writer. Caught up in personal and professional conflicts, he expressed his despair in the book, recounting the fate of the main character T. E. Fraser. “A lot of Walden Two comes from my own life,” Skinner admitted. “I allowed T. E. Fraser to say what I myself did not dare to say” (Skinner. 1979. P. 297–298).

The book received both praise and negative reviews in the press. Only a few thousand copies were sold until the early sixties, but in 1990, the year Skinner died, about two and a half million copies were sold (Bork. 1993).

The society depicted in Skinner's novel, and Skinner's very underlying assumption that humans are essentially like machines, reflects the culmination of a long development of this line of thought, from Galileo and Newton to the British empiricists and then Watson. "If we're going to use scientific methods in human affairs, we must admit that behavior is deterministic and subject to certain laws, ... that what a person does is the result of certain conditions, and if these conditions become known, then it is quite possible to foresee and to some extent determine actions" ( Skinner, 1933, p. 6).

Mechanistic, analytical and deterministic approach adopted in natural sciences, supported by Skinner's experiments on the formation of conditioned reflexes, convinced representatives of behavioral psychology that human behavior can be controlled, directed, modified and shaped by correct use positive reinforcement.

Behavior modification

Skinner's program for society, based on positive reinforcement, existed only in theory, but the control or modification of the behavior of individuals or small groups was widespread in practice. Behavior modification through positive reinforcement is one of the most popular techniques in psychiatric clinics, factories, schools, correctional institutions, where it is used to change abnormal or unwanted behavior, making it more acceptable or desirable. Behavior modification works in humans in the same way as operant conditioning, which modifies the behavior of rats or pigeons by reinforcing desired behavior and not reinforcing undesirable behavior.

Let's imagine a child throwing tantrums to get food or attention. If parents satisfy the child's demands, thereby reinforcing unwanted behavior. In behavior modification, actions such as foot stomping or yelling should not be reinforced. Reinforcement is given only for desirable and satisfactory behavior. After some time, the child's behavior will change, since the demonstration of character will no longer lead to the required result.

Operant conditioning and reinforcement are also used in the workplace, where behavior modification is widely used to reduce absenteeism or abuse. sick leave, as well as to improve performance and safety. Behavior modification techniques are also used to teach job skills.

Behavior modification programs have proven effective when used to change the behavior of psychiatric patients. For good behavior, patients received rewards in the form of badges that could be exchanged for certain privileges or benefits; disruptive or negative behavior was not rewarded. Gradually, positive changes in behavior began to be observed. Unlike traditional clinical techniques, what was going on in the patient's mind was no more taken into account than in the case of the rats in Skinner's box. The focus was solely on external behavior and positive reinforcement.

No punishment was applied. People were not punished for not behaving as required. They only received reinforcement or reward when their behavior changed in a positive direction. Skinner believed that positive reinforcement was more effective in modifying behavior than punishment. He confirmed his point of view with a significant volume experimental research both in animals and in humans. (Skinner wrote that as a child, his father never physically punished him, and his mother punished him only once: she washed his mouth with laundry soap because he used obscene words (Skinner. 1976). However, he did not mention whether the punishment had any effect any influence on his behavior.)

Criticism of Skinner's behaviorism

The most objectionable thing to Skinner's behaviorism was his extreme positivism and rejection of all theories. Skinner's opponents argue that it is impossible to reduce all theoretical constructions to zero. Since the details of the experiment must be planned in advance, this in itself is evidence of the construction of at least the simplest theory. It has also been noted that Skinner's adoption of basic principles of conditioning as the basis for his work is also to some extent theorizing.

The established belief system gave Skinner confidence in economic, social, political and religious issues. In 1986 he wrote an article with a promising title<Что неправильно в западном образе жизни?>(What is Vrong with Life in the Western World?) In this article he argued that<поведение жителей Запада ухудшилось, но его можно улучшить посредством применения принципов, выведенных на основании экспериментального анализа поведения>(Skinner 1986, p. 568). Critics have charged that Skinner's willingness to extrapolate from empirical data is inconsistent with his anti-theoretical stance and demonstrates that he is going beyond strictly observable data in his quest to present his own project for social reconstruction.

The narrow range of behavior studies in Skinner's laboratories (pressing a lever or plucking a key) has also not escaped criticism. Opponents of Skinner's theory argued that this approach simply ignores many aspects of behavior. Skinner's assertion that all behavior is learned was challenged by a former student of his who trained more than six thousand animals from 38 species to perform in television programs, attractions, and fairs (Breland & Breland. 1961). Pigs, chickens, hamsters, dolphins, whales, cows and other animals have shown a tendency towards instinctive behavior. This means that they substituted instinctive behavior for the one that was being reinforced, even if that instinctive behavior prevented them from getting food. Thus, reinforcement was not as omnipotent as Skinner claimed.

Skinner's position on verbal behavior—in particular, his explanation of how children learn to speak—has been challenged on the grounds that certain behaviors must be heritable. Critics argued that the infant does not learn the language word by word through the reinforcement received for each correctly pronounced word - the child masters grammar rules necessary to construct sentences. But the potential for the formation of such rules, Skinner's opponents argue, is hereditary, not learned (Chomsky.1959,1972).

The meaning of Skinner's behaviorism

Despite these criticisms, Skinner remained the undisputed leader and hero of behavioral psychology - for at least three decades, American psychology was shaped by the work of Skinner more than the work of any other psychologist.

In 1958, the American Psychological Association presented Skinner with the Distinguished Contribution to Science Award, noting that<мало кто из американских психологов оказал такое глубокое влияние на развитие психологии и воспитание многообещающих молодых ученых». В 1968 году Скиннер получил национальную медаль, что является высшей наградой, которой правительство Соединенных Штатов удостаивает за вклад в науку. В 1971 году Американский психологический фонд представил Скиннера к награждению золотой медалью; его фотография появилась на обложке журнала «Тайм». А в 1990 году он был отмечен занесением на доску почета Американской психологической ассоциации за большой вклад в психологию.

It is very important to understand that Skinner's main goal was to improve the lives of individuals and society as a whole. Despite the mechanistic nature of his system, he was essentially a humanist. This quality was evident in his efforts to modify human behavior in real-world settings in families, schools, businesses, and hospitals. He hoped that his behavioral technology would alleviate the suffering of people, and therefore he felt increasing disappointment, realizing that, despite all its popularity and influence, his system was not widely adopted.

In his old age, Skinner became more pessimistic about the hope that science could bring about a timely transformation of society. His despair about the future of the world grew.(Bjork. 1993. P. 226.)

There is no doubt that Skinner's radical behaviorism won and still maintains a strong position in psychology. Journal of Experimental Behavior Analysis and Journal of Applied Behavior Analysis ( Journal of the Experimental Analysis of Behavior and Journal of Applied Behavior Analysis) continue to thrive, as does the Division of Experimental Behavior Analysis of the American Psychological Association. The application of Skinner's principles—especially behavior modification—remains popular, and the results of these activities support the validity of Skinner's approach. By any measure of professional and public acceptance, Skinner's behaviorism has certainly eclipsed all other forms of behavioral psychology.

From the book From Hell to Heaven [Selected lectures on psychotherapy (textbook)] author Litvak Mikhail Efimovich

LECTURE 6. Behavioral psychotherapy: B.F. Skinner Methods of psychotherapy are based on learning theories. At the initial stage of development of behavioral psychotherapy, the main theoretical model was the teaching of I.P. Pavlov on conditioned reflexes. Behaviorists consider

From the book Personality Theories by Kjell Larry

B. F. Skinner: Operant Conditioning Theory Biographical Sketch Burrhus Frederic Skinner was born in 1904 in Susquehanna, Pennsylvania. The atmosphere in his family was warm and relaxed, teaching was respected, discipline was strict and rewards were given,

From the book Psychology by Robinson Dave

From the book Psychology in Persons author Stepanov Sergey Sergeevich

D. B. Elkonin (1904–1984) ...I can’t stand any vulgarity in science, I can’t stand any groundlessness, illogicality, I can’t stand anything that is brought into science other than its own internal logic. D. B. Elkonin In one popular movie

From the book The Wounded Healer: Countertransference in the Practice of Jungian Analysis by David Sedgwick

From the book Century of Psychology: Names and Destinies author Stepanov Sergey Sergeevich

Into the 1990s The work of several other authors, notably Samuels (1985, 1989, 1993) and Steinberg (1989), carries this review of countertransference theories into the 1990s. Like other Jungians mentioned here, Samuels (1989) introduces us to some new ways of looking at imaginative context

From the book Anticipating Yourself. From image to style author Khakamada Irina Mitsuovna

From the book 175 ways to expand the boundaries of consciousness by Nestor James

From the book Psychology. People, concepts, experiments by Kleinman Paul

From the book Social and Psychological Problems of the University Intelligentsia during Reforms. Teacher's view author Druzhilov Sergey Alexandrovich

From the author's book

Skinner Release Technique This is a therapeutic dance technique developed in the 1960s by Joan Skinner to restore our connection to the grace of animals, the inherent sense of balance, coordination and agility that we are all born with but have lost as we age.

From the author's book

Frederick Skinner (1904–1990) It's all about the consequences Frederick Skinner was born on March 20, 1904 in Susquehanna, the son of a lawyer and a housewife. His childhood was spent in a warm, stable atmosphere; the boy was creative a lot and was constantly inventing something, which had a positive impact

From the author's book

“The Wild” 1990s After 1991, the state “withdrew” from many sectors of the economy, leaving almost the entire field of higher education to the mercy of fate. Funding for universities has sharply decreased. And not only universities. Everything collapsed - the economy, the financial sector, work stopped

Lecture 6. Sociogenetic theories of development

The origins of the sociogenetic approach come from the tabula rasa theory that arose in the Middle Ages, formulated John Locke(1632-1704), according to which the human psyche at the moment of birth is a “blank slate”, but under the influence of external conditions, as well as upbringing, all the mental qualities characteristic of a person gradually arise in him. Locke put forward a number of ideas about organizing children's education on the principles of association, repetition, approval and punishment.

A representative of this trend was the French philosopher of the 18th century. Claude Adrian Helvetius(1715-1771), who believed that all people are born identical in their natural abilities and the inequality between them in the field of mental abilities and moral qualities is due only to unequal external environmental conditions and various educational influences.

Sociologizing ideas were consonant with the ideology that dominated the USSR until the mid-80s. According to this theory, with the help of targeted training and education, any qualities and behavioral properties can be formed in a child. In order to study a child, you need to study the structure of his environment.

The sociogenetic approach is associated with the behavioristic direction in psychology, according to which a person is what his environment makes of him. The main idea of ​​behaviorism is the identification of development with learning, with the child’s acquisition of new experience. American researchers took the idea of ​​I.P. Pavlov that adaptive activity is characteristic of all living things. The phenomenon of the conditioned reflex was perceived as some kind of elementary behavioral phenomenon. The idea of ​​combining stimulus and response, conditioned and unconditional stimuli came to the fore: the time parameter of this connection was highlighted. The main theories of behaviorism include:

1. The theory of classical and instrumental conditioning I.P. Pavlova

2. Associationistic concept of learning by D. Watson and E. Ghazri.

3. The theory of operant conditioning by E. Thorndike.

4. B. Skinner's theory. With the help of reinforcement, you can shape any type of behavior.

The very idea of ​​conducting a rigorous scientific experiment, created by I.P. Pavlov to study the digestive system, entered into American psychology. The first description of such an experiment by I. P. Pavlov was in 1897, and the first publication by J. Watson was in 1913. Already in the first experiments of I. P. Pavlov with the salivary gland brought out, the idea of ​​​​connecting dependent and independent variables was realized, which runs through all American studies of behavior and its genesis not only in animals, but also in humans. Such an experiment has all the advantages of real natural scientific research, which is still so highly valued in American psychology: objectivity, accuracy (control of all conditions), accessibility for measurement. It is known that I.P. Pavlov persistently rejected any attempts to explain the results of experiments with conditioned reflexes by reference to the subjective state of the animal.

American scientists perceived the phenomenon of the conditioned reflex as a kind of elementary phenomenon, accessible to analysis, something like a building block, from many of which a complex system of our behavior can be built. The genius of I.P. Pavlov, according to his American colleagues, was that he was able to show how simple elements can be isolated, analyzed and controlled in laboratory conditions. The development of the ideas of I.P. Pavlov in American psychology took several decades, and each time the researchers were confronted with one of the aspects of this simple, but at the same time not yet exhausted phenomenon in American psychology - the phenomenon of the conditioned reflex.

In the earliest studies of learning, the idea of ​​combining stimulus and response, conditioned and unconditioned stimuli, came to the fore: the time parameter of this connection was highlighted. This is how the associationist concept of learning arose (J. Watson, E. Ghazri). J. Watson began “his” scientific revolution by putting forward the slogan: “Stop studying what man thinks; let’s study what man does!”

1. Behaviorism

Watson John Brodes

(1878 – 1958). American psychologist, founder of behaviorism (from the English behavior - behavior), one of the most widespread theories in Western psychology of the 20th century.

In 1913 His article “Psychology from the Point of View of a Behaviorist” was published, assessed as a manifesto of a new direction. Following this, his books “Behavior: An Introduction to Comparative Psychology” (1914), “Behaviorism” (1925) appeared, in which for the first time in the history of psychology the postulate that the subject of this science is consciousness (its content, processes, functions, etc.).

Influenced by the philosophy of positivism, Watson argued that only what can be directly observed is real. He argued that behavior should be explained from the relationship between the directly observable effects of physical stimuli on the organism and its also directly observable responses (reactions). Hence Watson’s main formula, adopted by behaviorism: “stimulus-response” (S-R). It followed from this that psychology must eliminate the processes between stimulus and response - whether physiological (nervous) or mental - from its hypotheses and explanations.

Methodologists of behaviorism proceeded from the assumption that the formation of basic mental processes occurs during life. Lipsitt and Kaye (Lipsitt, Kaye, 1964) conducted experiments on the development of conditioned reflexes in 20 three-day-old infants. Ten infants were assigned to the experimental group, and the combination of an unconditional (pacifier) ​​and a conditioned stimulus (pure tone) was repeated 20 times. The researchers wanted to obtain the suckling response to the sound tone that a pacifier would naturally produce. After twenty stimulus combinations, infants in the experimental group began to make sucking movements in response to the sound, while infants in the control group, who were not exposed to stimulus combinations, did not show such a response. This research shows that learning occurs from the earliest days of life. It also suggests that a behaviorist approach can provide insight into development and that through conditioning, researchers can study infants' ability to process sensory information long before they acquire language.

D. Watson proved the ideas of classical conditioning in his experiments on the formation of emotions. He experimentally demonstrated that it is possible to form a fear response to a neutral stimulus. In his experiments, a child was shown a rabbit, which he picked up and wanted to stroke, but at that moment received an electric shock. Naturally, the child scaredly threw the rabbit and began to cry. However, the next time he approached the animal again and received an electric shock. By the third or fourth time, for most children, the appearance of a rabbit, even in the distance, caused fear. After this negative emotion was consolidated, Watson tried once again to change the emotional attitude of the children, forming an interest and love for the rabbit. In this case, they began to show it to the child during a tasty meal. The presence of this important primary stimulus was an indispensable condition for the formation of a new reaction. At the first moment, the child stopped eating and began to cry, but since the rabbit did not approach him, remaining far away, at the end of the room, and tasty food (for example, chocolate or ice cream) was nearby, the child quickly calmed down and continued eating. After the child stopped reacting by crying to the appearance of a rabbit at the end of the room, the experimenter gradually moved the rabbit closer and closer to the child, while simultaneously adding tasty things to his plate. Gradually, the child stopped paying attention to the rabbit and, in the end, reacted calmly, even when it was located near his plate, took the rabbit in his arms and tried to feed him something tasty. Thus, Watson argued, our emotions are the result of our habits and can change dramatically depending on circumstances.

Watson's observations showed that if the formed fear reaction to a rabbit was not converted to a positive one, a similar feeling of fear subsequently arose in children when they saw other fur-covered objects. Based on this, he sought to prove that persistent affective complexes can be formed in people based on conditioned reflexes according to a given program. Moreover, he believed that the facts he discovered proved the possibility of forming a certain, strictly defined model of behavior in all people. He wrote: “Give me a hundred children of the same age, and after a certain time I will form them into absolutely identical people, with the same tastes and behavior.”

The principle of behavior control gained wide popularity in American psychology after the work of Watson. His merit is also that he expanded the sphere of the psyche to include the bodily actions of animals and humans. But he achieved this innovation at a high price, rejecting as a subject of science the enormous riches of the psyche, irreducible to externally observable behavior.

Edwin Ray Ghazri

(1886 – 1959). He was a professor of psychology at the University of Washington from 1914 until his retirement in 1956. His major work was The Psychology of Learning, published in 1935 and reprinted in a new edition in 1952.

He proposed a single law of learning, the law of contiguity, which he formulated as follows: “A combination of stimuli which accompanies a movement, when reappeared, tends to produce the same movement. Notice that there is nothing said here about “confirmatory waves,” or reinforcement, or states of satisfaction.” Another way to define the law of contiguity is that if you did something in a given situation, then the next time you find yourself in the same situation, you will strive to repeat your actions.

E. Ghazri explained why, despite the possible truth of the law of contiguity, the prediction of behavior will always be probabilistic. Although this principle, as just stated, is short and simple, it will not be understood without some explanation. The phrase “tends” is used here because behavior at any point in time depends on a large number of different conditions. Conflicting “tendencies” or incompatible “tendencies” are always present. The outcome of any stimulus or stimulus pattern cannot be predicted with absolute accuracy because other stimulus patterns exist. We can express this by saying that the behavior presented is caused by the entire situation. But in saying this, we cannot flatter ourselves that we have done more than find an explanation for the impossibility of predicting behavior. No one has yet described, and no one will ever describe, the entire stimulus situation, or observe any complete situation, so as to speak of it as a “cause,” or even as a pretext for misconceptions about a small part of behavior.

In a recent publication, E. Ghazri revised his law of contiguity to clarify: “What is noticed becomes the signal for what is done.” For Ghazri, this was a recognition of the enormous number of stimuli that an organism encounters at any given time, and the fact that it is apparently impossible to form associations with all of them. Rather, the organism responds selectively to only a small fraction of the stimuli encountered, and this is the fraction that is associated with any response caused by those stimuli. One can pay attention to the similarities between Ghazri’s way of thinking and the concept of “predominance of elements” by Thorndike, who also believed that organisms react selectively to various manifestations of the environment.

Edward Lee Thorndike

(1874–1949). American psychologist and educator. President of the American Psychological Association in 1912.

Conducted research studying animal behavior. They were aimed at getting out of the “problem box”. By this term E. Thorndike meant an experimental device in which experimental animals were placed. If they left the box, they received reinforcement of the reflex. The research results were displayed on certain graphs, which he called “learning curves.” Thus, the purpose of his research was to study the motor reactions of animals. Thanks to these experiments, E. Thorndike concluded that animals act by the method of “trial and error and random success.” These works led him to the theory of connectivism.

E. Thorndike concludes that the behavior of any living creature is determined by three components:

1) a situation that includes both external and internal processes that affect the individual,

2) reaction or internal processes occurring as a result of this impact;

3) a subtle connection between the situation and the reaction, i.e. association. In his experiments, Thorndike showed that intelligence as such and its activity can be studied without resorting to reason. He transferred the emphasis from establishing internal connections to establishing connections between the external situation and movements, which introduced new trends in associative psychology. In his theory, Thorndike combined mechanical determinism with the biological, and then with the biopsychic, significantly expanding the area of ​​psychology, previously limited to the limits of consciousness.

Based on his research, Thorndike derived several laws of learning:

1. The law of exercise. There is a proportional relationship between the situation and the reaction to it with the frequency of their repetition).

2. The law of readiness. The condition of the subject (the feelings of hunger and thirst he experiences) is not indifferent to the development of new reactions. Changes in the body's readiness to conduct nerve impulses are associated with exercise.

3. Law of associative shift. When reacting to one specific stimulus out of several acting simultaneously, other stimuli that participated in this situation subsequently cause the same reaction. In other words, a neutral stimulus, associated by association with a significant one, also begins to evoke the desired behavior. Thorndike also identified additional conditions for the success of a child's learning - the ease of distinguishing between stimulus and response and awareness of the connection between them.

4. Law of effect. The last, fourth, law caused a lot of controversy, since it included a motivation factor (a purely psychological factor). The law of effect said that any action that causes pleasure in a certain situation is associated with it and subsequently increases the likelihood of repeating this action in a similar situation, while displeasure (or discomfort) during an action associated with a certain situation leads to a decrease in the likelihood of committing this act in a similar situation. This implies that learning is also based on certain polar states within the organism. If the actions taken in a certain situation lead to successful results, then they can be called satisfying, otherwise they will be violating. Thorndike gives the concept of a successful result at the neuronal level. When the action is successful, the system of neurons brought to alert is actually functioning and not inactive.

E. Thorndike, B. Skinner. They identified development with learning.

Burres Frederick Skinner

(1904 – 1990). American psychologist, inventor and writer. He made a huge contribution to the development and promotion of behaviorism.

Skinner is best known for his theory of operant conditioning, and less so for his fiction and journalism in which he promoted the widespread use of behavior modification techniques (such as programmed training) to improve society and make people happy, as a form of social engineering. Continuing the experiments of D. Watson and E. Thorndike, B. Skinner designed the so-called “Skinner box”, which made it possible to accurately measure behavior and automatically supply reinforcement. The Skinner box, reminiscent of a rat or pigeon cage, has a metal pedal, which, when pressed, the animal receives a portion of food into the feeder. With this very simple device, Skinner was able to make systematic observations of the behavior of animals under different conditions of reinforcement. It turned out that the behavior of rats, pigeons, and sometimes people is quite predictable, since they follow certain laws of behavior, at least in this situation. In Skinner's experiments (as in Thorndike's experiments), food was usually the reinforcer.

A typical Skinner model usually includes the following components: discriminated stimulus, individual response, and reinforcement. A discriminable stimulus usually signals to the individual that learning has begun. In Skinner's experiments, light and sound signals, as well as words, were used as discriminative stimuli. The response is the emergence of operant behavior. Skinner called his type of conditioning operant conditioning because the individual's response operates the mechanism of reinforcement. Finally, a reinforcing stimulus is given for an adequate response. Therefore, reinforcement increases the likelihood of subsequent operant behavior. Operant behavior can also be taught through avoidance conditioning, where reinforcement consists of ending exposure to an aversive stimulus. For example, a bright light can be turned off, a loud sound can be muted, an angry parent can be calmed down. Thus, in operant conditioning, an individual learns a response when the reinforcement consists of stopping exposure to an unpleasant stimulus.

Skinner developed a method of conditioning behavior through successive approximations, which forms the basis of operant conditioning. This method consists in the fact that the entire path from the initial behavior (even before the start of training) to the final reaction that the researcher seeks to develop in the animal is divided into several stages. In the future, all that remains is to consistently and systematically reinforce each of these stages and thus lead the animal to the desired form of behavior. With this method of learning, the animal is rewarded for every action that brings it closer to the final goal, and it gradually develops the desired behavior.

According to Skinner and other behaviorists, this is how most human behavior is developed. From Skinner's point of view, it is possible to explain the very rapid learning of a child's first words (without, however, extending this concept to language acquisition as a whole). At first, when the child is just beginning to utter some articulate sounds, the babbling “me-me-me” already causes delight among those around him, and especially the happy mother, who already thinks that the child is calling her. However, soon the parents' enthusiasm for such sounds cools down until the baby, to everyone's joy, utters “mo ... mo.” Then these sounds cease to be reinforced for the newborn until a relatively articulate “mo-mo” appears. In turn, this word, for the same reasons, will soon be replaced by the combination “moma”, and, finally, the child will clearly pronounce his first word - “mom”. All other sounds will be perceived by others only as “baby talk” in the literal sense of the word, and they will gradually disappear from the “lexicon” of the newborn. Thus, as a result of selective reinforcement from family members, the infant discards those incorrect responses for which he does not receive social reinforcement, and retains only those that are closest to the expected result.

Operant reactions in Skinner's sense should be distinguished from automatic, purely reflex reactions associated with unconditioned and conditioned reflexes. An operant response is an action that is voluntary and purposeful. However, Skinner defines goal-directedness in terms of feedback (that is, the effect on behavior of its consequences), rather than in terms of goals, intentions, or other internal states - mental or physiological. In his opinion, the use of these "internal variables" in psychology involves the introduction of dubious assumptions that add nothing to the empirical laws that relate observed behavior to observable environmental influences. It is these laws that are the real means of predicting and controlling the behavior of humans and animals. Skinner emphasized that “the objection to internal states is not that they do not exist, but that they are irrelevant for functional analysis.” In this analysis, the probability of an operator response appears as a function of external influences - both past and present.

In the field of education, Skinner put forward the concept of programmed learning. According to him, such training can free the student and teacher from the boring process of simple knowledge transfer: the student will gradually advance in mastering a particular topic at his own rhythm and in small steps, each of which is reinforced; These steps constitute the process of successive approximation (Skinner, 1969). However, it was very soon discovered that such training quickly reaches its “ceiling”, and this is due precisely to the fact that only minimal effort is required from the student and therefore reinforcement soon becomes ineffective. As a result, the student quickly becomes bored with such training. In addition, personal contact with the teacher seems to be necessary to constantly maintain student motivation and orderly transfer of knowledge. All of this can perhaps be explained by the principles underlying social learning, and in particular observational learning.

(17)

Frederick Skinner is an American psychologist, one of the influential and outstanding psychologists of the twentieth century, according to the American Psychological Association. He is also known as a writer with published works of art, and also gained considerable fame as an inventor.

Skinner was born on March 20, 1904 in Pennsylvania in the small town of Susquehanna. He was brought up in a friendly family, which, nevertheless, maintained discipline and order. In his youth, the future psychologist was fascinated by various mechanical devices, and he himself made a device for hanging his pajamas. As a child, he played the saxophone in the school orchestra and was very interested in literature. Already from the age of fourteen, the American schoolboy showed that he had an extraordinary mind.

In his youth, Skinner spent most of his time in the laboratory and had an extraordinary capacity for work. Graduated from Hamilton College, New York. I intended to study literature in the future.

Psychology in college was taught as an elective, so Skinner did not attend these classes; his interest in psychology appeared later.

In the thirties of the twentieth century, after a long creative quest, he reoriented himself into the scientific field. He entered the psychological department of Harvard University in 1928. Realizing that a lot of time had been lost, he set a Spartan regime for himself and practically abandoned leisure, but such dedication yielded results. Already in 1931, Skinner received his doctorate and published his first scientific research on behavioral psychology.

For five years at Harvard University, Skinner was engaged in scientific work, studying animal behavior. In 1936, he moved to Minneapolis and worked for 9 years as a teacher at the University of Minnesota. Then he headed the psychology department at Indiana University for more than two years.

In 1948, Skinner returned and became a professor at Harvard University, working there for more than 25 years until his retirement.

Over half a century, Skinner wrote 19 large monographs and many articles. The earliest article is considered to be “The Concept of Reflex in Descriptions of Behavior,” where the conditioned reflex was interpreted as a derivative of the actions of the experimenter. Skinner experimented, as a rule, on rats and pigeons.

In 1938, Skinner published his major work, The Behavior of Organisms, in which he outlined the basic principles of operant conditioning. They can be understood by conducting a typical experiment. The rat is placed in a "Skinner box", which is a cage where the rat's actions can be observed. The box has a hole for feeding food and a lever. The rat must press the lever several times to receive a portion of food. Such pressing with the nose, paw or tail is called an operant response, as it causes one consequence - the appearance of food. By dispensing food over a certain number of pressings or pressing at intervals, one can obtain response methods.

Operant reactions, as goal-directed actions, differ from reflex reactions. Skinner believed that observed behavior is also associated with environmental influences.

Skinner and his employees created special machines with the help of which training was carried out. Questions were proposed, and the student's answers were already evaluated by learning machines.

According to Skinner, operant conditioning is used not only to control the behavior of other people, but also to control one's own behavior. Self-control can only be achieved by creating conditions so that desired behavior is reinforced.

Skinner had a negative attitude towards generalizations, rightly believing that fixing the reactions of a separate mechanism would solve the main task of psychology - to control behavior. In 1957, Skinner's work with Foerster, “Plans of Reinforcement,” was published, which collected data on 250 million reactions of experimental pigeons over 70 thousand hours.

When studying the mechanisms of behavior, most behaviorists of that time believed that turning to physiology made no sense. Skinner's concept of "operant conditioning", influenced by Pavlov's teachings, destroyed this idea.

The famous Skinner box was inseparable from its creator throughout his creative life. He proposed a distinction between conditioned reflexes into 2 types: those studied by the Pavlovian school - this is when a reaction occurs in response to a stimulus, and type R - behavior in the “Skinner box”, called operant, when the animal carries out a reaction (R), and only then the reaction is reinforced .

Thus, there has been a transition from a linear idea of ​​behavior to feedback in the construction of reactions.

The technique of “operant conditioning” is widely used in practice in the United States. The principles of operant behaviorism have been applied to solving various types of problems. Operant technique began to be used in the treatment of neurotics and mentally ill people, as well as in the education of mentally retarded children. Behavior modification here is achieved through gradual reinforcement. For example, the patient is rewarded for each action leading to the goal, according to the treatment regimen.

Skinner's ideas have found wide application in pedagogy. He himself explained this as an accident (since he assessed everything that happened in life as a consequence of developing circumstances). In November 1953, he visited the school where his daughter studied and was dismayed by an arithmetic lesson: the teacher violated the laws of the teaching process without feeling guilty. Paul, inspired by his visit to Skinner's school, began to think about what could be used to improve teaching and designed a number of teaching machines. This is how programmed training arose, corresponding to the era of the scientific and technological revolution.

The idea of ​​optimizing learning itself is not associated with any psychological concept, but it has become the basis of research work on programmed learning.

Two of Skinner's works caused heated controversy: “Verbal Behavior” and the social utopia “Walden 2”. In the first book on his concept, speech acquisition is carried out according to the laws of the formation of operational conditioned reflexes. The American linguist Chomsky criticized this concept, and it must be said that most language specialists support Chomsky’s position. In the second book, Skinner tried to depict, using operant conditioning, the creation of a new, just social society such as commune models.

There were indeed many reasons for criticism of Skinner’s works, which did not prevent him from becoming one of the most cited authors in the history of psychology. During his lifetime, he was first on the honorary list of “distinguished” for his outstanding contribution to psychology (by the way, the second was Z. Freud).

B.F. Skinner died on August 18, 1990 from leukemia. Unfortunately, none of his works have yet been translated into Russian.

Burress Frederick Skinner was one of the most famous psychologists of his time. It was he who stood at the origins of the direction that today in science is called behaviorism. Even today, his learning theory plays an important role in psychology, pedagogy, and management.

Scientist's experiments

Skinner's theory is described in detail in one of his main works, which is called “The Behavior of Organisms.” In it, the scientist outlines the principles of so-called operant conditioning. The easiest way to understand these principles is to consider one of the most typical experiments of a scientist. The rat's weight was reduced to 80-90% of normal. It is placed in a special device called a Skinner box. It provides the opportunity to perform only those actions that the observing experimenter can see and control.

The box has a hole through which food is supplied to the animal. To get food, the rat must press a lever. This pressing in Skinner's theory is called an operant response. How the rat manages to press this lever - through its paw, nose, or perhaps tail - does not matter. The operational reaction in the experiment remains the same, since it causes only one consequence: the rat receives food. By rewarding the animal with food for a certain number of presses, the researcher forms stable ways of responding in the animal.

Formation of behavior according to Skinner

A prompt reaction in Skinner's theory is a voluntary and purposeful action. But Skinner defines this goal-directedness in terms of feedback. In other words, behavior is influenced by certain consequences of the animal.

Skinner agreed with the views of scientists Watson and Thornadike on the dual nature of mental development. They believed that the formation of the psyche is influenced by two types of factors - social and genetic. In operant learning, specific operations performed by the subject are reinforced. In other words, genetic data act as the basis on which socially conditioned behavior is built. Therefore, development, Skinner believed, is learning conditioned by certain environmental stimuli.

Skinner also believed that operant conditioning could be used not only to control the behavior of others, but also to control one's own behavior. Self-control can be achieved by creating special conditions in which desired behavior will be reinforced.

Positive reinforcement

Operant learning in Skinner's theory of reinforcement is based on the active actions of the subject (“operations”) carried out in a certain environment. If some spontaneous action becomes useful to meet a certain need or achieve a goal, it is reinforced by a positive result. For example, a pigeon can learn a complex action - playing ping-pong. But only if this game becomes a means of getting food. In Skinner's theory, reward is called reinforcement because it reinforces the most desirable behavior.

Sequential and proportional reinforcement

But a pigeon cannot learn to play ping-pong unless the experimenter shapes this behavior in it through discriminative learning. This means that the individual actions of the pigeon are reinforced by the scientist consistently, selectively. In B.F. Skinner's theory, reinforcement can either be distributed randomly, occurring at certain time intervals, or occur in certain proportions. Reward distributed randomly in the form of periodic cash wins provokes the development of gambling addiction in people. Reinforcement that occurs at certain intervals - salary - contributes to the fact that a person remains in a certain service.

Proportional reinforcement in Skinner's theory is such a powerful reinforcement that the animals in his experiments practically worked themselves to death in an attempt to earn more tasty food. Unlike behavior reinforcement, punishment is negative reinforcement. Punishment cannot teach a new behavioral model. It only forces the subject to constantly avoid certain operations followed by punishment.

Punishment

The use of punishment tends to have negative side effects. Skinner's learning theory identifies the following consequences of punishment: high levels of anxiety, hostility and aggressiveness, and withdrawal. Sometimes punishment forces an individual to stop behaving in a certain way. But its disadvantage is that it does not promote positive behavior.

Punishment often forces the subject not to abandon an undesirable model of behavior, but only to transform it into a hidden form that is not punished (for example, this could be drinking alcohol at work). Of course, there are many cases when punishment seems to be the only method of suppressing socially dangerous behavior that threatens the life or health of other people. But in ordinary situations, punishment is an ineffective means of influence and should be avoided whenever possible.

Pros and cons of Skinner's operant learning theory

Let's consider the main advantages and disadvantages of Skinner's concept. Its advantages are as follows:

  • Rigorous testing of hypotheses, control of additional factors influencing the experiment.
  • Recognition of the importance of situational factors and environmental parameters.
  • A pragmatic approach that has led to the creation of effective psychotherapeutic procedures for behavior transformation.

Disadvantages of Skinner's theory:

  • Reductionism. The behavior exhibited by animals is entirely reducible to the analysis of human behavior.
  • Low validity due to laboratory experiments. The results of experiments are difficult to transfer to natural environment conditions.
  • No attention is paid to cognitive processes in the process of forming a certain type of behavior.
  • Skinner's theory does not give stable, sustainable results in practice.

Motivation concept

Skinner also created a theory of motivation. Its main idea is that the desire to repeat an action is determined by the consequences of this action in the past. The presence of certain stimuli causes certain actions. If the consequences of a particular behavior are positive, then the subject will behave in a similar situation in the future in a similar way.

His behavior will repeat. But if the consequences of a certain strategy are negative, then in the future he will either not respond to certain incentives or change the strategy. Skinner's theory of motivation boils down to the fact that repeated repetitions of certain results lead to the formation of a specific behavioral attitude in the subject.

Personality and the concept of learning

From Skinner's point of view, personality is the experience that an individual acquires throughout life. Unlike, for example, Freud, supporters of the concept of learning do not consider it necessary to think about the mental processes that are hidden in the human mind. Personality in Skinner's theory is a product, largely shaped by external factors. It is the social environment, and not the phenomena of internal mental life, that determine personal characteristics. Skinner considered the human psyche to be a “black box.” It is impossible to explore emotions, motives and instincts in detail. Therefore, they must be excluded from the experimenter’s observations.

Skinner's theory of operant conditioning, which the scientist worked on for many years, was supposed to summarize his extensive research: everything a person does and what he is in principle is determined by the history of rewards and punishments received by him.

Burress Frederick Skinner(1904-1990) was born in Susquehania, Pennsylvania, where he lived until attending college. His childhood passed in an atmosphere of love and tranquility. He loved his school and always arrived early in the morning. In childhood and adolescence, he was interested in creating a variety of objects. He also read a lot about animal behavior and kept an entire zoo at home.

Skinner went to Hamilton College in New York, but he didn't like it there. Despite his rebellion, Skinner successfully graduated from college with a degree in English, membership in Phi Beta Kappa, and aspirations to become a writer. For two years after graduating from college, Skinner was engaged in literary activities.

After reading about Watson's and Pavlov's conditioning experiments, Skinner made a sharp turn from the literary aspects of human behavior to the scientific ones. In 1928, he entered graduate school at Harvard University in psychology - despite the fact that he had never taken a psychology course before. Three years later he received his Ph.D. Upon completion of his scientific work, after defending his doctoral dissertation, he taught at the University of Minnesota (1936-1945) and Indiana University (1945 -1974), after which he returned to Harvard.

At the age of 78, Skinner wrote an article entitled "How to Maintain Intelligence as You Age," in which he referred to his own experiences.

In 1989, Skinner was diagnosed with leukemia. He died two months later at the age of 86.

Scientific analysis of behavior. Behavior, like any other phenomenon, can be studied using natural scientific methods. It has its own patterns, and therefore is predictable and controllable.

Personality is the sum of patterns (reactions) of behavior. Every behavioral response is based on previous experience and genetic code.

Conditioning and reinforcement. Reactive conditioning is reflexive behavior; The body automatically responds to the stimulus.

Skinner was more interested in the process that follows the response—operational conditioning. This is more than a reaction, it is one of the mechanisms of behavior. Operational conditioning is at the core of learning. By encouraging or punishing, you can form a certain stereotype of behavior. And not only in animals (training), but also in people.

Reinforcement is any stimulus that increases the likelihood of a certain (pre-programmed) reaction, shapes and regulates behavior (can be positive or negative). In humans, the word is also a powerful stimulus of reinforcement. Therefore, on the one hand, power and glory are added to the basic reinforcements, and on the other, fear, humiliation, etc.

Explanatory fictions. When the true causes of behavior are not understood, they are explained by false (fictitious) mechanisms. The most common fictions are: “autonomous person”, “freedom”, “dignity”, “creativity”. Fictions mask the true mechanisms of behavior.

Behavior management. To predict behavior means to study its mechanisms. Behavior management is based on studying and changing the environment. Skinner viewed the human body as a black box. The input (stimulus) and output (behavior) are known. What happens inside the box is largely a mystery.

In his study of operant conditioning, Skinner came to the following conclusions:
- Conditioning most often occurs outside the realms of consciousness. Our individual perception depends on past perceptions (culture, traditions) as well as experience. They layer gayness on top of each other and create a basis for behavior that we are often unaware of.
- Conditioning is maintained outside of consciousness. Many decisions and resulting behavioral responses involve unconscious perception.
- Conditioning is most effective (and reaches a new level) when elements of the unconscious are combined with the conscious (the unconscious is realized).

Social relations. There is nothing in social behavior that distinguishes it from any other behavior. It is characterized only by the fact that two or more people enter into interaction. The behavior of an individual depends on the behavior of the people around him. I paid a lot of attention to “verbal communication”; it contributes most to feedback.

Skinner's works laid the psychological and methodological foundations of modern programmed learning:
- each student works at his own pace (choleric - quickly, phlegmatic - slowly);
- the student moves on to more complex material only when he has mastered simpler material;
- thanks to the correct answer “the student is always right”, he does not have a feeling of inferiority (“sit down, mediocrity, bad”);
- the student is constantly active and receives immediate confirmation of his success;
- the question is always formulated in a qualified manner and in such a form that the student understands its essence;
- machine responses always have a hierarchy of accuracy, give the opportunity to choose, and are educational in nature.