The article uses psychological references to explain what it is about slot machines that is so appealing, and why the machines can be addicting to some people. The article explains that the appeal of a slot game can be traced back to research by psychologist B.F. Skinner, and his experiment with pigeons and food pellets.
B.F. Skinner (Burrhus Frederic Skinner (1904 - 1990)) has been recognized by his colleagues as being one of the most influential people of the twentieth century; although his theories are also among some of the most controversial.
Bf Skinner Mentions Slot Machines Made
Basically, Skinner said that Psychology is the Science of Behavior, and true science is based on nothing but facts. He criticized many prominent theories of the time, claiming that they focused too much on intangible things such as 'emotions' and 'feelings'.
B. F. Skinner believed that:
- 'psychology must restrict itself to what it can see and what can be manipulated and measured in the laboratory' (Schultz, 1986, p. 350). He also said, 'You can't get results by sitting around and theorizing about the inner world.... I want to say to those people: get down to the facts' (Hall, 1967, p. 70).
Free Will Is an Illusion
- All Actions Are Consequences of Prior Actions
Skinner saw the cause of behavior as not originating isolated within the person's being as a result of free will, but as a direct consequence of an interaction the external environment.
Therefore, he thought, in order to change or control behavior, we must study only what a person does, not the way he thinks or feels.
Of course, Skinner was a behaviorist. In fact, he admired the work of John B. Watson and strove to follow in his footsteps.
So, it is no surprise that he viewed people as empty shells and believed that physiological, psychological, or mental processes have absolutely no bearing on behavior. In fact, he said,
- 'As far as I am concerned, the inside of the organism is irrelevant either as the site of physiological processes or as the locus of mentalistic activities' (Evans, 1968, p. 22).
Skinner Goes Against the Humanistic Theories:
- In Science Thoughts and Emotions Are a Complete Waste of Time
In Skinner's opinion, what happens inside a person is irrelevant and only those things which can be observed are important. All behavior is the result of an environmental factor, so if you want to change behavior you must first change the environment.
This was completely opposite to many of the popular humanistic theories that were rising up during that time. Even those theorists who preceded Skinner looked to the 'inside' of the person for at least some explanation of behavior and personality.
Maslow talked about the basic needs of safety and security. Freud outlined the inner workings of the Id, Ego, and Superego. Erikson used terms such as 'trust', 'doubt', and 'shame'. Even Carl Rogers, who describes the importance of a child's experiential world, included the concepts of 'worth' or 'positive self-regard'.
But, Skinner claimed what he called the 'inner man' approach was a complete waste of time. He discounted thoughts and emotions, and stated that the only things worth studying were those which could be observed and measured.
All Behavior Results from Conditioning
- Consequences of Action Determines Behavior
According to Skinner, a child does not behave a certain way out of fear, obligation, respect, or even a sense of right and wrong. All behavior is a direct result of conditioning.
Simply, people behave the way they do because they have been trained to do so. It sounds like house-breaking the family dog doesn't it? In truth - it is.
In fact, one of the main differences between Skinner's work and that of other theorists was the lack of human subjects. Skinner used animals because he said they were easier to work with and really weren't much different from humans anyway.
He claimed that the same principles used to train dogs or circus animals are applied to human behavior.
For example, if a dog receives a treat every time he obeys a command, eventually he will learn to do whatever you ask of him. Why? Because he wants the treat.
Similarly, if a child receives a cookie every time he cleans up his toys, eventually he will pick up his toys without argument. Why? Because he wants the cookie.
The consequences of the action will determine the behavior.
However, Skinner also found that the opposite was true. When the reward was removed, the behavior became extinct. This could explain why a child's behavior may be different when he is away from home. Why should he hang his coat up at school if there is no reward for doing so? OR if there is no punishment for not doing so?
Individuality Is the Result of Different External Stimuli
So, according to Skinner's theory, we are all a result of reinforced behavior and what he calls 'operant conditioning.' Individuality doesn't really exist.
The only reason there are individual differences is because the experiences (stimuli) which led to our responses may be different.
A fishing example
For example, you may enjoy fishing because every time you go out in the boat you catch a fish. (Maybe not every time you cast your line, but every fishing trip results in some measure of success).
So, when someone suggests going fishing, your reaction - or behavior - would be very positive.
I, on the other hand, may detest fishing because I live near a stream where the fish are very scarce and I rarely catch anything, even after hours of patient waiting. To me, the sport may become boring and useless, so when someone suggests going fishing, my response - or behavior - will be negative.
Now, Skinner would say that neither one of us was born with an innate desire or skill for fishing. According to his theory, our individual differences are a direct result of the consequences of our actions.
When you cast a line, the consequence is a fish. However, when I cast a line, the consequence is no fish. But, if we had grown up in opposite environments where I caught lots of fish and you caught nothing, then our behaviors would also be opposite. Or, if you were to suddenly stop having success at fishing, your enjoyment and attitude toward the sport would decline.
The Principles of Reward are Modern Remnants of Skinner's Teory
While many people objected to being told that their 'inner man' is irrelevant - or even non-existent - Skinner's ideas are still used in many common parenting and educational approaches.
How may parents do you know who reward a child for 'good' behavior? And really, isn't a paycheck often a consequence of working hard all week?
Most people wouldn't go to work if there wasn't any money to be made?
While Skinner's blatant Behaviorist approach is obviously one-sided, he was undoubtedly one of the most influential people in the field of psychology and his principles can still be seen in many contemporary parenting theories.
Biography of B. F. Skinner
Burrhus Frederic (B.F.) Skinner was born in Pennsylvania on March 20, 1904. He was the oldest of two sons, and his younger brother died at the age of 16 from a brain hemorrhage.
As a child, Skinner enjoyed inventing things, which later proved helpful in his own studies and experiments.
He is known for his creation of the 'air crib' (a temperature and humidity-controlled sleeping chamber), as well as the invention of the Skinner Box, the cumulative recorder (to keep track of behavior), and a teaching machine which administered pre-programmed questions.
In the old video below, you can see Skinner presenting his teaching machine.
Skinner attended Hamilton College in New York and received a BA in Literature in 1926. His initial dream was to become a writer and after graduation, he spent some time trying to create fiction novels.
Eventually, he discovered the book Behaviorism by John B. Watson and also began reading some of the writings of Ivan Pavlov. He was so inspired by these men that he decided to pursue an education in the field of Psychology.
Skinner received a PhD from Harvard University in 1931 and remained at the school as a researcher until 1936.
He married his wife, Yvonne, and together they had two daughters.
Skinner accepted a position at the University of Minnesota and later became Chair of the Psychology Department at Indiana University from 1946-1947.
In 1948, he returned to Harvard and joined the Psychology department where he remained for the rest of his career.
Throughout his career, Skinner developed his own brand of psychology which he called Radical Behaviorism and introduced the idea of operant conditioning, which resulted in him becoming one of the leading voices in the field of behaviorism and behavior modification.
B.F. Skinner received many awards and recognitions including the National Medal of Science from President Lyndon B. Johnson in 1968, the Human of the Year Award in 1972, and a citation for outstanding lifetime contribution to Psychology in 1990.
Skinner died of leukemia on August 18, 1990, but his ideas and theories can still be seen in parenting philosophies, educational approaches, and clinical therapies.
B. F. Skinner's Theories and Contributions
Skinner is by some considered to be one of the most influential psychologists to have ever lived and some of his most notable contributions include:Radical Behaviorism:
Complete Elimination of the 'Inner Man'
Although Skinner classified himself as a Behaviorist, his theories differed somewhat from the more 'traditional' forms of this school of thought. He called his own approach 'Radical Behaviorism'.
Like other Behaviorists, Skinner believed that Psychology is the science of behavior and should be regarded as a natural science focusing primarily on the facts.
He also supported the idea that animal behavior is highly comparable to human behavior and therefore, the study of animals will tell you everything you need to know about humans.
Thirdly, he agreed with his colleagues in the proposal that environmental factors are a very strong indicator of behavior. However, it is on this point that the divergence begins. Skinner did not accept the suggestion that emotions, feelings, perceptions, or any other unobservable intangibles should be considered relevant.
Humans Are all Equipped with the Same Software and May Easily Be Reprogrammed
While all behaviorist adhere to the 'blank slate' theory that every child is born with a clean slate and behavior is shaped or created through rewards and punishments, most do not completely eliminate the 'inner man'.
Even Watson admitted that some behavior may be elicited by respect, fear, or a desire to avoid punishment. Behavior is created, but this behavior eventually becomes part of a person's personality.
But, Skinner said that all behavior is caused by consequences and that actions are completely determined by external stimuli. Free will is just a myth because everyone is basically programmed in the same way a computer is programmed.
This means that any individual can also be 'deprogrammed' or 'reprogrammed'. Personality - at least the way psychologists define it - does not exist but is just a small part of behavior that is determined by different responses to various stimuli.
Skinner: We All React Blindly to Threats and Rewards and Can Be Controlled Like Animals
Watson's theory was very similar, in that he believed rewards and punishments (consequences) could shape behavior; however, the behavior developed in childhood carried over into adulthood.
This explained why many children would run their homes and raise their own children in the same way in which they were raised.
But, Skinner said this idea was not correct. Since all action is a result of the consequence that follows, what happens in childhood doesn't necessarily influence adulthood.
For example, if a teenager loses the right to use the family car if he doesn't keep his room clean, then he is likely to have a spotless room because he enjoys the privilege of driving.
In this way, he is following the rules of the home and appears to be very obedient. However, once he becomes an adult and owns his own car and lives in his own home, these boundaries are no longer in effect. Now, he still gets to drive, even if his room is messy.
Some behaviorists would say that the idea of a clean room is so ingrained in a person that he will continue this practice even when he is away from home because he believes that it is the right thing to do. It has become a value to him. However, Skinner argues that this is not necessarily true.
Since the idea of 'values' or 'right and wrong' have nothing to do with behavior once the consequence has been removed, the action will cease. Therefore, the threat of losing driving privileges (or the reward of earning them) is no longer an effective consequence.
Now he has no reason to keep his room clean and isn't likely to maintain the practice unless there is another incentive for doing so. Skinner called his approach 'radical' because if focused on the facts - and nothing but the facts!
Operant Conditioning:
- Controlling Behavior by Its Consequences
As previously stated, Skinner believed that ALL behavior can be controlled by the consequences which follow it. Therefore, he stated that a person can be 'trained' or programmed to behave in a particular way simply by changing the nature of the consequence that follows.
Perhaps the most concerning part of this theory is the idea that whoever controls the reinforcement inevitably controls the person receiving the reinforcements.
So, no one is responsible for their own actions because they are all simply victims of the reinforcements in their environment. This is quite a scary thought when you think about the implications involved!
Now remember, Skinner was quite impressed with the work of Ivan Pavlov, and to explain some discrepancies between Pavlov's findings and his own, he made a distinction between Respondent Behavior and Operant Behavior.
Respondent Behavior
Respondent behavior is a response made to (or caused by) a specific stimulus in a person's environment. Skinner said that some responsive behaviors are automatic (such as jumping at the sound of a loud noise) while others are learned.
This learned behavior is called 'conditioning'. The idea of conditioning really began with Ivan Pavlov who is famous for his experiments with dogs.
At first, Pavlov noticed that his dogs would begin to salivate every time a bowl of food was placed in front of them. After a while, he started ringing a bell a few minutes before feeding time.
Initially, the bell made no difference and the dogs would only salivate at the sight of food. But, after several days, the dogs began to learn that food followed the sound of the bell. Now, they would begin to salivate as soon as the bell rang, before they could even see the food. The dogs had been 'conditioned' to respond to the bell.
However, Pavlov also learned that the dogs were only responding to the bell because they were being rewarded with food. Therefore, behavior can only be conditioned if it is reinforced. A behavior cannot be created without reinforcement, AND if the reinforcement is removed, the behavior will cease.
Skinner was quite fascinated with Pavlov's findings; however, he felt that they were very limited. He did agree that people are conditioned to respond to many stimuli in their environments (an alarm clock ringing will cause us to get out of bed - hopefully!), but he also believed that not all behavior could be explained this way.
He said that some behavior is determined by what he called Operant Conditioning.
Operant Conditioning
Operant conditioning is behavior that is determined by a consequence or reinforcement that follows an action (rather than precedes it like in respondent conditioning). Giving a child a sticker every time he uses the training potty or praising him for cleaning up his toys are examples of operant conditioning.
Now, there is a very important difference between these two types of behavior:
- Respondent behavior has absolutely no effect on the environment. The dogs responded to the bell but their response did not change their environment - the bell rang ,the food came, and the fact that they were salivating did nothing to change those events. A person many jump at the sound of a loud noise, but the jumping does not change either the noise or anything that follows the noise.
- Operant behavior, on the other hand, 'operates' on the environment and can bring about change. When a baby cries he is given food. Therefore, he quickly learns that a particular behavior is followed by a desirable reward, and this increases the likelihood of the behavior being repeated in the future. A specific behavior brought about a change in the child's environment.
Skinner also stated that if a particular action is not followed by a desirable reinforcement then the likelihood of the behavior being repeated is decreased.
So, it can be assumed that if a child does not receive food when he cries, then he is not likely to cry when he is hungry.
According to Skinner, the consequences (or reinforcements) of a behavior have the power to modify the behavior and determine whether or not the action is repeated in the future. Behavior may remain consistent in various different situations because the consequences are also the same.
For example, a child in most cultures learns very early that the word 'please' can lead to many wonderful consequences - 'Please may I have a cookie?', 'Please will you pick me up?', 'Please play with me.'
Skinner would say that the child is not necessarily being polite because it is the right thing to do, but because he is rewarded when he uses certain words.
If the word 'please' never elicited a desirable consequence then he would stop using the word, despite being told that it is always right to be polite. When the child goes to school, joins a sports team, or visits a friend's home, he also receives positive reinforcement for using 'please', so therefore the behavior established in the home is carried over to other situations.
A central aspect of Operant Conditioning is the idea of positive and negative reinforcement.
Positive and Negative Reinforcement Is More Effective at Controlling Behavior Than Punishment
Positive Reinforcement is a way of strengthening behavior by applying or giving a consequence.
Giving a child a sticker for cleaning their room, or dessert when they eat all their vegetables are ways of applying positive reinforcement.
Contrary to popular belief, Negative Reinforcement is not punishment, Skinner would say, but is actually a way of strengthening a behavior by removing or avoiding an undesirable consequence.
A child cries when his diaper is wet and his behavior results in his mother changing the diaper. The crying caused the cessation of discomfort (because wet diapers are not comfortable) so this is an example of negative reinforcement.
So, reinforcement involves something being either applied or removed, but both of these are very different from punishment.
While Skinner did discuss punishment as a means of conditioning, he argued that it is not an effective way of controlling behavior. He stated that punishment decreases the likelihood of a particular behavior reoccurring by:
- Applying an averse stimulus (Spanking a child for refusing to clean up his toys).
- Removing a desired stimulus (telling your child that he cannot have a cookie because he didn't eat his vegetables).
- Or failing to give a reward for an action (if you stop praising a child for making his bed, then it is likely that he will stop making his bed. The removal of a desired reinforcement will cause the behavior to become 'extinct').
Skinner believed that punishment, though it may sometimes seem effective, actually weakens behavior, and the goal of operant conditioning is to strengthen behavior.
Therefore, rather than making a behavior 'extinct' through punishment, people (parents, teachers, employers) should focus on strengthening desired behavior through positive or negative reinforcement.
Skinner's Box: The Rat in the Box
Skinner designed an Operant Conditioning Box - also known as Skinner's Box - to prove his theory.
He placed a rat that had been deprived of food (so he was hungry) into an enclosed box.

At first, the rat would run around randomly, sniffing and exploring the new environment. He observed that, at this point, the rat was not responding to any particular stimuli, nor was he making any effective change on his environment.
The box was equipped with a lever that, when pressed would dispense a food pellet into a feeding bowl or trough.
Eventually, the rat would discover this lever. Skinner argued that now the rat's behavior had in fact changed or 'operated' on his environment. At first the box contained nothing for the hungry rat to eat, but now it had a food pellet.
The food acted as a positive reinforcement for the behavior (pressing the lever) so the rat would continue to press the lever in order to get more food. NOW, the behavior had lost some its randomness and had become 'conditioned'.
From these observations, Skinner concluded that the rat's behavior could be accurately predicted. He knew that when it was placed in the box, the animal would push the lever because he had been conditioned to understand that his action would result in a desirable consequence.
But, Skinner took this one step further. He learned that behavior could not only be controlled, it could also be changed. What would happen if the rat pushed the lever and no food was dispensed?
After several attempts with no reward, the rat eventually stopped pushing the lever. The behavior was 'extinguished'.
From these findings, Skinner theorized that human behavior isn't much different than that of rats. (Remember, behaviorists believe that you can learn anything you need to know about people by watching animals).
When a baby is born, his actions are quite random. He is simply exploring and discovering his environment.
However, from the moment of birth, parents begin reinforcing certain behaviors. When he cries, he is fed. When he is happy, someone will play with him. When he throws a toy, mommy takes it away for a little while. Whether through reinforcement (positive or negative), or punishment, certain behaviors are encouraged to reoccur while others are more likely to disappear because they have not been reinforced.
- 'Operant conditioning shapes behavior as a sculptor shapes a lump of clay' (Skinner, 1953, p. 91).
Schedules of Reinforcement
Although it was easy to achieve reliable data when the rats were confined to a box, Skinner realized that real life doesn't always work that way. Every behavior is not reinforced every time it occurs.
You go to work everyday, but you only get a paycheck once a week. A child may not In other words, their behavior didn't immediately change after the first time the lever didn't produce the desired results.
So, he began to investigate various schedules of reinforcement to see if behavior could be controlled even when it wasn't always followed by a desired consequence.
In the short video below, may see how Skinner explains schedules of reinforcement in his experiments with pigeons:
Skinner studied four different types of schedules:
1) Fixed-interval schedule
Reinforcement was only given after a set period of time (one minute, five minutes etc.). It didn't matter how many times the rat pushed the lever, the food was only dropped after the fixed duration had lapsed. ays followed by a desired consequence. He studied four different schedules:
1) Fixed-interval schedule
Reinforcement was only given after a set period of time (one minute, five minutes etc.). It didn't matter how many times the rat pushed the lever, the food was only dropped after the fixed duration had lapsed.
This is very similar to someone who is paid a salary once a week, or a child who is promoted to the next level at the end of every school year.
Skinner found that shorter intervals were a greater predictor of recurring behavior. The longer the intervals, the lower the response rate.
In the beginning, a child is rewarded every time he uses the potty. Eventually, the interval could be increased to receiving a reward for going a whole day without an 'accident'.
But, the reinforcement schedule will become much less effective if the reward is only given for going an entire month without wetting his pants.
2) Fixed-Ratio Schedule
Reinforcement was only given after a set number of responses (for example, the rat must push the lever 10 times, or 25 times before food is dispensed). It didn't matter how long it took for the required number of responses to occur, the rat did not get food until it had met the requirement.
People who work at piece-rate jobs or on a commission basis will understand this type of schedule. The more the rat pushed the lever, the more food he would receive.
If the rate was set to dispense a pellet every 10 responses, he would quickly learn that pushing the lever 30 times would get him more food than pushing it only 10 times. Many parents will give a child a sticker for each day that he behaves at school.
Once he receives 10 stickers (or 7, or 20, or whatever number is pre-set) he gets to go to the store and choose a special prize. It doesn't take long for most children to realize that the faster they get the stickers, the quicker they will get their prize, so they are less likely to have bad days and more likely to behave. They learn that having a 'bad' days prolongs the time between rewards.
3) Variable-Interval Schedule
The interval of time between reinforcement varied. The rat may have received a food pellet after 5 minutes one time but after 15 minutes the next.
Of course, there was an average interval of time so that the rat remained confident that the reward was coming, even though he didn't know when. This schedule may apply to someone who is self-employed or operates a home business.
Bf Skinner Mentions Slot Machines Jackpots
In these cases, a pay schedule or work load could be variable depending on the demand at any given time.
4) Variable-Ratio Schedule
The number of responses required for reinforcement varied each time. The rat may have received a food pellet after 10 responses one time but not until after 20 responses the next.
Again, there was always an average ratio so that the rat knew that a reward would eventually arrive. Skinner compared this type of schedule to the features of a slot machine or roulette wheel.
Eventually the slot machine will dispense a reward, but the number of spins varies each time.
Behavior Modification
- Increasing or Decreasing Behaviors
Skinner's theories provided the foundation for a therapeutic technique known as behavior modification. This treatment approach is applied to children with ADHD, people with Obsessive Compulsive disorders, delinquent teens, addicts, and just about every other situation where behavior needs to be changed.
Skinner believed that behavior could be modified by reinforcing (either positive or negative) desirable actions and ignoring (or refusing to reward) undesirable actions.
Again, punishment should be avoided, although the line between negative reinforcement and punishment can sometimes be blurred.
For example, if a child acts out because he wants his parents attention, then some experts say that putting him in the 'time-out' chair is a form of negative reinforcement because it removes the reward he was looking for - parental attention.
Others believe that the time-out chair is form of punishment and that the child's tantrum should simply be ignored (as long as no one is in danger) or the parent should walk away.
When the child calms down, he can be given the attention he desires. In this way, the child learns which behaviors earn him the consequence he is looking for without being made to feel that he was 'bad' (which is what the time out chair may imply).
Although behavior modification has taken on many forms throughout the years, it has been used effectively in homes, schools, health/medical institutions, and workplaces.
Scientific Criticism of Skinner
1) Humans Are Much More Complex Than Animals
From a scientific point of view, Skinner is often criticized for limiting his research to rats and pigeons.
Experts argue that human beings are much more complex and possess many characteristics not seen in laboratory animals.
Others take offence to the suggestion that human behavior can be reduced to such simplistic terms that it can be explained through the observation of rats and pigeons. For example, you can't explain why a groundhog burrows in the dirt by observing a kangaroo.
Similarly, you can't determine human behavior by studying rats.
2) You Can't Just Eliminate the 'Inner Man'
Some experts also criticize Skinner for his disregard of internal factors such as feelings, emotions, and cognitive processes.
To claim that what goes on inside a person is irrelevant actually fails to recognize the very things that make us unique individuals.
Some of his writings indicate that Skinner eventually did come to believe that internal events (physiological, mentalistic, psychological) do exist; however, he continued to ignore them because they could not be controlled, manipulated, or observed objectively.
He felt that they had no bearing on predicting and controlling behavior and would remain irrelevant until such a day when science discovered a means of manipulating and observing these processes in concrete terms.
To some, this seems a bit contradictory - yes, feelings, emotions, physiological and cognitive processes do exist, but because we cannot see them, manipulate them, or measure them objectively then they are useless?
3) We Are not Robots - Humans Have a Free Will
Perhaps the greatest criticism is received from those who believe that people are much more than empty slates, robots, or machines. T
hey argue that free will definitely does exist, whether Skinner acknowledges it or not. In fact, some critics say his idea that everyone can be controlled or manipulated is rather scary.
- 'If Skinner's view of humanity were to become totally accepted, it would ease the way for a government to institute a society in which every aspect of behavior, from infancy on, would be controlled and directed' (Schultz, 1986, p. 372).
Books and Publications by B. F. Skinner
Throughout his career, Skinner published about 200 articles and 21 books. Although Behaviorism is no longer the prominent school of thought - having been replaced by Humanism - many of Skinner's ideas are still prevalent in the world today.
Walden Two (1948)
Walden Two was Skinner's only work of fiction and was the center of much controversy. It incorporated many of the same ideas included in the original Walden written by Henry David Thoreau.
Basically, the novel describes a utopian world where people lived without war, competition, or strife.
Skinner points out everything that is wrong with the present society and allows the reader to see what the world would be like if these flaws were corrected. He suggests that by changing the environment and eliminating political or economic factors, it would be possible to create a nearly perfect world.
Many claim that the book lacks a strong plot and is really quite bland; however, it is rather thought provoking and gives the reader a glimpse of alternate solutions to current socioeconomic problems.
Beyond Freedom and Dignity (1971)
Beyond Freedom and Dignity is a general discussion of Skinner's theories.
He explains why he rejects traditional psychology and claims that the focus should not be on changing people but rather on changing the environment in which the people function.
He definitely supports a Behaviorist approach to solving many of the problems that exist in society and believes that the idea of free will actually prevents science from being able to improve the world.
Basically, he states that only through the use of science, technology, and behaviorist principles can true freedom and dignity be achieved.
About Behaviorism
About Behaviorism is the best source of information if you are looking too understand the principles of Skinner's theories.
About Behaviorism outlines the fundamentals of the Behaviorist belief and describes the research and studies that led Skinner to make his conclusions regarding the conditioning of behavior.
- 'Self management is often represented as the direct manipulation of feelings and states of mind. A person is to change his mind, use his will power, stop feeling anxious, and love his enemies. What he actually does is change the world in which he lives' (from About Behaviorism, page 195).
Your Positive Parenting Ally,
Birgitte
Want to stay in touch and get the latest news?
Sign up for my free newsletter
Parent Coaching
- For Inner Peace, Clarity and a Deeper Connection to Your Child
Being a parent can feel like a double-edged sword. Life with kids may feel like the greatest gift you have ever received, while at the same being hugely challenging, often leaving you confused, stressed and overwhelmed.
When we feel like this, we've lost touch with ourselves. We can't hear our own inner voice, and it's difficult to know what is 'right' for us and how to act.
I offer in-depth parent coaching to help you regain your balance and get back in touch with yourself. From a place of inner peace and clarity, your will find your own answers which will help you reconnect with your child from a place of unconditional love and acceptance.
Read more about my parent coaching here.
Where Would You Like to Go Next?
Famous Parenting Experts Overview
Famous Parenting Experts and Their Parenting Styles Theories: Humanism vs. Behaviorism. |
Parenting Experts Related to BEHAVIORIST Parenting Styles (Founders, Refiners, Supporters or 'Coiners')
John B Watson Biography, Theories and Books: The Father of Behaviorism. | Amy Chua Biography and Theories: Tiger Mother's 9 Parenting Principles. |

Bf Skinner Mentions Slot Machines Jackpots
Parenting Experts Related to HUMANISTIC Parenting Styles (Founders, Refiners, Supporters or 'Coiners')
Sigmund Freud the Controversial Pioneer of Psychology: The Famous Psychosexual Stages of Child Development. | Jean Piaget Biography, Theories and Books: The Earliest Torch Bearer of Humanism. |
Erik Erikson Biography and Theories: The 8 Developmental Stages, Identity Crisis and Ego Identity. | Abraham Maslow Biography: The Father of Humanistic Psychology and Self Actualization Theory. |
Dr. Benjamin Spock Biography, Theories and Books: The Controversial Forefather of Intuitive Parenting. | A Biography of John Bowlby: The Father of Attachment Theory. |
An Intriguing Mary Ainsworth Biography: The Refiner of Attachment Theory. | Diana Baumrind Spot-on: Biography, 3 Parenting Styles and Criticism (Spanking). |
Carl Rogers Biography, Theories and Books: The Founder of Humanistic Psychology. | Alfie Kohn Biography, Theories and Books: The Father of Unconditional Parenting. |
Deep Insights into the Essence of Dr Sears' Attachment Parenting - Along with a Fascinating Historical View on the Slow Rising Consciousness of Attachment Parenting. | Martin Seligman: The Critic of Traditional Psychology and Father of Positive Psychology. |
Back to the top of this page about B. F. Skinner Biography, Theories and Books: Father of Radical Behavorism
Go to the Positive Parenting Ally Homepage
An operant conditioning chamber (also known as the Skinner box) is a laboratory apparatus used to study animal behavior. The operant conditioning chamber was created by B. F. Skinner while he was a graduate student at Harvard University. It may have been inspired by Jerzy Konorski's studies. It is used to study both operant conditioning and classical conditioning.[1][2]
Skinner created the operant chamber as a variation of the puzzle box originally created by Edward Thorndike.[3]
History[edit]
In 1905, American psychologist, Edward Thorndike proposed a ‘law of effect’, which formed the basis of operant conditioning. Thorndike conducted experiments to discover how cats learn new behaviors. His most famous work involved monitoring cats as they attempted to escape from puzzle boxes which trapped the animals until they moved a lever or performed an action that triggered their release. He ran several trials with each animal and carefully recorded the time it took for them to perform the necessary actions to escape. Thorndike discovered that his cats seemed to learn from a trial-and-error process rather than insightful inspections of their environment. Learning happened when actions led to an effect and that this effect influenced whether the behavior would be repeated. Thorndike’s ‘law of effect’ contained the core elements of what would become known as operant conditioning. About fifty years after Thorndike’s first described the principles of operant conditioning and the law of effect, B. F. Skinner expanded upon his work. Skinner theorized that if a behavior is followed by a reward, that behavior is more likely to be repeated, but added that if it is followed by some sort of punishment, it is less likely to be repeated.
Purpose[edit]
An operant conditioning chamber permits experimenters to study behavior conditioning (training) by teaching a subject animal to perform certain actions (like pressing a lever) in response to specific stimuli, such as a light or sound signal. When the subject correctly performs the behavior, the chamber mechanism delivers food or other reward. In some cases, the mechanism delivers a punishment for incorrect or missing responses. For instance, to test how operant conditioning works for certain invertebrates, such as fruit flies, psychologists use a device known as a 'heat box'. Essentially this takes up the same form as the Skinner box, but the box is composed of two sides: one side that can undergo temperature change and the other that does not. As soon as the invertebrate crosses over to the side that can undergo a temperature change, the area is heated up. Eventually, the invertebrate will be conditioned to stay on the side that does not undergo a temperature change. This goes to the extent that even when the temperature is turned to its lowest point, the fruit fly will still refrain from approaching that area of the heat box.[4] These types of apparatuses allow experimenters to perform studies in conditioning and training through reward/punishment mechanisms.
Structure[edit]
The structure forming the shell of a chamber is a box large enough to easily accommodate the animal being used as a subject. (Commonly used model animals include rodents—usually lab rats—pigeons, and primates). It is often sound-proof and light-proof to avoid distracting stimuli.
Operant chambers have at least one operandum (or 'manipulandum'), and often two or more, that can automatically detect the occurrence of a behavioral response or action. Typical operanda for primates and rats are response levers; if the subject presses the lever, the opposite end moves and closes a switch that is monitored by a computer or other programmed device. Typical operanda for pigeons and other birds are response keys with a switch that closes if the bird pecks at the key with sufficient force. The other minimal requirement of a conditioning chamber is that it has a means of delivering a primary reinforcer (a reward, such as food, etc.) or unconditioned stimulus like food (usually pellets) or water. It can also register the delivery of a conditioned reinforcer, such as an LED signal (see Jackson & Hackenberg 1996 in the Journal of the Experimental Analysis of Behavior for example) as a 'token'.
Despite such a simple configuration (one operandum and one feeder) it is nevertheless possible to investigate a variety of psychological phenomena. Modern operant conditioning chambers typically have multiple operanda, such as several response levers, two or more feeders, and a variety of devices capable of generating different stimuli including lights, sounds, music, figures, and drawings. Some configurations use an LCD panel for the computer generation of a variety of visual stimuli.
Some operant chambers can also have electrified nets or floors so that shocks can be given to the animals; or lights of different colors that give information about when the food is available.Although the use of shock is not unheard of, approval may be needed in countries that regulate experimentation on animals.
Research impact[edit]
Bf Skinner Mentions Slot Machines Machine
Operant conditioning chambers have become common in a variety of research disciplines especially in animal learning. There are varieties of applications for operant conditioning. For instance, shaping a behavior of a child is influenced by the compliments, comments, approval, and disapproval of one's behavior. An important factor of operant conditioning is its ability to explain learning in real-life situations. From an early age, parents nurture their children’s behavior by using rewards and by showing praise following an achievement (crawling or taking a first step) which reinforces such behavior. When a child misbehaves, punishments in the form of verbal discouragement or the removal of privileges are used to discourage them from repeating their actions. An example of this behavior shaping is seen by way of military students. They are exposed to strict punishments and this continuous routine influences their behavior and shapes them to be a disciplined individual. Skinner’s theory of operant conditioning played a key role in helping psychologists to understand how behavior is learned. It explains why reinforcements can be used so effectively in the learning process, and how schedules of reinforcement can affect the outcome of conditioning.
Bf Skinner Mentions Slot Machines For Sale
Commercial applications[edit]
Slot machines and online games are sometimes cited[5] as examples of human devices that use sophisticated operant schedules of reinforcement to reward repetitive actions.[6]
Social networking services such as Google, Facebook and Twitter have been identified as using the techniques.[citation needed] Critics use terms such as Skinnerian marketing[7] for the way the companies use the ideas to keep users engaged and using the service.
Gamification, the technique of using game design elements in non-game contexts, has also been described as using operant conditioning and other behaviorist techniques to encourage desired user behaviors.[8]
Bf Skinner Mentions Slot Machines Invented
Skinner box[edit]
Bf Skinner Mentions Slot Machines Dispense
Skinner is noted to have said that he did not want to be an eponym.[9] Further, he believed that Clark Hull and his Yale students coined the expression: Skinner stated he did not use the term himself, and went so far as to ask Howard Hunt to use 'lever box' instead of 'Skinner box' in a published document.[10]
See also[edit]
References[edit]
- ^R.Carlson, Neil (2009). Psychology-the science of behavior. U.S: Pearson Education Canada; 4th edition. p. 207. ISBN978-0-205-64524-4.
- ^Krebs, John R. (1983). 'Animal behaviour: From Skinner box to the field'. Nature. 304 (5922): 117. Bibcode:1983Natur.304..117K. doi:10.1038/304117a0. PMID6866102. S2CID5360836.
- ^Schacter, Daniel L.; Gilbert, Daniel T.; Wegner, Daniel M.; Nock, Matthew K. (2014). 'B. F. Skinner: The Role of Reinforcement and Punishment'. Psychology (3rd ed.). Macmillan. pp. 278–80. ISBN978-1-4641-5528-4.
- ^Brembs, Björn (2003). 'Operant conditioning in invertebrates'(PDF). Current Opinion in Neurobiology. 13 (6): 710–717. doi:10.1016/j.conb.2003.10.002. PMID14662373. S2CID2385291.
- ^Hopson, J. (April 2001). 'Behavioral game design'. Gamasutra. Retrieved 27 April 2019.
- ^Dennis Coon (2005). Psychology: A modular approach to mind and behavior. Thomson Wadsworth. pp. 278–279. ISBN0-534-60593-1.
- ^Davidow, Bill. 'Skinner Marketing: We're the Rats, and Facebook Likes Are the Reward'. The Atlantic. Retrieved 1 May 2015.
- ^Thompson, Andrew (6 May 2015). 'Slot machines perfected addictive gaming. Now, tech wants their tricks'. The Verge.
- ^Skinner, B. F. (1959). Cumulative record (1999 definitive ed.). Cambridge, MA: B.F. Skinner Foundation. p 620
- ^Skinner, B. F. (1983). A Matter of Consequences. New York, NY: Alfred A. Knopf, Inc. p 116, 164
External links[edit]
Wikimedia Commons has media related to Skinner boxes. |
- From Pavlov to Skinner Box - background and experiment