Friday, February 26, 2016

Shocking News About B.F. Skinner

I always smile a little when I hear clicker trainers talk about the "science" of dog training.

The transparent message is that people without science degrees of any kind are now armed with the cloak of science because they went to PetCo and got a clicker, read some sensible training tips on a list-serv, and have a few cubes of cheese in hand.

Why do I smile? Simple. You see, to the extent there is a "science" to dog training (and I will let others debate the semantic edges there!), it was sparked by B.F. Skinner who used "Skinner Boxes" to teach animals to press levels, guide bombs, play tic-tac-toe, and dance in circles.

Left out of the story, however, is the fact that Skinner boxes had electric floors and could administer mild electric shocks to rats, monkeys, and other animals inside. Please notice the power cord and the electric floor grid in the "Skinner Box" diagram at top.

Of course the fact that B.F. Skinner jolted animals with electricity is hardly surprising. After all, the three core parts of operant conditioning (which were well understood by circus trainers long before B.F. Skinner named them) are rewards to encourage behavior, doing absolutely nothing to extinguish behavior, and engaging in "punishment" to discourage behavior.

So, to put a point on it, if you insist on calling yourself a "scientific" dog trainer, be sure to show me where you plug in the electric grid, or how you administer your mild aversives.

Science is not philosophy -- it is the opposite of that.

Science, like Mother Nature, is not particularly soft. In fact, it is more likely to be red in tooth and claw than warm and fuzzy. Every dog comes with teeth to instruct. Not a one carries a clicker.


PipedreamFarm said...

So did Skinner use shock in teh same way that gun dog traners use shock collars? Meaning clicker and shock collar training both came from Skinner.

PBurns said...

Yes. A Skinner box (the device that the Breland's used to train so many animals) was rigged for food and electric shock. You could get almost all behaviors expressed through food alone (Question: is extreme hunger a cruel aversive?), but the combination of shock and food was faster and in most cases increased the consistency especially if food was no longer a strong trigger.

The great thing about food rewards, as opposed to shock, is that before shock collars came along you could give food rewards anywhere. In addition, when the first e-collars came along, they were not very sophisticated and could nonly give pretty strong shocks and no tones or vibrations.

The main positive claim (a true one!) for food-based training is that it's harder for a really bad trainer with no experience and no desire to read or get any, to screw it up too bad. An e-collar is like a shotgun or a pistol -- a very good tool in the right hands. A clicker is like a ham sandwich - hard to hurt anyone with.

But, to come back to it, YES electric shock for training is what Skinner REALLY brought to the table. Food training has been around since Moses. See >>

Stoutheartedhounds said...

"You could get almost all behaviors expressed through food alone (Question: is extreme hunger a cruel aversive?)"

So, in reference to my previous question in the other thread, are you saying that R+ and P- CAN do everything without P+? Is there anything that R+ and P- cannot do? Is it really a simple issue of efficiency?

I am not a professional dog trainer, but I do use P+ and aversives to train my dogs. In some instances I will admit that I choose to use aversives because they work efficiently, but I do believe there are certain things that can only be taught with aversives (which I already mentioned in a previous thread).

SecondThoughtsOptional said...

Eating is existential. Animals will do just about anything to meet the existential need. However, once that need is met...

One of the not so great things about dolphins is that an obedient dolphin is one short ten pounds of fish. A dolphin + ten pounds of fish is one that's going to blow you off, at least until it's hungry again. How to keep one working in the field? With a muzzle, so it can't go off and catch its own fish. There's no question that when people interact with the dolphins, they do so nicely. However, to look at the big picture is to see a far less rosy situation.

Since we actually have to live with our dogs and can't keep them continually hungry (at least, I hope not) to ensure their cooperation, we need a bit more give and take.

Miss Margo said...

Skinner's research was truly top-notch. "Inducing Superstition in the Pigeon" is one of my favorite articles.

@SecondThoughtsOptional: It's the same with parrots as you say that it is with dolphins. To harness their full attention and motivate them, you have to delay feeding them half a day, and then give them treats for performance in the training session (of course, the birds are fed full meals afterward).

This man demonstrates (well-executed) clicker training with a hungry, socialized parrot in this video:

It's awesome; his bird can do about two dozen tricks now.

My Senegal HATES to have her nails clipped, but if I do it when she's hungry and give her a nut every other toe and immediately afterwards, she endures it and doesn't even hate me after it's over.

BTW Terrierman, I always appreciate the way you show compassion for captive parrots on your blog.

paulbarron said...

Skinner or more precisely a colleague of his Nate Azrin studied punishment to establish its scientific efficacy. The major conclusion for Skinner was that punishment should be avoided where possible. He spent his hole career warning us all about the negative consequences of punishment. Yes Skinner didn't discover behavior, like Darwin didn't discover evolution, he set it out on a scientific basis. By the way reward is not a Skinnerian concept, reinforcement is and along with C Ferster the publication of Schedules of Reinforcement,1957, mapped out the rules of reinforcement. The work with Ferster represents possibly his most original lab work. No one prior to this had identified the effectiveness of particular schedules or even identified such concepts existed!!

With regards to your association of Skinner with punishment you are very much misinformed!!!

PBurns said...

Nonsense Paul Barron. Every Skinner box comes with an electric grid on the floor. Surely you know this? do you deny it? Do you think I made that drawing myself?

As for Skinner and Azrin, they published papers on the difficulty of shocking pigeons due to their feathers and the construction of their feet, but they sure did figure out a way. See >>

The point of this post is not to defend shocking as a teaching technique -- it cannot teach very well, as I have noted before. A very mild electrical tap CAN, however, result in a higher compliance rate after a dog is taught, and that has been clearly demonstrated again and again. In addition, a very stout shock CAN be an absolute NO signal to an animal if it is very well timed and severe enough, and this too has been shown again and again.

Apparently you think Skinner's theory is just about rewards? It is not. There IS punishment in the mix. Surely you know this? Rewards alone are not the only driver in the real world and though Skinner was wrong that animals are blank boxes controlled by inputs alone, he was right that they ARE shaped by positive AND negative consequences as anyone who has strung an electric fence around a dairy cow -- or called them in with a dinner bell -- can tell you.

I would agree that Skinner did not discover much. He did not discover reinforcement -- gamblers did that a millenia before. He did not discover animal training -- that too was discovered a millenia earlier. He did not discover shaping -- his assistants, the Brelands, did that. Skinner mostly discovered a mechanical box that could operationalize reward and punishment so hundreds or even thousands of animals could be taught at the same time. That box -- the Skinner Box -- has an electrified floor. That is the point of this post. Shockingly (pun intended) you missed it. Go stand in a corner.