I always smile a little when I hear clicker trainers talk about the "science" of dog training.
The transparent message is that people without science degrees of any kind are now armed with the cloak of science because they went to PetCo and got a clicker, read some semi-sensible training tips on a list-serv, and have a few cubes of cheese in hand.
Why do I smile? Simple. You see, to the extent there is a "science" to dog training (and I will let others debate the semantic edges there!), it was sparked by B.F. Skinner who used "Skinner Boxes" to teach animals to press levels, guide bombs, play tic-tac-toe, and dance in circles.
Left out of the story, however, is the fact that Skinner boxes had electric floors and could administer mild electric shocks to rats, monkeys, and other animals inside. Please notice the power cord and the electric floor grid in the "Skinner Box" diagram at top.
Of course the fact that B.F. Skinner jolted animals with electricity is hardly surprising. After all, the three core parts ofoperant conditioning (which were well understood by circus trainers long before B.F. Skinner named them) are rewards to encourage behavior, doing absolutely nothing to extinguish behavior, and engaging in "punishment" to discourage behavior.
So, to put a point on it, if you insist on calling yourself a "scientific" dog trainer, be sure to show me where you plug in the electric grid, or how you administer your mild aversives.
Science is not philosophy -- it is the opposite of that.
Science, like Mother Nature, is not particularly soft. In fact, it is more likely to be red in tooth and claw than warm and fuzzy. Every dog comes with teeth to instruct. Not a one carries a clicker.
As for Skinner and the Skinner Box, has no one else noticed that he never trained predators, and he never worked with animals in an open -field situation?
And why not?
Simple: predators have strong prey drives, and rewards-only training does not work very well to stop prey drive. In addition, rewards only training is subject to sudden failure in an open-field situation where stimulation and distraction can come from any and every direction.
And so Skinner tended to focus on pigeons and chickens, and the occasional rat, and he spent almost all of his energy getting them to do tricks for food, rather than getting them to stop naturally self-rewarding behavior using aversives.
One reason Skinner did so little with aversives is that birds are not easily shocked through their feet, and body feathers prevent direct skin contact with metal elsewhere. Unable to easily jolt a bird with an electric current through their toes, and using only animal subjects with very low prey drive, Skinner generalized a theory of learning that works well for training tricks in a sensory depravation chamber, but which too often fails in the real world.
The simple truth is that getting a prey animal in a closed and captive setting to do an unnatural behavior without distraction, is almost the exact opposite of getting a predator, like a dog, to stop doing a self-rewarding behavior in the home, yard, or field. Any wonder then that rewards-based training, based on Skinnerian theories, so often fails outside of the trick training arena?