Wednesday, March 24, 2010

Who Invented Animal Training?


Marian Breland and Keller Breland, March 15, 1955. Source

Who invented animal training?

The question is silly on its face. Animal training is older than the hills. For certain, it is as old as the dog.

That said, most of what we call animal training today is what the fancy talkers call "operant conditioning," a term first coined by psychologist B.F. Skinner in the early 1930s in an effort to dress up an even older concept -- learning from consequences.

B.F. Skinner did not invent operant conditioning any more than Newton invented gravity.

That said, Skinner DID codify the basic principals of operant conditioning, and he did invent mechanical-based operant conditioning, i.e. the "Skinner box" a mechanical device that awarded animals with food for pulling levers, pecking at spots, and doing other slightly more complicated learned behaviors.

Perhaps just as importantly, B.F. Skinner brought into the world of operant conditioning several people who helped shape the way we train animals today.

In fact, I do not think it is too much to say that two of his students -- Marian Ruth Kruse and Keller Breland -- invented modern animal training.

The story begins in the very early 1930s when B. F. Skinner was a researcher at Harvard working on something he called an "operant conditioning chamber" -- a device which measured the response of animals to stimulus. This later became the first "Skinner box" or animal teaching machine.

In 1936, Skinner left Harvard to teach at the University of Minnesota, where he began expanding on his earlier work.

In 1938, he took on his second student assistant, a young 18-year old girl by the name of Marian Ruth Kruse.

In 1940, Skinner added Keller Breland to his team of graduate student assistants.

Keller Breland and Marian Rught Kruse fell in love, and in 1941 they were married.

Marian and Keller Breland learned the basics of operant conditioning from B.F. Skinner, and helped him to train thousands of rats and pigeons used in various experiments and projects.

Once particularly important project began in 1941, when Skinner and his assistants were hired by the U.S. Navy to see if pigeons could be trained to guide bombs to their targets.

While pigeon-guided bombs never made it to the battle field, the operant conditioning techniques learned during this period of stable Navy funding suggested to Mariann and Keller Breland a possible business opportunity.

Was there a market for trained animals? They thought there might be.

When, in 1945, B.F. Skinner was lured away from Minnesota to teach at the University of Indiana, they Brelands decided to see if they could make a go of it on their own as commercial animal trainers and contract researchers and consultants.

It seemed an unlikely way to make a fortune.

Skinner had mostly trained rats and pigeons. If the Brelands were to support themselves as animal trainers, however, they would have to train higher animals than that!

Yes, the basic elements of operant conditioning had been used, off and on, and in a largely chaotic way, to train many species around the world over a thousand years. But most of this real-world experience was now lost to time and was little more than rumor or anecdote.

Could the basics of rat and pigeon training be scaled up and used across a wide variety of species? And if it could, would there be a market for such a thing? Would it be a large enough market to put food on the table, and gas in the car?

No one knew, least of all the Brelands. They took the plunge, nonetheless, buying a small farm in Mound, Minnesota and forming a company they called Animal Behavior Enterprises (ABE).

ABE's goals were three-fold and reflected their slightly tenuous business plan: to produce trained animals in profusion for an unknown commercial market, to engage in contract research if they could find anyone willing to underwrite that, and to consult on operant conditioning if they could find anyone willing to pay for their advice.

To say bravery was involved in this economic venture is an understatement. To those on the outside, including family and friends, it seemed sheer madness.

The Brelands had a secret, however: they were pretty good animal trainers, and they also had a growing body of evidence that suggested operant conditioning was a very robust training methodology.

In 1943, the Brelands, working with B.F. Skinner, had discovered the power of shaping behaviors by using a simple hand-held food-delivery switch. Now, instead of being rewarded for actually completeting the task, an animal could be rewarding for "approximating" the task -- a behavior that could be "shaped," by degrees, to the actual desired behavior.

A simple hand-held food delivery switch was, in effect, the first massive leap forward beyond the Skinner Box.

By 1945, the Brelands had gone even further. The mechanical construction of Skinner boxes had led the Brelands to a new idea; that small noises, such as those produced by the mechanical apparatus inside a Skinner box, or the noise made by a hand-held switch, might be an important part of the training process itself.

Experimenting with this idea, Keller and Marian Breland discovered that an acoustic secondary enforcer, such as a click or whistle, could communicate to an animal what precise action was being done that was actually resulting in a food reward.

Keller and Marian called this a "bridging stimulus," and found it dramatically sped up animal training by increasing the amount of information going to an animal. Most importantly of all, it seemed to work well with all animals. Important stuff!

In 1946, Animal Behavior Enterprise's got its first animal training contract with General Mills. The assignment was to train farm animals to appear in feed advertisements.

This first successful contract led to more contracts, first for in-store promotional animals, and then for animals to be used in movies, circuses, museums, and zoos.

In addition to providing trained animals, the Brelands were also asked to train workers and producers in how to work with those animals when they were sent on location.

From the beginning, "training the trainers" became an adjunct business to providing the trained animals themselves.

While the Brelands had worked almost exclusively with rats, pigeons and chickens when employed by Skinner, they now found themselves training everything: dogs, cats, pigs, cattle, chickens, goats, sheep, raccoons, rabbits, ducks, parrots, ravens, deer, and monkeys.

At one point, the Brelands had more than 1,000 animals under training at a single time. Over the course of a lifetime, scores of thousands of animals, representing more than 140 species, were trained by the Brelands.

Of course, it did not take too long for the Brelands to outgrow their small Minnesota farm, and it took even less time for them to realize that long, cold Minnesota winters were not too conducive to animal training outside of a laboratory setting.

In 1951 the Brelands moved to Hot Springs, Arkansas, a central location well-served by the railroads, where land was cheap and the weather was not too bad.

There they continued to train animals and animal trainers, and they also started a cash-concern they called the "IQ Zoo" which featured various animals doing amusing tricks, from basketball-playing raccoons and drum-playing ducks, to a printing press operated by reindeer and a chicken that would take on all comers in games of tick-tack-toe.

Though Keller and Marian Breland were equals at ABE, their division of labor suited their personalities and the flavor of the times.

Keller was the public face who traveled and did most of the show presentations and who promoted and expanded on the theory, while Marian was the engineer who made sure everything ran like a clock and actually operationalized everything at the level of fur, fin and feather.

Throughout the 1950s and 60's business was booming, with the Brelands signing contracts with Marineland of Florida, Parrot Jungle, and Six Flags.

In 1955, the Brelands produced the first trained dolphin show at Marine Studios in St. Augustine, Florida, and in 1957 they produced the first trained-whale shows at Marine Studios in Florida, and Marineland of the Pacific in Palos Verdes, California.

The work of the Brelands did not go unnoticed. Not only did the Brelands train other animal trainers who went on to places like Busch Gardens and Disney World, and Sea World, but they were also contracted with by the U.S. Navy to see if dolphins could be trained to do surveillance and salvage work.

It was during this time, that Keller and Marian Breland met Bob Bailey, who was the first Director of Training for the U. S. Navy Marine Mammal Program.

In 1965, Keller Breland died of a heart attack, leaving Marian with three semi-grown children, and Bob Bailey stepped up as as General Manager of ABE.

Marian and Bob continued on with ABE, signing a contract with the U.S. Navy to manage their Marine Mammal Facility in Key West, Florida from 1967 to 1969.

Along the way love blossomed between Bob Bailey and Marian Breland, and they married in 1976, adding Bob's six young children (three sets of twins!) to the now rapidly growing family.

In the 1980s, the Baileys began to phase out the commercial subdivisions of Animal Behavior Enterprises in order to simplify their life and devote more time to teaching. After a 1989 fire destroyed a lifetime of research, including thousands of hours of historical film, the Baileys decided to close ABE for good.

Marian Breland Bailey died on September 25, 2001, in Hot Springs Arkansas, and her ashes were taken to Bush Key, seventy miles west of Key West, Florida, where she had spent so much time training dolphins.

Bob Bailey continues to train teachers in the basics of operant conditioning, and his own contribution to animal and human training will be featured in a later post.

Suffice it to say that if you have heard of clicker training, it's due in no small part to Bob Bailey, whether you know that or not!

And if you have ever trained an animal in the last 40 years, you have stood on the shoulders of Marian and Keller Breland and Bob Bailey, whether you know that or not.

When the history of animal training is written let it be said that these three remarkable individuals invented or perfected so much of what we take for granted today.

12 comments:

Mongoose said...

Interesting article. I think though, operant conditioning is probably why we now have so many dog whisperers, horse whisperers, baby whisperers, etc. Because operant conditioning ignores altogether the motivations behind the creatures' behaviours, and the relationships between members of a pack / household / herd. People who think that behaviour modification is all there is to getting others to do what you want are missing out on the concepts of motivation and relationship, and that makes them, on average, lousy trainers. In my opinion.

Heather Houlahan said...

But you forget what may be the most important chapter of this story, the Brelands' paper "The Misbehavior of Organisms."

Which could have been subtitled "When Skinnerian ideology gets bitch-slapped by biology."

Easily found online.

Donald McCaig said...

Wonderful History. I'm a little surprised you didn't mention "The Misbehavior of Organisms" the Breland's paper that destroyed the intellectual foundation of Skinner's theory.

HTTrainer said...

There are so many different words used when training an animal, each used by adherents of different theories. Ed Bailey has an interesting take on terminology in March issue of "Gun Dog Magazine".
We used many words to define our camp or niche in the training world, these words label ourselves and the methods we choose. You can't fix problems if we're babbling.

PBurns said...

Man you people take all the fun out of life. YES, I am going to talk about "The Misbehavior of Organisms" -- the next post. The one that the pure click and treat folks don't want to talk about because it's inconvenient.

The good news, as far as most dog trainers are concwenred, is that most pet dogs are not too big on instinct. If you look at "problem dogs," however, you find they tend to be game bred or field bred -- terriers, digging, Corgis and Shelties barking and herding, etc.

P

Unknown said...

Well, I'm not a troll, and I did read the article, so I guess it is OK if I say something. Patrick summarizes pretty well, but, as might be expected, there are a few inaccuracies.
What is VERY correct, in my opinion, is that animal training has been around for thousands of years, and that some animal trainers of old did pretty well.

About the Brelands and clickers (they called them crickets), they were using a clicker long before the D-Day invasion. Keller made them himself and you can see the wood-metal homemade devices in the video PATIENT LIKE THE CHIPMUNKS (training birds in 1943-44). Next, it was true that Skinner did not "invent" operant conditioning, any more than Newton invented celestial mechanics, Skinner did discover operant conditioning much like Newton (co)discovered calculus and described the motion of our solar system - all incredibly important events in human history.

We never worked for Sea World, Sea World management was too smart for that. They simply hired ABE's Training Director (Kent Burgess), tripling his salary. The Breland's wished him Godspeed!The rest is history. When you see a Sea World show you are viewing a direct descendant of ABE technology.

Pointed out correctly that Marian was left with 3 children, but they were mostly grown. I, on the other hand, had 6 kids (3 sets of twins), all very young. Now, Marian was the smartest woman I've ever met, but, on occassion,I have questioned her sanity for buying into my houshold.

But, why quibble about minor points. Skinner started it all by subjecting behavioral responses to objective data, and drawing conclusiongs, not all of which were correct. However, the Brelands' paper MISBEHAVIOR OF ORGANISMS did not detract so much from operant conditioning as much as it added to the potential of operant conditioning.

Thank you for the very kind words about my very, very close colleagues who taught me so much. It was they who started so much, and I so little.

Respectfully,

Bob Bailey

PBurns said...

Thank Bob -- I am editing this a bit as I said I would. Nothing wrong with actually getting it right for once, LOL!

P

Unknown said...

Patrick,
Thanks for this wonderful summarized history, and how nice to read a comment from my new hero, Bob Bailey!
Looking forward to your next one...
Melissa

Heather Houlahan said...

Skinner did discover operant conditioning much like Newton (co)discovered calculus

And Thorndike was doing what exactly?

PBurns said...

See >> http://terriermandotcom.blogspot.com/2004/10/short-history-of-dog-training.html

As I note, "operant conditioning" is older than the pyramids and is at least as old as dogs.

The idea that "consquences lead to learning" is the core of operant conditioning and much older than Skinner, Thorndike, the Bible, or even the devopment of fire.

So far as I know Thorndike's work does not mention a lot of the things Skinner did. For example, I do not think Thorndike talks about extinguishing behavior (one of the three legs of operant conditioning). Nor, do I believe, is there a mention of variable rewards as motivator. Thorndike's core thesis was that practice makes perfect, and that rewards and aversives (including something Thorndike called "readinenss") are the consequences that shape learning.

Thorndike mostly worked with humans (he invented a type of IQ test), but his collected work on animals can be read here >> http://psychclassics.yorku.ca/Thorndike/Animal/

Thorndike is, so far as I know, one of the first people to actually collect and track data and do real (albeit primitve) learning experiments on animals. I suppose one could argue that his "cat box" was a kind of primitive Skinner box, though I think that dramatically understates the difference between Thorndike's very simple and primitve stuff and Skinner's much more sophisticated work.

A final note: Thorndike is almost unreadable, in part because his theories were a bit fuzzy, and in part because his writing was quite furry. Skinner, on the other hand, was so clear on core concepts that he at times over-simplified, and he was a good enough writer that he produced a book (Walden II) that became a best seller. In short, Skinner was a motorcycle compared to Thorndike the tricycle. Keller and Marian Breland, took Skinner's motorcycle and added a sidecar, and then enclosed the sidecar and bike, inventing the first four-wheel drive cross-country vehicle that could take you almost anywhere while keeping you dry the whole time.

Chase it around the room, and I think the first REAL animal trainers were Keller and Marian Breland. They did not work with a few animals for a little while in a lab, but with scores of thousands of animals, and at least 140 species, over more than 50 years in the real world. They are the Henry Fords of animal training, no only mass producing it, but also creating (or co-creating) such core elements as shaping and bridging. No, they did not invent the wheel, but they sure as heck invented the modern car.

P.

Mongoose said...

I guess it depends what you call "training". Teaching tricks is one thing, socializing a dog to live with humans is another.

Seahorse said...

"Practice makes perfect" always makes me cringe a little. The fuller truth is "Practice makes permanent". One had better be crystal clear about the result they desire and exactly how to get there because practice will indeed make whatever is happening pretty solid.

Seahorse