IT Automation: A Cognitive Perspective
Recently I stumbled at a book Things That Make Us Smart by one of my favorite authors Donald Norman. In the book, he shared many insights on the complex human machine interactions, “arguing for the development of machines that fit our minds, rather than minds that must conform to the machine.” By the way, I highly recommend his another book The Design of Everyday Things.
Because I just blogged about IT automation, I still have that topic on my mind. So when I read the book, I did quite a lot of reflective thinking around IT automation. In general, I feel I can get more from a book when I have a purpose than otherwise. No exception this time.
Time to learn how to "Google" and manage your VMware and clouds in a fast and secureHTML5 App
Here are several quotations from the book that helped me and may help you as well:
Efficient, routine operations are fine for machines, not for humans. The body wears out – “repetitive stress syndrome,” we call it today. Just as the body can wear out, so too can the mind – the syndrome called “burnout” – wearing out the ability to create, to innovate, or even simply to care about the work being produced. A worn-out mind leads to a demoralized worker, to someone who no longer cares about the job and who is apt to leave.
This is an excellent description of the negative effect of letting human to do the works that should be really automated.
In aviation, airplanes are flown more and more by automated controls. Did the designers do a careful analysis of the task faced by a pilot and decide which were best done by people, which were in need of some machine assistance? Of course not. Instead, as usual, the parts that could be automated were, leftovers were given to the humans.
A great reminder for us to carefully evaluate what should be automated and what should not.
Worse, the automation works best when conditions are normal*[Wiener, 1988; Wiener and Curry, 1980]. When conditions become difficult – say, there is a storm and an engine, a radio, an electrical generator fails – then the automation is also likely to fail. In other words, the automation take over when it is least needed, gives up when it is most needed. When the machine fails, often with no advance warning, people are suddenly thrust into the process, suddenly asked to figure out the current state of the system, what has gone wrong, and what should be done.
Good news is that the flows to be automated are really normal and routine ones.