How Much Automation Is Too Much?
Now that automation has become a primary goal of enterprise IT — in particular, within IT operations — we should ask the question: How much automation is too much?
Join the DZone community and get the full member experience.
Join For FreeOn June 1, 2009, Air France flight 447 crashed into the Atlantic, killing all 228 passengers and crew.
After several years of analysis, investigators concluded that one of the contributing causes of the crash was too much automation. Flight controls of the Airbus A330 were so thoroughly automated that the flight crew had lost their edge. When presented with confusing information, they made poor decisions which resulted in the crash.
Flying airplanes is but one of many applications of automation. Now that automation has become a primary goal of enterprise IT — in particular, within IT operations (ops) — we should ask the question: How much automation is too much?
The Problem of Technology Dominance
Technology dominance refers to the dominating influence technology has over individuals, thus allowing them to take subservient positions with respect to the technology, deferring to the tech in decision-making processes.
The long-recognized problem with technology dominance is deskilling. Deskilling is the phenomenon whereby people lose skills over time when they defer decision-making (and other intellectual processes) to some piece of technology.
Deskilling is ubiquitous across our technology infused lives. Calculators cause people to lose arithmetic skills. (When was the last time you performed long division by hand, hmm?) Search engines cause people to forget facts that such tools are good for retrieving. Autonomous vehicles cause people to lose the ability to drive.
Deskilling also happens to organizations as they lose skilled people to attrition. Any employee uncomfortable with deferring decision-making to technology is likely to seek more challenging work elsewhere.
The Rise of Automation Bias
For knowledge worker endeavors, automation is a particularly strong force of technology dominance. We call this type of technology dominance automation bias.
There are two sides to automation bias. The first is deskilling: when people have a skill for accomplishing a task manually, automating that task will cause those people to lose that skill over time.
The second part of the automation story is more pernicious: people entering a field where tasks that had been manual but are now automated no longer learn how to complete those tasks in the first place.
The most effective solution to the deskilling problem for automation bias is ongoing simulation training. Today’s airlines ensure that all pilots continue their simulation training throughout their careers so that they don’t lose skills to automation.
The second problem — in essence, the dumbing down of the workforce — is a bigger issue, especially in areas where expertise is required, including ops. How are organizations going to hire junior people and grow them into senior people if automation is depriving people of the opportunity to gain the expertise they need to advance in their careers?
Why Automation Bias Is Such a Problem
The reason why automation bias is such a concern for ops organizations is because no matter how good the automation is, there will always be the chance of an issue that the automation cannot solve. If no person on the team has the expertise to solve it either, then the organization will be stuck.
A common but shortsighted response to this concern is to conclude that the organization simply needs better automation. Given today’s automation is AI-driven and AI is still relatively immature, the assumption that automation technologies like AIOps will improve over time is reasonable.
Perhaps, then, automation bias is simply a type of growing pain. As time goes on, the problem will resolve itself.
The problem with this argument is that automation is software, and software is never perfect. There will always be situations where the automation breaks. If the organization hasn’t already taken action to mitigate automation bias, then no one will be around who understands how the automation works well enough to fix it. Once again, the organization will be stuck with no solution.
The Two Types of Expertise
Simulation-based training may address many deskilling issues, but there is more to ops expertise than such training can deliver.
The problem with ops — as with other knowledge worker activities that require advanced levels of expertise — is that senior ops pros become experts in analogizing.
Analogizing refers to how people can solve problems they’ve never encountered before because those problems are similar in some ways to other problems they have solved before, and they are able to come up with solutions to the problem at hand based upon those other, similar problems.
Analogizing is vitally important for solving the more difficult ops problems, including problems that automation cannot solve.
Given the complexity of modern production environments as well as the differences among them, having experts on staff who are not only up to speed on the specifics of their job but who can also analogize to solve the more challenging issues is essential to the smooth running of any ops organization.
Furthermore, analogizing is not a skill that people can effectively learn via training or simulation. It comes from experience — and the more diverse the experience, the better able the individual is in applying analogizing to difficult problems.
How Much Automation Is the Optimal Amount?
For many forms of automation, deskilling isn’t a serious problem. Knowledge workers in general, including ops personnel, may face many routine, repeatable tasks in their day-to-day work that don’t require a level of skill that would cause an issue if that skill were lost. All such routine tasks are subject to automation without concern.
At the other extreme, organizations may aspire to "lights out" production environments, so fully automated that there’s no reason to keep the lights on, because there are no people on duty. Any organization with such a lights out environment is likely to lose any staff who might be able to fix something if it goes wrong, either via deskilling or attrition.
As AI-based automation becomes increasingly sophisticated, therefore, organizations will reach some optimal point where the advantages of automation sufficiently balance any disadvantages.
Finding this optimum depends upon the people involved — the skilled workers who must somehow accommodate automation in their day-to-day work.
Be sure to listen to the senior-level people who are adept at analogizing. They can solve problems that automation will never be able to solve. Do whatever it takes to retain them and keep them sufficiently challenged, both to maintain morale but also to help them maintain their skills.
Remember as well that the only way to come up with senior people is to start with junior people and upskill them. Knowledge-based training, however, is insufficient because it doesn’t address automation bias.
Simulation-based training is more efficacious, and ironically, AI can drive simulation-based training technologies. But even this type of training isn’t enough.
No amount of training, either knowledge or simulation-based, can adequately teach analogizing. The best approach for building this skill is a combination of mentorship and experience — in other words, traditional apprenticeship.
Apprenticeship-based training, of course, has been around for millennia, many centuries before automation became a reality.
Now that AI is driving a paradigm shift in how organizations find and leverage expertise, it’s essential to remember the lessons of the past — especially in the face of advancing automation.
Published at DZone with permission of Jason Bloomberg, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments