Much as some engineers would like to ignore politics, it’s irresponsible to do so. I’m not saying you have to be a political junkie, but an awareness of the ultimate purposes and effects of the organisation you work for is part of being a responsible engineer. Writing in the Human Life Review, bioethicist Wesley J. Smith has issued a call to defeat technocracy before it takes over more of our lives than it has already. And every engineer should hear that call.
The meaning of the word “technocracy” has changed over the years. Engineer William Henry Smyth coined it back in 1919 to mean democracy as mediated through scientists and engineers. For a brief time in the 1930s it became a small political movement of its own, favoring the management of society by engineers and other experts rather than by democratic means. That sense is usually the one which is used today.
A pure technocracy would be in radical contrast to a democracy, where the ultimate authority resides in the people at large. In a technocracy, all important public (and many private) decisions are made only by experts, qualified by the usual professional credentials of education, licensing, or other signs of expertise. And by implication, a technocrat is a sort of technically-educated bureaucrat, one of those experts to whom power has been entrusted to do the right thing for the uneducated rabble, who aren’t smart enough to know what is good for them.
As you can tell, I’m no fan of technocracy, and neither is Wesley Smith. He sees technocracy as a primary threat in a spectrum of what are called “life issues”: abortion, euthanasia, healthcare rationing, and the destruction of the privilege of conscientious objection for medical workers. The problem stems from the stunted view of human life that technocracy tends to have. Because this stunted view seems to be all too common among engineers, I’ll pause to describe it in some detail.
Engineers are great at getting things done, but not so great at deciding what things need to be done. In a technocracy, certain fundamental assumptions are made without questioning or even thinking about them. For example, one such assumption is that increasing a country’s GDP (gross domestic product) is a good thing. So anything that contributes to the GDP is good, and anything else is bad.
That sounds nice, except what about all those useless old people in rest homes? They can’t contribute to the GDP — they’re too old to work. And what about children? Same deal — they are a drag on the economy, not a benefit. Even the worst technocrat can see, you would think, that if a culture quits having children, pretty soon there won’t be any culture to worry about. But it was technocrats who came up with the brilliant idea of China’s one-child policy, which threw a giant monkey wrench into that country’s demography and is threatening to wreck its future economy even now.
Engineers are great at coming up with more efficient ways to do things. And within the proper context, greater efficiency is indeed a worthwhile goal, if you’re talking about, say, energy efficiency or reducing a waste stream.
But when you try to apply that same attitude outside its proper sphere, you end up like Peter Singer, the Princeton philosopher that Smith quotes as saying the respect one should receive depends on your “capacity for physical, social, and mental interaction with other beings.”
A person with such an attitude, which is entirely consistent with the goal of increasing a country’s GDP, will take a dim view of people with disabilities of any kind, and will see nothing wrong with, for example, aborting any baby with Down’s syndrome, or refusing health care to an old useless guy with only a high-school education who’s probably going to die in a few more years anyway, in preference to a young productive college grad.
Essential to doing engineering ethics right is the cultivation of one’s moral imagination. The easy way out is simply to do what you’re told, acting as a small cog in the large organisations that most engineers are a part of, and not asking about the wider implications or effects of one’s work. But it’s attitudes like that which have allowed China to start constructing their social-credit system, which Smith cites as an egregious example of technocracy gone bad.
For those unfamiliar with it, think of social credit as your credit score, only applied to your whole life instead of a narrow aspect of your financial behaviour. The Chinese government, with input from their technocrats, decides what kind of citizen they want. Then they set about giving points to people who behave the way they want, and taking away points from those who don’t.
Intrusive technologies such as GPS tracking, facial recognition, and of course, informants, are deployed to find out who you associate with, what you do in your spare time, and what websites you visit. (Even here in the West, our technocratic watchers in the private sector have perfected the website part of the business.)
And for people who do things the government doesn’t like, such as going to church or meeting with other Christians or Muslims, loss of social credit score can mean restrictions in travel, loss of employment, or worse, not only for yourself but for your children as well, who had the bad judgment to be born to such ne’er-do-wells.
It can’t happen here, can it? Wesley Smith doesn’t seem to think so, at least not in the full-bore prison-camp variety, because the Constitution would prevent it. Well, I’m not so sure, because lately the Constitution has been found by its nine Supreme Court interpreters to say a lot of things that most people didn’t think it was saying.
But one thing is for sure: if engineers, who are vitally necessary to any sort of technocracy, individually and collectively refuse to destroy democracy and replace it with technocracy, that will be the end of technocracy. Idealistic? Maybe. But the first step is to understand what technocracy is and what its goals are, and only then can you decide to oppose it.
Karl D. Stephan received the B. S. in Engineering from the California Institute of Technology in 1976. This article has been republished with permission from the Engineering Ethics blog.