Type Error

homeblogmastodonthingiverse



I just read "The Limits to Growth". According to their reference model we hit peak humans around the middle of this century (give or take a little due to technological advance), with the death rate going exponential somewhat before that.

Why wasn't this covered in school? It seems like an important thing to be planning for.

Anyway, the book has an interesting example of a type error. Here is a characteristic quote:

"Man possesses, for a small moment in his history, the most powerful combination of knowledge, tools, and resources the world has ever known. He has all that is physically necessary to create a totally new form of human society--one that would be built to last for generations."

This is obvious nonsense in an otherwise good book, facilitated by an archaic use of "man" that has fortunately slipped out of common usage. Man, or mankind, or humanity, is of type class. An individual instance of mankind is of type mankind. You can not reason about instances of class the same way you reason about instances of mankind.

$ python
>>> class Man: """ Instances of this class are homo sapiens. """
...
>>> man_instance = Man()
>>> isinstance(man_instance, Man)
True
>>> isinstance(Man, Man)
False

Instances of class do not have behaviours in that same way that instances of mankind do. They do not perceive problems and work to avoid them. Merely pointing out the problem is not sufficient to begin a process of avoiding it. We are going to slide right along the reference model into population overshoot. You should probably take steps to prepare for this. Those steps might by accident ameliorate the problem as a whole to a tiny extent, but they are just as likely by accident to make the problem worse. Certainly you could decide deliberately to attack the whole problem. That would be nice, it sounds like a fun hobby, I might even join you. Instances of mankind do stuff like that. But even then we would not be pursuing this hobby with the same fervour that the meaty extensions of mankind would, were it somehow rendered rational and self-interested.




[æ]