It is but one example of the so called (by me) compulogical fallacy which describes the logical contradictions that arise when we apply characteristics/behaviors/attributes/skills/abilities to machines and computers that can only rightly be applied to whole (mostly intact) human beings and some (non-human) animals. The term machine learning is one of the most oft cited (by me) examples of this fallacy. The two words (each by their very definitions) when combined in that order result in a term that is a logical contradiction and the creation of something that is logically impossible, a learning machine. A machine cannot learn for if it did it would no longer be a machine. Even though it is a nonsense term and thus absurd computer “scientists” and technology hype practitioners have continued its use unabated. Apparently they believe that violating a universal law (logical law) that has been recognized as such since virtually the dawn of consciousness in man is justified by the fact that everybody else is doing it, and nobody gives a crap about what words mean or logic. It doesn’t hurt that it sound sexy and cool and no doubt pads the ole’ pocketbook when it is paycheck cashing and/or grant writing time. Slap the word “deep” in front of it and you have yourself a virtual gold-mine. Literally, it is virtual, not real, not a real thing, machines can not learn. They are not capable of “supervised” learning or “unsupervised” or “reinforcement” learning. They are not capable of learning numbers, or letters, or words, or concepts, or definitions, or even algorithms. You know what all the algorithms in the list of most common machine learning algorithms have in common, they are all just plain old algorithms, many of which have been around since olden times, back when machines were not learning, which also happens to be every day since that time, and today. Confusing isn’t it? Programming a machine/computer with one or even all of them does not magically give said machine the ability to learn, it gives said machine the ability to calculate/compute using said algorithms. Not sure if this is a useful thing to know about machine learning or not but I figured you should probably know it.