Cool. At what point does the machine start learning? Before or after it gives the discrete binary outcome as described by the sigmoid function? Or is it learning how to calculate as it outputs the result exactly as described by the mathematical equations encoded in the algorithm based on the given inputs? As it is a computer I would assume it already “knows” how to calculate therefore it would not need to “learn” this. So I guess the machine only really learns when the other important algorithms are added to the program and it is through the interactions of these various algorithms that the learning starts to happen. I would assume these other algorithms are also statistical/mathematical techniques/equations. Therefore, it must be the unique interactions of the mathematical techniques/equations that allows our machines to learn. Of course we also probably need to arrange said algorithms in an “artificial neural network” to make it look like we are doing something only humans and some non-human animals with (mostly) fully functioning nervous systems have ever been shown to be capable of doing, learning. Luckily since we know exactly how the human brain works (we do not) and how it learns (we do not) and the structure/function of neural networks (these have not even been conclusively proven to exist, let alone have we understood how these theoretical networks facilitate learning) it is a straight shot to machines that learn from there. Simple, right? I can’t believe I was a skeptic.

Would it really be so difficult to just say “**Logistic Regression is one of the most used algorithms in modern computing for binary classification?” **It conveys the exact same information but in an accurate and non misleading fashion. I know it is not nearly as sexy or cool or futuristic sounding or whatever but is it really so bad? If it makes you feel any better you could keep using the pseduo-modifier word “deep” as a consolation. It can do little harm since it has no real meaning anyway so I have no problem if you want to call it deep modern computing. That still sounds pretty cool, right?