My favorite quote about networking came from Jim Kurose. The Internet works so well because it doesn't have to. The IP and lower layers of the internet stack make no promises of delivery. Complete failure fulfills the protocol. This allows for simpler and more powerful protocols without the extra complexity needed to guarantee success. TCP aims for delivery basically by restarting the IP communication when it fails, and even TCP can report failure to the layers above. We can say the same about modern artificial intelligence. Machine learning works so well because it doesn't have to. With the softmax function that neural nets use to determine the probability of outputs, neural nets never completely rule out a possibility, always giving it at least some tiny probability. In cases where the complexity is just too difficult, neural nets give several possibilities with nontrivial probabilities, as I described in my recent post, where a machine learning model would generate a uniform…
No comments yet. Log in to reply on the Fediverse. Comments will appear here.