I recently posted about an article that pure logic is the key to understanding all programming languages (Logic is the Key). That led one astute reader to ask a question about the limits of logic and programming:
Q: Might there be any limitations to what can be programmed that we otherwise would rely on our senses for?
The short answer:
All that we sense is interpreted by our brain before we are even aware of it. In some cases we automatically start responding to that input before the awareness reaches our consciousness. So, yes, there are limitations in at least that ‘use case’. Having said that, hearing aids can do an amazing job of compensating for hearing loss, some early attempts have been made to feed an image from a camera straight to the brain of a blind person, and some prosthetic hands are getting a sense of touch.
The long answer:
Let’s consider the case where we need a machine to operate in an environment apart from us (whether operating on a tether or autonomously). Further, let’s consider each of our five main senses: sight, hearing, touch, taste, and smell. Note: this is more of a thought experiment than a review of the technology state of the art.
Of all of the senses, this probably on of the easiest to mimic. We have cameras under the sea and in space, even on other planets. There are even cameras small enough to slip into our veins to look for blockages or to guide a surgeon removing a growth on an organ. Autonomous vehicles use cameras to watch for hazards on the road. Night vision goggles let us see in the dark, or warm bodies against a cooler background.
As mentioned earlier, hearing aids can compensate for hearing loss. Siri and Alexa can answer our spoken questions. An alarm system can detect the sound of glass breaking and alert a guard. Our phones can listen to spoken words and play back the translation in a different language. But some ear issues are currently beyond our ability to repair.
Touch, to degree of sensitivity and discrimination we enjoy, is very hard to mimic and faithfully pass on to us. We have systems that use resistance to movement as feedback, but that isn’t quite the same thing. There are pressure switches that provide a sense of touch, but still not like we have.
Taste & Smell
Because these two senses are closely related in us, I’m going to lump them together. Tasting or smelling for us with the same range and discrimination I believe is beyond our current technology. Sensors can detect specific chemical signatures and inform us of their presence, usually long before we could on our own. But that’s not the same as what we experience and thus a sensor cannot transmit to us what we can taste and smell ourselves (at least not yet!).
So what is the bottom line answer to the question? We can remote sense somethings well (vision and hearing), some in some fashion (touch) and the rest not so well (taste and smell). Can we give a machine access to these senses for themselves that would allow them function in their environment better than they might otherwise today? I suppose it all depends on the use case and the environment they are functioning in, but that is another line of inquiry (any takers?).
So in short, the answer is yes, no, and it depends.
Feel free to share if you know any technologies that can, indeed, enhance or replace the senses we have or that a machine might need in its environment!