Whatever Happened to Analog in this Digital Age?
Ed Brown
I came across an article from the University of Wisconsin that got me thinking about the analog world vs. the digital world. It described how some researchers discovered that they could fool a widely used digital automatic speaker identification system simply by speaking through a PVC tube. One explanation that caught my attention was: "…because the sound is analog, it bypasses the voice authentication system’s digital attack filters."
It got me thinking that there’s a fundamental difference between the analog world and the digital world — the analog world is the world as it actually exists, and the digital world is a manipulated approximation. For example, when you run an analog signal through an analog-to-digital converter (ADC), you’re taking information that is an analog to some real-world condition, for example temperature, and breaking up that continuous signal into digital pulses. The ADC takes the amplitude of the analog signal at a particular moment in time and represents it as a series of binary bits. So, there is a triple distancing of the ADC output from the real world. First, there is the question of how well the analog signal represents the physical phenomenon it’s measuring — it's never perfect. Then the DAC adds to that the fact that it can only sample the analog signal at discrete intervals of time. Then that sample has to be represented by digital bits that are limited in their accuracy by bandwidth.
So, why is it so important to measure the physical world in the first place? As I mentioned in my previous blog, without sensing the physical world there would be no Internet of Things (IoT) — there would be no automation, connected factories, autonomous vehicles, remote security systems, intelligent building systems, or sophisticated medical monitoring.
Signals have to be digitized before they can enter the connected realm of the IoT, there’s no question about that. There are, however, ongoing discussions about how and where to do that digitizing. A growing trend is to do the processing at the edge. Edge processing should really be called preprocessing. The idea is to cut down the amount of data sent to the cloud — or a central processor — by doing some of the analytics at or close to the sensor. That reduces the bandwidth needed to transmit large number of bits of data from all of the sensors in a system and it also reduces the computational burden on the cloud.
After decades of work as an EE, SAE Media Group’s Ed Brown is well into his second career: Tech Editor.
“I realized, looking back to my engineering days and watching all of the latest and greatest as an editor, I have a lot of thoughts about what’s happening now in light of my engineering experiences, and I’d like to share some of them now.”
Manufacturers are accomplishing that by embedding AI in the sensor. That can be done by digitizing the analog input and running inferencing at the chip. The tiny ML Foundation is a community of developers committed to this approach.
In a recent Sensor Technology article, Tom Doyle, CEO of Aspinity, described a different approach to the analog/digital division of labor. His company has developed an analog machine learning chip that can do some of the preprocessing at much lower power levels than digital-only AI.
Lately there has been a movement to bring back vinyl records because some listeners think the sound is warmer and more natural. There is no way to prove that — it is a purely subjective judgment. Generally, digital recordings can be more technically perfect because analog recordings rely on mechanical methods. The electronics is just introduced to process and amplify input. The sound from a vinyl recording is initiated by a stylus moving in response to grooves on the record. But some listeners might prefer a bit less perfection. You can’t argue with that because it is completely subjective. But isn’t the purpose of a music recording to impart pleasure to the listener?
High-Performance, Current-Steering Digital-to-Analog Converter
Faster Computation for Deep-Learning Applications
There is no disputing the fact that we are living in a digital age. Its effects are felt in every aspect of our lives from the computer I’m writing on to the smartphone I’ll be using to Facetime with my out-of-town children, to Google, which is my constant companion as an editor. But I think that it is important not to lose sight of the fact that the digital world has its limitations.
Which gets me back to the story I started with. When you’re designing digital systems that interact with the real world it’s important to consider the analog world as well.
Topics: