All electronic designs have a time dependence that is hard to escape, regardless of the analog or digital nature of the design. Parts get discontinued, techniques and markets change, and costs eventually become too burdensome, even if construction were still possible. Each design has its optimal period where it fits perfectly in the balancing act of materials, techniques, costs and customer desires, and then fades from the market as these factors shift out of an attractive alignment.
Oddly, digital designs, especially those with a software core, have this effect in a highly accelerated time scale and with much higher risk factors. This is the very unhappy merging of technical and market effects that have strong negative pressure on designs intended for a long lifespan, especially for critical areas such as avionics, energy infrastructure and public utilities. Here, the desired time horizon for useful life is rarely under 20 years, and based on previous generations of technology, have the implicit expectation to run as long as 40-50 years. Sadly for everyone, this kind of lifespan is simply no longer possible with current digital technology.
Moore’s Law and How Time Compression Started
Almost everyone recalls Moore’s Law (Gordon Moore, co-founder of Fairchild Semiconductor and Intel) in his 1965 white paper which said that component/transistor density was doubling every 2 years, and would continue into the future. Some use the faster figure of 18 months, because another Intel executive (David House) said that chip performance would double in that period due to the combined effect of improving both speed and density. Regardless of which yardstick you use, this general observation has proven true with sometimes serious consequences in the marketplace.
The 4004, Where It All Began
This rush to complexity all started with the groundbreaking Intel 4004 microprocessor in 1971 (running at a whopping 0.74Mhz, often with no outboard RAM). It had 2,300 transistors. For comparison, the Intel Xeon E5 processor has 2.27 billion, and my station has dual processors and 16 gb of RAM. It’s clear that Moore’s Law is working pretty much as described, but it has some serious fallout that is not so obvious. For example, the 2.4Ghz 16-core processor array in my Lenovo workstation is already obsolete and discontinued. In essence, I am working on a very advanced but inarguable dinosaur.
The Other Shoe Drops…
The rapid density/performance growth curve of digital parts tends to make higher performance progressively less expensive, but it also inevitably creates lesser parts with a very short market life cycle as a result.
Let’s face it, there’s not much demand for 4K DRAMs or 80386 microprocessors any longer, not to mention older RTL, DTL or even TTL simple logic circuits. Even the many 8080 family parts for embedded systems have largely vanished. In addition, the system level hardware that grew up alongside of these sold state devices like CGA or VGA monitors, floppy drives and cartridge tape backup that used to be such serious digital “kung fu” are also just memories. In fact, these days, it’s hard to find a new computer of any type with any floppy drive. Many interfaces like serial ports, VGA video ports and parallel ports have vanished as well. For the most part, this may be viewed as progress, but as we will see, it has a hidden and troublesome problem.
Digital design implemented at the discrete gate level is not so volatile, and has no intrinsic hidden code component, but this robust technique has also been seriously displaced by the use of programmable logic in many forms, like an field programmable gate array where a logic definition is used to configure a more generic multi-purpose device. These highly proprietary devices soon fell prey to the same increasing complexity race in a very competitive marketplace, and many designs and architectures were abandoned in a very short period, again making for wickedly short product lifespans and a confusing product and programming landscape. Thankfully, one can still buy individual HCMOS gates and other similar parts today, but that 74HC00 quad NAND gate can cost more than a low-end microprocessor today and probably exists only as a surface-mount device.
The Software Landscape
Interestingly, there is no software corollary to Moore’s Law, and software development has not kept up with hardware particularly well. In fact, rather than create well-established, validated and proven foundation languages and tools, we have instead seen a proliferation of similar but frustratingly incompatible tools with endless revisions (usually code incompatible), and many have been quickly discarded along the way, leaving support for systems built on them in serious jeopardy.
In addition to program language churning (usually to generate new software sales), we have seen the disappearance of many standard operating systems, and the loss of tool platforms to do software development and device programming. Even the hardware interfaces to device programmers (serial and parallel ports) have disappeared. As a final unfortunate development, we have also seen the erosion of instruction in software programming due to the perilously short lifespans of systems and languages and to the continuous need for programmers. Since all programming rests on a platform of the underlying host and target machines, which continue to evolve rapidly, deep and fundamental knowledge of them is very hard to come by today.
The real skill level of this programming group is deteriorating not so much because of its personal shortcomings, but rather because the industry is becoming progressively more fractured and incoherent as all aspects from hardware, operating systems and programming languages become transient, and continue to time expire. In addition, a high-quality programmer (in a four-year program or longer with some valuable work experience) will also have learned quality control, human interfacing, logic, task analysis, psychology and a host of fine arts topics that make for good and thoughtful design of attractive systems.
This well-integrated approach is now largely abandoned in favor of quick code competence in a narrow field, without the essential philosophical structure that leads to high-quality, useful work and a strong sense of responsibility for known good results. Virtually everyone’s software and application knowledge has become shallow and narrow as a result. This is not a criticism, merely an observation of how market forces are creating trends. The concept of quality has taken a serious beating in the process. Everyone figures they will “patch it in the field,” a genuinely toxic plan for progress. And something you really don’t want to see in flight.
I think it is difficult to find anyone who can really claim to understand the workings of Windows from the graphical user interface all the way down to the hardware abstraction layer, and the inner workings of all the system dynamic link libraries and hardware drivers. Or, for that matter, the complete functionality of any programming language or operating system and the differences between every version, compiler and related libraries. Once you stack applications in various languages with their support files on top of the operating system with all their code dependencies, the certainty of known good operation in all conditions soon vanishes, making validation and flight-worthy software very difficult.
These factors make the required support and error correction of fielded systems harder as each year goes by and might make continued production impossible as key items fall out of the supply chain and products rapidly go obsolete.
The ARM Concept
It is only recently that microprocessor intellectual property (IP) has really separated from microprocessor hardware, with the advent of ARM. The concept is to provide optimized processor technology that all chip makers can license with compatible code structures and upward mobility paths. They provide the tested IP, the vendors provide the differentiated silicon with the features they think are best. This helps offset the nightmare of orphaned and incompatible parts from vendors with no second source and is finally giving some longevity and “teachability” to the key concepts of digital design using microprocessors. The advent of a huge common interoperable technology pool with wide access like ARM is one of the most significant advances in digital design and of huge benefit to all involved.
Those are all issues leading up to code creation and viable hardware. There is another aspect that can be difficult, namely when the code itself becomes volatile and short-lived, and suddenly the process gets worse in a hurry.
In the next issue of Avionics International, we will look at the transition to hardware, and all of its aspects. AVS