Why Applications End Up on Specific Compute Platforms

For those of us on the hardware side of the technology industry, one of the greatest challenges is getting application developers to support new computing platforms. This can range from supporting different CPUs (x86, ARM, etc.) to different operating systems (particularly across different CPUs, but also different Linux distributions or kernel versions, or versions of Microsoft Windows), to specific hardware configurations. In some cases, these differences are driven by specific hardware needs of the application (graphics capabilities, memory requirements, or a certain number of cores); in other cases, it is simply economics (does the platform have enough market to make it worthwhile). The worst case (and this is often true of “reference architectures”) is when it seems to be done solely to drive sales of a specific vendor’s hardware platform. But all of this begs the question of why a given application ends up on specific compute platforms in the first place.

In many cases, applications were originally developed on or for a specific platform that was popular at the time. This is certainly true of the x86 processor architecture, which 10-15 years ago was essentially the only processor architecture with widespread market adoption. Most of the 3rd party enterprise business applications that are popular today were developed on this platform. Similarly, most of these applications were developed on the Linux, Windows, or (in a few cases) the Mac operating system. The only other development platforms that existed in that time were mainframes (a small market), and ARM or MIPS processors for embedded environments (industrial control systems, military hardware, etc.). Even by the late 2000s, many embedded designs had moved to x86 architectures (often running a real-time version of Linux) simply because there were more developers available to code for these platforms.

Things are certainly different today in several respects. If you had to pick the most popular processor today, it would probably by the ARM processor, which has near-exclusive market share in the smartphone business, and which has penetrated both the tablet and web server markets. Many (if not the majority) of these devices run either a version of Google Android or Linux. Similarly, the number of developers for these platforms has also increased significantly; while not as large as the x86 development community, it is certainly not insignificant. With this set of circumstances plus the fact that ARM processors generally consume less power than Intel processors (though this does vary by workload), one wonders why ARM processors do not support more enterprise business applications, or to look at it another way, why more commercial business applications haven’t been ported to the ARM processor. We will explore that topic in our next blog.

2019-01-10T09:51:24+00:00