Not every user sees workstations the same way.
That is the conclusion of an Intel-commissioned blind survey conducted by a third-party research firm to gain new workstation insights. The survey elicited more than 2,500 responses from end users, IT decision-makers (ITDMs), and managers at major organizations across the top workstation verticals. The research firm performed qualitative interviews with 34 major companies representing key workstation segments: architecture, engineering, and construction (AEC); manufacturing; media and entertainment (M&E); health care; oil and gas; and financial services. Quantitative responses were received from 1,482 ITDMs and 1,046 business decision-makers (BDMs) – 2,528 in total. An impressive sampling, indeed.
Geographically, the responses broke down into 1,222 from North America, 695 from Europe, and 611 from Asia – a well-distributed geographic sampling. The usage across key verticals and workloads of the most frequently used applications were fairly evenly distributed – about 300 in each category – and twice that for manufacturing, resulting in a good distribution representing the industry as a whole.
Below we dig into key insights from the research.
What Is a Workstation?
The term “workstation” is commonly used in the industry, and most of the business decision-makers also use the term “high-performance computers,” which to them means a machine that can run multiple programs on multiple monitors.
The technical decision-makers tend to describe a workstation more in terms of solving their needs, pointing out that a workstation should have hyper-threading, multi-core, and an abundance of memory. Or in terms of specifics: Animators need dual--socket Intel Xeon processors operating at 3.5 ghz or higher, an Nvidia top-end add-in board (AIB), an SSD, and as much memory as you can afford – at least 128gb.
The Value of Application Certification
One of the distinguishing features of a workstation is the assurance that it will run the most frequently-used apps as they are designed, commonly referred to as “application certification.” During the development process, workstation suppliers work closely with independent software vendors (ISVs) to ensure that the application’s special features and functionality are fully supported. ISVs such as Autodesk, Bentley, Siemens, PTC, and others work with the AIB suppliers like AMD, Nvidia, and their partners.
ISV CERTIFICATION IS MORE CRITICAL TO TECHNICAL DECISION-MAKERS, AS THEY CONFIGURE AND REPAIR WORKSTATIONS. HOWEVER, CERTIFICATION IS MUCH LESS IMPORTANT IN CHINA THAN IN NORTH AMERICA.
They also work closely with CPU suppliers like Intel and AMD to fine-tune their software drivers for three operating systems (Windows, Linux, and MacOS), to make sure special features and functions in the applications fully exploit all the acceleration capabilities in the hardware, which are (hopefully) exposed through the driver.
In addition to three operating systems, there are three (or four) application program interface (API) standards that the workstation suppliers have to support: OpenGL being the most important and, in some cases, DirectX. Two new APIs, Vulkan and Metal, are also being added. Not all ISVs offer support for every combination, but nonetheless, that suggests there can be up to 12 combinations of OS and API that must be tested across at least a half-dozen AIBs. Additionally, there are a half-dozen CPUs, which expands the potential number of certifications to 432 possibilities. Obviously, they don’t test for these scenarios; however, testing for a dozen or two is not unusual.
With all that effort to ensure the maximum, most reliable performance, the surveyed end users don’t fully appreciate or understand the certification linkage and importance between the processors (CPU and/or AIB), software driver, API, OS, and application. And yet, if any one of those components in that chain fails, the system stops, and it burns time and money in debugging and repairing it.
The time and money aspect of production engineering design work is precisely the backbone of a workstation – rock-solid and “bulletproof,” as they say in the industry. So, given how important “fail-safe operation” is in a workstation, you’d think the users, especially at the level of those surveyed, would have a deeper understanding – if not appreciation – for its value. Nevertheless, it doesn’t appear to be part of the evaluation criteria by the managers on their purchasing checklist, and yet, it ranks number one in the decision tree chart on page 36. However, the real value is to professional IT managers, who get a guarantee that if they buy hardware that is certified with the application they are using, it will just work!
The value is in the enterprise-level support that the certification process provides the base, to ensure not only a good release, but also the ability to support it for multiple years. Nvidia, for example, actually tests a majority of the combinations and maintains builds/regression testing across multiple OSs, versions of OSs, and multiple versions of the ISV app. Furthermore, they’ll test and support multiple ISV apps running together on the same system. This cross-testing helps ensure the best supported workflow and not just the best supported app.
The Value of ECC
The highly dense, high-speed random--access memory (RAM) used in today’s modern computers is a miracle of technology, but it’s not foolproof. Those microscopic memory cells can miss a signal, get confused by cosmic rays, thrown off by temperature and/or voltage surges, and some say even misbehave if there’s a “bad moon out tonight.” All of these errors become more common as you increase the amount of memory in a system. Knowing the inherent fragility of RAM, circuit and system designers have developed schemes to catch, and sometimes correct, for such failures. Approaches have been developed to deal with unwanted bit-flips, including immunity-aware programming, RAM parity memory, and error-correcting code (ECC) memory.
ECC memory was introduced in the late ’70s and early ’80s. An ECC-capable memory controller can detect and correct errors of a single bit per 64-bit “word” (the unit of bus transfer), and detect – but not correct – errors of 2 bits per 64-bit word.
Technical decision-makers responding to the survey said they saw minor advantages to ECC overall but felt it was vital to CPU-heavy workloads.
Some respondents felt ECC was critical for AEC users performing rendering and senior designers/engineers in M&E, health care/biotech, and energy/oil and gas. Those who have used systems with and without ECC reported they consider the feature a critical factor for improved reliability and productivity. Others think it depends on the workload of the end user. When users move to an Intel Xeon processor-based workstation, they are assured they have ECC, which gives them the stability they demand for their systems.
ECC memory is one of the distinguishing features of a workstation and one of the elements that contributes to its nonstop reliable functionality. The survey indicated that efforts should be made to enable better understanding of the benefits of ECC and how it helps resolve a lot of the pain points users have identified.
Technical decision-makers and business decision-makers who responded to the survey are confused about which component contributes the most to great and/or poor performance. Keep in mind, they aren’t computer experts any more than they have to be to get their primary job done, so they can be excused if they don’t know where the bottlenecks are in a system. And keep in mind, those bottlenecks shift over time from memory, to software, to CPU, and so forth. This does, however, indicate that the sources of information that the technical and business decision-makers are using (magazines, Web pages, newsletters, user groups, and conferences) are not delivering sufficient information to them so they can make a more informed purchasing decision.
Based on the comments above about ECC and ISV certification, it seems clear that price is getting a major piece of mindshare in the decision process. That is almost a complete reversal from 10 and 20 years ago: In previous studies conducted by Jon Peddie Research (JPR) and others, price was never the primary decision factor; performance, reliability, vendor, and certification were always ahead of it. But with the expansion of the market to the entry level, price has crept up in importance. For the high-end users, though, it is still low on the list.
Running Multiple Workloads
The technical and business decision-makers who focused on the workstation insofar as it would be configured and certified to run their most critical workloads indicated that sufficient memory, followed by a high-end CPU and AIB, are top workstation purchase priorities. The number one item identified in the survey was a high-end, multi-core CPU processor for multi-tasking. Some respondents reported to have six to seven programs running at once, powering multiple workloads to boot.
For a group that wasn’t certain about what was the most critical part of the workstation, the technical decision-makers showed a surprising interest in overclocking and running multiple monitors – especially since overclocking can run counter to reliability, the hallmark of a workstation. The business decision-makers’ priorities were multiple screens, sufficient RAM, and certification for their specific software programs. The managers from AEC firms had a higher-than-average priority for multiple monitors and RAM, while the priorities from manufacturing decision-makers included overclocking and 3D capabilities. Meanwhile, the managers in the US and China placed a higher priority on CPUs and overclocking than the overall average.
DECISION TREE FOR TECHNICAL AND BUSINESS DECISION-MAKERS WHO RECOMMEND, APPROVE, OR ACTUALLY PURCHASE WORKSTATIONS.
At JPR, we think overclocking may be a false positive and doubt if overclocking is really understood. The most that a CPU or GPU can be overclocked is about 3 percent to 5 percent, but the impact on reliability due to heating is probably 25 percent or higher. Bottom line, the ROI on overclocking is terrible.
CPU and AIB: Function of the App
It’s probably not a surprise to learn that the respondents said the application determines how a processor impacts productivity – that the value of the CPU or AIB in a workstation depends greatly on the workloads and industry.
After I/O, memory and AIB rank high with AEC firms due to the large graphics files that are created and constantly updated. The newest-generation of high-end AIBs contain up to 24gb of high-speed local RAM (GDDR5). The main system can house up to 2tb of ECC RAM (DDR4). As astounding as those numbers sound, they aren’t there for show; high-end users need all the local storage they can get because the 3D models are getting larger every day. The dream of all designers is to have the entire model in RAM, so they can move through it as fast as possible.
The CPU is ranked at or near the top of the components for energy/oil and gas, health care/biotech, financial services, and M&E due to complex computations, rendering, and creating 3D digital files.
Geographically, China ranks the CPU as the most important component. In North America and Europe, the I/O and memory are most critical.
Workstations have a lot of demands on them, and they meet those demands, which is why they are so popular. A workstation has to have a crazy amount of high-speed I/O, inside and out. Outside there are features such as DisplayPort, external SATA, and USB-C. Inside, though, it has to have dozens of PCIe lines to support graphics AIBs (a high-end workstation can be equipped with up to four AIBs), high-speed SSD drives, Intel’s new high-speed Octane memory, specialized communications, and special I/O subsystems such as high-speed cameras. I/O is, and always has been, a moving target. The demands for more and faster I/O is one of the things that motivates a user to buy a new workstation.
The respondents to the survey indicated they were most interested in 4k and augmented reality for current workloads, and were testing virtual reality. M&E, manufacturing, and construction expressed the most interest in VR, followed by energy/oil and gas, and architecture firms, which stated they are testing VR (from qualitative interviews).
Technical decision-makers indicated more interest in VR than average, while business decision-makers expressed more interest in cloud rendering/processing.
Likewise, M&E, manufacturing, and construction firms stated they had more interest in VR than average, while AEC firms were more interested in cloud rendering/processing than average.
China and North America showed more interest in VR than average, while the US and China have a higher interest in 4k.
One of the misunderstandings the general press and others make is when they see the terms “VR” and “workstation” together; they think of a user wearing a head-mounted display (HMD). However, the major role for a workstation in VR is content creation as opposed to content enjoyment.
However, VR can be a partial supplement for a CAVE or can augment one. A CAVE, or Cave automatic virtual environment, is a virtual--reality system that uses projectors to display images on three or four walls and the floor.
By creating a VR walk-through of the proposed facility in the very early stages, manufacturers can engage with equipment suppliers and vendors, which allows them to better plan how operations will be conducted in the facility.
Workloads in the Cloud
According to the survey, there is a strong progression to the cloud. The respondents indicated they are actively moving both storage and computation to the cloud in the near term. Less-regulated firms in the US and UK will see the largest jump to the cloud, while firms already in the cloud have not seen a drop-off in workstation CPU needs.
THE FUTURE INTERESTS OF WORKSTATION USERS.
AEC, M&E, and manufacturing see more movement to the cloud in both areas than average, while finance and health care/biotech are more resistant to moving to the cloud. China expects computation to stay more local, while the US and UK are more open to the cloud.
The survey results correlated well with our findings for JPR’s CAD in the Cloud study (see “CAD in the Cloud,” CGW, July.August 2017).
There really isn’t one “CAD market,” there are several CAD markets. CAD is such a universal tool; it is used in dozens of other markets. CAD usage in one field can look quite different from CAD usage in another field. However, there are a couple of segments that dominate the use of CAD: AEC and manufacturing. Those two segments compose about 70 percent of the market, and for the purposes of analyzing the data, the rest is categorized as “other.”
Similar levels of current and planned implementation are seen across the main industry sectors. There are higher levels of ongoing evaluation in manufacturing and AEC than “other” sectors, of which almost half have not investigated CAD-in-the-cloud solutions at all.
CAD is not the only engineering application to move to the cloud: Finite--element analysis (FEA), computational fluid dynamics (CFD), and subterranean geophysical exploration modeling are some other applications that need the distribution and storage capability of the cloud to allow secure collaboration worldwide. It’s a constant trade-off between local processing and storage, versus cloud storage and local processing, and cloud storage and processing. And even within a company, on a given project, all three arrangements will be employed. There is no single answer (one size does not fit all), and it’s the flexibility that remote computing and storage offer that has helped propel the productivity gains in the face of increases in dataset sizes.
Product Introductions and Buy Cycles
Generally speaking, the workstation suppliers are introducing new products every two years on average to keep up with expanding workloads and software upgrades. Survey respondents reported that they try to look at the workload software upgrade specifications a year before they buy and plan their refresh around those requirements.
Large companies are moving toward two-year leases to automatically stock the best workstations. For AEC firms with users not involved in rendering, workstations are replaced every three to four years. Manufacturing, M&E, and energy/oil and gas are refreshing faster than the average.
In the past, large organizations would use a purchasing agent or IT manager to choose which workstation would be given to the company’s engineers. These buyers’ motivations were different from engineering: IT was looking for stability and communally for ease of maintenance and support, while engineering was looking for maximum performance. Typically, the engineers needing maximum performance were the minority and didn’t have a voice in the decision process. Today, that’s totally reversed because large and small organizations have learned that with the demands of time to market, product differentiation, traceability, and quality control, it’s the engineers who need to be driving the selection of which type of workstation they use.
Germany and France’s refresh cycles are longer than the average. The US and UK have shorter refresh cycles than the average.
Refresh cycles also are different for every organization, usually driven by budget cycles, and many tiers are out of sync with the realities of the market. For years, accountants and financial planning departments didn’t factor in the upgrade schedules of ISVs and hardware suppliers, leaving their engineers with outdated workstations and applications. In the past few years, though, planning departments have learned to include a fudge factor in their budget to allow for surprises – the ISVs don’t always have predictable or reliable update schedules.
A general rule of thumb has been to plan for a refresh of hardware and update of software every two to three years; any longer than that and you find yourself behind the curve compared to competitors, and over time, it only gets worse. A workstation is a tool, and if you don’t have the right tools, you can’t do your job – it’s that simple (and your job is to stay on time in a project and at least even with, if not ahead, of the competition).
The survey shows that end users are very involved in the workstation purchasing process. The respondents indicated that while IT can make workstation decisions, they almost never make those choices independently. The department manager and end users tell IT what software is being used and how many hours a day it will be under stress. Once IT figures out the workload and looks at the budget, they then buy the workstation.
The technical and business decision--makers more often look to outside consultants/VARs to help with selecting workstations, and less so their IT department, if they even have one.
North American firms are more inclined to use VARs, while China is significantly less inclined, according to the survey results. The US and UK give more autonomy to the line-of-business buyer, whereas China relies more on IT.
The Whole Picture
The key takeaway from this survey information is that workstations have a strong hold on power users and those who need uncompromised uptime and performance. The new multi-, multi-core processors now being placed in workstations – and, in some cases, two of them in a workstation – are mind-bogglingly powerful, and yet users still want more FLOPS, more memory, and more display resolution.
THE WORKLOAD, USER INPUT, AND LINE-OF-BUSINESS DEPARTMENT MANAGERS ARE KEY IN THE PURCHASE PROCESS.
The workstation users who responded to the Intel survey quantified their opinions with regard to several criteria concerning a workstation and its procurement, which is summarized in the table on this page.
Health/biotech/science, energy/oil and gas, and M&E are applications where users expressed the highest CPU needs, faster refreshes, and are current users of the Xeon processor. Manufacturing firms are attracted to fast refreshes (in part due to leases) and the need for ECC and multi-threading, and AEC firms give it the lowest priority with lesser CPU needs and slower refreshes.
Geographically, China ranks the CPU as the most important component, while in North America and Europe, I/O and memory are most critical.
In my book, “The History of Visual Magic in Computers,” I trace the introduction of the workstation to the IBM 1620, a small scientific computer designed to be used interactively by a single person sitting at the console. Introduced in 1959, it was the first integrated workstation – just not a graphics workstation.
Since then, workstations have become 10,000 times more powerful, 1,000 times smaller, and 1,000 times less expensive. Today, you can get a very powerful laptop workstation weighing less than four pounds for less than $2,000.
And the hunger for workstations continues. The market has grown from 50 units a year to over four million units a year, and even with a declining average selling price due to Moore’s law, the market has shown steady and robust growth in value.
COMPARISON OF APP NEEDS FOR VARIOUS WORKSTATION COMPONENTS.
All the things we enjoy today – air travel, fantastic movies and games, giant skyscrapers, clever consumer products, and even our clothes – are or have been designed on a workstation. To say we couldn’t live without workstations would be an understatement. But workstations are workhorses and not very sexy, so they don’t get headlines, tweets, or much Facebook time. If your car represented life, then workstations would be the pistons: You know they’re there, they do their job, but you don’t think about or talk much about them.
Today’s workstation ranges from devices as small as a couple packs of cigarettes to big boxes, and everything in between, including laptops.
The survey captured some of the ideas users have about workstations, and some of their attitudes with regard to buying one (or a hundred). And if it proved one thing, it is that opinions and needs vary geographically, by applications and industry, and, of course, budget. After all, there isn’t a workstation market, there are dozens of workstation markets.
Jon Peddie (jon@jonpeddie) is president of Jon Peddie Research, a Tiburon, CA-based consultancy specializing in graphics and multimedia that also publishes JPR’s “TechWatch.” In addition to following and reporting on workstations for the past 35-plus years, he is also the author of the recent book “Augmented Reality: Where We Will All Live.”