CHA's - CAD Systems Trends (1984-88)  

Cook-Hauptman Associates, Inc.




  FUTURE TRENDS IN HARDWARE AND OPEN SYSTEMS


A Technical Overview in a Business Context


By: James E. Cook (1985)

( Abstract Introduction Environment Workstations LANs Servers Systems ¯ )

 

  Abstract

This paper addresses, for the 1988 time frame, trends in hardware (i.e., workstations, local area networks, and computational servers) and open systems. An engineering scenario and analysis is presented to expose fundamental shifts in business values and orientations and to portray the ends to which the trends are directed. Each topic is discussed in terms of its driving force, contributing technologies, most critical choice, best selection, and trends.

 

  Introduction

Those responsible for computer aided design (CAD) in the late eighties are faced with difficult decisions on many fronts: technical, managerial, and organizational. The biggest challenges to realizing the promises of CAD/CAM are managerial, but even so, the challenges in the technical and organizational arenas are substantial. Furthermore, the classical approach to decision making of dividing problems into separate parts is becoming inappropriate as engineering decisions become more dependent on overall business issues. Therefore, the first section of this paper, Engineering in the late 1980s, puts the CAD decisions in a larger business context. The remainder of the paper is the presentation of CAD hardware and open system technical trends within this business context.

The engineering scenario has the engineer orchestrating workstation, network and server in symbiotic pursuit of better, quicker designs. More important, though, are the non-technical influences on the conduct of engineering. For example, a major shift in values is said to be occurring. Specifically, the time honored concern for productivity, cost reduction, and production are being replaced by concern for timeliness, margin enhancement, and design. The trends which will be sustained are those which contribute to these new business values and orientations.

The actual hardware trends are divided into workstations, local area networks, and computational servers. The trends in each presents us with a fundamental choice. In workstations, we must choose between personal computer and workstations. With local area networks, in 1988 we'll be choosing between fiber optical cable and twisted pair. Ironically, the choice in the case of the most difficult product technologically, computational servers, is the least technical; it is whether to buy from an established company or a start-up.

The open systems trends revolve around standards. The most strategic selection to make is whether or not to adopt IGES (or SET) as a neutral data base or to postpone such a commitment until a better standard emerges. Another strategic open systems' selection which must be made is whether to insist that all device inputs and outputs (including spooled files and archived data output files) conform to GKS (or Core or PHIGS) or not.  The other important open systems' issue is whether or not to adopt UNIX as a standard operating system. Language standardization, on the other hand, is not too material to Open Systems as long as their respective standards and bindings (to pertinent systems, such as GKS) are adhered to.

 

  Environment

The engineer of the late 1980s begins the day by, say, reading the electronic mail and perusing his or her project's electronic bulletin board. Although the engineer has a powerful graphics workstation, that much computing power isn't needed for these initial tasks, but at least they are performed using the same user interface as heftier tasks and the same familiar collection of software tool kit resources.

The electronic mail messages include a notification of a successful Finite Element Analysis (FEA) run from the previous evening, a notification of a footprint mismatch, and an invitation to a social gathering. The project bulletin board advises those who might be affected by yesterday's customer request for a specification change to attend an eleven o'clock meeting to figure out a response.

A close look at the FEA results is reassuring. A few annotations attached onto the critical portion of the model's geometry insures, hopefully, that a particularly effective design intent remains intact through any subsequent edits. Next, a few incidental edits are made to the model to accommodate the footprint mismatch followed by an electronic mail message which apprises the appropriate cognizant engineer. So much for the final stages of that part.

Now, the engineer begins the design of a new part. First a textual description is entered from which keywords are extracted and used as the basis of activating existing parts which might satisfy the design requirement. The visual presentations of the most likely possibilities show that no existing part will do. One of the parts, the engineer decides, is worth using as a starting point. A few modifications of its dimensions and the introduction of a necessary twist completes the design, which previously would have taken over a week, in under an hour. However, there is a warning that the manufacturability of this part has been rendered unknown and should be determined before investing heavily in this design. A highly interactive "tradeoff session" is required to resolve the manufacturability issues, so the engineer leaves the office to talk to the project leader to get some guidelines before proceeding. The engineer expects to resolve the manufacturability that same afternoon and paces himself or herself so that an FEA run will be ready for that evening. And so it goes...

Underlying Non-technical Influences

On the surface, this engineer's day may look like an orderly, straightforward transition from today's hodgepodge of memos, meetings, scheduling, consultations, and computer hassles. The apparent ease with which these designs progress resulted from a large investment in new means and methods for the conduct of engineering. These new means are carefully selected and integrated systems which complement a new set of managerial values and organizational orientations.

The time honored managerial values of: productivity, discipline, specialization, cost reduction, failure minimization, and return on investment are becoming inappropriate to sustaining a competitive advantage, and so they are being replaced by a new set of values. The set of new values I see emerging are: timeliness, creativity, integration, margin enhancement, success maximization, and resource utilization (see Table 1). The old organizational orientations to centralized controls, functional departments, organization hierarchy, policies, production, and industrial resources I see as yielding to project autonomy, project teams, personal networks, culture, design, and information resources (see Table 2).


MANAGERIAL PARADIGM SHIFT
OLD VALUES NEW VALUES
Productivity -----> Timeliness
Discipline -----> Creativity
Specialization -----> Integration
Cost Reduction -----> Margin Enhancement
Failure Minimization -----> Success Maximization
Return on Investment -----> Resource Utilization

Table 1


ORGANIZATIONAL PARADIGM SHIFT
OLD VALUES NEW VALUES
Centralized Control -----> Project Autonomy
Functional Departments -----> Project Teams
Organizational Hierarchies -----> Personal Networks
Policies -----> Culture
Production -----> Design
Industrial Resources -----> Information Resources

Table 2


CAD hardware and open systems trends that will be sustained are those which are complementary to these paradigm shifts. As an example, the trend towards workstations is a significant facilitator of timeliness and creativity in the pursuit of high margin designs. As a counterexample, the conspicuous trend away from the "islands of automation" could be attributed not only to the inherent lack of integratability, but also to the lack of timeliness caused by manually iterating around the simulate - design - analyze loop. Furthermore, as engineering design activities blend into an engineering design process, the flexibility of components becomes more important than the performance of components.

 

  Workstations

Driving Force

The driving force in workstations is to give individual engineers sufficient computing power and memory capacity to allow them to create, simulate, and analyze their designs interactively for all but their most comprehensive analyses. The computing power challenge to do this is to provide high powered vector processing, image processing, and general purpose processing. The memory challenge is simply to have sufficient memory for 90 to 95%" of all tasks at affordable prices. This challenge translates into 1 to 10 million bytes (megabytes) of resident semiconductor memory and 20 to 100 megabytes of rotating magnetic memory for under $10,000.

Contributing Technologies

The technologies contributing to workstation trends are: CMOS semiconductors, gate arrays, surface mounted chips, and magnetics. CMOS semiconductors will reach 1-2 million circuits per chip resulting in 1 megabit memory chips and 5 million instruction per second (mips) processor running at 24 million hertz (megahertz). Gate arrays will be the major technology for implementing high speed customized logic. Surface mounted chips will effectively double board capacity and allow all workstations to be desktop consoles. Magnetic memories (not optical nor vertical magnetic) memories will be preferred for rotating memory and will have twice the capacity at today's prices.

Workstation versus Personal Computer

Engineers will have one of two fairly distinct choices for doing their CAD work. Either they will use a workstation or a personal computer. Specific prices and specifications is highly speculative, but are offered in the spirit of trying to be helpful.

The engineering workstation will be a $20,000 compact desktop engineering workstation whose specifications might be:

19 Inch (48 centimeter) color CRT display
8 Megabytes of resident semiconductor memory
100 Megabytes of rotating hard disk memory
1-2 Megabytes of anti-aliasing pixel color screen
24 Megahertz VME bus with a 32 bit data path
5 Mips processor and a floating point accelerator
1 Vector processor and image accelerator
1 Lisp accelerator (for "AI") - industrial grade
1 Standards accelerator board 1 Input/Output board
1 Unused expansion slots (8 total)
PLUS
1 Quality color ink jet printer
1 5 1/4 inch (13 centimeter) floppy disk drive

Alternatively, there will be a $5,000 general purpose personal computer whose specifications might be:

14 Inch (35 centimeter) color CRT display
1-2 Megabytes of resident semiconductor memory
30 Megabytes of rotating hard disk memory
.75 Megabytes of anti-aliasing pixel color screen
12 Megahertz Multi-bus with a 32 bit data path
2 Mips processor and a floating point accelerator
1 Lisp accelerator (for "AI") - industrial grade
1 Input/Output board 3 Small, unused expansion slots (8 total)
PLUS
1 Quality black and white ink jet printer
1 5 1/4 inch (13 centimeter) floppy disk drive

Selection

The choice should be determined by the nature of the usage. Engineers whose primary usage is for management, conceptual design, perusing designs, or elementary designing should select a personal computer, especially since it will outperform many of today's workstations, and will cost a tenth as much. However, engineers whose primary usage is in medium to large projects and who regularly do detailed design or analysis should select an engineering workstation.

Businesses which take a superficial view of return on investment and only provide personal computers to engineers who, by the above, qualify for engineering workstations will usually be subjected to negative consequences much larger than their savings. For example, skimping on engineers' CAD tools results in designs which, in some instances, may not be competitive because the slowness with which the personal computer responds adversely affects the engineer's creativity or causes results to be late.

Trends

Consequently, the trends for engineering workstations are:

Larger, faster, cheaper processors and memories
More specialized acceleration boards
More parallelism and concurrency
Increasing sophistication of image processing
Continuation of algorithmic advances
More analysis and simulation done on workstation
More often packaged just as desk top consoles


 

  LANs

Driving Force

The driving force in local area network (LAN) communications is to have responsive, reliable, communication of information (almost exclusively data through 1988, then voice and video sometime soon thereafter) to and from engineers' workstations (and personal computers). The primary challenges to communication are speed and reliability, followed by a long list of ancillary considerations. The speed challenge is to keep every user on the network (maybe somewhere between 100 and 1000) from ever experiencing any noticeable degradation of responsiveness, even during peak usage. The reliability challenge for voice and video is nominal (a fairly large number of errors is tolerable), however, for data, the reliability challenge is to achieve error free transmission of data and, in the rare occasions of an error, the sender and receiver must be notified. Finally, there is a large number of ancillary considerations: distance, interference, security, safety, ground currents, installation, splicing, corrosion, etc. These ancillary considerations relate, generally, to the issue of fiber optics versus coaxial cable or telephone wire.

Contributing Technologies

The major technologies contributing to LAN trends are: standards, signal processing, and GaAs and low cost lasers. In most respects, the rapid innovative standards progress of the last few years has the same effect as an advancing technology. The International Standards Organization - Open Systems Interconnect (ISO - OSI) communications standards reference model provides a framework for separate physical media and protocols to cooperate in the communication of data, especially through multiple LANs. Signal processing and line conditioning techniques are achieving 1-2 megabit per second over ordinary telephone lines accustomed to maximum data rates of 64 thousand bits per second. Data compression techniques are achieving three times compression on data transmission (and 10 times on voice and 30 times on video). GaAs (due to its unique properties of light/electricity conversion and ultra high speed) and laser (due to its unique ability to emit an exact frequency of light in short bursts of high energy) technology provide the means by which to send ultra high data rates (1-2 billion bits per second) over fiber optic cable for distances measured in miles or kilometers.

Fiber Optics versus Twisted Pair

Engineering departments will have two dramatic alternatives for LAN cabling, fiber optics and twisted pairs. The cabling selection, in turn, determines the ultimate capacity of the entire local area network. Or, the top of the next column are the parameters of these choices.

FIBER OPTICS

TWISTED PAIRS

Actual cable $ 5.00 / meter part of phone
Installation $ 5.00 / meter part of phone
Cost $ 1,500 / terminal $300 / terminal
Bandwidth 1 Gigahertz 2 Megahertz
Error Rate 10 exp -9 10 exp -6
Distance 1,500 meters 150 meters
Media Data, Voice & Video Data & Voice

Table 3


Selection

The selection is not as easy as the three orders of magnitude difference in bandwidth and error rate and one order difference in distance suggests. So prevailing is the aversion to re-cabling that twisted pairs will probably dominate administrative office local area networks. However, engineering organizations make extensive use of graphical data (rather than textual data) whose size and usage will accrue over time and whose size will multiply as precision, complexity, and pervasiveness increase. Therefore, to have a responsive CAD system will require that the data communications capacity be substantial in order to remain responsive.

Coaxial cable (particularly IBM's 75 ohm and to a much lesser extent Ethernet's 50 ohm), has as almost its only attraction being based on a mature (almost commodity, i.e., consumer cable television) technology which must be compared to fiber optics' more than ten fold bandwidth capacity and more than 100 fold data integrity superiority. For that reason, coaxial cable was not included as a choice.

Trends

The trends for local area networks are:

Standards complying with the ISO - OSI model
Migration of SNA and TCP/IP to ISO - OSI
Repudiation of proprietary networks
Widespread use of fiber between buildings
Substantial use of existing telephone wire
Customized fast chips to speed communications
Introduction of video and voice compression.


 

  Servers

Driving Force

The driving force is the thruput of applications. An important distinction between servers can be made based on whether they run a standard environment (e.g., Fortran 77, UNIX 4.2) without the need of any manual reprogramming in order to get the bulk of their performance benefits. Those which require no reprogramming are generally mainstream to the interests of general engineering and CAD users. The measurement of thruput is highly controversial because the (sometimes bizarre) architectures of computational servers can get radically distorted results on any single measure or benchmark. Nonetheless, the push for thruput is so strong that very proprietary architectures (and the latest technologies) are used. Performance is generally five to ten times that of an engineering workstation and (both are) advancing one order ever 5 years. That puts computational servers in the 25 to 50 million instructions per second (Mips) class.

Contributing Technologies

Most of the contributing technologies are the same (except for magnetics) as for Workstations, discussed above, namely: CMOS semiconductor, qate array logic, and surface mounted chips. The architecture of the server is the major technology contributor to achieving higher thruput. The architecture chosen for handling concurrancy, parallelism, switching, instruction streams, data flows, memory caching, and bus traffic generally determines the thruput power of the server. The commitment to vectors, arrays, complex versus reduced instruction set instructions can also materially affect how a computational server executes particular classes of jobs.

Established Name versus Start-up

Unless you are doing research or have exceptional needs for engineering computation, I believe the issue will boil down to the above. The Established Name will usually have a large repertoire of running and supported engineering analysis and simulation software. The Start-up will probably have the latest cost/performance benefits, and may have a special purpose niche in which it has spectacular performance.

Selection

Since the purpose is to have a computational advantage, the Start-up that has minimized or obviated manual reprogramming will usually offer the best price performance. However, the viability of these Start-ups has to be raised as a central issue. Good indicators of business viability are allegiances with major companies (not laboratories). A truly advanced architecture offered by an established company may also be worthwhile.

Trends

Trends in Computational Servers:

Dramatic increase in parallelism
Relatively constant cost between $200,000-400,000
Increase 1 order of magnitude power every 5 years
Further specialization into computational servers
... dedicated to: Images, Vectors, Arrays, FEA, with intensifying
Symbolic and Logical Manipulation and Data Access capabilities.


 

  Systems

Driving Force

The driving force is leading users (who are usually large CAD system customers!) who insist on being able to repeatedly and consistently exploit their product data throughout their entire product design activities from simulation, analysis, documentation, and then release to manufacturing without being restricted to any one vendor. These leading users have mandated that their discrete product data handling activities (processing, sharing, and dissemination) be fully automated into a continuous product process (information) flow. This product process flow is to become the neural network (in the case of CAD) and the nervous system (in the case of CAM) of the Factory of the Future and cannot be done on a broad scale without Open Systems. By Open Systems, we mean systems whose components abide by standards which allow users to mix and match vendors' offerings according to need and preference.

Contributing Technologies

The major technologies contributing to Open Systems are LANs and standards. The advances in LAN technology, including the advances in the standards on which it relies, are discussed previously in the section entitled, "LOCAL AREA NETWORKS." Standards have been rapidly emerging, not only for LAN and other communications, but also for virtual device interfaces and data base exchanges. The ISO - OSI communications model and the GKS framework are advances in the specifications of standards for CAD/CAM Open Systems. On the other hand, the competition between Europe and America has probably held back Open System standards.

European versus American Standards

In this ever shrinking world, parochial orientations as "European versus Americana" is counterproductive. What is needed is cooperation and consensus. In the case of \virtual device interface standards, Europe's GKS (Graphical Kernal System) has matured to the point where it is ready for truly international acceptance. Its conceptual framework and inherent flexibility makes it superior to its American counterpart, Core and PHIGS (Programmers Hierarchical Interactive Graphics Standard). The progress toward data base exchange standards is discouraging. Neither =the American IGES (Initial Graphics Exchange Specification) nor the European SET(Standard D'Exchange et De Transfert) have an adequate conceptual framework. SET can be thought of as analogous to a compiled IGES (i.e., it is more compact and faster, but derives from narrower (aerospace) interests and centralized implementation.

Selection

In some areas the choice is easy. For example, Fortran will continue having wide acceptance and binding to all relevant standards. UNIX seems to be the de facto Operating Systems' standard. And, in communications, many standards comfortably coexist due to the interchangability of standards at each layer of the ISO - OSI model. Lastly, GKS has matured to the point where it will experience wide acceptance.

However, in the most important arena, data base exchange, no promising standard has emerged. Both IGES and SET are flawed by their destruction of global information and their lack of a canonical for (i.e., a way of assuring that if two entities are identical, so are their representations). Furthermore, IGES macro capability is insufficiently flexible in that it forces data types and lacks conditional arguments.

Specification, implementation, and silicon (clock speed, density, and customization) progress will prevent speed of execution from causing a worthwhile standard to be rejected. Good implementations of GKS do not experience appreciable delays. IGES slowness is a secondary issue, it has much more profound flaws.

Trends

The trends in Open Systems are:

Perpetuation of bindings to Fortran
Wider acceptance of UNIX Operating System
Acceptance of GKS over PHIGS and Core
Postponement of neutral data base acceptance
Emergence of acceptable product data model.


 

PRESENTED AT: CAMP '85 Conference on September 25, 1985 in Berlin, Germany




HOME E-MAIL

https://cha4mot.com/works/cad_trnd.html as of November 23, 1997

Copyright © 1985 by James E. Cook

RETURN Top