 factory of the future. We're making a new part, or replacing an old one, will be as automatic as getting a hard copy on your home computer printer. It may sound like science fiction, but it's being developed right now in the Automated Manufacturing Research Facility at the National Institute of Standards and Technology. Manufacturing is important to the United States economy. Wherever we have trouble with offshore competition, it turns out that manufacturing is a major part of the problem. This shows up in terms of quality, cost, and speed in going to market. From its initial work in discrete parts manufacturing in the early 80s, to its more recent work on composites and metal atomization, the AMRF has concentrated on integrating a variety of manufacturing components, equipment, computers, and people. The first phase was the development of techniques and software needed to integrate machines on the shop floor. A great deal of effort was devoted to simply getting these machines to work together. We are now in the second phase of development concerned with determining how data needed to manufacture a part is generated, stored, is made available when needed, and how to make sure it is in the right form. To understand the complexities of this task, one must first look at five manufacturing steps in the factory of the future. Design, process planning, offline programming, shop floor control, and manufacturing processes. The first step is to create the specifications for a part with the help of computer-aided design programs. Once the part has been designed, using the CAD system and other engineering analysis tools, that data is communicated to a process planning system. This system determines the steps to be taken in order to make the part, the kind of tools needed, and the sequence in which they will be used. Next step is to use offline programming to generate the data to drive each piece of equipment specified in the process plan. Offline programming along with process planning and design form the foundation for concurrent engineering. All this data is then made available to the control function. Then the real time planning and scheduling of the work can be done. The data is then communicated to a system of workstations. The part is machined, deburred, and inspected using the control and machining system developed by NIST. The first step in this integrated process is to build a detailed architecture which describes the functionality, the relationships, and the interfaces of those manufacturing components. The second step is to provide the protocols and standards which allow the computers to talk through one another across those assigned interfaces. The third step is to provide the standards which allow for storing, retrieving, displaying, and exchanging the information they need to carry out their assigned functions. NIST's strategy was to develop two separate kinds of interfaces. One, to manage the retrieval of data from a variety of data repositories, and another to enable the engineering and control functions to communicate with each other. Let's examine the second interface first. In manufacturing a complex part, it is often necessary for a wide variety of CAD systems at different companies to be able to exchange information. The first attempt to do this was called IGIS, the initial graphics exchange specification. It provides a neutral format for the exchange of computer-aided design data, but it has some limitations for use by downstream planning and manufacturing process. IGIS drawing notes and annotations were really intended for human consumption. Tolerance and surface finish information is not tied directly to the geometry of the part, and this makes it unusable by automated systems. The most recent attempt to extend the capabilities of IGIS is called STEP, the standard for the exchange of product model data. STEP is intended to support a wide variety of applications which cover the life cycle of the product from design through manufacture and support. The STEP standard is designed to be extensible and not limited to file exchange technologies. The U.S. effort to develop the international standard, STEP, is known as PEDIS, which stands for Product Data Exchange Using STEP. A major focus of the product data sharing effort is the development of the National PEDIS testbed. The National PEDIS testbed will provide a testing-based foundation for the development of product data standards. The testbed supports PDIS Inc., a major industrial consortium. The consortium consists of more than 25 member companies that have a strong interest in product data sharing technology. The testbed provides systems, software, and personnel to these tests and evaluation efforts. What I've observed is that within the PDIS Inc. effort itself, there's an enormous amount of sharing and that accelerates the whole process. Companies that become involved early on in the PDIS initiative will have a great opportunity to develop a competitive edge. NIST is also continuing to refine the rest of the interfaces which drive the functions that generate all the data needed to plan, schedule, manufacture, and inspect each individual part. STEP provides one of the key interfaces for exchanging data between manufacturing functions. These interfaces will allow the paperless exchange of data between manufacturing functions which control the flow of production across different factory equipment. Three of these manufacturing functions include process planning, scheduling, and offline programming. Offline programming is also used to create simulations or animations of actual manufacturing activities. Here a programmer manipulates a graphical representation of the robot to perform a task that he'd like to perform. Once the programmer is satisfied that the program is actually operating the way he'd like, he can then download that information to an actual robot which will then perform the task. Determining the information that goes across functional components is one of the main problems of integrating commercial hardware and software packages. One candidate developed at NIST is known as ALPS, a language for process specification. ALPS provides a single representation for process plans which can be used by individual shop floor controlled subsystems to generate their own plans and schedules. It's important to test the interaction between the various subsystems on the factory floor. However a lot of this testing doesn't require any actual manufacturing to take place. What you're looking for is the orderly startup and shutdown of the various systems, identifying deadlock situations or unstable situations. So to do that kind of testing with the actual factory floor equipment would be prohibitively expensive. Instead what's better is to try and do as much as that testing is possible with emulated systems and that way you can identify the weaknesses and only at the end of the testing phase do you actually swap in the real factory floor equipment. The answer was to develop a control system in which shop floor hardware is simulated or animated. But this is only a partial test to validate that product data is communicated completely, accurately and unambiguously. To finish the validation process the software must also drive the real hardware. At the cleaning and deburring workstation both types of validation are being demonstrated. The same information used to drive simulations drives the real equipment. Here when the factory sends information to the workstation you can no longer tell whether the robot or a simulation of the robot is actually performing the operation. This allows us to debug high level programs without using the robot. The second type of standards called for by NIST's strategy are standards for the structuring and retrieving of manufacturing data. The major problem in sharing and integrating manufacturing information is diversity. There is no commonality among database systems, access methods or even the information model used by the various engineering systems, production management and scheduling systems and control systems. The solution was MDOS, the integrated manufacturing data administration system. This is how it works. When a user needs some data a query is sent to MDOS which takes care of translation, distribution and routing. MDOS forwards the inquiry onto the appropriate data manager. The data may reside in one computer or in many computers at locations all over the country. MDOS receives the information the user needs and then assembles and delivers it back to the user in the appropriate form. The entire MDOS operation is transparent to the user. MDOS was originally designed to solve a very practical problem, access from the control systems on the AMRF floor to geometry information, scheduling information and control programs and possibly other manufacturing information wherever it might reside. As it turns out, the standard interfaces, the standard forms of data exchange and the integrating information model all of which were necessary to the distributed system may have turned out to be the most important parts of the project. These systems have been shown to work with relational systems, with object-oriented systems and with file systems, in short with virtually any system in which manufacturing data might reside. One of AMRF's goals is to support development of international standards in all areas of manufacturing so that it is possible to easily substitute the hardware and software of different vendors in the system. What we're trying to do here at the NIST Automated Manufacturing Research Facility is to do research to develop better measurement techniques and to help develop standards for the effective application of computer technology to these manufacturing problems. To compete and win in the international arena, United States companies are simply going to have to offer product and services that are world-class.