Browse Prior Art Database

Shared Disk Application in Statistical Quality Control

IP.com Disclosure Number: IPCOM000119857D
Original Publication Date: 1991-Mar-01
Included in the Prior Art Database: 2005-Apr-02
Document File: 4 page(s) / 162K

Publishing Venue

IBM

Related People

Manoranjan, K: AUTHOR

Abstract

Data collecting and processing is an integral part of manufacturing activity. Data can be gathered while monitoring the process, or it can be gathered while inspecting product or during regular testing. The process of collecting and processing data to make decisions is a very important aspect of a statistical process control.

This text was extracted from an ASCII text file.
This is the abbreviated version, containing approximately 41% of the total text.

Shared Disk Application in Statistical Quality Control

      Data collecting and processing is an integral part of
manufacturing activity.  Data can be gathered while monitoring the
process, or it can be gathered while inspecting product or during
regular testing.  The process of collecting and processing data to
make decisions is a very important aspect of a statistical process
control.

      Manufacturing is a round-the-clock operation or a three-shift
process.  Data collection and analysis takes place over a twenty-four
hour period.  Errors can be made in data recording, calculation or in
interpretation.  Since this activity is being performed by almost
everyone on the manufacturing floor, the possibility of making errors
increases.  These errors can be expensive since product can be
scrapped incorrectly or defective product may be shipped out as good
product.

      To ensure data integrity and accuracy, it is necessary that all
data entries be sent to a common file.  This will make it possible to
check for any errors made, and eliminate the need to combine all
information being collected from different locations, individuals,
and shifts.  It will also eliminate any errors due to calculations,
since all calculations will be performed by the system itself and not
by different individuals.

      This centralization of data is the key to making data
collection and analysis a very important part of Statistical Quality
Control (SQC) strategy.  This is similar to the network concept,
wherein every operator entering information is being shared by all
who are a part of this network.

      The basic aspects of this network are shown in Fig. 1. To
achieve this centralization in a manufacturing process, the following
steps should be taken:
      1.   A decision should be made identifying the person who will
be sharing his or her userid.  All data will therefore be sent
automatically to this userid.  This individual's userid is the focal
point where all data is sent and processed.
      2.   Since all personal files reside in disk address 191, the
individual must install a few cylinders (at least three) in disk
address 192.
      3.   The owner should format the 192 disk using the format
command and give it a label.
      4.   The owner should then provide access to those individuals
who are part of this network and will be involved in SQC.  This is
done using the RACFPERM command in VM.  Once this command is used
correctly, the owner's 192 disk can be accessed by all who are
sharing this disk.
      5.   Next, the program should be installed in the owner's 192
disk.  Program execution will create the input/output screen for data
entry for the
           operator and will store all data in a CMS file. This
program will be an EXEC file type and will usually contain program
code written in Restructured Extended System Executor (REXX). Program
execution for the owner of t...