The use of Statistical Process Control (SPC) extends across numerous domains, and the key metrics vary by industry. For example, Vetter & Morrice (2019) describe how SPC is used in medical applications. According to them, “Undertaking, achieving, and reporting continuous quality improvement in anesthesiology, critical care, perioperative medicine, and acute and chronic pain management all fundamentally rely on applying statistical process control methods and tools.” Each one of those applications would have their own key metrics.
Goetsch & Davis (p. 305) tell a story about the use of SPC in another industry: semiconductor manufacturing. According to them, a North American semiconductor plant they visited had reduced the number of control charts from 900 down to 100 in a few years. This indicates how it is possible to “overmeasure” processes. The problem here is that the semiconductor plant was collecting too much data. Why is that a bad thing? First, it takes time and money: some poor employer at the plant had to update all those control charts. Second, the charts had to be stored somewhere and that consumes space, even when charts are stored digitally. Finally, all that "overmeasured" data is just noise that masks the signal that is the real information.
Determining exactly what to measure is exceedingly difficult when the processes being monitored are complex. Even when using another control technique, an SPC is also recommended, at least according to Montgomery (2018). For example, SPC is used to augment a system called engineering process control (EPC). EPC is used for industrial processes and is best for situations where the mean value of what is being measured drifts over time. This is completely different from SPC systems, which measures values that vary about a fixed mean.
The data storage problems mentioned in Goetsch & Davis even happen for companies that store their data digitally. For example, web site hosting companies log (record) information about which web pages are requested, which images are downloaded, and any errors that occur. The most common one is the dreaded “404 – page not found” error.
The length of time these logs are retained depend on the industry and the jurisdiction. For example, in the U.S. healthcare industry, HIPAA requirements mandate that logs be maintained for six years. These requirements have been straining smaller web hosting companies because they occupy so much space on hard drives!
As much space as they occupy, log files are extremely valuable! Network engineers and QA specialists pour over these logs not only looking for errors but also to determine statistics about each of the web pages. To do this, there are specialized tools that make finding errors and measuring statistics easy.
Security specialists also use these logs to detect potential security threats. They can detect unauthorized access, identify malware infections, and use this information to respond to security breaches.
One advantage of digital logs over traditional paper-based control charts is that it is possible to create “alerts.” For example, if the log shows that a website went down, an alert in the form of a text message is automatically sent to a network engineer so that he can correct the problem. I imagine that other SPC software systems have a similar feature.
While storing digital logs is expensive, website hosting companies stick to the motto: “store everything, analyze later.” The costs are sometimes worth it.
References
Goetsch, D. L. & Davis, S. B. (2021). Quality management for organizational excellence: Introduction to total quality (9th ed.). Pearson.
Montgomery, D. et al. (2018). Integrating Statistical Process Control and Engineering Process Control. Journal of Quality Technology 26(2). https://doi.org/10.1080/00224065.1994.11979508
Vetter, T. & Morrice, D. (2019). Statistical process control: No hits, no runs, no errors? Anesthesia & Analgesia 128(2), 374-382. https://doi.org/10.1213/ANE.0000000000003977
No comments:
Post a Comment