Unit - 4
TQM tools and techniques
Project Managers rely on several tools and techniques for total quality management. They are
- Right First Time: Employees ensure quality while they work. They do the right things first time. They aim for zero defect.
- Benchmarking: It is the process of learning from best practices of other projects that produce superior performance. They do exceptionally high-quality things.
- Outsourcing: It is subcontracting services and operations to outside firms who can do them cheaper and better.
- ISO 9000: They are set of quality standards created by International Organization for Standardization (ISO). Organizations obtain certification form ISO for product testing, employee training, record keeping, supplier relations and repair policies and procedures.
- Statistical Quality Control: It includes a set of specific statistical tools that can be used to monitor quality. It is based on sampling.
- Just-in-Time Inventory Management (JIT): Inventories are received just-in-time to be used up by production. They are not stored.
- Speed: Speed is the time needed to get the activities accomplished. TQM increases speed. Speed becomes a part of project culture.
- Training: Employees are provided continuous training in quality matters. Quality circles also serve as training grounds for TQM.
A control chart is a graph which displays all the process data in order sequence. It consists of a centre line, the upper limit and lower limit. Centre line of a chart represents the process average. Control limits (upper & lower) which are in a horizontal line below and above the centre line depicts whether the process is in control or out of control. Control limits are based on process variation.
Types of control chart
There are various types of control chart used for different types of data and for specific purposes. Selecting the right type of chart is the first priority. Let us discuss some of the charts which can be used for the following types of data.
- Attribute data – When your data is in the form of an attribute or count form of data, we will use control charts like
- P chart
- U chart
- C chart
Attribute data are the number of defects, defective units, etc.
2. Numerical data – When your data is in the form of a continuous type of data, we will use control charts like
- X bar chart
- R bar chart
- S bar chart
- Examples like measurement of length, weight, temperature, etc.
When to use control charts?
- To examine whether the process is stable or not.
- To understand the process variation over time.
- When you need to find out any variation occurs and fixed it instantaneously.
- To find out whether the process is within the statistical control or not (Due to chance or assignable causes).
Benefits
- Gives the visual representation of the ongoing in a process.
- Easy to understand and to interpret.
- Helps in decision making for process improvement goals.
- Identification of cause’s type of variation in a process.
Process Capability (Cp) is a statistical measurement of a process’s ability to produce parts within specified limits on a consistent basis. To determine how our process is operating, we can calculate Cp (Process Capability), Cpk (Process Capability Index), or Pp (Preliminary Process Capability) and Ppk (Preliminary Process Capability Index), depending on the state of the process and the method of determining the standard deviation or sigma value. The Cp and Cpk calculations use sample deviation or deviation mean within rational subgroups. The Pp and Ppk calculations use standard deviation based on studied data (whole population). The Cp and Cpk indices are used to evaluate existing, established processes in statistical control. The Pp and Ppk indices are used to evaluate a new process or one that is not in statistical control.
Process capability indices Cp and Cpk evaluate the output of a process in comparison to the specification limits determined by the target value and the tolerance range. Cp tells you if your process is capable of making parts within specifications and Cpk tells you if your process is centred between the specification limits. When engineers are designing parts, they must consider the capability of the machine or process selected to produce the part.
To illustrate, let us use a real-world example. Imagine that you are driving your vehicle over a bridge. The width of your vehicle is equivalent to the spread or range of the data. The guardrails on each side of the bridge are your specification limits. You must keep your vehicle on the bridge to reach the other side. The Cp value is equivalent to the distance your vehicle stays away from the guardrails and Cpk represents how well you are driving down the middle of the bridge. Obviously if the spread of your data is narrower (your car width is smaller), the more distance there is between the vehicle and the guardrails and the more likely you are to stay on the bridge.
The Cp index is a fundamental indication of process capability. The Cp value is calculated using the specification limits and the standard deviation of the process. Most companies require that the process Cp = 1.33 or greater.
The Cpk index of process centre goes a step further by examining how close a process is performing to the specification limits considering the common process variation. The larger the Cpk value the closer the mean of the data is to the target value. Cpk is calculated using the specification limits, standard deviation or sigma, and the mean value. The Cpk value should be between 1 and 3. If the value is lower than 1 the process is in need of improvement.
The Cp and Cpk indices are only as good as the data used. Accurate process capability studies are dependent upon three basic assumptions regarding the data:
- There are no special causes of variation in the process and it is in a state of statistical control. Any special causes must be discovered and resolved.
- The data fits a normal distribution, exhibiting a bell-shaped curve and can be calculated to plus or minus three sigmas. There are cases when the data does not fit a normal distribution.
- The sample data is representative of the population. The data should be randomly collected from a large production run. Many companies require at least 25 to preferably 50 sample measurements be collected.
Why Measure Process Capability
In manufacturing and many other types of businesses, reduction of waste and providing a quality product are imperative if they are to survive and thrive in today’s marketplace. Waste exists in many forms in a process. When we look at the bigger picture, process capability is more than just measuring Cp and Cpk values. Process capability is just one tool in the Statistical Process Control (SPC) toolbox. Implementing SPC involves collecting and analysing data to understand the statistical performance of the process and identifying the causes of variation within. Important knowledge is obtained through focusing on the capability of process. Monitoring process capability allows the manufacturing process performance to be evaluated and adjusted as needed to assure products meet the design or customer’s requirements. When used effectively this information can reduce scrap, improve product quality and consistency and lower the cost to manufacture and the cost of poor quality.
How to Measure Process Capability
The capability indices can be calculated manually, although there are several software packages available that can complete the calculations and provide graphical data illustrating process capability. For the example in this section, we will utilize a popular statistical software package. For our example, we will utilize data from randomly collected measurements of a key characteristic of a machined part. To better represent the population values, the sample data must be randomly collected, preferably over time from a large production run. A few things to keep in mind:
- Our data is quantitative and variable
- Our data consists of 100 measurements
- The target dimension is 25.4 mm
- USL (Upper Specification Limit) = 25.527 mm
- LSL (Lower Specification Limit) = 25.273 mm
- Range = 0.254 mm
First, we will examine our data with a simple histogram to determine if it could fit a normal distribution. In addition, we can generate a probability plot evaluating our data’s best fit to a line further indicating we are 95% confident that our data fits a normal distribution.
Now let us examine the Process capability report:
- Cp (Process Capability = 1.68
- Cpk (Process Capability Index) = 1.66
Using the graph, we can further evaluate process capability by comparing the spread or range of the product specifications to the spread of the process data, as measured by Six Sigma (process standard deviation units).
Through examination of the reports, we can determine that our example process is in a state of statistical control. All the data points fall well within the specification limits with a normal distribution. A process where almost all the measurements fall inside the specification limits is deemed a capable process. Process capability studies are valuable tools when used properly. As previously mentioned, the information gained is generally used to reduce waste and improve product quality. In addition, by knowing your process capabilities, the design team can work with manufacturing to improve product quality, and processes that are “not in control” may be targeted for improvement. During a typical Kaizen event or other quality improvement initiatives, Process Capability is calculated at the start and end of the study to measure the level of improvement achieved. Accurate knowledge of process capability enables management to make decisions regarding where to apply available resources based on data.
Six Sigma is a highly disciplined process that focuses on delivering near-perfect products and services consistently. Its strength is that it is a continuous improvement process with an unwavering focus on change empowerment, seamless training of resources and continuous top management support. These three are known as the Pillars of Six Sigma. If Six Sigma is implemented methodically, it will give sustained results for any process. Now the question arises as to what is a process. This will be explained in the next screen.
Process of Six Sigma
Six Sigma follows a process named DMAIC (Pronounced as D-MAC) DMAIC stands for Define, Measure, Analyse, Improve, and Control. Click each tab to know more. In the Define phase, define the problem statement and plan the improvement initiative. Consider a typical problem in an organization. A particular organization’s customers are not satisfied with the current support process of the organization. You can define the problem as the support process of the organization is at 20% satisfaction. In Six Sigma, the projects are always defined objectively. In addition to defining the problem, the Six Sigma project team is also formed in this phase. The Measure phase collects the data from the process and understands the current quality or operational performance levels. Additionally, the measurement criteria such as how to measure, when to measure, and who will measure are established. In the Analyse phase, the business process and the data generated from the measurement phase are studied to understand the root causes of the problem. In the Improvement phase, possible improvement actions are identified and prioritized. These are then tested and the improvement action plan is finalized. In the last phase, which is the Control phase, the Six Sigma team goes for a full-scale implementation of the improvement action plan and sets up controls to monitor the system in order to sustain the gains.
Features of Six Sigma
Six Sigma's aim is to eliminate waste and inefficiency, thereby increasing customer satisfaction by delivering what the customer is expecting.
- Six Sigma follows a structured methodology, and has defined roles for the participants.
- Six Sigma is a data driven methodology, and requires accurate data collection for the processes being analyzed.
- Six Sigma is about putting results on Financial Statements.
Six Sigma is a business-driven, multi-dimensional structured approach for −
- Improving Processes
- Lowering Defects
- Reducing process variability
- Reducing costs
- Increasing customer satisfaction
- Increased profits
The word Sigma is a statistical term that measures how far a given process deviates from perfection.
The central idea behind Six Sigma: If you can measure how many "defects" you have in a process, you can systematically figure out how to eliminate them and get as close to "zero defects" as possible and specifically it means a failure rate of 3.4 parts per million or 99.9997% perfect.
Primary Goals of Six Sigma
Any organization that seeks to implement Six Sigma or any enhancement programme should have a clear understanding of what it expects the action plan to achieve. Instead of abstract thinking, such as ‘we want to get better,’ project leadership needs to set clear and achievable targets that can be expected from the execution of any development strategy, even Six Sigma.
Six Sigma’s key objectives would include these particular results anticipated for the program:
- Reduce Defects
Six Sigma’s target is to achieve a defect rate of six standard deviations from the norm or 3.4 defects per million goods, and it will educate six Sigma practitioners (green belts, black belts) to aim towards this degree.
This will not, however, be possible for all organizations and goods. Motorola invented Six Sigma, and General Electric popularised it, which produces several different interests. Consequently, it has been adopted as a basis for multiple development methods. Yet Six Sigma was questioned in that the six standard deviation law (six sigma’s) is not uniformly valid to all sectors and goods. Many companies will refuse to reduce the defect rate to that level, since dealing with more defects is easier. A higher defect rate could attract customers too.
2. Quality
Quality management is one of the important primary goals of Six Sigma implementation. For practically every Six Sigma program, consistency is a natural, primary target. Without resolving quality problems in any phase, any more enhancement attempts will inevitably fall short. The major quality-related focus areas for standard Six Sigma initiatives include the elimination of error/defects (or the proofing of errors) and the reduction of waste involved in manufacturing a good or service.
The project team will use a specified methodology and statistical methods to attempt to recognize and remove the causes of the defects in a given procedure. In doing so, the goal is to optimize the process’s final result with as few mistakes as possible. Secondary purposes related to consistency targets include improved consumer loyalty and enhanced profitability.
3. Know the Stats
Do you understand how many of your items are wrong or are your services short? If you know the consistency level as opposed to your competitors? If you know how to boost the efficiency of your service? Want to be higher than the average?
Implementation of Six Sigma would improve your numbers:
- Where the consistency of the product/service now is.
- Wherever you want to go
- How to get inside
- How to keep at it
If nothing else, only the numbers you’ll know.
4. Variability
In this case, variability refers to the difference between the planned quality and the real performance of a product or service. Variance is sometimes, products or services that are not satisfactory to the end-user / client, and are refused to result in excessive duplication, lower profits, and increased costs.
Variability may be caused by common causes (steady, random fluctuation) or by special causes (once, non-random fluctuation). Six Sigma aims to reduce uncertainty, resulting in stable processes generating consistent output.
5. Provide Career Advancement
Are you trying to keep people focused and deliver career growth targets that make them happy? Giving somebody a Six Sigma green belt or black belt certification training is a perfect way to keep them interested, practicing, and pushing their career forward. With the Six Sigma hierarchy structure in different coloured belts representing enterprise structure, professionals can take up popular Quality Management Certification courses to get ahead in their careers.
6. Improved Productivity
Rising efficiency is an overall goal for several implementations of Six Sigma. Also, small increases in a process’s productivity will bring significant advantages for the organization. Six Sigma ventures usually focus on reducing cycle time (the amount of time it takes to complete one processing process), such as a bank teller that serves a single customer. The longer the cycle time, the more cycles (clients) within a single shift can be handled.
Other areas of change include eliminating measures that are duplicative or don’t add value in a process. The streamlining of the processes tends to maximize processing time and minimize idle time, thereby enhancing the total process efficiency.
7. Improve Corporate Culture
Perhaps a few individuals will become Six Sigma Yellow, Green, or Black Belts but this acts as a precedence for the entire organization which will see the quality emphasis throughout the organization. Implementing Six Sigma is an obvious way to say that “safety matters” to everyone. That’s a value you can’t quickly put a price on.
So, these are the primary goals of Six Sigma Implementation in any organization.
Quality function deployment is a LEAN technique that is a little out of scope for Six Sigma Green Belt practitioners and more useful for Black Belt practitioners. Yet, it is a powerful tool to design processes or products according to customer requirements. Quality function deployment is abbreviated as QFD. It fits into the Define phase of the DMAIC structure as briefly stated in the online free Six Sigma training. It is one of many LEAN techniques in the LEAN toolbox that are discussed in Six Sigma Green Belt training. Let’s talk about quality function deployment!
Once information about customer expectations has been obtained, techniques such as quality function deployment can be used to link the voice of customer directly to internal processes. QFD is not only a quality tool but also an important planning tool. It allows the consideration of the “voice of the customer” along the service development path to market entry.
There is no single definition for quality function deployment, but a general basic concept of this method is as follows:
“Quality Function Deployment is a system with the purpose of translating and planning the VoC into the quality characteristics of products, processes and services for reaching customer satisfaction”
History of QFD
The tool was first used to design an oil tanker at the Kobe shipyards of Japan in 1972 by Yoji Akao and Shigeru Mizuno to design customer satisfaction into a service offering before it is produced. Prior to this, quality control methods were primarily aimed at fixing a problem during or after production. In the mid-1980s, Don Clausing of MIT introduced this design tool to the United States. A classic product design application is in the automotive industry. In fact, Clausing tells of an engineer who initially wanted to place the emergency hand brake of a sports car between the seat and the door. However, the voice of customer testing found that women drivers wearing skirts had difficulty with the new placement of the hand brake. The Quality Function Deployment highlighted potential dissatisfaction with the location of this feature, and the idea was scrapped.
Benefits of QFD
- Quality Function Deployment is a powerful prioritization tool that combines several different types of matrices into one to form a house-like structure.
- Quality Function Deployment is a customer-driven process for planning products and services.
- It starts with the voice of the customer, which becomes the basis for setting requirements.
- Quality Function Deployment provides documentation for the decision-making process.
- QFD helps you to:
- Translate customer requirements into specific offering specifications
- Prioritize possible offering specifications and make trade-off decisions based on weighted customer requirements and ranked competitive assessment
The QFD technique is based on the analysis of the clients’ requirements, which normally are expressed in qualitative terms, such as: “easy to use”, “safe”, “comfortable” or “luxurious”. In order to develop a service, it is necessary to “translate” these fuzzy requirements into quantitative service design requirements; QFD makes this translation possible. Quality Function Deployment is also a system for design of a product or service based on customer demands, a system that moves methodically from customer requirements to specifications for the product or service. QFD involves the entire company in the design and control activity. Finally, QFD provides documentation for the decision-making process.
What Is the Taguchi Method of Quality Control?
The Taguchi method of quality control is an approach to engineering that emphasizes the roles of research and development (R&D), product design and development in reducing the occurrence of defects and failures in manufactured goods.
This method, developed by Japanese engineer and statistician Genichi Taguchi, considers design to be more important than the manufacturing process in quality control, aiming to eliminate variances in production before they can occur.
Understanding the Taguchi Method of Quality Control
The Taguchi method gauges quality as a calculation of loss to society associated with a product. In particular, loss in a product is defined by variations and deviations in its function as well as detrimental side effects that result from the product.
Loss from variation in function is a comparison of how much each unit of the product differs in the way it operates. The greater that variance, the more significant the loss in function and quality. This could be represented as a monetary figure denoting how usage has been impacted by defects in the product.
Example of the Taguchi Method of Quality Control
For instance, if the product is a precision drill that must consistently drill holes of an exact size in all materials it is used on, part of its quality is determined by how much the units of the product differ from those standards. With the Taguchi method of quality control, the focus is to use research and design to ensure that every unit of the product will closely match those design specifications and perform exactly as designed.
Special Considerations
Loss from detrimental side effects on society speaks to whether or not the design of the product could inherently lead to an adverse impact. For example, if operating the precision drill could cause injury to the operator because of how it is designed, there is a loss of quality in the product.
Under the Taguchi method, work done during the design stage of creation would aim to minimize the possibility that the drill would be crafted in a way that its use could cause injuries to the operator.
From a higher perspective, the Taguchi method would also strive to reduce the cost to society to use the product, such as designing goods to be more efficient in their operation rather than generate waste. For instance, the drill could be designed to minimize the need for regular maintenance.
History of the Taguchi Method of Quality Control
Genichi Taguchi, a Japanese engineer and statistician, began formulating the Taguchi method while developing a telephone-switching system for Electrical Communication Laboratory, a Japanese company, in the 1950s. Using statistics, he aimed to improve the quality of manufactured goods.
By the 1980s, Taguchi's ideas began gaining prominence in the Western world, leading him to become well-known in the United States, having already enjoyed success in his native Japan. Big-name global companies such as Toyota Motor Corp. (TM), Ford Motor Co. (F), Boeing Co. (BA) and Xerox Holdings Corp. (XRX) have adopted his methods.
Total Productive Maintenance (TPM) is a maintenance program which involves a newly defined concept for maintaining plants and equipment. The goal of the TPM program is to markedly increase production while, at the same time, increasing employee morale and job satisfaction.
TPM brings maintenance into focus as a necessary and vitally important part of the business. It is no longer regarded as a non-profit activity. Down time for maintenance is scheduled as a part of the manufacturing day and, in some cases, as an integral part of the manufacturing process. The goal is to hold emergency and unscheduled maintenance to a minimum.
TPM - History:
TPM is a innovative Japanese concept. The origin of TPM can be traced back to 1951 when preventive maintenance was introduced in Japan. However, the concept of preventive maintenance was taken from USA. Nippondenso was the first company to introduce plant wide preventive maintenance in 1960. Preventive maintenance is the concept wherein, operators produced goods using machines and the maintenance group was dedicated with work of maintaining those machines, however with the automation of Nippondenso, maintenance became a problem as more maintenance personnel were required. So, the management decided that the routine maintenance of equipment would be carried out by the operators. ( This is Autonomous maintenance, one of the features of TPM ). Maintenance group took up only essential maintenance works.
Thus, Nippondenso which already followed preventive maintenance also added Autonomous maintenance done by production operators. The maintenance crew went in the equipment modification for improving reliability. The modifications were made or incorporated in new equipment. This led to maintenance prevention.
Thus, preventive maintenance along with Maintenance prevention and Maintainability Improvement gave birth to Productive maintenance. The aim of productive maintenance was to maximize plant and equipment effectiveness to achieve optimum life cycle cost of production equipment.
By then Nippon Denso had made quality circles, involving the employee’s participation. Thus, all employees took part in implementing Productive maintenance. Based on these developments Nippondenso was awarded the distinguished plant prize for developing and implementing TPM, by the Japanese Institute of Plant Engineers (JIPE). Thus, Nippondenso of the Toyota group became the first company to obtain the TPM certification.
Overall Equipment Effectiveness-OEE measures how well a plant is performing relative to reducing the equipment losses and conditions. These conditions and losses are:
- Equipment breakdowns
- Scrap and rework caused due to poor performance of machine/equipment.
- Low productivity due to equipment
- Running at reduced speed
- Receiving unnecessary adjustment
- Idling or stoppages which require the operator’s attention.
Forming small, interdisciplinary teams to address fundamental areas such as preventative and autonomous maintenance, training personnel who operate machines, and standardization of work processes are common ways to improve OEE using TPM.
Key Requirements for TPM improvement-
There are three main requirements for effective TPM implementation and improvement within the organization:
- Increasing employee motivation and changing their mindset or attitude.
- Increasing employee competency and skill level.
- Improving the working culture or environment, so that it helps to support in establishment and effective implementation of a TPM program.
All these three requirements need to be fulfilled within the organization in order to follow the further steps for the implementation of any kind of improvement programs/techniques like TPM, TQM, Lean manufacturing, Six Sigma, etc.
Performance measures-
The purpose of TPMs is to provide an assessment of key capability values in comparison with those expected over time. TPM is an evolutionary Program Management and systems engineering tool that builds on the three parameters of (1) Earned Value Management (EVM) and (2) cost and schedule performance indicators and (3) the status of technical achievement. By combining cost, schedule, and technical progress into one comprehensive management tool, program managers are able to assess the progress of their entire program.
TPMs are typically established on those programs complex enough where the status of technical performance is not readily apparent. TPMs can also be valuable for Risk Tracking – levels below that forecast can indicate the need for an alternate approach.
Technical Performance Measurement (TPM) Evaluations
With a TPM program, it is possible to continuously verify the degree of anticipated and actual achievement of technical parameters and compare with the anticipated value. TPM is also used to identify and flag deficiencies that might jeopardize meeting a critical system level requirement. Measured values that fall outside an established tolerance band will alert management to take corrective action.
By tracking the system’s TPMs, the Program Manager and systems engineer gain visibility into whether the delivered system will actually meet its performance specifications (requirements). Beyond that, tracking TPMs ties together a number of basic systems engineering activities of Systems Analysis and Control, Functional Analysis Allocation, and Verification and Validation activities.
References:
1. Course Notes - National Institute of Technology, Calicut
2. Production and Operation Management - DDCE Utkal University