top of page
fondo azul

Lean-Six Sigma in the age of Artificial Intelligence.

  • Writer: Yazmin T. Montana
    Yazmin T. Montana
  • Nov 24, 2021
  • 6 min read

Updated: Jun 8, 2022

In November of 2020 I spoke on the ASQ International Forum of Quality in Juarez.

I titled my conference The link in between Lean Six Sigma and Machine Learning.

On this blog entry I am reviewing the general topics of my talk, where I focused on the Lean Six Sigma metrics that meet the following requirements in order to be good candidates to consider when planning on transitioning to manufacturing 4.0:

  • Directly impact in a manufacturer's EBITDA.

  • Can be assessed with conventional Lean-Six Sigma tools.

  • Generate large enough amounts of data that is sensible to use analytics techniques to measure.

These metrics are the top 10 Lean Six Sigma tools with examples in a problem-solving proposition context as suggested by Michael L. George, 2019:


  1. Management team Obtain management team's engagement in process improvement.

  2. AI data mining Remove outliers using interquartile analysis. Example: Create a Big Data spreadsheet that provides all accounting cost detail on all part numbers produced per year regardless of number.

  3. Cash flow improvement Example: Increase to 20% of revenue using pull systems.

  4. Labor efficiency Example: Measure actual labor cost versus verified and corrected accounting standards, react daily to adverse cost variances. Investigate outlier data source and correct.

  5. Scrap cost Example: Reduce from 10% to less than 3% of revenue.

  6. Total Productive Maintenance Example: Increase machine "up time" from 88% to 99%

  7. Setup time reduction Example: Reduce the waste due to setup time by greater than 25%

  8. Employee morale Example: Reduce negative perception of company from 62% to less than 5%

  9. EBITDA Example: Increase from -3.6% to +20% of revenue.

  10. Work-in-process inventory turns Example: Increase by 50%. Reduce cycle time by 50% and improve on-time delivery to more than 90%

Artificial Intelligence data mining (For Lean Six Sigma Metrics)


AI data mining provides a Big Data global view of all the relevant cost and pricing data and factory margin of every part number produced in the business together with a usual overhead breakeven analysis.

Similarly, it looks at the cycle time through each workstation from real material to finished goods. This is in sharp contrast of the lean six Sigma of the early 00's which was based on the repetitive manufacturing model of Toyota.


All of the unique Toyota tools such as the 4 step rapid set up are engineering intensive, and they only yield a higher return on investment in a repetitive manufacturing environment.

Western manufacturers generally produce both new production and spare parts, id est both repetitive and non-repetitive manufacturing.


By applying a Pareto distribution to big data strategies, theres is the risk of overlooking and sending to "black data" the 20% of the information that is considered in the first assessment as less impactful for the larger volume of production.


Lean Six Sigma uses Pareto analysis to find a repetitive 20% of part numbers that delivered 80% of revenue. If we make 80% of revenue production highly efficient, we are making giant strides, and we wrongly reasoned we won't bother with the 80% of part numbers that deliver 20% of the revenue.


Lean Six Sigma strategies applied without a thorough re-evaluation of dogmatic practices applied to Big Data, inadvertently could turn most of the data into "hidden data" by not looking at it.

Another limitation of conventional Lean Six Sigma is the strategy used by Black Belts to choose improvement projects, which is by looking at value stream maps at department level and have Sigma Belts work on removing local sources of waste.


Instead, in the new wave of cloud computing and Big Data, we can mage a global evaluation of corporate performance using tools such as multi processor PCs, advanced analytic tools, Machine Learning and even complete Artificial Intelligence deployments in order to identify the weaker links of the value stream.


An important step in computation of data generated in the manufacturing floor consist in the elimination of outliers that are generally related to the method of data collection rather than the system itself.


For example: A machine has with an approximated 3 hour set up time has suddenly reported 200 hours of set up time in the data report. According to observations on the manufacturing floor, the operator forgot to clock out as soon as the set up was complete.


The most effective tool to remove outliers is the interquartile range (IQR) which can be automated by calculating the IQR using Excel, Minitab and specially formulated algorithms to process specific types of data.

Manufacturing is one of the most important pedagogical tools to explain the power of AI because you can physically see the productivity improvements after its implementation.

Manufacturing 4.0, IIoT and automation.

Not every process and not all types of data are good candidates for automation and high-level computing. Processes that have unique characteristics such as stability, robustness and bottleneck throughput processes are the most suitable for high-level analysis.


Lean six Sigma improves process performance by systematically eliminating waste and reducing variation from the process. In the journey towards automation, LSS can assist organizations with standardizing of the ecosystem and make processes more suitable for automation.


Data 2.0 within organizations


An important challenge for organizations that have operated under conventional problem-solving managerial structures, is to prepare their data for the introduction of automated analytics.

In order to ease the transition of organizations at all levels from manufacturing 3 to manufacturing 4, leaders should help their teams comply with 3 fundamental data management rules that will help any automation initiative to succeed:

  • Selected content: Select databases that allow internal innovation; ensure they share common language, same format and selection of relevant information.

  • Repeatability control: There must be no diverse protocols for sharing data. Information should flow freely.

  • Assigned channels: Ensure databases that can be accessed from applications, repositories or "data lakes". Security: Encrypt data, or mask it. "Data-at-rest": Allow data to be accessed at any time.

Strategic data management is the most powerful ally of any organization that seeks to transition into automation.


The new wave of LSS allows all the variables of a process to be simulated and analyzed before they even occur.

Foresight and risk prevention.

By identifying the correlations between the input and output variables, the LSS model tells us how we can control the input variables in order to move the output variable(s) into our target values. Most importantly, LSS also requires the monitored process to be “stable”, i.e., minimizing the output variable variance, by minimizing the input variable variance, in order to achieve the so called “breakthrough” state.


Data analytics can be introduced in all parts of the Lean Six Sigma process:
ree

By increasing the ability of continuous improvement leaders of preventing and measure risk before they become an imminent event, the capability for continuous improvement initiatives increases as well.

A CI leader can form a symbiotic relationship with her risk assessment tools not only for day to day inspections of the processes, but also to model and predict the reaction of the system to new initiatives.


Machine learning tools such as neural networks can be computed with already-existent databases in Excel spreadsheets (I wrote an example of this on my last blog entry) which facilitates these work strategies for any leader interested in implementing high level analysis without creating a entirely new database, for example by taking all the information stored in 5 year old Excel spreadsheets and moving them to a SQL database.


IT decisions like changing to an entirely new language of data storing can be chaotic and expensive for all the parts involved and since it is a priority to prevent a decrease in productivity in all parts of the system, this should be avoided.


It is more expensive for organizations to try small automation projects instead of assessing 4.0 readiness for the entire productive system.


Leaders that seek to push their organizations into state of the art technologies, often support small automation projects that produce results in cell manufacturing processes, focusing on reducing headcount or steps in product assembly.


These initiatives although productive, lead to higher expenses for the organization in the long term.

Leaders take extra time to evaluate, run and upkeep projects that have the characteristics of experimentation rather than legitimate and sustainable improvement. Resources invested in small projects often can not be populated if there is not a pre-existent long term strategy to ensure that all of the many small projects that leaders take on, can be aligned into the eventual macro automation of the entire system.

That is, data literacy for all members of the organization.


It is preferable for leaders to take a slower pace for planning of their projects and ensure there is a robust strategy for the system at large, before implementing small and complex automated processes.


High-level analytics and automation initiatives go hand in hand with culture change within organizations, and this is the Lean part of Lean Six Sigma. Associates must grow in the conviction that data will make their jobs easier and provide them with new opportunities to become expert problem solvers.

ree

Slide from my conference The link in between Lean Six Sigma and Machine Learning, 2020


... So, where should a CI leader start?


Just like any change initiative, it is important to ensure associates and team leaders agree on a broader vision of the future of the organization. Before automation is effectively "up and running" in your productive system, it is fundamental that every team member becomes vigilant of the potential issues that could arise in the pre- existent system.


Toyota Production System teaches us of the importance of making every team member an active member of continuous improvement and this idea must prevail even if our factory shop floor is mostly robots and computers. Automation and AI are aids in the perfecting of manufacturing processes which eventually impacts your end customer. With this concept in mind, AI and automation can thrive in the humanistic perspective of your organization.


Comments


bottom of page