Data Science for Manufacturing | 3 PoC Opportunities

Manufacturing holds multiple predictive analytics and data science opportunities. With the rise of the Internet of Things (IoT) and data collection technologies becoming more accessible, manufacturing companies have a wealth of data to mine. Companies can use predictive analysis and optimization algorithms on these data sets to apply data-driven guidance and decision making to improve efficiency and quality, and to reduce costs.

Manufacturers can run Proof of Concept (PoC) projects to prove value and garner larger investment before spending multiple millions of dollars on an analytic infrastructure. Many companies turn to an expert analytics consulting firm like Mosaic to help them in developing a PoC plan and executing on that plan. Documents from a PoC typically include narrative sections, explaining best practices and processes for various issues, lists of guidelines, checklists for important items to consider each time working on a model, and other instructive elements identified through discussions concerning analyst needs.

The outline below describes how Mosaic typically works through a data science for manufacturing PoC:

  1. Identifying, Selecting and Planning Advanced Analytics Projects
    1. Identify Use Case and Developing Problem Statement – Emphasize importance of refining problem statement to ensure analysis is properly focused and scoped
    2. Identify and Characterize Available Data – Help to structure the brainstorming process for identifying internal and external data sources that may be useful
    3. Prepare Data for Analysis and Modeling – Identify strategies and techniques for assessing data quality, working with missing/dirty data, and transforming raw data into forms appropriate for modeling
      1. Place particular emphasis on feature engineering, including a library of different transforms and combinations of variables to test in developing a feature
      2. Describe standard visualization used in characterizing data to drive determination of compatible model types
    4. Estimate Potential Return on Investment, Complexity and Risk – Identify decision factors and processes, and collect input information, to support prioritization of analytics projects
  2. Data Science Model Types
    1. Identify broad classes of model types
    2. Identify criteria for application – These are used to determine if a particular model type is appropriate and well-suited for a specific problem type and data structure
    3. Identify combinations of models – Identify those that are complementary and may be used together or others that are typically tested on the same dataset to compete
  3. Data Science Software Tools
    1. Provide broad overview of current tools
    2. Define preferred suite of tools – This will necessarily be an iterative and ongoing process, but will cover appropriate products for storage, visualization, processing, modeling, etc.
  4. Modeling Process
    1. Develop Models – Process for implementing multiple models quickly, iterating with evaluation step to identify suitability of each model
    2. Evaluate Models – Iterative process to determine quality of each model version emphasizing feedback process of identifying positive and negative attributes of model in both an analytical and business context and using these characteristics to improve model
    3. Deploy Models – Limited emphasis placed initially on this topic due to scope, but emphasize importance of considering data flows, etc., in developing models to ensure compatibility with deployment requirements/environment

Now that Mosaic has shown how we engage as a machine learning consultant, lets focus on 3 specific use cases Manufacturers can tackle with data science.

Data Science for Manufacturing Application
Predictive maintenance
  1. Predictive Maintenance

Current maintenance paradigms fall into two primary categories:

  • Preventive maintenance, which is typically performed according to a fixed schedule and
  • Reactive maintenance, which is performed after a failure or drop in performance is observed. This unscheduled maintenance translates to higher maintenance costs, greater downtime and opportunity cost, and increased customer dissatisfaction.

The ideal approach to maintenance is a predictive model, in which equipment sensors and advanced statistical models are used to estimate the likelihood of failure, based on historical conditions and usage, and to plan the optimal time for maintenance to be conducted. Fortunately, as the cost of sensors and data storage has decreased a wealth of real-time equipment and line data has become available.

As an organization begins to build out a predictive maintenance capability, they can begin to take advantage of some of the monetary benefits that this solution offers. If a company can predict the likelihood of mechanical assets failure, they can optimize their maintenance response with scheduled, as opposed to unscheduled, maintenance. Organizations can now ensure increased equipment availability, effectiveness and run time predictability through understanding historical, current and predicted part availability and replenishment requirements. With greater visibility into equipment health, companies can extend preventive maintenance cycles. Users can start monitoring equipment performance leveraging real time sensor data to accurately predict run time failures and take action to prevent them, as well as identifying maintenance needs that cannot be completed during planned downtime periods. Date-driven diagnostics shorten downtime by ensuring that the right skills, parts, and tools are part of the maintenance response the first time around. Mosaic’s predictive real-time capabilities allow for businesses to start predicting mechanical degradation and failures in time to do something about it – stage inventory, schedule maintenance, and deploy field engineers.

A proof of concept use case would involve the following steps:

  • Identification of available data regarding production lines and equipment, including historical maintenance and failures, as well as a gap analysis to identify desirable data that is currently unavailable.
  • Decision analysis to determine the maintenance decisions that can be changed based on probability of failure and the operational influences of those decisions.
  • Statistical mapping of possible maintenance decisions and outcomes to the available historical data.
  • Model development and testing
  • Design of the deployment environment for the model (e.g., integration into an overall operations management dashboard)
Assembly line changeover
  1. Multiple Changeovers per shift

Increased competition and the drive toward more dynamic responsiveness to market demands has led to an increase in line changeovers on a single shift throughout the manufacturing industry. Each changeover brings some level of inefficiency due to the time and coordination required to take the appropriate steps to changeover the line. Analysis of historical line changeover data can provide insight regarding types and frequencies of line changeovers that generate higher or lower levels of inefficiency. For example, line changes that only involve packaging may require a smaller amount of line down time for the changeover, whereas line changes that involve a different product size may lead to a higher level of inefficiency. This insight can allow planners to sequence utilization of each line to be as efficient as possible.

By combining this information with product demand data, statistical optimization models can be generated and used to recommend the sequence and timing of product assignments to lines and changeover events to maximize overall efficiency and line productiveness.

A proof of concept use case would involve the following steps:

  • Identification of available data regarding line changeover events, including products on the line before and after the event, and all data available regarding the changeover steps. Again, similar to the predictive maintenance use case, a gap analysis to identify desirable data that is currently unavailable will be performed.
  • Constraint and decision analysis to determine the criteria that must be applied to plan line changeover events, and the degrees of control available to line planners.
  • Analysis of the historical data to determine associations between changeover products and type, and line changeover time.
  • Model development and testing
  • Design of the deployment environment for the model (e.g., integration into an overall operations management dashboard)
Inventory Level Optimization
  1. Inventory levels on the shop floor

Data-driven parts inventory management uses historical data and statistical models to ensure that input material and components are available at the required levels on the shop floor. Shortages of parts on the production line are extremely expensive. The smallest oversight can generate significant increases in cost due to line downtime, idle labor, and penalties associated with customer service level agreements (SLAs). Today’s production lines are generating significant amounts of data, but often that data is only used in a reactive mode, if at all. A proactive approach to data-driven inventory management by leveraging the data available in the enterprise resource management (ERM) system can improve shop floor efficiency, and most importantly, avoid parts shortages and associated line downtime.

A proof of concept use case would involve the following steps:

  • Identification of available data from the ERM system, including inventory levels of input material, parts and components. And again, similar to the other two use cases, a gap analysis will be performed to identify desirable data that is currently unavailable.
  • Constraint and decision analysis to determine the factors associated with inventory level management, dependencies, and criteria related to inventory levels and SLAs.
  • Analysis of the historical inventory level data to determine associations between demand patterns, line scheduling, and shop floor events.
  • Model development and testing
  • Design of the deployment environment for the model (e.g., integration into an overall operations management dashboard)

Data Science for Manufacturing Conclusion

In order to stay competitive and offer a superior customer experience, Manufacturers need to embrace data science and the shift towards data driven decision making. With the wealth of data Manufacturers now collect, big data consulting firms like Mosaic are positioned to help you build a strategic roadmap, and execute along that plan.


0 Comments

Leave a Reply

Privacy Policy
Cookie Policy