A quotation from Harold Sirkin, Vice President of the Boston Consulting Group: "As the economy changes, as competition becomes more global, it's no longer company versus company but supply chain versus supply chain." This quote resonates well with enterprises that are globally diversified as they step into the global environment trying to emulate the Wal-Mart’s and the Dells.
One of the top concerns that companies have been preoccupied - is evaluating the risk assessment of their complex supply chains. A fall out in the supply chain can have devastating effect on the bottom line of the company and the recovery process can be very time consuming. When hurricane Katrina hit the Louisiana coast, a significant number of refineries were affected which resulted in an immediate increase in the price of gas. "Many of the key risk factors have developed from a pressure to enhance productivity, eliminate waste, remove supply chain duplication, and drive for cost improvement," as quoted by William L. Michels, CEO of consulting firm ADR North America, Ann Arbor, Mich. Companies need to cope with such uncertainties in their supply chains and should develop strategies to mitigate any supply chain disruption risks. No longer can cost reduction activities can be implemented in isolation cause it may increase the risk of the supply chain e.g. the decision to outsource manufacturing to low cost countries requires balancing competing priorities of increased inventory/lead times while heavily dependents on the logistics efficiency and total cost of ownership.
In the current environment the principal threats that companies are besieged with are supplier disruption, natural disaster, strategic risk, political risks, compliance and logistics failure. Supplier risks can range between prices control, supply variability, supplier company stability, and lead times among others. Supplier disruptions risks are usually the highest priority across all industries. Natural disaster risks are a top priority for the chemical and the oil and gas industry where most of the refineries are based around the coastal areas. Pharmaceutical companies also consider risk evaluation studies for disease control after natural disasters. Strategic risk evaluation is critical for companies in the High Tech space where the product life cycle is very short. Companies have to make good strategic investments and understand the market for consumption or reaction of the introduction of a new product. The aerospace and defense industry, construction and the oil and gas industry are primarily susceptible to political risks. As political climate changes in the war, terrorism, dictatorship ridden economies – it affects the entire global economy. Compliance from financial regulations to SOx based compliance is also critical for Pharmaceutical and other industry where traceability of their product has to be determined in case of recalls. Retail industry and the advent of outsourced manufacturing in other industries are placing a lot at risk on the logistics and movement of goods.
Companies can take a four pronged approach to mitigate supply chain risk strategies. The first element for a company is to understand the strategic implication of their supply chain. A classification of products or categories may provide insights to material risk profiles that can degrade the operational performance. Diversify the procurement across few but multiple suppliers that can hedge the company against supply disruption. Manufacturing capacity can also be analyzed to meet the global/regional demand. A flexible supply chain should be designed which can buffer against uncertainties through strategic lean six sigma thinking. The second element is to create a collaborative and cooperative manufacturing and supplier network. Continuous visibility of metrics and constant communication/collaboration will enable companies to be proactive and reduce the impact of disruptions. Supply chain risk managers have to frequently work with colleagues from other departments such as purchasing, logistics, maintenance and quality to proactively create risk mitigating strategies. The third element is the ability for companies to consider and analyze the tradeoffs during decision making addressing the root causes and not the symptoms. Companies have to understand the difference between the service levels and cost reduction strategies that could negatively affect the operational performance. The last element is to show relentless passion for risk elimination. Create analyze and use up the company brainpower to come up with metrics and strategies for quantify the risk and implement risk mitigating strategies. Supply chain risks have become deeply embedded into the fabric of extended supply chain network. If neglected, it can have devastating effect on the supply chain – as quoted earlier the battle is not between enterprises but between supply chains.
Sunil Roy's Operational Excellence Corner
This blog has been written based on my experience working with World Class Companies and Clients- who have always challenged me to deliver quality results in the shortest amount of time. This is an attempt to share some of my experiences with the community.
Tuesday, September 16, 2008
Monday, September 8, 2008
Lean Procurement Improvement Strategies
Global commerce has changed the dynamics of supply chain management in recent times. Global procurement although has its advantages of low cost sourcing, but increases the overall risk and cost of the supply chain with increased lead times and inventory. Large companies have already initiated programs to get visibility of the extended supply chain to squeeze out additional cost. If the goal is trying to utilize the natural resources effectively, we should all look at material from “extraction” to “recycled”.
In the demand driven world with ever shortening of product life cycle, it has become challenging for the procurement organization to ensure the right material is always available in the right quantity and quality. Procurement organizations can no longer afford long material lead-times which leads to increased inventory and susceptible to forecast accuracy. Supplier performances are continuously being measured and suppliers who cannot deliver consistently on time and quality are quickly replaced. To reduce the overall cost of the supply network, companies have embraced lean and six-sigma process improvement techniques to eliminate redundancies and wasteful practices.
The key challenges for the procurement organization are 1) preventing shortages of material when required, 2) maintaining high quality standards, 3) reducing inventory investments, 4) reducing supply side lead times and 5) Embarking on a continuous process improvement practice.
The top three things that have been identified by companies to step into a procurement improvement program are visibility, improved business processes and inventory management. Visibilities to supplier performance, inventory, quality & inspections, lead times, on-time delivery, contract compliance are extremely important. Maintenance of supplier related content and keeping it updated, though an added task, is imperative to improve the overall business processes. Visibility and real time inventory update by location helps to make prudent decisions on purchasing and purchasing process latency. Business processes are hard to change but a key source for non-value added tasks. Companies need to identify steps in their business processes are that are redundant and adding lead time to the procurement process. Reducing the need for entering purchase order information can be an example of business process change. A practice of generating a purchase order based on material consumption can save a lot of time and effort. Inventory management practices have to be in place to reduce the overall cost. Classifications of procured items into groups of high runners, low runners, high risk and low risk have to be identified to determine the level inventory that has to be maintained without the risk of shortages and efficient operational performance.
Companies who have excelled in Lean procurement practices have moved from a “push” to a “pull” environment with increased visibility and building a collaborative environment with their suppliers. There is less reliance on ad-hoc phone calls but have the system generate orders based on material movement. The material depletion at the consumer are used as a medium to send signal for the supplier to replenish material. Companies have also improved on the past practices of replenish order quantity where a large batch of material is ordered each time when the material goes below a critical level. Frequent smaller batches of orders can not only provide a signal to suppliers but help the supply chain to reduce the level of inventory. The other strategy for companies is to develop a responsive supply chain. The demand should be propagated not only within the enterprise but across the network. Proper processes should be in place to propagate the demand variability across the network so that inventory levels are sized correctly to reduce the disruption in operations. A responsive supply chain will help reduce the lead times and help proactively manage shortages. Companies should also eliminate all waste in the procurement process. Buyers spend a significant amount of time entering purchase orders, tracking order status, maintaining private excel sheets which consume time from actually doing effective work. Creation of a procurement hub which is transparent across the network and integrated with the individual supplier systems are some of the steps taken by high performing companies.
In the demand driven world with ever shortening of product life cycle, it has become challenging for the procurement organization to ensure the right material is always available in the right quantity and quality. Procurement organizations can no longer afford long material lead-times which leads to increased inventory and susceptible to forecast accuracy. Supplier performances are continuously being measured and suppliers who cannot deliver consistently on time and quality are quickly replaced. To reduce the overall cost of the supply network, companies have embraced lean and six-sigma process improvement techniques to eliminate redundancies and wasteful practices.
The key challenges for the procurement organization are 1) preventing shortages of material when required, 2) maintaining high quality standards, 3) reducing inventory investments, 4) reducing supply side lead times and 5) Embarking on a continuous process improvement practice.
The top three things that have been identified by companies to step into a procurement improvement program are visibility, improved business processes and inventory management. Visibilities to supplier performance, inventory, quality & inspections, lead times, on-time delivery, contract compliance are extremely important. Maintenance of supplier related content and keeping it updated, though an added task, is imperative to improve the overall business processes. Visibility and real time inventory update by location helps to make prudent decisions on purchasing and purchasing process latency. Business processes are hard to change but a key source for non-value added tasks. Companies need to identify steps in their business processes are that are redundant and adding lead time to the procurement process. Reducing the need for entering purchase order information can be an example of business process change. A practice of generating a purchase order based on material consumption can save a lot of time and effort. Inventory management practices have to be in place to reduce the overall cost. Classifications of procured items into groups of high runners, low runners, high risk and low risk have to be identified to determine the level inventory that has to be maintained without the risk of shortages and efficient operational performance.
Companies who have excelled in Lean procurement practices have moved from a “push” to a “pull” environment with increased visibility and building a collaborative environment with their suppliers. There is less reliance on ad-hoc phone calls but have the system generate orders based on material movement. The material depletion at the consumer are used as a medium to send signal for the supplier to replenish material. Companies have also improved on the past practices of replenish order quantity where a large batch of material is ordered each time when the material goes below a critical level. Frequent smaller batches of orders can not only provide a signal to suppliers but help the supply chain to reduce the level of inventory. The other strategy for companies is to develop a responsive supply chain. The demand should be propagated not only within the enterprise but across the network. Proper processes should be in place to propagate the demand variability across the network so that inventory levels are sized correctly to reduce the disruption in operations. A responsive supply chain will help reduce the lead times and help proactively manage shortages. Companies should also eliminate all waste in the procurement process. Buyers spend a significant amount of time entering purchase orders, tracking order status, maintaining private excel sheets which consume time from actually doing effective work. Creation of a procurement hub which is transparent across the network and integrated with the individual supplier systems are some of the steps taken by high performing companies.
Wednesday, September 3, 2008
Demand Management Process for Steady Material Supply
During one of my stints working for a client in the US, the team was setting up the design for the best practices in demand management. As most of us are aware, the performance of any company reflects on how the demand management process works. Demand is the point of entry for the rest of the operation. If inventory is building, lead times are increasing and delivery dates are being missed, then the demand management is the first place that the folks have to understand and control. The strategies used for demand management is dependent on industries – as retail industries look for solution to capture the POS data to get a real-time semblance of the demand pattern. Asset intensive industries are dependent on how well they manage the S&OP planning process to determine the market addressable based on the rough cut capacity planning.
During the design planning session a team consisting of sales, planning, marketing and IT were gathered in a room. As we have probably realized that demand, itself, is an ambiguous term. People talk about – forecast (shipment forecast, sales forecast, booking forecast, unconstraint forecast), sales orders, inter company transfer orders, procurement forecast and purchase orders. It took the team a while to differentiate between the various types of forecast and what has to be the prime driver to manage the sales and operations planning process.
The other challenge for companies is the capture the information for all these types of demand. The most common element of the forecast is the shipment information that is historically archived. In a commodity business this is a good start to create a forecast during the annual planning process. The forecast based on the shipment is constraint based, clearly indicating what the company has shipped in the past, and is a good measure of generating a forecast for the future demand. The sales and marketing teams can use this forecast as the starting point to modify based on market intelligence reports and changes in the customer buying pattern. The modified forecast becomes the actual sales forecast that the business planners have to corroborate and conduct the rough cut capacity planning.
The following discussion will be mostly about the forecasting process. The bookings forecast, as conducted by sales department, on the other hand used in many companies are a way to hedge the sales forecast errors. Bookings forecast can be higher than the sales forecast if the sales think that a new order may show up before the end of the month or at the beginning of the next month. However, the bookings is not evil that has to be eliminated but should be controlled by the planning team. In the commodity business, where the demand is cyclical and seasonal, it is essential to level the production based on the demand variability. The planner has to determine the build up of the inventory or booking of capacity to meet future demand.
There are several best practices in managing the forecast, but based on my experience the key elements that a forecaster has to grasp is the reliability and the repeatability of the forecast. Assuming that the forecast is reliable with a high degree of forecast accuracy, which can be measured, based on the company business and processes and proper risk mitigation strategies should be applied. From a repeatability point of view, if the forecast is fairly level, it is easy to plan the supply to meet the demand and control the level of inventory in WIP as well as in the finished goods. When the demand is seasonal e.g. spikes in the demand at the end of a period, it should be treated carefully to ensure a steady supply without causing a situation of a stock-out. If the supply is flexible and reliable then the process can be run with minimum inventory. The risk for Just-In-Time increases as the rate of consumption increases and the supply has to keep up with the consumption. Best practices suggest that a steady supply is the best way to operate which will mitigate the risk of stock outs. Inventory planning, based on the repeatability of the forecast, plays a critical part to determine the level of inventory to maintain, which will keep the cost of inventory down as well as maintain high customer service levels. The repeatability of the forecast can help the inventory planner to size the finished goods inventory in the warehouse and the distribution centers thus promoting steady supply and high customer service levels. If there are any feedbacks or questions, please let me know.
During the design planning session a team consisting of sales, planning, marketing and IT were gathered in a room. As we have probably realized that demand, itself, is an ambiguous term. People talk about – forecast (shipment forecast, sales forecast, booking forecast, unconstraint forecast), sales orders, inter company transfer orders, procurement forecast and purchase orders. It took the team a while to differentiate between the various types of forecast and what has to be the prime driver to manage the sales and operations planning process.
The other challenge for companies is the capture the information for all these types of demand. The most common element of the forecast is the shipment information that is historically archived. In a commodity business this is a good start to create a forecast during the annual planning process. The forecast based on the shipment is constraint based, clearly indicating what the company has shipped in the past, and is a good measure of generating a forecast for the future demand. The sales and marketing teams can use this forecast as the starting point to modify based on market intelligence reports and changes in the customer buying pattern. The modified forecast becomes the actual sales forecast that the business planners have to corroborate and conduct the rough cut capacity planning.
The following discussion will be mostly about the forecasting process. The bookings forecast, as conducted by sales department, on the other hand used in many companies are a way to hedge the sales forecast errors. Bookings forecast can be higher than the sales forecast if the sales think that a new order may show up before the end of the month or at the beginning of the next month. However, the bookings is not evil that has to be eliminated but should be controlled by the planning team. In the commodity business, where the demand is cyclical and seasonal, it is essential to level the production based on the demand variability. The planner has to determine the build up of the inventory or booking of capacity to meet future demand.
There are several best practices in managing the forecast, but based on my experience the key elements that a forecaster has to grasp is the reliability and the repeatability of the forecast. Assuming that the forecast is reliable with a high degree of forecast accuracy, which can be measured, based on the company business and processes and proper risk mitigation strategies should be applied. From a repeatability point of view, if the forecast is fairly level, it is easy to plan the supply to meet the demand and control the level of inventory in WIP as well as in the finished goods. When the demand is seasonal e.g. spikes in the demand at the end of a period, it should be treated carefully to ensure a steady supply without causing a situation of a stock-out. If the supply is flexible and reliable then the process can be run with minimum inventory. The risk for Just-In-Time increases as the rate of consumption increases and the supply has to keep up with the consumption. Best practices suggest that a steady supply is the best way to operate which will mitigate the risk of stock outs. Inventory planning, based on the repeatability of the forecast, plays a critical part to determine the level of inventory to maintain, which will keep the cost of inventory down as well as maintain high customer service levels. The repeatability of the forecast can help the inventory planner to size the finished goods inventory in the warehouse and the distribution centers thus promoting steady supply and high customer service levels. If there are any feedbacks or questions, please let me know.
Monday, June 9, 2008
Efficient Supply Chain Network
The business climate has changed over the last 15 years as it transforms itself to a global economy. Manufacturing has moved to low cost countries where the environmental laws are not so stringent and the labor cost is low. The challenges faced by many large companies when sourcing from multiple locations is an efficient visibility to the extended supply chain.
There are four flows that companies have to manage which are Demand, Supply, Information and Cash. Companies who manage each of these flows efficiently run the most profitable enterprise. Early detection of change and ensuring a quick response to each of these flows are critical for the existence of companies. Inefficient management of each of these flows results in a buffer which ultimately hits the margin and market capitalization.
One of the largest technology company in the US, was faced with a challenge to reduce the inventory in their supplier network. The network of supplier was highly complex where there could be multi hierarchical network and multi level supply associated with each of the suppliers. Since each of the suppliers was a small medium business and could not invest in the technology support required to provide visibility, the company decided to invest and share their best practices across their supplier network. It was a win-win situation and economical decision to not only improve their margin but also help the supplier network to run their operations more efficiently.
The main challenge was to reduce the build up of inventory at each of the supply chain nodes which made the supply chain responsive but lethargic. These inventory buffers covered the demand/supply variability thus providing a quick response to changes in demand. However, if you look across the supply chain it seemed to be like a large python – which was very well fed and moved slowly. The solution to these problems is information flow. If the information is reliable at each of the supplier nodes then the buffer of inventory can be reduced and still be responsive to the changes in the demand.
The vision of the company was to create a solution that would be implemented across the supply chain network. The key areas that had to be addressed were a) inventory planning at each of the supply chain nodes, b) the demand picture that is propagated and exploded to each of the components at each of the nodes and the c) supply picture at each of the nodes. The solution was based on the Lean principles which promote reduction of waste in the system. The inventory planning is a key component which has to be in tune with the demand and ensure a level supply plan. The level supply plan is propagated upstream to determine the supply plan for the upstream partners. While planning is the correct step but the execution has to be tightly controlled. A dashboard depicting the overall demand and supply plan was planned for each node and individually managed. The technology enabled a PDCA approach to control the deviation and enable a faster decision making process.
It is a reasonable solution and the right approach. The green supply chain economics has put pressure on the manufacturer to reduce waste not only within the compounds of the facility but across the supply chain. As natural resources are being consumed and depleted, we have to be smarter about using them effectively.
There are four flows that companies have to manage which are Demand, Supply, Information and Cash. Companies who manage each of these flows efficiently run the most profitable enterprise. Early detection of change and ensuring a quick response to each of these flows are critical for the existence of companies. Inefficient management of each of these flows results in a buffer which ultimately hits the margin and market capitalization.
One of the largest technology company in the US, was faced with a challenge to reduce the inventory in their supplier network. The network of supplier was highly complex where there could be multi hierarchical network and multi level supply associated with each of the suppliers. Since each of the suppliers was a small medium business and could not invest in the technology support required to provide visibility, the company decided to invest and share their best practices across their supplier network. It was a win-win situation and economical decision to not only improve their margin but also help the supplier network to run their operations more efficiently.
The main challenge was to reduce the build up of inventory at each of the supply chain nodes which made the supply chain responsive but lethargic. These inventory buffers covered the demand/supply variability thus providing a quick response to changes in demand. However, if you look across the supply chain it seemed to be like a large python – which was very well fed and moved slowly. The solution to these problems is information flow. If the information is reliable at each of the supplier nodes then the buffer of inventory can be reduced and still be responsive to the changes in the demand.
The vision of the company was to create a solution that would be implemented across the supply chain network. The key areas that had to be addressed were a) inventory planning at each of the supply chain nodes, b) the demand picture that is propagated and exploded to each of the components at each of the nodes and the c) supply picture at each of the nodes. The solution was based on the Lean principles which promote reduction of waste in the system. The inventory planning is a key component which has to be in tune with the demand and ensure a level supply plan. The level supply plan is propagated upstream to determine the supply plan for the upstream partners. While planning is the correct step but the execution has to be tightly controlled. A dashboard depicting the overall demand and supply plan was planned for each node and individually managed. The technology enabled a PDCA approach to control the deviation and enable a faster decision making process.
It is a reasonable solution and the right approach. The green supply chain economics has put pressure on the manufacturer to reduce waste not only within the compounds of the facility but across the supply chain. As natural resources are being consumed and depleted, we have to be smarter about using them effectively.
Monday, April 7, 2008
Stability of Supply Chains
Recently I had some interesting discussions with executives from numerous companies who are at different stages of moving forward in their Lean journey. The business drivers for most of these clients are how to reduce cost and the manufacturing lead times.
After meeting with several clients and doing a bit of research it seems that if there is one metric that companies should focus on is the lead time reduction. Lead time is the core metric that influences the health of the operations. People familiar with “Little’s Law” would quickly grasp that there is a direct correlation between Lead time and WIP in the system. As the WIP is steadily increased in the system the Lead time does not change. This indicates that some level of WIP is good for the supply chain, and the operations can keep running with the same level of operational performance. But at the inflexion point, any increase in WIP will adversely affect the Lead time. What it indicates that some level of WIP is definitely good but in excess it is the cause of the increase in the cost of operation. Excess WIP, requires more handling cost, loss of material, increased material movement, more quality problems due to handling and an overall increase in the complex decision making for scheduling.
Where is your operation with respect to the inflexion point? A typical company never reaches the efficiency as dictated by Little’s Law. It is a theoretical limit and gives perspective for companies to compare their operations to the theoretical limit. The typical reasons for a company to deviate from the limits are a) Increase in product mix, b) batching requirement are each operation, c) volatility of the supply and demand, d) setups and e) not producing parts/components based on customer requirement.
In this highly variable environment, what can companies do to run effectively? Notice – I mention effective and not efficient. Companies can run their operations efficiently by producing material at the resource capacity. This does not mean that the products they are producing are based on customer needs. The end customer can be the next operation, distribution centers, channel customers or end customers. Running effective operations indicate good business practice and produce the right mix of material that can run their operations efficiently as well as meet the customer demand.
The main crux of the manufacturing effectiveness is the 3rd principle of Lean as defined by “James Womach” in his book “Lean Thinking,” which is making the material flow through the system. These concepts are not new, we see this in real life with fluids dynamics. The only way to reduce the residence time of the fluid is to make it flow, thus reducing the stagnation points as defined as “Sinks”. The same thought process can be applied in Manufacturing – we want to avoid “Sink” or material stagnation points. I am not preaching that the sinks should not be there, it is necessary to have these sinks to manage product portfolio, variability in supply and demand, or the manufacturing scheduling constraints e.g. calendar mismatch, rate of production, required mix of production etc. It is however a good process to determine the size of these “sinks” and manage them so that we do not loose control.
This gets us to a new point – how do you compensate or measure the operations managers to run “effective” operations. Do you measure them based on the deviation from the predicted “sinks” or inventory buffer target or do you measure them on resource utilization?
After meeting with several clients and doing a bit of research it seems that if there is one metric that companies should focus on is the lead time reduction. Lead time is the core metric that influences the health of the operations. People familiar with “Little’s Law” would quickly grasp that there is a direct correlation between Lead time and WIP in the system. As the WIP is steadily increased in the system the Lead time does not change. This indicates that some level of WIP is good for the supply chain, and the operations can keep running with the same level of operational performance. But at the inflexion point, any increase in WIP will adversely affect the Lead time. What it indicates that some level of WIP is definitely good but in excess it is the cause of the increase in the cost of operation. Excess WIP, requires more handling cost, loss of material, increased material movement, more quality problems due to handling and an overall increase in the complex decision making for scheduling.
Where is your operation with respect to the inflexion point? A typical company never reaches the efficiency as dictated by Little’s Law. It is a theoretical limit and gives perspective for companies to compare their operations to the theoretical limit. The typical reasons for a company to deviate from the limits are a) Increase in product mix, b) batching requirement are each operation, c) volatility of the supply and demand, d) setups and e) not producing parts/components based on customer requirement.
In this highly variable environment, what can companies do to run effectively? Notice – I mention effective and not efficient. Companies can run their operations efficiently by producing material at the resource capacity. This does not mean that the products they are producing are based on customer needs. The end customer can be the next operation, distribution centers, channel customers or end customers. Running effective operations indicate good business practice and produce the right mix of material that can run their operations efficiently as well as meet the customer demand.
The main crux of the manufacturing effectiveness is the 3rd principle of Lean as defined by “James Womach” in his book “Lean Thinking,” which is making the material flow through the system. These concepts are not new, we see this in real life with fluids dynamics. The only way to reduce the residence time of the fluid is to make it flow, thus reducing the stagnation points as defined as “Sinks”. The same thought process can be applied in Manufacturing – we want to avoid “Sink” or material stagnation points. I am not preaching that the sinks should not be there, it is necessary to have these sinks to manage product portfolio, variability in supply and demand, or the manufacturing scheduling constraints e.g. calendar mismatch, rate of production, required mix of production etc. It is however a good process to determine the size of these “sinks” and manage them so that we do not loose control.
This gets us to a new point – how do you compensate or measure the operations managers to run “effective” operations. Do you measure them based on the deviation from the predicted “sinks” or inventory buffer target or do you measure them on resource utilization?
Thursday, March 20, 2008
Global Manufacturing Challenges
Manufacturing environment has changed significantly due the advent of Globalization in the last 20 years. Traditional manufacturers served the local market and had their manufacturing facilities and the supplier network within the local region. Previously, the manufacturing challenge was to manage the regional demand, where the consumption pattern was well understood. A majority of the international market was closed to foreign companies in order to promote local industries. In India, the flagship car, the 1950’s model - Ambassador, was the major brand on the roads till the 1980’s when the market was opened to other brands. The downside of such a market was that the local consumer did not have a choice and it also killed innovation due to the lack in competition. The "one style/color fits all" mentality behind the mass assembly line is no longer an accepted approach for large-scale production. Instead, "mass customization" is now not only required but a goal for many manufacturers. Customers require a higher quality product with more functions, and usually at a lower price. The product lifecycle was much longer as the original typewriter existed for 30 years as compared to today where a computer phases out within a year. Globalization has created a new market and opened up access to new consumers who are eager to shorten the learning pattern and jump to the latest technology in a heartbeat. Nowadays, you can buy an I-Pod or a laptop at any part of the world, with the same quality standard as found in the developed world.
Manufacturers today operate in a highly complex, distributed and fragmented environment. On one hand, globalization has created tremendous opportunities to develop new products, serve new customers, and pursue new markets. On the other hand, competitive stakes are getting higher then ever. The new manufacturers has unprecedented pressure to cut operating costs, deliver on time, optimize use of the available assets, and adhere to regulatory and compliance edicts while at the same time focus on marketing efforts to grow the top-line revenues.
The new global manufacturer has several manufacturing facilities, several contract outsourcing facilities, several sales channels and global consumers. The challenge is how to coordinate the activities across multiple plants at the enterprise level into a cohesive business and produce thousands of product variations, in ever shortening lead times and high delivery compliance. The global dispersed supply chain and manufacturing operation required synchronization across several time zones. Several years of continuous improvement have made these chains effective and lean, but globalization has ratcheted up the challenge a few levels. The lead times and delivery requirements have not changed, in fact become more competitive, in spite of the global network of facilities.
To meet the faster, better, cheaper mantra of today's economy, manufacturers need to look for new ideas, designs and methods, examining every aspect of production. As supply chains become leaner, the pressure on manufacturer to respond quickly, profitable, and efficiently has only increased.
Manufacturers today operate in a highly complex, distributed and fragmented environment. On one hand, globalization has created tremendous opportunities to develop new products, serve new customers, and pursue new markets. On the other hand, competitive stakes are getting higher then ever. The new manufacturers has unprecedented pressure to cut operating costs, deliver on time, optimize use of the available assets, and adhere to regulatory and compliance edicts while at the same time focus on marketing efforts to grow the top-line revenues.
The new global manufacturer has several manufacturing facilities, several contract outsourcing facilities, several sales channels and global consumers. The challenge is how to coordinate the activities across multiple plants at the enterprise level into a cohesive business and produce thousands of product variations, in ever shortening lead times and high delivery compliance. The global dispersed supply chain and manufacturing operation required synchronization across several time zones. Several years of continuous improvement have made these chains effective and lean, but globalization has ratcheted up the challenge a few levels. The lead times and delivery requirements have not changed, in fact become more competitive, in spite of the global network of facilities.
To meet the faster, better, cheaper mantra of today's economy, manufacturers need to look for new ideas, designs and methods, examining every aspect of production. As supply chains become leaner, the pressure on manufacturer to respond quickly, profitable, and efficiently has only increased.
Monday, February 4, 2008
Where is thy Buffer?
Production planning and operations has always been a challenging task. The decision making process involved on a daily basis around which orders (production orders) to process within the available capacity based on customer requirements, sales team escalations, quality issues, machine setup and maintenance requirements can be a very daunting task. The questions that make planners struggle are a method to make the production process consistent and sustainable in this uncertain environment.
Planners usually play around with buffers to manage the uncertainties. There are primarily two kinds of buffers – capacity and inventory. Planners have an option of booking less than the available capacity. The remaining capacity would buffer against uncertainties. The problem is representing this information to the executives. The sales executives will look at the booking and determine that they should be able to book more orders given the open capacity. Planners can use a variety of mechanism to create a conservative plan. The run rates and the yields are based on averages which are based on past information and heavily influence the capacity. Any small changes to the run-rates and the yields can influence the capacity planning considerably – which directly reflects the amount to book in the sales process.
The other way to buffer against uncertainty is managing the inventory levels. The inventory levels are a function of the demand and supply variability, demand pattern and equipment uptime. The safety stock buffer is primarily maintained to buffer against the variability. This in fact helps to keep the production steady while letting the inventory buffer levels move up or down on a daily basis based on production execution and shipment.
Lean principle and just-in-time production are two medium of production that the planners employ during the production planning process. For the built-to-order industry, where the production process starts only when the order is received, planners have to plan around the capacity buffer. The challenge is to determine a consistent product mix for production and the start times which would in effect reduce the over work in process in the supply chain. For this industry, the Sales and Operations is an important part of the planning process, to determine the sales booking mix adhere to the capacity constraints. For the repetitive business, where the manufacturer is producing the same material consistently, inventory buffer management becomes as important aspect of planning. The planner has to ensure that the buffer can handle spikes in demand and supply reliability while the production is producing material consistently.
Any other buffers?
Planners usually play around with buffers to manage the uncertainties. There are primarily two kinds of buffers – capacity and inventory. Planners have an option of booking less than the available capacity. The remaining capacity would buffer against uncertainties. The problem is representing this information to the executives. The sales executives will look at the booking and determine that they should be able to book more orders given the open capacity. Planners can use a variety of mechanism to create a conservative plan. The run rates and the yields are based on averages which are based on past information and heavily influence the capacity. Any small changes to the run-rates and the yields can influence the capacity planning considerably – which directly reflects the amount to book in the sales process.
The other way to buffer against uncertainty is managing the inventory levels. The inventory levels are a function of the demand and supply variability, demand pattern and equipment uptime. The safety stock buffer is primarily maintained to buffer against the variability. This in fact helps to keep the production steady while letting the inventory buffer levels move up or down on a daily basis based on production execution and shipment.
Lean principle and just-in-time production are two medium of production that the planners employ during the production planning process. For the built-to-order industry, where the production process starts only when the order is received, planners have to plan around the capacity buffer. The challenge is to determine a consistent product mix for production and the start times which would in effect reduce the over work in process in the supply chain. For this industry, the Sales and Operations is an important part of the planning process, to determine the sales booking mix adhere to the capacity constraints. For the repetitive business, where the manufacturer is producing the same material consistently, inventory buffer management becomes as important aspect of planning. The planner has to ensure that the buffer can handle spikes in demand and supply reliability while the production is producing material consistently.
Any other buffers?
Friday, January 18, 2008
Levers of Planning
One of my colleagues recently mentioned that “If we produce the same material as the past – who needs Lean”. It is an interesting thought. If the company produces the same product mix as the past –the operations were time tested to enable such a product mix to flow through the facility.
This raises a good point. If the operation can sustain the same kind of performance which can satisfy the customers - Why change? I think we need to investigate the operations in detail – What were the issues that were faced during the past. Questions such as – How many escalations of production were introduced? How many times have the orders not been delivered on time? How many times have the maintenance schedule been violated? How many times the schedule was changed due to unavailability of the materials? How many times were the resources used for rework? How many times the inventories build ahead and sitting on the floor? And this goes on….
It is good to know the past to predict the future. The past is a learning experience and the experience is obviously very important so that we do not make the same mistakes again. However, the future is unpredictable and unmanageable – It is like a Christmas present – You can certainly hope that your rich Grandpa gives you the similar present as last year but the changes in his lifestyle may affect this year’s gift.
As Eli Goldratt has discussed in several forums, that there are only a few variables that can be managed effectively by an individual or an organization. The more complex system in place the fewer variables can be effectively managed. The key levers for manufacturing are capacity, forecast and material availability. During planning the capacity and forecast are the main variables. Material availability should be an outcome of the process and should not be a constraint to planning. The question planners need to answer, if customer requires a part on a particular day – Is the planner going to go back to the customer to change the request date based on material availability. That not only causes changes in the plan but affects a lot of decisions that the planner has to manage. Material availability usually becomes a constraint during execution due to several factors – some of which are quality, vendor related; material handler could not find the material, production variability etc. Lean principles can be extremely valuable to determine the right amount of material to maintain against such uncertainties. For capacity planning, lean manufacturing promotes level production and lets the inventory vary to sustain the shipment of material to customers. The kanban system as astutely implemented by Toyota keep the production steady and helps manage the inventory levels while making the supply chain responsive against changes in planned versus actual.
Lean has previously been synonymous to repetitive manufacturing. It is not. It can be implemented in a make to order industry as confirmed by many research institutes. The benefits of Lean are also well documented – but the biggest benefit to manufacturers is helping them to build a steady production plan which deviates minimally due to forecast inaccuracy. For the production folks – having a steady repetitive plan on a daily basis is a panacea from the discomfort of determining what they need to build next. As discussed in earlier articles – Mura (steady production) helps reduce Muri (Discomfort and Stress) in planning.
I would like to hear back about your experiences. My next article would be around material management and inventory.
This raises a good point. If the operation can sustain the same kind of performance which can satisfy the customers - Why change? I think we need to investigate the operations in detail – What were the issues that were faced during the past. Questions such as – How many escalations of production were introduced? How many times have the orders not been delivered on time? How many times have the maintenance schedule been violated? How many times the schedule was changed due to unavailability of the materials? How many times were the resources used for rework? How many times the inventories build ahead and sitting on the floor? And this goes on….
It is good to know the past to predict the future. The past is a learning experience and the experience is obviously very important so that we do not make the same mistakes again. However, the future is unpredictable and unmanageable – It is like a Christmas present – You can certainly hope that your rich Grandpa gives you the similar present as last year but the changes in his lifestyle may affect this year’s gift.
As Eli Goldratt has discussed in several forums, that there are only a few variables that can be managed effectively by an individual or an organization. The more complex system in place the fewer variables can be effectively managed. The key levers for manufacturing are capacity, forecast and material availability. During planning the capacity and forecast are the main variables. Material availability should be an outcome of the process and should not be a constraint to planning. The question planners need to answer, if customer requires a part on a particular day – Is the planner going to go back to the customer to change the request date based on material availability. That not only causes changes in the plan but affects a lot of decisions that the planner has to manage. Material availability usually becomes a constraint during execution due to several factors – some of which are quality, vendor related; material handler could not find the material, production variability etc. Lean principles can be extremely valuable to determine the right amount of material to maintain against such uncertainties. For capacity planning, lean manufacturing promotes level production and lets the inventory vary to sustain the shipment of material to customers. The kanban system as astutely implemented by Toyota keep the production steady and helps manage the inventory levels while making the supply chain responsive against changes in planned versus actual.
Lean has previously been synonymous to repetitive manufacturing. It is not. It can be implemented in a make to order industry as confirmed by many research institutes. The benefits of Lean are also well documented – but the biggest benefit to manufacturers is helping them to build a steady production plan which deviates minimally due to forecast inaccuracy. For the production folks – having a steady repetitive plan on a daily basis is a panacea from the discomfort of determining what they need to build next. As discussed in earlier articles – Mura (steady production) helps reduce Muri (Discomfort and Stress) in planning.
I would like to hear back about your experiences. My next article would be around material management and inventory.
Wednesday, January 2, 2008
Efficiency at the Cost of Flexibility
During my discussions with clients in the past the focus was primarily efficiency rather than flexibility. I have to admit most of the clients that I worked with belonged to an asset intensive industry where millions of dollars were invested to install the asset. The focus of the operation was to keep these expensive asset running to reduce the unit cost (based on utilization) but sometimes at the cost of loosing control over unwanted inventory buildup.
We had a unique experience the other day at McDonald. Dallas was unusually cold during the holidays and we planned to take the kids to the neighborhood McDonalds to burn some calories (sounds like an oxymoron) – in the indoor play area. We had a bunch of 4 boys who wanted to play tag. The play area at a McDonalds is a colorful group of slides, walk-in pipes, climbing nets and steps combined together to form a pleasing area to play. The two 11 year olds formed a tag team and the two 6 year olds formed the other team.
At first thought it seemed that it was a slam dunk where the 11 year olds would win easily. They were stronger, faster and bigger. They could climb up easily, take bigger steps and could overpower their brethrens to surrender easily. What we did not realize that the walk-in pipes had a limited diameter. It was easy for the 6 year olds to run through them while the 11 year old had a tougher time and their movement was at best very sluggish. It was soon proven that the 6 year olds won the tag as a team – while being physically weaker of the two teams.
It is a good example where it was evident that flexibility is a key enabler to speed up the operation. Manufacturing philosophies have changed since Frederick Taylor promulgated the art of improving efficiency through mass production in his research on “Scientific Management”. Henry Ford and Andrew Carnegie were early adopters who took advantage of the philosophies building standardized massive installation to build the high efficacy machines. The role of manufacturing was based on push production and was thrived on low mix and less global competition. Ford could keep on building efficiently their black Model-T’s in dedicated lines, and still be able to sell them in the market. But in today’s market the role of customers have taken the center stage. Customers dictate the product requirement and the attributes. Customers have become accustomed to better customer service and have consistently challenged the producers to shorten their lead times. The discussion to shorten the lead times in a high product mix environment is itself a topic of discussion.
Flexibility is a virtue that manufacturers have learnt to adopt in the new developing global economy. There are several ways to improve flexibility in operations. One of the ways is installing several smaller machines instead of a single large machine. It might cost more during investment but will increase the flexibility. If the organization is based on a smaller machine environment, some of the machines will always be up while the others are being maintained. Flexibility aids producing several different kinds of products simultaneously through it manufacturing lines. Flexibility also improves if you are planning to your rated capacity based on past reliability records of the equipments. Plan based on the effective capacity instead of the rated capacity. The rule of thumb used by planners is to plan around 85% of capacity leaving some room to accommodate unreliability of machines and labor, machine setups due to product mix changes and customer order expedites.
Based on the observation, I think I would rather have the two 6 year olds in my team rather than the two eleven year olds. They are smaller and more agile in the given environment. One could argue that if the walk-in pipes bigger the results would be different – well “that” is the market demand – we have to learn to be flexible and adjust to it.
We had a unique experience the other day at McDonald. Dallas was unusually cold during the holidays and we planned to take the kids to the neighborhood McDonalds to burn some calories (sounds like an oxymoron) – in the indoor play area. We had a bunch of 4 boys who wanted to play tag. The play area at a McDonalds is a colorful group of slides, walk-in pipes, climbing nets and steps combined together to form a pleasing area to play. The two 11 year olds formed a tag team and the two 6 year olds formed the other team.
At first thought it seemed that it was a slam dunk where the 11 year olds would win easily. They were stronger, faster and bigger. They could climb up easily, take bigger steps and could overpower their brethrens to surrender easily. What we did not realize that the walk-in pipes had a limited diameter. It was easy for the 6 year olds to run through them while the 11 year old had a tougher time and their movement was at best very sluggish. It was soon proven that the 6 year olds won the tag as a team – while being physically weaker of the two teams.
It is a good example where it was evident that flexibility is a key enabler to speed up the operation. Manufacturing philosophies have changed since Frederick Taylor promulgated the art of improving efficiency through mass production in his research on “Scientific Management”. Henry Ford and Andrew Carnegie were early adopters who took advantage of the philosophies building standardized massive installation to build the high efficacy machines. The role of manufacturing was based on push production and was thrived on low mix and less global competition. Ford could keep on building efficiently their black Model-T’s in dedicated lines, and still be able to sell them in the market. But in today’s market the role of customers have taken the center stage. Customers dictate the product requirement and the attributes. Customers have become accustomed to better customer service and have consistently challenged the producers to shorten their lead times. The discussion to shorten the lead times in a high product mix environment is itself a topic of discussion.
Flexibility is a virtue that manufacturers have learnt to adopt in the new developing global economy. There are several ways to improve flexibility in operations. One of the ways is installing several smaller machines instead of a single large machine. It might cost more during investment but will increase the flexibility. If the organization is based on a smaller machine environment, some of the machines will always be up while the others are being maintained. Flexibility aids producing several different kinds of products simultaneously through it manufacturing lines. Flexibility also improves if you are planning to your rated capacity based on past reliability records of the equipments. Plan based on the effective capacity instead of the rated capacity. The rule of thumb used by planners is to plan around 85% of capacity leaving some room to accommodate unreliability of machines and labor, machine setups due to product mix changes and customer order expedites.
Based on the observation, I think I would rather have the two 6 year olds in my team rather than the two eleven year olds. They are smaller and more agile in the given environment. One could argue that if the walk-in pipes bigger the results would be different – well “that” is the market demand – we have to learn to be flexible and adjust to it.
Wednesday, December 26, 2007
The Control Room – Next Generation of Managers
Computing horsepower has increased tremendously over the years. I was at NASA Houston recently, and was amazed to see how little technology was used to put the first man on the moon. I was informed that the entire operation was controlled by the horse power equivalent to 3 PC’s. The amount of copper to lay the wiring was equivalent to the length of going around the earth almost 3 times.
Not sure how to interpret the decision – If 3 pc’s can put a man on the moon then do we really need all that extra memories to run a factory. A typical manufacturing factory will have at least 10 – 20 servers with memory ranging from 2 - 10 GB of RAM.
If we target the main thrust of change over the years, it has been the transformation of information sharing. One of my clients claimed that it would take 3 - 4 weeks to get a report for the entire spend in the organization. This induces latency in the decision making process and reduces the flexibility of the company to react to changes quickly.
The main thrust of technology is not only helping in solving complex problems but the dissemination of gigabytes of information quickly and in real time to enable fast decision making. To enable such massive amount of data flowing between equipment, operations, maintenance, engineering and corporate it requires substantial bandwidth and RAM to process and distribute the information to the stakeholders.
The role of “Autonomation” – which is the inclusion of human intelligence in operations – will become an important aspect of manufacturing. Toyota developed the technique of self-correcting operations where the operation would stop automatically when a defect is detected. This process allows management of multiple machines by fewer people. The role of technology to enable operations to be automated is also related to processing of information. The real time status of the equipment including capturing the signals from active and passive sensors and processing them intelligently to a digestible metric enhances the ability of the company to manage the company with fewer individuals on the floor.
The classic example is Wall-Street. The value of currency is changing by the second depending on multitude of factors from around the world. The currency valuations are available real-time to everyone and decisions on buy & sell are almost real-time, even by a common person sitting in a remote place – while sipping margarita in Cancun. If Wall-Street can do it, so can manufacturing companies. I see the next generation of manufacturing managers sitting in a Control Room – making decisions based on the real-time feedback from the operation. The decisions will be implemented and the results will be visible real-time for the next set of corrective actions.
Not sure how to interpret the decision – If 3 pc’s can put a man on the moon then do we really need all that extra memories to run a factory. A typical manufacturing factory will have at least 10 – 20 servers with memory ranging from 2 - 10 GB of RAM.
If we target the main thrust of change over the years, it has been the transformation of information sharing. One of my clients claimed that it would take 3 - 4 weeks to get a report for the entire spend in the organization. This induces latency in the decision making process and reduces the flexibility of the company to react to changes quickly.
The main thrust of technology is not only helping in solving complex problems but the dissemination of gigabytes of information quickly and in real time to enable fast decision making. To enable such massive amount of data flowing between equipment, operations, maintenance, engineering and corporate it requires substantial bandwidth and RAM to process and distribute the information to the stakeholders.
The role of “Autonomation” – which is the inclusion of human intelligence in operations – will become an important aspect of manufacturing. Toyota developed the technique of self-correcting operations where the operation would stop automatically when a defect is detected. This process allows management of multiple machines by fewer people. The role of technology to enable operations to be automated is also related to processing of information. The real time status of the equipment including capturing the signals from active and passive sensors and processing them intelligently to a digestible metric enhances the ability of the company to manage the company with fewer individuals on the floor.
The classic example is Wall-Street. The value of currency is changing by the second depending on multitude of factors from around the world. The currency valuations are available real-time to everyone and decisions on buy & sell are almost real-time, even by a common person sitting in a remote place – while sipping margarita in Cancun. If Wall-Street can do it, so can manufacturing companies. I see the next generation of manufacturing managers sitting in a Control Room – making decisions based on the real-time feedback from the operation. The decisions will be implemented and the results will be visible real-time for the next set of corrective actions.
Friday, December 14, 2007
CIA of Operation Management
Well since I have your attention and you may be wondering what CIA has to do with operations. CIA was the termed coined by one of my clients as “Central Item Allocation” during an enterprise transformational project. Large manufacturing companies face three prominent challenges that are rated very highly in customer surveys. The top three challenges are resistance due to cultural challenges, managing proliferation of IT systems and coping with Mergers and Acquisitions.
This article discusses the third challenge of merger and acquisition, particularly with respect to operational efficiency. In the last 10 years there has been a proliferation of companies going global. The reasons are varied and range from 1) access to low cost resources, 2) access to new markets and 3) a way to hedge the market by diversifying. However these acquisitions have created a monster of a problem to the COO and the CIO. Millions of dollars are being spent on building a common standard business processes across cross cultural and national boundaries. One of the key decisions that come up during due diligence efforts is whether to build a single instance or a distributed set of systems across the enterprise. Single instance is beneficial from a TCO (Total Cost of Ownership) point of view, but causes two problems. It takes the ability from the individual sites to manage their manufacturing. It takes the individuality away from the plants which have unique characteristics and strengths. The best people to manage the operation are usually the people at the plant who breathe and live the production on a daily basis. With the single standard across the enterprise, they have to comply with the corporate standards. The second problem is a situation where the company may decide to divest a manufacturing unit; it becomes a highly complex process to separate the IT infrastructure.
Integration of the acquisition is a nightmare and has to be managed diligently. The challenge faced by the COO is to determine the best way to utilize the distributed capacity “effectively” to serve the variable market demand. CIA was coined to develop a common order taking process that would consider the demand characteristics and allocate the orders intelligently to the individual manufacturing sites.
Some of the key factors that were considered during the design phase were 1) How to create a level capacity loading, 2) How to promise profitability, 3) How to promise based on logistics costs, 4) How to accommodate customer preferences and 5) Can the small lot orders be aggregated in bigger lots before promising. Overall the CIA function can become very complex if all the decisions have to be considered during order promising. The challenges are to accommodate all the logic within a short interval of time to ensure that customers and sales representatives do not have to wait for an inordinate amount of time to get a promise date. To make this happen, some basic fundamentals have to be in place, to develop this solution in a multi plant environment. This cannot be done without the standardization of products and processes – thus the single instance of an enterprise application plays a critical part.
A comprehensive successful implementation of CIA will help in better utilization of the capacity across multiple plants, balanced production and load leveling, reduction in delays and shortened cycle times, aggregation of small lot orders and improved customer service.
This article discusses the third challenge of merger and acquisition, particularly with respect to operational efficiency. In the last 10 years there has been a proliferation of companies going global. The reasons are varied and range from 1) access to low cost resources, 2) access to new markets and 3) a way to hedge the market by diversifying. However these acquisitions have created a monster of a problem to the COO and the CIO. Millions of dollars are being spent on building a common standard business processes across cross cultural and national boundaries. One of the key decisions that come up during due diligence efforts is whether to build a single instance or a distributed set of systems across the enterprise. Single instance is beneficial from a TCO (Total Cost of Ownership) point of view, but causes two problems. It takes the ability from the individual sites to manage their manufacturing. It takes the individuality away from the plants which have unique characteristics and strengths. The best people to manage the operation are usually the people at the plant who breathe and live the production on a daily basis. With the single standard across the enterprise, they have to comply with the corporate standards. The second problem is a situation where the company may decide to divest a manufacturing unit; it becomes a highly complex process to separate the IT infrastructure.
Integration of the acquisition is a nightmare and has to be managed diligently. The challenge faced by the COO is to determine the best way to utilize the distributed capacity “effectively” to serve the variable market demand. CIA was coined to develop a common order taking process that would consider the demand characteristics and allocate the orders intelligently to the individual manufacturing sites.
Some of the key factors that were considered during the design phase were 1) How to create a level capacity loading, 2) How to promise profitability, 3) How to promise based on logistics costs, 4) How to accommodate customer preferences and 5) Can the small lot orders be aggregated in bigger lots before promising. Overall the CIA function can become very complex if all the decisions have to be considered during order promising. The challenges are to accommodate all the logic within a short interval of time to ensure that customers and sales representatives do not have to wait for an inordinate amount of time to get a promise date. To make this happen, some basic fundamentals have to be in place, to develop this solution in a multi plant environment. This cannot be done without the standardization of products and processes – thus the single instance of an enterprise application plays a critical part.
A comprehensive successful implementation of CIA will help in better utilization of the capacity across multiple plants, balanced production and load leveling, reduction in delays and shortened cycle times, aggregation of small lot orders and improved customer service.
Tuesday, November 13, 2007
Bookings-Billings-Backlog (BBB) – Performance Metric
I came along this term during my tenure as a lead for a consulting service group. The Bookings-Billings and Backlog was extensively used to understand the pulse of the services business. Looking back into my engineering days, the concept is similar to the fluid flow problem. Let’s take a water tank example and let’s assume that there are two taps attached to it. One of the taps (Bookings Tap) is pouring water into the tank and the other (Billings Tap) drain water out of the tank. The Backlog is the amount of water inside the tank. The challenge for companies is to make sure that the level of water inside the tank is steady or changing steadily. A change in the Backlog is the first indicator that the business is changing and corrective actions have to follow. A fast or an abrupt change indicates that the data was inherently inaccurate or the visibility was being hidden.
If the backlog rises faster – it is an indication that the business is growing. The decision maker has to decide on how to meet the changing requirements. While having a good Backlog is a good sign, it usually has a limited lifespan in the competitive world. The decision maker has to quickly decide on how to serve the backlog or the bookings may be cancelled and the backlog may disappear. There are two ways for servicing backlog. You can either open the billings tap more or spread out the bookings to future time buckets. By draining the billings faster – You can put more resources on the project/projects to service the billings. If you feel that the increases in billing are sustainable and cannot be serviced by the current structure of the organization - then decide on hiring more people in the organization.
If the backlog is decreasing then it is an indication that the business is shrinking and emphasis should be made to increase the bookings and rethink the billings strategy. Bookings are dependent on Sales which in itself has a high level of uncertainty. For the services industry there is more control on the billings and actions have to be taken to manage the workforce based on the billings projections. I have lived through a company where the bookings kept of steadily decreasing over time. The initial reaction was that it was a fleeting reaction of the market and no major corrective actions were taken. In the long run the number of projects went down and the people on the bench were increasing as well as the cash burn rate of the company. The smart manager would look at the metric (Shrinking Backlog) and take quick corrective action. As I had written in my earlier article (The Art of Prediction – Manage Risk ) – all the metrics can be made visible but if not correlated with experience the benefit of early visibility is negated.
Spreading or leveling the bookings is a way to manage risks – (also known as Mura – borrowed from Japanese). It is prevalent in any industry – manufacturing or services. Leveling of bookings ensures that the growth or downturn of the business is manageable and revenue risks are mitigated. A rapid change in bookings or billings can create an unsustainable environment. It causes “Muri – borrowed from Japanese” – which means stress in the organization, people and machines. In such an environment, people are challenged to be creative and do things that are against the norm which more often than not can cause execution of not so well thought out decisions.
My learning’s from managing a services organization was threefold. The first was to manage the organization based on BBB and always be on top of the Backlog metric. Backlog is like the early warning system to indicate whether the business is growing or shrinking. The second was to take quick corrective action to mitigate the risk of failing to meet the changes in the bookings. The last one was to level out the bookings and billings and manage the growth or the downturn to guarantee a soft landing.
If the backlog rises faster – it is an indication that the business is growing. The decision maker has to decide on how to meet the changing requirements. While having a good Backlog is a good sign, it usually has a limited lifespan in the competitive world. The decision maker has to quickly decide on how to serve the backlog or the bookings may be cancelled and the backlog may disappear. There are two ways for servicing backlog. You can either open the billings tap more or spread out the bookings to future time buckets. By draining the billings faster – You can put more resources on the project/projects to service the billings. If you feel that the increases in billing are sustainable and cannot be serviced by the current structure of the organization - then decide on hiring more people in the organization.
If the backlog is decreasing then it is an indication that the business is shrinking and emphasis should be made to increase the bookings and rethink the billings strategy. Bookings are dependent on Sales which in itself has a high level of uncertainty. For the services industry there is more control on the billings and actions have to be taken to manage the workforce based on the billings projections. I have lived through a company where the bookings kept of steadily decreasing over time. The initial reaction was that it was a fleeting reaction of the market and no major corrective actions were taken. In the long run the number of projects went down and the people on the bench were increasing as well as the cash burn rate of the company. The smart manager would look at the metric (Shrinking Backlog) and take quick corrective action. As I had written in my earlier article (The Art of Prediction – Manage Risk ) – all the metrics can be made visible but if not correlated with experience the benefit of early visibility is negated.
Spreading or leveling the bookings is a way to manage risks – (also known as Mura – borrowed from Japanese). It is prevalent in any industry – manufacturing or services. Leveling of bookings ensures that the growth or downturn of the business is manageable and revenue risks are mitigated. A rapid change in bookings or billings can create an unsustainable environment. It causes “Muri – borrowed from Japanese” – which means stress in the organization, people and machines. In such an environment, people are challenged to be creative and do things that are against the norm which more often than not can cause execution of not so well thought out decisions.
My learning’s from managing a services organization was threefold. The first was to manage the organization based on BBB and always be on top of the Backlog metric. Backlog is like the early warning system to indicate whether the business is growing or shrinking. The second was to take quick corrective action to mitigate the risk of failing to meet the changes in the bookings. The last one was to level out the bookings and billings and manage the growth or the downturn to guarantee a soft landing.
Saturday, November 10, 2007
The Art of Prediction – Manage Risk
We have all heard about the famous Deming Cycle – PDCA (Plan Do Check Act)। The closed loop process developed as a preemptive action to mitigate manufacturing and quality related risks. Globalization of manufacturing in the recent times has taken the battle for efficiency to another level. As we have heard this before – “The battle is no longer between companies or organizations but between Supply Chains”. The outsourcing of manufacturing to the low cost countries has challenged the ones in the developed world to squeeze additional cost out of their supply chain to remain competitive in the new world.
Can we predict failure? Prediction is an art and not a science. However, with new technology and tools we can come close to predicting failure. The successful companies are the ones that can take corrective actions faster. The financial industry has developed a series of metrics to evaluate company’s performance and constantly monitor the metrics to make decisions to buy or sell. Some quantitative hedge funds base their decisions simply on numbers from their quantitative models but the outstanding ones can correlate the metrics with experience and make better and faster decisions. I always begin by analyzing the human body or an automobile. As we grow older, predictive maintenance becomes an important facet of our lives. We go to the doctor on a regular interval, trying to judge the condition of our health. The main reason for doing so is to reduce the future cost and lead a healthy lifestyle. The doctor goes through a series of tests and measurements and provides warnings based on the results. These tests have been standardized and customized based on the individuals’ age and gender. The best doctors take the data and make recommendation based on their experience.
The key learning from visiting a doctor are similar to a supply chain risk management. There are several key components that have to be in place to enable a medium of managing risk in operations. The five key components are 1) Metrics, 2) Real Time Data Gathering & Monitoring, 3) Experience (History), 4) Corrective actions and 5) Perfection.
The most difficult task is defining a standard set of metrics. Identifying the right metrics at the right level of detail is the first step to help decision making. Just like in the medical community, the metrics vary for manufacturing companies. The segmentation of the companies can be in the form of industry, size, innovations and market perception. Today, there are many software vendors who have developed standardized metrics for financial and operations effectiveness. Companies should however, understand each metric and revisit them in regular intervals to ensure that these are still valid based on the current state of the company.
The challenge 30 years ago was the readily availability of the data to monitor the pulse of the company. With new technology it is possible to constantly monitor the resources (equipment, inventory, order, customer, supplier) and generate consequential results that are constructive in decision making. The interval of monitoring and real time data gathering can be set based on the risk of failure or the impact of failure. The technical challenge on processing massive volumes of data and making sense out of it is a reality in the current internet world and will only improve.
As we have all heard before – learn from history. History and experience is a valuable commodity that we cannot ignore. I still remember the dot-com boom era where all of us thought for a period that we were better in investing that the professionals. We were quickly brought to earth during the bust – where most of us were not prepared to get out of the market in the right time. Manufacturing companies also have similar executives and managers who have precious past experiences. All the data and analysis without a context and perspective can all go to waste.
The dot-com boom and bust made me realize the basic art of investing is to know when to get out. The art of decision making to make gutsy calls to make quick corrective actions is mostly based on the individual just like while driving a car in snow, the driver would slow down to prevent any mishaps. As soon as the car skids, the driver takes quick actions to minimize the loss of control. These action at the level of a company are still valid and in a much larger scale. Corrective actions require real time metrics, data and experience to make judicious decisions.
Perfection is a word that I have borrowed from the Lean community. All the above processes can be built and can give significant benefits. To create a sustainable engine the processes of measuring, monitoring and taking corrective action should be built into the DNA of the company. The pursuit of perfection can only be attained if the processes are standardized and accepted as a norm by the entire workforce.
We must be aware that like Deming’s PDCA cycle it is a closed loop process. Each cycle the learning’s should be incorporated into the new knowledgebase and the cycle should continue. Risk management is about proactively measuring and monitoring the right data and taking corrective actions when required. There is a cost associated with laying such a foundation but the benefits of preventing a failure far outweighs the cost of the implementation.
Can we predict failure? Prediction is an art and not a science. However, with new technology and tools we can come close to predicting failure. The successful companies are the ones that can take corrective actions faster. The financial industry has developed a series of metrics to evaluate company’s performance and constantly monitor the metrics to make decisions to buy or sell. Some quantitative hedge funds base their decisions simply on numbers from their quantitative models but the outstanding ones can correlate the metrics with experience and make better and faster decisions. I always begin by analyzing the human body or an automobile. As we grow older, predictive maintenance becomes an important facet of our lives. We go to the doctor on a regular interval, trying to judge the condition of our health. The main reason for doing so is to reduce the future cost and lead a healthy lifestyle. The doctor goes through a series of tests and measurements and provides warnings based on the results. These tests have been standardized and customized based on the individuals’ age and gender. The best doctors take the data and make recommendation based on their experience.
The key learning from visiting a doctor are similar to a supply chain risk management. There are several key components that have to be in place to enable a medium of managing risk in operations. The five key components are 1) Metrics, 2) Real Time Data Gathering & Monitoring, 3) Experience (History), 4) Corrective actions and 5) Perfection.
The most difficult task is defining a standard set of metrics. Identifying the right metrics at the right level of detail is the first step to help decision making. Just like in the medical community, the metrics vary for manufacturing companies. The segmentation of the companies can be in the form of industry, size, innovations and market perception. Today, there are many software vendors who have developed standardized metrics for financial and operations effectiveness. Companies should however, understand each metric and revisit them in regular intervals to ensure that these are still valid based on the current state of the company.
The challenge 30 years ago was the readily availability of the data to monitor the pulse of the company. With new technology it is possible to constantly monitor the resources (equipment, inventory, order, customer, supplier) and generate consequential results that are constructive in decision making. The interval of monitoring and real time data gathering can be set based on the risk of failure or the impact of failure. The technical challenge on processing massive volumes of data and making sense out of it is a reality in the current internet world and will only improve.
As we have all heard before – learn from history. History and experience is a valuable commodity that we cannot ignore. I still remember the dot-com boom era where all of us thought for a period that we were better in investing that the professionals. We were quickly brought to earth during the bust – where most of us were not prepared to get out of the market in the right time. Manufacturing companies also have similar executives and managers who have precious past experiences. All the data and analysis without a context and perspective can all go to waste.
The dot-com boom and bust made me realize the basic art of investing is to know when to get out. The art of decision making to make gutsy calls to make quick corrective actions is mostly based on the individual just like while driving a car in snow, the driver would slow down to prevent any mishaps. As soon as the car skids, the driver takes quick actions to minimize the loss of control. These action at the level of a company are still valid and in a much larger scale. Corrective actions require real time metrics, data and experience to make judicious decisions.
Perfection is a word that I have borrowed from the Lean community. All the above processes can be built and can give significant benefits. To create a sustainable engine the processes of measuring, monitoring and taking corrective action should be built into the DNA of the company. The pursuit of perfection can only be attained if the processes are standardized and accepted as a norm by the entire workforce.
We must be aware that like Deming’s PDCA cycle it is a closed loop process. Each cycle the learning’s should be incorporated into the new knowledgebase and the cycle should continue. Risk management is about proactively measuring and monitoring the right data and taking corrective actions when required. There is a cost associated with laying such a foundation but the benefits of preventing a failure far outweighs the cost of the implementation.
Wednesday, November 7, 2007
Support Structure for Sales Effectiveness
I have been involved in several sales cycles for software and services over the last 11 years. It has usually been grueling affair trying to manage the whole process and working 80 hours week to bring it completion. Over the years, having worked in consulting and sales, I have come to the conclusion that selling is the most difficult task for an organization. I have grown to respect sales organization and the effort that they put to bring it to a closure, not only for their effort but also working in an environment of uncertainty.
This piece is not about the sales process but the support structure required making the sales process effective. The three key dimensions that organizations should focus on are Marketing, References/Case Studies and Customer Testimonials.
Customers have become savvier buyer of software and services over the years. Although some customers fall into the category of Leaders/Innovators, who are open to new ideas and willing to try out new innovative solutions, a great majority of them usually are followers who are primarily looking for success stories. Empowered with a list of references, the sales team can quickly gain the confidence of the buyer. Customers usually prefer references in similar solution areas and similar industries. References and case studies help to clear the first hurdle whether the experience of the software and services has been successful in a similar environment. It removes the first line of skepticism and the customer moves to the next step of analysis.
Once the references veil has been lifted the next thing the next step to bring the customer at ease is existing customer testimonials. In order to create a positive customer testimonial, three things has to be in place. The first one is a successful implementation and experience where the existing customer is willing to stand and vouch for the product and service. The second is a strong personality within the existing customer base who can help influence the community and the third a user community, blogs, discussion sites where the new customer can visit and gather perspectives. A positive customer testimonial goes a long way in increasing the effectiveness of sales process.
The Marketing organization in a software and services company holds a significantly important role is creating messaging and disseminating the message across the community. The first step for the marketing group is to create an effective and simple message that comprehensively describes the offerings that the company has to offer. The message should be simple and effective which the sales team can easily explain in a form of an elevator speech. The second step is to work with analyst and publications editors to get the message across. The analyst and publications are very powerful and have existing customer base that are eager to hear/read about the new solutions and practices. On several occasions when I have visited customers, we tend to discuss the new trends and technology. Customers would inevitably turn out some document that were written by analysts and would want to understand how our solutions fit into the model. Being in the top quadrant requires not only implementation successes but close alignment of the company’s message with the analyst. The other medium to disseminate the message is attending community conferences and organizations. Executives and managers attend conferences to listen to new trends and this is the best place where you can talk one-on-one in an outside location.
In summary – the Marketing, References and Customer Testimonials play a key role is enabling sales. The software and services SG&A are one of the highest in the industry. A small change in the effectiveness of the process can reduce the overall cost of sales and increase the margins.
Like to hear of other support organization and structures which will increase the sales effectiveness.
This piece is not about the sales process but the support structure required making the sales process effective. The three key dimensions that organizations should focus on are Marketing, References/Case Studies and Customer Testimonials.
Customers have become savvier buyer of software and services over the years. Although some customers fall into the category of Leaders/Innovators, who are open to new ideas and willing to try out new innovative solutions, a great majority of them usually are followers who are primarily looking for success stories. Empowered with a list of references, the sales team can quickly gain the confidence of the buyer. Customers usually prefer references in similar solution areas and similar industries. References and case studies help to clear the first hurdle whether the experience of the software and services has been successful in a similar environment. It removes the first line of skepticism and the customer moves to the next step of analysis.
Once the references veil has been lifted the next thing the next step to bring the customer at ease is existing customer testimonials. In order to create a positive customer testimonial, three things has to be in place. The first one is a successful implementation and experience where the existing customer is willing to stand and vouch for the product and service. The second is a strong personality within the existing customer base who can help influence the community and the third a user community, blogs, discussion sites where the new customer can visit and gather perspectives. A positive customer testimonial goes a long way in increasing the effectiveness of sales process.
The Marketing organization in a software and services company holds a significantly important role is creating messaging and disseminating the message across the community. The first step for the marketing group is to create an effective and simple message that comprehensively describes the offerings that the company has to offer. The message should be simple and effective which the sales team can easily explain in a form of an elevator speech. The second step is to work with analyst and publications editors to get the message across. The analyst and publications are very powerful and have existing customer base that are eager to hear/read about the new solutions and practices. On several occasions when I have visited customers, we tend to discuss the new trends and technology. Customers would inevitably turn out some document that were written by analysts and would want to understand how our solutions fit into the model. Being in the top quadrant requires not only implementation successes but close alignment of the company’s message with the analyst. The other medium to disseminate the message is attending community conferences and organizations. Executives and managers attend conferences to listen to new trends and this is the best place where you can talk one-on-one in an outside location.
In summary – the Marketing, References and Customer Testimonials play a key role is enabling sales. The software and services SG&A are one of the highest in the industry. A small change in the effectiveness of the process can reduce the overall cost of sales and increase the margins.
Like to hear of other support organization and structures which will increase the sales effectiveness.
Thursday, November 1, 2007
Promising in Finer Time Buckets
Depending on the industry vertical and the “lead time” of manufacturing or services, the promising of orders are in days, weeks or months. The aerospace industry will promise the next airplane delivery in months, the steel industry in weeks, the auto parts industry in days and the IT project implementation in months. The question that the VP of operations often poses is how to promise better in finer time buckets.
During one of my stints in Asia, the CIO who came from a manufacturing background wanted a solution/strategy to promise his customers in days instead of the usual way of operating which was weeks. The benefits of promising and delivering in days was very simple – it would take away a week of inventory from the supply chain which would reduce the lead times and also leave extra cash of the table that can be put to more productive use rather than being tied in inventory.
Though the reasons for promising delivery is finer buckets is very simple – but the task of achieving it is pretty complex. During further investigation on why the company was promising in weekly buckets we found that there were basically 3 main reasons. The first reason was that the extra buffer of time was to cover the unreliability of the operations, the second reason was that each operation in the manufacturing chain was scheduled to maximize the utilization of the resource thus sometimes resulting in the production of unwanted material or producing material before hand and the third reason was that they wanted to ship in large quantities aggregating to reduce the shipping cost.
The first task at hand is to investigate the root causes of reliability. The reason can be varied, ranging from quality problems, machine uptime, poor planning and demand variability. The key to eliminate reliability is to systematically eliminate all the root causes. The other way to manage reliability issues is by creating a buffer of safety stock that can sustain the variability. Although safety stock is additional capital tied up in inventory – it does enable creating a flow of material through the manufacturing chain. The key to any manufacturer is to decide on the safety stock level and not just any stock – but a predetermined amount for a mix of stock. Safety stock is definitely not the panacea to eliminate reliability, but it does help bringing stability and consistency to the production. As further actions are taken to eliminate reliability, the safety stock can be reduced and the capital released to be used elsewhere.
The second factor is the fascination about utilization, of machines and people, by operations manager. The relentless focus on utilization reduces the focus on flexibility. No operation is perfect in the imperfect world. Creating a plan for 100% utilization increases the risk for unplanned disruptions. A small hiccup in the operation can lead to a ripple effect which can last a significant amount of time with loss of customer service, escalation of orders, complex decision making for available capacity and overall stress on the people and machines. All of this results in nervousness in the supply chain and results in the increase in inventory. In the imperfect world the advantages of running an operation at 85-90% capacity far supersedes the benefits of running at 100% capacity. It provides flexibility to take care of unforeseen interruptions and allows the goal of providing excellent customer services at a lower cost. The concept of process design and line design by industrial engineers take into account the variability in demand before setting up the manufacturing process. The premise of the design is to enable flow of material through the line thus reducing the latency within the production line. However, the rate of change in the business climate has increased several folds over the years. The original typewriter had a lifecycle of over 30 years – while the current technologies have a much shorter lifecycle. Business processes and manufacturing lines have to be reconfigured to align to the new business climate every so often to keep itself tunes to the new dance. Running an operation at less than 100% capacity provides the ability to react and accommodate incremental changes in the business while maintaining excellent customer service.
The third factor as explained was the consolidation of shipment into large batches. This is again a counterintuitive approach to improve efficiency. Larger batches do help reduce the cost per unit. But there are inherent costs associated with accumulating goods to reduce shipping costs. Material have a tendency of not adhering to the planned timeline and a delay to a portion of the shipments will delay the entire shipment. Keeping goods in the warehouse, while waiting for the batch to be complete, increases the risk of goods being stolen, lost or damaged. Increase in goods as resident increases overall material handling costs – tracking and tracing, inventory management data entries, not finding the right material and others. A smaller shipment can keep the customer operating, while waiting for a large batch before shipping increases the risk for the customer. An unplanned event during the shipment can derail the entire shipment while if shipped in smaller batches can reduce the overall risk for the customer.
The benefits for promising in finer time buckets are great, but require discipline in bringing stability, consistency, repeatability and sustainability to manufacturing. Operations Manager should systematically address and eliminate the root causes of reliability, run the operation smoothly and below 100% capacity and ship in smaller batches.
I would love to hear from you about your experiences.
During one of my stints in Asia, the CIO who came from a manufacturing background wanted a solution/strategy to promise his customers in days instead of the usual way of operating which was weeks. The benefits of promising and delivering in days was very simple – it would take away a week of inventory from the supply chain which would reduce the lead times and also leave extra cash of the table that can be put to more productive use rather than being tied in inventory.
Though the reasons for promising delivery is finer buckets is very simple – but the task of achieving it is pretty complex. During further investigation on why the company was promising in weekly buckets we found that there were basically 3 main reasons. The first reason was that the extra buffer of time was to cover the unreliability of the operations, the second reason was that each operation in the manufacturing chain was scheduled to maximize the utilization of the resource thus sometimes resulting in the production of unwanted material or producing material before hand and the third reason was that they wanted to ship in large quantities aggregating to reduce the shipping cost.
The first task at hand is to investigate the root causes of reliability. The reason can be varied, ranging from quality problems, machine uptime, poor planning and demand variability. The key to eliminate reliability is to systematically eliminate all the root causes. The other way to manage reliability issues is by creating a buffer of safety stock that can sustain the variability. Although safety stock is additional capital tied up in inventory – it does enable creating a flow of material through the manufacturing chain. The key to any manufacturer is to decide on the safety stock level and not just any stock – but a predetermined amount for a mix of stock. Safety stock is definitely not the panacea to eliminate reliability, but it does help bringing stability and consistency to the production. As further actions are taken to eliminate reliability, the safety stock can be reduced and the capital released to be used elsewhere.
The second factor is the fascination about utilization, of machines and people, by operations manager. The relentless focus on utilization reduces the focus on flexibility. No operation is perfect in the imperfect world. Creating a plan for 100% utilization increases the risk for unplanned disruptions. A small hiccup in the operation can lead to a ripple effect which can last a significant amount of time with loss of customer service, escalation of orders, complex decision making for available capacity and overall stress on the people and machines. All of this results in nervousness in the supply chain and results in the increase in inventory. In the imperfect world the advantages of running an operation at 85-90% capacity far supersedes the benefits of running at 100% capacity. It provides flexibility to take care of unforeseen interruptions and allows the goal of providing excellent customer services at a lower cost. The concept of process design and line design by industrial engineers take into account the variability in demand before setting up the manufacturing process. The premise of the design is to enable flow of material through the line thus reducing the latency within the production line. However, the rate of change in the business climate has increased several folds over the years. The original typewriter had a lifecycle of over 30 years – while the current technologies have a much shorter lifecycle. Business processes and manufacturing lines have to be reconfigured to align to the new business climate every so often to keep itself tunes to the new dance. Running an operation at less than 100% capacity provides the ability to react and accommodate incremental changes in the business while maintaining excellent customer service.
The third factor as explained was the consolidation of shipment into large batches. This is again a counterintuitive approach to improve efficiency. Larger batches do help reduce the cost per unit. But there are inherent costs associated with accumulating goods to reduce shipping costs. Material have a tendency of not adhering to the planned timeline and a delay to a portion of the shipments will delay the entire shipment. Keeping goods in the warehouse, while waiting for the batch to be complete, increases the risk of goods being stolen, lost or damaged. Increase in goods as resident increases overall material handling costs – tracking and tracing, inventory management data entries, not finding the right material and others. A smaller shipment can keep the customer operating, while waiting for a large batch before shipping increases the risk for the customer. An unplanned event during the shipment can derail the entire shipment while if shipped in smaller batches can reduce the overall risk for the customer.
The benefits for promising in finer time buckets are great, but require discipline in bringing stability, consistency, repeatability and sustainability to manufacturing. Operations Manager should systematically address and eliminate the root causes of reliability, run the operation smoothly and below 100% capacity and ship in smaller batches.
I would love to hear from you about your experiences.
Monday, October 15, 2007
Running Operations Lean
The story exemplified in Ely Goldratt in his famous book “The Goal” about a team of “Scouts” on a camping trip was an excellent example exhibiting the role of a bottleneck and its impact on the overall progress of the team. The slowest member in the chain controls the pace of the progress and the key focus of TOC (Theory of Constraint) is to manage and improve the slowest member or the bottleneck. TOC helps manufacturing companies to identify the bottleneck and focus resources to elevate the constraint. Lean experts look at the same example a different way. Lean fundamentals explore and understand the supply chain. Lean would have taken the same team of “Scouts” and determined the rate of each member – analyzed the “takt” time and balanced the carrying load for each member of the team so that the entire team moves at a uniform pace. Lean approach to manufacturing is ensuring that the processes are balanced and run per the “takt” time as determined for the customer demand.
Lean principles will enable the team to move at a uniform pace with minimum amount of inventory (distance) between the members. Lean takes a holistic approach to design the effective supply chain (line balance) and recognizes the nuances in the supply chain constraints to create a uniform flow across every unit in the chain. TOC on the other hand is a more focused approach to address the bottleneck. TOC is a good solution for factories which have less control or foresight into the demand product mix and are continually re-planning to determine the optimal way to schedule the demand to meet delivery performance. Usually this model, the variability in production causes reliability issues for the people, process, machine and materials in order to satisfy the four requirements of an efficient and effective supply chain is cost, quality, safety and delivery performance.
A good analogy of a Lean and TOC practice is the human body. Every person has a different metabolism of absorbing the calorie intake. In a Lean model, the person has knowledge of its metabolism and creates a corresponding diet plan. The diet plan, if maintained, will ensure that the body remains lean. The diet becomes a habit and all corresponding activities of ensuring the right diet is available and regular health checkup are planned easily. A regular exercise schedule is laid out which ensures that the body stays in shape. The exercise schedule is based on the variability in the calories intake. If the variability is high then a regular strenuous exercise is laid out – which buffers that body against the variability. Variability in the diet is taken into account while planning the exercise routine. TOC model is useful when there is a lack of regimental schedule and the body has to react to the changes in the diet. The emphasis is not on the calorie intake but on the process of burning the calories. If on a certain day the calorie intake is high, the body meter indicates that the exercise schedule has to be strenuous. If TOC determines that exercise is the bottleneck, it will plan for extra time on the stair-master, to burn the extra calories. Lean and TOC both address the metabolism and try to keep the body lean. Lean dictates that the body metabolism has to be understood and set up all the activities, including diet and exercise plan, to meet the body’s metabolism requirement, while TOC will quickly address the variability by addressing the bottleneck and focus on making it more efficient. Both the processes require continual evaluations to reset the regiment. Although it may look better on paper that one could only focus on the bottleneck rather than take a holistic approach, it causes uncertainty and irregularity in the schedule which can become cost prohibitive in the long run.
The human body is an embodiment of perfect manufacturing environment. It is a learning system which processes massive amounts of data, provides visibility, makes decisions and reacts to variability.
Love to hear if you agree or disagree.
Lean principles will enable the team to move at a uniform pace with minimum amount of inventory (distance) between the members. Lean takes a holistic approach to design the effective supply chain (line balance) and recognizes the nuances in the supply chain constraints to create a uniform flow across every unit in the chain. TOC on the other hand is a more focused approach to address the bottleneck. TOC is a good solution for factories which have less control or foresight into the demand product mix and are continually re-planning to determine the optimal way to schedule the demand to meet delivery performance. Usually this model, the variability in production causes reliability issues for the people, process, machine and materials in order to satisfy the four requirements of an efficient and effective supply chain is cost, quality, safety and delivery performance.
A good analogy of a Lean and TOC practice is the human body. Every person has a different metabolism of absorbing the calorie intake. In a Lean model, the person has knowledge of its metabolism and creates a corresponding diet plan. The diet plan, if maintained, will ensure that the body remains lean. The diet becomes a habit and all corresponding activities of ensuring the right diet is available and regular health checkup are planned easily. A regular exercise schedule is laid out which ensures that the body stays in shape. The exercise schedule is based on the variability in the calories intake. If the variability is high then a regular strenuous exercise is laid out – which buffers that body against the variability. Variability in the diet is taken into account while planning the exercise routine. TOC model is useful when there is a lack of regimental schedule and the body has to react to the changes in the diet. The emphasis is not on the calorie intake but on the process of burning the calories. If on a certain day the calorie intake is high, the body meter indicates that the exercise schedule has to be strenuous. If TOC determines that exercise is the bottleneck, it will plan for extra time on the stair-master, to burn the extra calories. Lean and TOC both address the metabolism and try to keep the body lean. Lean dictates that the body metabolism has to be understood and set up all the activities, including diet and exercise plan, to meet the body’s metabolism requirement, while TOC will quickly address the variability by addressing the bottleneck and focus on making it more efficient. Both the processes require continual evaluations to reset the regiment. Although it may look better on paper that one could only focus on the bottleneck rather than take a holistic approach, it causes uncertainty and irregularity in the schedule which can become cost prohibitive in the long run.
The human body is an embodiment of perfect manufacturing environment. It is a learning system which processes massive amounts of data, provides visibility, makes decisions and reacts to variability.
Love to hear if you agree or disagree.
Subscribe to:
Posts (Atom)