“The history of lean manufacturing shows that it is usually feasible, if not downright simple, to eliminate waste from a process. This cannot happen, though, until the waste is recognized for what it is. One of Henry Ford’s key success secrets was the ability to identify waste that others overlooked even though it was (with the benefit of hindsight) in plain view…”
William A. Levinson, here.
In Henry Ford’s day, waste could be identified as something physical (eg inventory) or perhaps even as our most precious commodity, time, lost.
Our OSS / BSS are able to track entire life-cycles of inventory items, so we have the information required to analyse physical waste. We are able to track the time taken for a process to step through all its constituent activities, so we have the ability to analyse time wasted (to some extent… acknowledging that we only record a fraction of possible time-efficiency metrics).
However, in the information age, there’s another measure of waste that is commonly overlooked even though it is (with the benefit of hindsight) in plain view – that of data. What would happen if we treated data waste as seriously as lost inventory?
“I recently heard that the typical organisation uses 0.05% of the data it collects. I haven’t been able to find the research that backs this up, but let’s assume that this is correct. This implies that 99.95% of the data that is collected, stored and (perhaps) curated is never utilised,” as indicated in this article.
OSS is primed for increased use of tools that can sift through data at scale (ie analytics, machine learning, artificial intelligence, etc). If you could reduce the data waste from 99.95 to 95% (still not a big number), do you get a corresponding 100x increase in insights? Even if it’s only 10x, an investment in data waste reduction still seems attractive.Read the Passionate About OSS Blog for more or Subscribe to the Passionate About OSS Blog by Email