“You are in the business of testing your guesses, not convincing yourself that you are right.”
Nathan Furr, in his book, “Nail It then Scale It”
The speed of implementation of OSS projects tends to be quite lengthy (months/years on tier-one CSPs), so we’re not exactly as responsive as some other industries by example. However, it’s inevitable that change will speed up so much in the OSS industry (and the industries we support) that competitive advantage will be derived from arbitrage, where fleeting opportunities (ie business models, supply chains, third-party offerings, cost differentials, etc) have to be leveraged for immediate gain. These opportunities may not be for our or even our customers, but could even be other individuals or organisations that aren’t even customer’s customers (yet).
This means speed must be an imperative for the OSS industry. We need new paradigms that are built around speed, lateral vision and flexibility. We need to be looking across other verticals (eg high frequency trading) for new techniques as well as processing our internal data (and externally sourced data) in new ways. Machine learning is already in use within the OSS industry but I can only see it being used more heavily in the future. In what ways do you think we can leverage this more?
Information arbitrage can only come about by getting instant feedback from your OSS and continually testing concepts that might lead to some sort of opportunity. Next generation OSS can’t come from guessing about desirable product features for CSP operational resources. It has to look further than that. It also has to take a data perspective rather than a feature perspective. We have to find ways of measuring and testing features/data rather than convincing ourselves that our guesses are right.