There is no question that technology is changing at an extreme rate but what is also changing is what we expect from the technology we use today. Previously when purchasing a software package our focus was on features and functionality while sacrificing stability and quality. Today, consumers of enterprise software emphasize User Experience (UX) and stability at an equal level to an applications set of features. This evolution of the areas users focus on when determining which software application they select has a direct relationship on where software vendors concentrate their efforts.
You often hear a lot about technology changing at an exponential rate and what is possible today not being conceivable just 10 years ago. This ever changing growth of technology has had an effect greater than just making us more efficient or allowing us to have so much more fun. During this progression of more and more sophisticated tools, what you and I expect from the software applications we use today and the way they are delivered is much different.
Not too long ago it was common to use software applications which were unstable and crashed often. We all have fond memories of losing hours of work because either the application we were using crashed or we became blessed with the “blue screen of death.”
In hindsight this seems absurd that we tolerated the crashes, loss of work, instability, and so on. However, after reading the Innovators Dilemma by Clayton M. Christensen I realized we accepted the instabilities because most software was still considered to be in its disruptive/innovative phase. Any technology in this phase of immaturity has a user base which is more tolerant of the product having quality issues in exchange for more features and functionality.
When we used to buy software we compared the number of features and did not put as much weight on the stability of the product. Since we as customers were making our buying decisions mainly on features, this in turn caused the software vendors to focus on adding more and more features without concentrating on of the quality of the features being developed or going back and spending time on the technical debt accumulated with the previous releases.
Each release, with its increase in unstable features, amplified our frustration of updating to the latest version. We finally got to the place when we no longer updated to the latest releases despite all the “new features” available. We were more comfortable with the subset of features we knew how to use and their workarounds.
I do think this strategy of not always updating to the latest version was the right one for many organizations as the efficiency gained by new features is sometimes offset by the new “bugs” introduced.
Times are changing and in most cases have already changed; we as consumers are less tolerant of crashes and instability. This has caused vendors to focus more on quality, user experience and usability than just adding new features. Each new release or update contains less “features” but contains much more productivity enhancements such as new bug fixes, improved stability, user workflow improvements, more automation, simplified GUI without sacrificing the quality and stability of the software.
We now are much more accustomed to updating our applications without worrying about how the update may negatively impact us. I do not remember the last time I had an issue from an update. This is especially true with the apps on my mobile phone. I have set most of my apps to automatically update to ensure I have the latest and greatest. Of course there is the occasional issue but in these cases there is an update within a day or two which rectifies the issue.
For enterprise organizations it is a bit more complicated as there are more factors than just stability which causes updating to the latest version to be at a much slower pace.
- Each update of a product would need to be propagated throughout the organizations in one seamless update.
- Cost of deploying products
- Changes in hardware requirements
- Other software requirements
- Custom built tools need to be verified and updated
- Current workflows need to verified that they still work with the new update
However enterprise organizations are realizing that there is now much more benefit and lower risk with keeping up-to-date with the latest software. These companies are changing their strategies and processes to allow them to update more regularly and leverage the new enhancements available in the new software with each release. Enterprise companies will never (at least in the near feature) be able to upgrade as fast as smaller and more agile companies because a single bug or workflow change can cost millions of dollars. However, with the more stable releases there has also been a trend to keep applications as current as possible.
The landscape for enterprise organizations has changed just as much as the technology they use. Technology is allowing a new breed of competitors to enter their space and for any enterprise organization to compete with these agile competitors they will need to keep their software applications up-to-date enabling them to gain all the benefits from their investment. The risks of not updating their applications has become higher than staying at the status quo.
The increased focus on quality during the buying decision has indirectly made the software we use more stable as well as it’s updates. With the reduced risk of applying updates, the benefit vs. cost/risk has changed and we are now comfortable with updating more frequently. Some enterprise companies have acknowledged this transition and are in the process of creating an environment which will allow them to update regularly. The laggards who decide to stay with the status quo will have a much harder time staying competitive.