fbpx

Terminal Value

Technology Scaling – There’s No School Like The Old School

Doug Utberg

Business Growth Authority | Technology Strategy & Resourcing | Cost Optimization Expert | Business Process Architect | Financial Strategist | Founder - Terminal Value Podcast

Warning: Undefined array key "theme_style" in /homepages/9/d922449880/htdocs/clickandbuilds/TerminalValue/wp-content/plugins/clickfunnels/clickfunnels.php on line 610
Learn More

Many companies in the current environment are struggling with how to deploy technology.

This makes sense, since technology is changing at an exponential place, while company culture tends to change at a speed that fluctuates between slow and stop.  What results from this situation is a litany of pitfalls that companies can fall into during their technology transformation.

This is a big deal since technology transformation has become a critical priority for a multitude of companies since it has become ‘table stakes’ for competitiveness in the 21st century.  The reason for this is because effective technology makes it possible to execute business transactions at a much faster rate of speed with a much lower per-transaction cost.

So what’s the problem?

It turns out that getting to the point of fast, smooth, low-cost business transactions is a lot harder than you would think.  In order for the whole thing to feel ‘easy,’ all of the pieces need to be perfectly aligned and automated with backup plans and contingency scenarios built-in and active.

Technology in an ‘Old School’ Industry

A perfect example of this phenomenon takes place in the insurance industry.  Insurance has been around for hundreds of years and was previously executed with almost no technology at all, outside of pens and paper.

Unfortunately, this is a formula for an extremely expensive cost structure in the current business environment.  This has made ‘digital transformation’ a top priority for many insurance and financial companies.  However, there is a hitch … the processes that need to be transitioned into the digital realm are frequently very complex and some have very little standardization.

The problem this creates is that the data solutions required to accommodate these types of processes need to be highly powerful (ex. Expensive) and there needs to be a lot of sophisticated routing in place to ensure the right things are done in the right way.  (Ex. Complicated)

Shiny Objects

In response to this situation, many technologists will look to solutions such as NoSQL, Cloud, or Blockchain to apply cutting-edge techniques to these technological transformation problems.

The issue is that these technologies all have specific use cases where they ‘can’ be useful, but it is not a foregone conclusion that they will always be the right solution.  Let’s walk through them one by one.

NoSQL

NoSQL is a broad terminology used to describe databases that store data in more than just a tabular format.  Their advantage is that they have much more flexibility in the way that data can be parsed or queried.  The reason for their rise in popularity was a rapid decrease in storage costs during the late 2000’s that shifted the prime constraint of data systems from storage capacity to development resourcing.

The key factor that one has to consider when it comes to NoSQL is that flexibility comes at the cost of … higher cost.  (Pun intended)

Flexible data systems will be much more expensive to operate per machine cycle than systems that work with a more structured framework.  When working with small data sets, this difference is usually negligible, but it can escalate rapidly as the size and scope of your data set increases.

Cloud

Another key technology that has become a contemporary buzzword is cloud computing.

Fundamentally, migrating to the ‘cloud’ is where you run your systems on somebody else’s data center instead of managing one yourself.  For many companies, this is a perfectly rational choice since they lack the expertise or scale of need to justify building and maintaining their own data center.

However, it is important to understand that executing compute cycles in the cloud involves increased transmission latency and process overhead while your software runs underneath the base software of the cloud service provider.  Since cloud providers are for-profit enterprises, it makes sense that all of these costs need to be passed along to the end-customer and marked-up to generate a net profit margin for the provider.

The end result is a solution that is far more flexible than traditional data solutions, but that also has the potential to be more costly.

Blockchain

Another item that has been in the news lately is blockchain.  The prime reason for this is because it is the engine behind Bitcoin and many other cryptocurrencies.  The problem created by the mania for blockchain is that many people ignore its overhead requirements.

Blockchain and any other ‘proof of work’ protocol is built around a validation methodology that intentionally discards a significant portion of the work that is done to validate the transaction ledger.  This means that the compute overhead to execute blockchain will be significantly higher than a method with a more linear design.

The reason why this has not come into focus thus far is that Bitcoin and other cryptocurrencies went through an exponential price expansion that made the blockchain overhead seem irrelevant.  However, there are no situations when asset prices can continue expanding exponentially forever.  At some point, there is always a mean reversion.

In the end, what we frequently see are ‘shiny objects’ that distract the attention of technologists away from business value. 

(Spoiler alert – business value is the only thing that really matters)

There’s Gold in the Old

What this brings us to is older protocols that are encountering a resurgence in their utility because of their ability to cost-efficiently scale.

One example of this is mainframe computers. 

Mainframes are one of the oldest compute solutions in existence, but there are situations where they simply cannot be beaten. 

The use case that comes to mind most prominently is for brokerage houses that need to execute financial trades in fractions of a second.  In this kind of environment, any unnecessary overhead that causes transaction latency cannot be tolerated.  The net result is scaled-out legacy technology solutions that are built on very old protocols with archaic user interfaces. 

At first blush, this seems to be antithetical to the purpose of technology, but it actually demonstrates it perfectly.  The objective of technology is to optimize business value – period.

Any use of technology that does not optimize business value should not be implemented.

Simple Solutions

In many cases, the best way to optimize business value is to simplify the business process instead of implementing more technology.

A prime example is health insurance claim processing

Insurance claims are typically processed in batches based on inflow throughout the day.  This methodology dates back to the ‘pen and paper’ days when forms were manually collected throughout the day and entered all at once.

The result is that when all the claims are processed at once, an enormous strain is placed on the computing and network infrastructure to process and transmit them during a narrow time window.  What ends up happening is that the data center will be operating at less than 20% utilization for most of the day and at over 80% utilization during peak processing time.

In order to accommodate this process, enormous amounts of capital investment are required to build out the needed computing infrastructure.

One way to address this problem is to migrate the process to the cloud.  It will certainly reduce the capital investment required, but is likely to drive extremely high data fees during peak processing times since these peak times will almost certainly coincide with batch-based data processing that is being done by other clients of the cloud provider.

Another way to address the problem is to change the processes so that claims are processed continually instead of batching them until a certain time of day.

There is no particular technological barrier to a change like this, other than the inertia of the status quo and the process/programming changes required to implement continual claim processing.

In the end, technology is and always must be about business value.  How do we create, enhance and optimize business value.  Any conversation you are involved in that does not revolve around this one central idea is almost certain to be another shiny object that will distract attention away from the only thing that has ever made technology valuable in the first place.  (Spoiler alert – business value is the only thing that

Key Contributor – Tony Rapp (Cambia Health)

https://www.linkedin.com/in/tony-rapp-6880a21/

Podcast Interviews

Marketing and Operations to Scale a Business

So this is a Friday edition where you are going to be hearing from me. And so if you’ve been listening this week, what you’ve been hearing about is how to market and brand your business. And if you haven’t

Read More »
Leadership & Strategy

Competing Against the Clock

The thing on my mind today is Time … specifically, the way that time goes by and can never be recovered. Because of this, each of us end up spending our lives competing against the clock. Like many people, I

Read More »