Wednesday, November 18, 2009

Virtualization

In today's enterprise-level data centers, virtualization is the term that describes the abstraction of tangible computer resources. It is essentially the separation of physical computing equipment (processors and hard-drives) from the applications that rely on them. A "virtual" application allows a piece of software compiled for a specific computer to run unmodified on different computers and operating systems. Even multiple operating systems can safely coexist on one physical machine. In an economic contraction such as the one we're currently experiencing in the U.S., the virtualization of computing resources is becoming increasingly important to organizations. Because of virtualization technology's ability to reduce costs and decrease reliance on physical hardware, a company's IT infrastructure doesn't rely as much on new servers and storage devices. Instead, it uses existing hardware to increase productivity. While such productivity gains are an obvious and near-term advantage in virtualization, the technology has considerable implications if we consider how the computing landscape will fundamentally change with the separation of physical resources from applications.

Consider first, that business innovation has accelerated in recent years with the emergence of faster technology deployment, more integration between disparate systems, and the rapid surge of world-class human talent into the high-tech industry. Because virtualization reduces the need for organizations to ramp-up expensive physical resources in order to deploy new technology, the adoption and use of new applications is much faster. As these new applications are integrated into a company's business processes, business capabilities are enhanced and overall efficiency increases. With the resulting improvements in productivity and collaboration, the business case for virtualization extends beyond cost savings and into organizational effectiveness.

Second, the architecture of an organization becomes more flexible as a result of virtualized applications. Businesses which experience "spikes" in IT requirements (ticketing agencies during big concerts, toy stores at Christmas, health-care data centers during flu season, weather centers during hurricanes, etc.) no longer need to design their IT architecture around their maximum computing requirements. With more reliance on virtualization, IT infrastructure can be made more resilient and handle demand spikes. Businesses can continue to operate without worry of exhausting their IT resources due to limitations in physical infrastructure.

Third, and most fundamentally, consider that the core of a business process no longer relies on a machine, but rather a piece of intellectual capital. What does this do to the business world in which we operate? Would the increased "liquidity" of intellectual resources in this cloud of processing power and data storage be bound by the same rules of supply and demand as all other economically driven processes? While it may sound rather far-fetched, I would like to propose that if a company's intellectual capital IS its product, and that intellectual capital is no longer bound by resource constraints in the physical realm, the traditional axiom of supply and demand is no longer applicable. If companies were able to break free from this fundamental productivity contraint simply by shifting their products to those of a more intangible nature, the business landscape would be changed forever.

Business productivity that relies on physical materials tends to be linear in relation to infrastructural investments. The increased data liquidity that virtualization enables changes the way that data centers and consequently businesses, architect themselves for the future. In addition to aforementioned gains in productivity and technology deployment, the fluid and highly cooperative nature of virtualized environments will inevitably lead to symbiotic relationships between different business applications...an evolution of sorts. It is interesting to note that we have already observed such transformations in biological systems...but that's another blog post.

Tuesday, November 3, 2009

Market Correction

What exactly is a market correction? What is it that the market is correcting when this happens? Today, one might argue that assets have been overvalued, and the irresponsible trading of financial derivatives resulting from these overvalued assets have led us to the current market correction. So what the market is actually correcting is the fact that it is trading on value that may not actually exist. So I guess the next natural question would be: What is considered to be value that does exist? Merriam Webster defines value as "a fair return or equivalent in goods for something exchanged". If we apply this definition of value to asset values in the financial context, we would have to define the value of financial derivatives as the equivalent in goods or services for an asset that a particular derivative represents. In applying this to an example then: I get a loan from a bank to buy a house for $1m. The bank takes my loan and chops it up into 10, $100K pieces with a higher future value since I am going to pay the bank interest on this loan, and whoever buys it from the bank wants to make an investment in its future value. These 10 pieces (or mortgage-backed securities) are traded in the stock market with other such (future-value based) financial instruments. If I ultimately default on my loan and do not pay the bank, the bank would then no longer have the financial value necessary to fulfill its obligations on the securities it has traded. In this scenario, when did the bank determine value? Was it when they chopped up my loan and sold it, or was it when they determined whether or not I was credit-worthy enough to get a $1m loan? If they didn't think I could pay the loan, they probably would never have given it to me in the first place. However, since they wanted my business because I was willing to pay them interest, they took a "risk" on me. So the value here must lie in the risk right? The less risk that the bank would have to take on me (determined by my credit-worthiness), the more valuable my business would be to them. Conversely, the more of a risk they would have to take on me, the less valuable my business would be to the bank. Isn't this all so simple?

So I ask again. What is a market correction? What is being "corrected"? Is it the value that the bank places on risk? If so, someone is not doing a very good job at evaluating my credit-worthiness. In order to accurately value the risk of giving me a loan, the bank would have to be able to predict the future; whether or not I would be likely to default on my loan. According to current methodologies on determining credit-worthiness, the best indicator of the future, is the past. Given my credit history (and probably liquidity and other such metrics), the bank would likely be led to conclude that I have excellent credit worthiness and thus, am a low risk candidate for the loan. However, how good are we at predicting the future based on the past? Has this methodology (of basing future predictions on past events) succeeded throughout history with enough frequency that we would base an entire market system on it?

The truth is that predicting the future is not possible. We can reduce our risks with these predictions by leveraging knowledge from the past but by placing a monetary value on that reduction of risk, we are creating a market system that will always be plagued by market corrections and other such crises. After all, what one person thinks the future holds is likely to be very different than what another does. Further, the criteria that one person uses to predict the future may not necessarily be the past, but rather something very VERY different. Spiritual guidance even. I have a theory here: People who base their life purpose on spiritual guidance are more credit worthy than those who don't. Don't believe me? I'll bet you a million dollars I'm right.