Nonlinear Pricing: Theory and Applications / Edition 1

Nonlinear Pricing: Theory and Applications / Edition 1

by Christopher T. May
ISBN-10:
0471245518
ISBN-13:
9780471245513
Pub. Date:
02/22/1999
Publisher:
Wiley
ISBN-10:
0471245518
ISBN-13:
9780471245513
Pub. Date:
02/22/1999
Publisher:
Wiley
Nonlinear Pricing: Theory and Applications / Edition 1

Nonlinear Pricing: Theory and Applications / Edition 1

by Christopher T. May

Hardcover

$69.95
Current price is , Original price is $69.95. You
$69.95 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Overview

One of the many striking applications of nonlinear technology in recent years, nonlinear pricing uses cutting-edge technology to identify and exploit patterns hidden within the seemingly helter-skelter rise and fall of daily stock prices. Nonlinear Pricing sheds much needed light on the principles behind this innovative view of reality and provides clear explanations of how it is employed to predict-at least partially-the unpredictable.

Beginning with an incisive introduction to the topic, May presents the roots of nonlinearity through the examples of calendrics, geometry, and music. He then illustrated the application and integration of various nonlinear technologies, including genetic algorithms, fuzzy logic, fractal imaging, and nonlinear dynamics, to such essentials as trading strategies, asset allocation, risk management, and derivative pricing and hedging. Along with practical methodologies and a wealth of real-world examples, this comprehensive resource contains a glossary of terms, a bibliography and in-depth information on:
* Fractal analysis-power law distributions, fractional Brownian motion, and their relationships
* The Hurst Exponent-the KAOS screen and its practical implementation
* Resonance-time domain versus frequency domain, Brownian motion, and the Gaussian distribution
* Advanced concepts-Soros's Reflexivity, non-equilibrium economics, kernel of theoretical nonlinear pricing, May's Law, resolution and resonance

Written by one of the few practitioners using this breakthrough methodology to trade the markets successfully, Nonlinear Pricing fills an important niche in investment literature. It is a must read for anyone seeking to understand-and capitalize on-twenty-first century financial economics.

CHRISTOPHER MAY (New York, NY) runs TLB Partners, LP, an onshore hedge fund and May Nonlinear US Equity Fund, an offshore fund.

Product Details

ISBN-13: 9780471245513
Publisher: Wiley
Publication date: 02/22/1999
Series: Wiley Trading , #65
Pages: 384
Product dimensions: 6.30(w) x 9.38(h) x 1.18(d)

About the Author

Christopher T. May is the author of Nonlinear Pricing: Theory and Applications, published by Wiley.

Read an Excerpt

Nonlinear Pricing
Christopher T. May
0-471-24551-8

A Toy Story for Wall Street

Las ideas no se matan!
Don't kill the ideas!
-Don Domingo Faustino Sarmiento,
President of Argentina, 1875

Almost everything you believe about characterizing financial-economic relationships is wrong. Explaining this statement would fill many books. It has already filled a few; it filled this one and it will fill others. Most MBAs and CFOs (chief financial officers) are trained to believe, not to know. Belief can be manipulated. Knowledge, on the other hand, is dangerous. This book is about knowledge of how the real world works and not how theoretical constructs assume it to be. We will examine how we got to this point where theory and reality do not mesh, what the differences are between old and new interpretations, and how we can better explain the real world. In the final analysis, it does not matter whether one believes a thing is true. The only thing that matters is whether a thing is true. That is all.
The journey to the new explanation is arduous, and not everyone will make it. The challenge is not physical; it is mental because we need new technology to help us-technology with which many may not be familiar. Complacency and the inability to make up one's own mind will be the biggest hindrances. Mirrors can be painful instruments of introspection. But persevere, because great advantage awaits those who reach the other side.
There is a central concept that will help you on your journey: In the digital age, we are no longer in the financial business as we know it traditionally. We are in the information business. Money is only a commodity.
Characterize financial relationships that have yet to be articulated and you will change information. Change information and you change the business.
To win, investors have to see things from a different perspective. John Maynard Keynes in his General Theory of Employment, Interest and Money observed that this kind of independence is seen as "eccentric, unconventional and rash in the eyes of average opinion. . . . Worldly wisdom teaches that it is better for reputation to fail conventionally than to succeed unconventionally." Mindful of Warren Buffett's admonition not to confuse conventionality with conservatism, the present text argues that a more realistic-that is, nonlinear-picture of reality, is conservative, while extant beliefs in the form of linear constructs, though widely held, are merely conventional.

DRUCKER AND CONCEPTS

Articulating this change in perspective is Peter Drucker who is credited with founding management and who penned a prescient article, "The Next Information Revolution," in the 28 August 1998 Forbes ASAP. The introductory paragraph reads:

The next information revolution is well underway. But it is not happening where information scientists, information executives, and the information industry in general is looking for it, it is not a revolution in technology, machinery, techniques, software, or speed, it is a revolution in CONCEPTS.

Nonlinear pricing is a revolution in concepts. Drucker goes on to say that mensuration or what we can measure influences if not solely determines how we think about something. In financial-economics measurement is called accounting. Although double entry booking dates from the Fibonacci's Liber Abaci in 1305, Generally Accepted Accounting Principles and economic theory are industrial era legacies of the 1930s. GAAP was created by the Securities and Exchange Act of 1933 in response to a need for better information by investors following the great depression. Neoclassical economic theory is the result of an attempt to make economics more rigorous by quantifying it. To do this, economics purloined the concept of equilibrium from physics. As the "what is measured" and "how it is thought about" of the 1930s, GAAP accounting and equilibrium theory are related. In the information age, they do not go away, but become specialized subsets of a broader nonlinear interpretation of reality. Improvements in traditional measurements, like yield-management for airlines and new measurements, like the Hurst exponent or nonlinear simulations build this new interpretation. Measure a nonlinear time-series and you change derivative pricing, hedging, portfolio and risk management. Better measurement puts us at increasing odds with extant theory and compels us to find new explanations. If that is not a change in CONCEPT, I do not know what is.
In financial-economics, transitioning to the electronic medium has highlighted the concept of discounting as in the case of Amazon.com for books and on-line stock trading for commissions. Optimark extends discounting to liquidity in the markets, by the clandestine matching of buyers and sellers. It began operation on the Pacific Stock Exchange in the Autumn of 1998 and is scheduled to take effect in the over-the-counter market a year later. But the question has not been asked, what happens next? It is like asking what happens when emerging markets "emerge"? At some point discounting must become passT. After discounting, the next conceptual leap will be to leverage the electronic infrastructure and the advances in science and mathematics to give us insight into pricing financial assets.
With new nonlinear assumptions in place we can now move beyond discounting to "pricing." Pricing includes the concept of valuation-which holds that any financial instrument's value is tied to some aspect of economic performance. Nonlinear pricing has a role to play here by discovering metrics which are more indicative of performance by information era companies than the industrial era legacies which now exist. The real question is to what sort of metric is value tied?
Beyond valuation however is the inclusion of dynamics of markets themselves. As we know it valuation is a static concept that occurs in a dynamic environment which is highly nonlinear-that is the cause-effect relationship is disproportional. Because of this deterministic formulas per se do not apply. In lieu of formulas is simulation. The goal of nonlinear pricing is the continuous running in real-time of "what if" scenarios to capture the dynamics of the interaction between values and the market. The inherent assumption is that valuation and dynamics are "reflexive," that is, they can affect each other although most of the time the effect of valuation is more pronounced on dynamics than vice versa. A good though unfortunate parallel of the latter is politicians who "lead" by first taking a poll and then follow the poll's findings.
Nonlinear pricing is a more realistic and thus more complex view of reality. Simulating results for the purpose of making investment decisions is a more sophisticated concept than discounting. The concept of discounting is analogous to an example discussed in greater detail later where non-English speaking employees in the early days of the semiconductor business sought a mechanistic interpretation of chip making. that is, sort of intuitive approach that required no special education. However, the information age is driven by CONCEPTS. Like our counterparts in engineering there is nothing intuitive about how electrons or the pricing mechanisms in markets sometime behave. Without the background in financial-economics and the sciences there is little in this book that will appeal to a simplistic or mechanistic interpretation and the scientific legacy of Newtonian determinism which it represents. Without the background you cannot participate.
Blame it on Sir Isaac Newton, the first fellow to be knighted for scientific achievement. He is the primary reason we need so much science in this text on financial economics. You see, after the apple dropped on his head and he discovered gravity, Newton then codeveloped the calculus in 1686. He did this by making a basic assumption about the world. That is, that we can make an observation if all points in the universe are fixed. This came to be known as determinism, because if all points are fixed and a few are known, then we can determine all the others. After Cambridge "Adam Smith" (George J. W. Goodman) wrote in 1981 in Paper Money, the government gave Newton a sinecure post called Master of the Mint, which roughly corresponds to Chancellor of the Exchequer, or Secretary of the Treasury in the United States today. The same mind using the same guiding philosophy that fixed physical relationships in science fixed or pegged the currency in 1717! Goodman tells us, "One guinea, that is, 21 shillings, would be worth 129.4 grams of gold, and thereby he [Newton] said, We intend this currency to hold its value." Newton also lost a fortune in the South Sea Bubble.
This was to hold until 1931, when Britain let the pound "float" because of a world crisis. In 1944, in an attempt to reestablish order, the Bretton Woods conference fixed world currencies and specifically the new reserve currency, the dollar, at $35 per ounce of gold. Later, in August 1971, because of the oil crisis or the unsustainable demand for gold, President Richard Nixon took the dollar off the gold standard. Newtonian determinism became discredited in the scientific world in 1926 with quantum physics. Sadly, in financial economics we have been trying to interpret the relativistic and relational world with the wrong set of assumptions. Goodman saw the future in Paper Money, but the collateral concepts were not sufficiently developed to dismount the status quo. Listen to his words, now 17 years old, and hear their prescience.

When the atomic physicists began to describe the subparticle universe of quantum mechanics, they needed a description of the existing world as a base reference. They called it "Newtonian": classical, balanced, fixed. . . . Index funds, and beta and Modern Portfolio Theory-and three more computer-derived techniques now a-borning-are all quests for certainty, for orientation, for classical science, for the lost Newtonian universe. . . . The Newtonian universe is gone.

The mere floating of currencies and introduction of quantum physics have eroded the fringes of Newton's verly successful legacy, but they have not been enough to expunge its tattered remnants. It is to be hoped that non-linear pricing is the final nail in the coffin of Newton's legacy in financial economics. To get to the heart of our argument we have to undo three centuries of complacency.
Once upon a time Newtonian determinism was both conventional and conservative, but now, with the advent of superior technology, it is only conventional. As the nonlinear paradigm begins to permeate the practitioner's world, keeping the old paradigm will ensure that you are with the herd. But you can no longer be considered conservative.
Finance, at both the corporate and market levels, is a complex adaptive process. Ever-increasing amounts of computational horsepower and scientific development allow us to characterize relationships of which we were previously ignorant or that we suspected and ignored because direct proof and practical application were neither feasible nor possible. This results in:

  • A fundamental change in measuring and illustrating the risk-reward relationship;
  • Increased use of sophisticated information management
  • Adaptation of advanced techniques, common in other industries to finance
  • Updated skills for people in the finance industry who wish to stay competitive

We are not concerned with investing in technology stocks, how compuerization will improve your billing system, or other mundane tasks. Our inquiry is more fundamental.

SIMPLE FINANCE

John Holland, father of the genetic algorithm, once said, "It is little known outside the world of mathematics that most of our mathematical tools, from simple arithmetic through differential calculus to algebraic topology, rely on the assumption of linearity." The change to nonlinearity requires a re-statement of the traditional way we think about and express some common concepts.
Think of two simple examples in finance: correlation and volatility. In English, correlation expresses the concept of how much something goes up and down vis-a-vis something else. For example, futures on the S& P index are highly positively (near perfectly) correlated with that basket of stocks which comprise the S& P index. If the stocks go up, the futures very quickly go up by an almost identical amount, and if stocks go down futures go down as well. The linear mathematical measurement of this correlation in statistics is known as regression and is called R 2 . If R 2 is 1.00 then the movement of the S& P index or the dependent variable, perfectly accounts for the movement of the independent variable or the basket of stocks. If R 2 is only 0.87 then only 87% of the movement of the S& P index is described by the stocks and something else accounts for the other 13%. Conceptually characterizing the movement of one variable with another variable or group of variables is a good idea. The problem is with the assumptions of the math itself. R 2 assumes that the underlying distribution, which describes the futures contract and the basket of stocks in the S& P index, is Gaussian. The assumption is wrong. It was proved wrong as far back as 1964. Conceptually we are like the tailor who assumes that one size fits all. So why do financiers use R 2 ? Out of habit perhaps, but not out of reason.
A more accurate and nonlinear measure uses "fuzzy logic." Why? Fuzzy logic does not make an assumption about the underlying distribution of the things being measured. It does not try to fit reality into a presized suit or defined mathematical framework. The simple definition of fuzzy logic is that it is a "universal approximator." In Chapter 6 we will cover it more in depth. But fuzzy logic is not new; it was invented in 1965.
Nonlinearity does not mean the futures contract and the underlying basket do not rise and fall in synchronicity. They most certainly do. The relationship is what it is. Nonlinearity merely allows us to characterize more accurately all the disproportionate influences or "noise" in the market to help reduce that 13% difference, or what the index arbitrageurs call "tracking error."
Volatility is another popular concept in finance and it expresses the degree to which something goes up and down vis-a-vis itself. For example, a stock with 50% volatility, which may be considered high, is indicative that this stock has higher risk and therefore supposedly a higher potential reward. Like the previous example, volatility is expressed mathematically by a Gaussian probability distribution. Distributions are described by "moments." The first moment is the well-known mean or average. The second moment is called variance and expresses the concept of how often an observation deviates from the mean-hence the term standard deviation. The linear measurement of volatility is defined in mathematical terms as variance or standard deviation squared.
Thus, variance is the second moment of a distribution. If we assume a normal distribution, then the second moment is finite; in other words, it exists. If we actually measure the returns of the stock, like Eugene Fama did in 1963, we discover that returns of a stock are not normally distributed, and that the tails of the distribution do not touch the horizontal axis, and therefore variance is infinite. In other words, it does not exist. Poof! There goes volatility as we know it.
Nonlinearity does not say that stocks do not go up and down. They most assuredly do. Nor does it deny that risk may be generally commensurate with reward. However, nonlinearity suggests that using a stable distribution-or "fractal" distribution as it is sometimes called-that better describes reality is a superior economic model.
Both correlation and volatility are good descriptive concepts. But as they are present at defined, both assume a normal distribution. That assumption is wrong. It is the imposition of a linear ruler to measure a nonlinear world and, therefore, not very good science. To reiterate what Holland said, "It is little known outside the world of mathematics that most of our mathematical tools, from simple arithmetic through differential calculus to algebraic topology, rely on the assumption of linearity." He went on to say, "Polls, project trends, or industrial statistics, all of which employ summation, are only useful if they describe linear properties of the underlying systems. It is so much easier to use mathematics when systems have linear properties that we often expend considerable efforts to justify the assumption of linearity." The previous examples illustrate Holland's point exactly.
Unfortunately, the markets are not a linear system; they are a complex adaptive system, meaning that they adapt and change slightly with time. Businesses measure the degree to which they interpret and satisfy client needs with sales. Clients respond or give feedback by purchasing the product or service or not. Sales are the clearest example of a feedback mechanism in an economy. Viewed generically, an economy is an adaptive system and the concepts developed from other adaptive systems in other disciplines may have merit in financial economics.
Yet, we continue to attempt to justify using linear concepts to describe a nonlinear system. A scientist would probably say, if the theory does not fit the facts, change the theory. An enlightened student of the markets should ask, how do I make money exploiting a more accurate measurement of the risk-reward relationship while my competitors sleep?

A MILD CRISIS IN ECONOMICS

There is a mild crisis boiling in financial economics, which is the underpinning of virtually all investment strategy. It ought to be a major crisis but it is not one yet. Many recent (and not so recent) developments in other disciplines suggest that the field of financial economics is long overdue for some major revisions, not only in the way we conceptualize theory but also-and more importantly-in the way the financial business operates. Whether financiers will be ahead of the crisis or behind it depends on whether, and how, we interpret these developments and put them into practical application.
With an unprecedented flow of money being put into mutual funds by the investing public, good corporate earnings, and the Dow high, one might well ask, "What crisis?" The upshot is, as Will Rogers so aptly put it, "So much of what we know, just ain't so." This is an unsettling admission, because finance, for all its supposed dynamism, is really quite a staid field when it comes to new ideas. Financial economics has done a good job in providing new products; however, the mathematical assumptions-the technology-which often underpins those products is 25 years old, or older.
The fact is that many of the underlying assumptions we use to construct our explanations of the financial economics world are beginning to look highly questionable-and most certainly, severely limited-in light of developments in other fields, particularly computer science and mathematics. Probably the two biggest sacred cows are: (1) The distributions of any financial asset are a normal or Gaussian distribution or the familiar bell-shaped curve. All three terms mean the same thing. Further, the Gaussian distribution is the limit of a Brownian motion the mathematical name for the randomness we assume applies to financial time series. (2) Markets exist only in equilibrium. The concepts of equilibrium and randomness are part of linear mathematics.
In financial economics, the only time the precision of linear mathematics accurately reflects reality is when the discount rate is fixed and applied to government debt. And then the math is applicable only if we, like Walter Wriston, the former chairman of Citibank, assume that countries do not go bankrupt. And even then, the assessments are far different for a banana republic or a developing nation than they are for the world's policeman and lender of last resort. In all other instances of applying linear mathematics to characterize relationships in financial economics, Einstein's famous quote applies: "So far as the laws of mathematics refer to reality, they are not certain. And so far as they are certain, they do not refer to reality." The certitudes expressed by linear math do not apply to the nonlinear and nonequilibrium reality of financial economics.

NONLINEAR PRICING

Nonlinear pricing is defined as any technological trading aid that acknowledges the nonlinearities exhibited by markets to more accurately characterize the patterns exhibited by traded assets. Nonlinear pricing comprises new technologies such as complexity theory, fuzzy logic, abductive logic, genetic algorithms, and loosely coupled sets to model the new paradigm of evolution or adaptation.
One possible opportunity to profit is the arbitrage between the pattern depicted by nonlinear pricing and the market's inability to detect the pattern as accurately. Moreover, it is an efficiency unlikely to be arbitraged away because of the number of variables involved, the varying investment horizons, and the technology gap among market participants.
Nonlinear pricing is based on the fact that markets are adaptive; that is, they change with time. Possible trades are run in hypertime-quicker than real time, like the fast-forward button on your VCR-to get an idea of how a pattern may evolve. The goal of nonlinear pricing is to quantify the relationship in time between multiple variables and their movements to gain some degree of predictability over probable future prices. This is the goal of any and all analysis in financial economics. The primary distinction between nonlinear pricing and prevailing views is the underlying nonlinear technologies used and the understanding that goes with them. The Hurst exponent is but a single statistic; thus it is only a small part, but the most visible mathematical tool of all the tools available in nonlinear pricing.
In many important respects, nonlinear pricing is a practical implementation of George Soros's theory of reflexivity, which he first wrote about in 1987. The theory of reflexivity holds that equilibrium-the assumed norm in classical economics-is but a special case of the morecommon disequilibrium state in markets and the more infrequent far-from-equilibrium state (e. g., market swings and crashes). Reflexivity goes one step further in describing the warp and woof of market activity. It states that practitioners' expectations can actually influence the markets themselves. Nonlinear pricing can take many variables and aggregate them into a nonlinear framework so that the concerted effect of those variables renders a stronger statement than any single metric can make individually or that myriad individual metrics can make in a linear analysis. There is a strong basis for the principle of reflexivity in Heisenberg's uncertainly principle, which holds that either the momentum or the position of a molecule can be known but not both simultaneously. The logical upshot is that subject and object are linked. To observe a molecule requires bouncing a photon of light off of it. At the quantum level, the light is strong enough to alter the molecule's trajectory. The real problem probably lies in the separation of subject and object. This is also a problem in financial economics.
The concept of reflexivity can be partially visualized within the microcosm of trading an S& P futures contract of an index versus the basket of underlying equities that the S& P index represents. The futures and the index are equivalent, though not identical. When the target spread is reached, a buy/ sell or sell/ buy signal is triggered. However, executing the trade will narrow the gross spread. The net spread will be affected by liquidity and execution delay and will not be known until all the prices at which stocks in the basket were executed are received. Moreover, the fact that your competitors have the same information at the same time means that you have to anticipate. If several firms do large trades, their collective views and resultant action will have a real but temporary effect on the markets. Views of the spread affect the markets, and views of the markets affect the spread. Thus, reflexivity is easily demonstrated. It is a recursive or reflexive relationship.
Paradigm shifts are typically recognized by disparate parties nearly simultaneously, each formulating to the best of their understanding a brick on the path to progress. Ed Peters and Tonis Vaga wrote Fractal Market Hypothesis and Coherent Market Hypothesis, respectively. The differences between these gentlemen's views and Soros's are primarily generational. Whereas Soros cast his theory of reflexivity in terms of classical economics, Peters and Vaga, as befits the schooling of their generation, have cast their theories in terms of the Modern Portfolio Theory of the 1970s. The terms to describe nonlinear relationships have grown up and become formalized within the hard sciences in the last two decades, and Peters and Vaga are facile in using them to describe market realities. Although each of the three theories has a specific thrust, their underlying commonalities are greater than their differences, and their collective effort to shift equilibrium-based economics toward a new nonlinear and nonequilibrium paradigm is undeniable.
Academician-based economics has moved much more slowly, although it has recently offered the concept of increasing returns. This is the opposite of diminishing returns and holds that something gets easier to sell the larger the installed base. A good example is the fax machine, which is not worth very much if you are the only person on the planet to have one. Conversely, the more profligate the fax machine, the more of a necessity it becomes. Other recent examples are the VHF-Betamax war, the QWERTY keyboard layout, and technology standards like Windows NT and Netscape browsers. The concept of increasing returns is a very nonlinear phenomenon.
The paradigm shift to evolution or adaptation is not limited to financial economics. In fact, much is derived from the microprocessor, which has evolved from the stand-alone personal computer (PC) to the network and now to the dynamics of networks. The dynamics of networks, where thousands of autonomous entities clandestinely interact, are remarkably similar to those of financial markets. Kevin Kelly provides a thought-provoking account in Out of Control: The Rise of the Neo-Biological Civilization. Since biology has been our traditional source of evolutionary study, it would seem that biology then would be a more powerful paradigm than physics for finance and be in keeping with Philip Anderson's funnel remark (Anderson, who received the 1977 Nobel Prize in physics, said that no amount of studying water molecules will help you in learning about the emergent property like a funnel formed by draining water).
Nonlinear pricing totally refutes the theory that a financial time series will be random 100% of the time-a tenent of both the Capital Asset and Black-Scholes option pricing models. The theory is directly disproved via the KAOS screen in Chapter 3. But this proof, which illustrates the Hurst exponent, is only one of many technologies that can be used to characterize data. Nonlinear means that the input-output of a relationship is disproportionate. If three hours of studying results in a B grade on an exam, it does not follow that four hours will result in an A. A linear relationship, in contrast, is proportional. As a simple robot, a Coke machine is proportional in that three quarters go in and one Coke comes out every time. Stated differently, if relationships are nonlinear some of the time, they are also partially predictable. Exploiting this partial predictability is a central concept of non-linear pricing.
Much of what nonlinear pricing has to say modifies extant financial theory because theory is the gestalt, weltanschauung, or nber-view of how things are supposed to work. Unfortunately, the modifications are incomplete in the sense of being the final answer to everything, or what physicists call "the theory of everything." To understand market or physical behavior we have concepts and theory from which predictions are derived.
For example, Einstein's general theory of relativity published in 1916, and quantum theory in final form replaced Newton's theory of motion, which ceased to be credible in 1926. Relativity does give superior predictions; however, that is not the primary basis for its value. Similarly, in finance we have the Black-Scholes option pricing model, which assumes that the movement in time of a financial time series is totally random. The primary basis for the value of both theories is the depth of understanding that each gives us about reality, a reality which we do not necessarily experience directly.
To wit, Galileo was challenged by the Inquisition for his heliocentric theory of planetary motion, in which he claimed that the Earth and the other planets moved around the Sun. His theory replaced the Copernican view that the Earth was the center of the universe. The questions he reportedly received were: "How do you know the Earth moves? Can you feel the Earth move?" No. No one can feel the Earth move, and it is relativity that tells us why. His answer, which he was forced to recant, was based on the fact that we could interpret reality symbolically and that observations can be reduced to an understanding or theory. In modern terms, those symbols are mathematics processed by computers in cyberspace. The Roman Catholic Church did not care about the planets; it rightly perceived a rival interpretation of reality based on mathematics versus its own of divinity.
This discovery may be said to be the birth of modern science. Although it took Galileo three hundred years to win the argument, with the Vatican finally admitting its error in 1992, his is the understanding we have today. We accept Galileo's interpretation because it is the most plausible explanation that we have. To understand why, we have to take a step back and look at the most modern interpretations of reality. What gives theory its validity over mere fact memorization or prediction is the understanding derived from it.
Yet many in financial economics are concerned only with short-term predictions. In the scientific world these people are called instrumentalists. Instrumentalists do not care for theory or understanding, only for correct predictions. In a very limited sense this view may seem appealing. However, taken to its logical conclusion it leads us in the wrong direction. The logical conclusion would be predicting at the turn of the 20th century that man would walk on the moon, or that interconnected computers all over the world called the Internet would exist. In 1900, computers and rockets did not exist. Without the requisite understanding-which is the scientific and financial ideal-correct predictions cannot lead very far, because there is no basis to interpret them.
The misunderstanding of the role of predictions may arise from the fact that subjecting a theory to an experimental test is part of the scientific process. That is, two theories are subjected to the same test and the theory with the affirmed result is adopted. Some theories without any logic of explanatory power, are rejected outright without testing because there is no thread of logic running through them-for example, the theory that jumping up and down will improve the performance of your portfolio.
As knowledge proliferates, better theories that provide a broader and deeper understanding of the world are constantly evolving. Often, the new theory that replaces the old augments our understanding of the reality that we inhabit and seek to master. Financial economics, as a derivative phenomenon, is quite far removed from first principles. First principles are the statements that can be made about the most fundamental constituents of matter or particles. The implicit hope was that once these first principles were understood, the rest of the universe could be explained in terms of them. Practitioners have, to their great credit, achieved some success in managing risk and return. Unfortunately, they cannot express much of their knowledge in terms of extant financial theory or mathematics. It is a hodgepodge of habits, rules of thumb, superstitions, instinct, and other attributes garnered from the master-apprentice relationship and experience. The explanatory gap between the economics professor and the bond trader is a chasm. Nonetheless, that operational hodgepodge is, or contains, some form of mental framework we call financial-economics theory. Progress ensures that the knowledge is deeper and structurally different.
A century ago, we did not have as firm a grasp of fiscal or monetary policy as we do now. Nor did we have equilibrium-based economics until 1930s, fundamental analysis until the 1930s, or knowledge of derivatives until 1972. To make better theories of financial economics requires a firm understanding of the reality it inhabits. Financial theory stands to benefit greatly from the explanatory power of the theories of reality, which has improved dramatically in recent years. One of the problems in creating a superior theory is the philosophical approach taken when formulating it. The most popular approach for the past 350 years is the reductionist approach. Inspired by Newton, reductionism attempts to reduce things into ever finer parts. It has met with great success. For example, we can explain the physical constitution of an inanimate object by going from piece to molecule, to atom, to particle, to subatomic particle. The success has not extended to animate objects, typically the subject of organic chemistry-better known as life. The reason is knowledge of atomic behavior occurs at too fundamental a level to be applicable in describing a more complex interaction of atoms in the animate form of an animal. For example, knowing the weight of the carbon molecules that make up a goodly part of bears and humans will not help you if the bear decides to chase you. The philosophical conundrum is this: Reductionism has always said to tear the thing to be studied into finer parts. When that approach fails, what is the beginning point? Is it the bear, the ecosystem, or human interaction with bears? Similarly, in financial-economic analysis we start with the world and then focus on the national economy, th e sector, the industry, the company, the division, and finally the product. We rip it into even finer pieces. Real analysis, though, involves the mental abstraction to see how an increase in employment numbers and an increase in sales may bode for the price of a stock. In other words, sometimes stocks go up or down on good news, and sometimes they go down or up on bad news, but the analysis needs to address highly nonlinear relationships across varying levels of resolution. However, reductionism solves only part of the problem.
The other part of the problem in recent years, in both financial economics and physics was well stated by Philip Anderson. Anderson gave the example of the funnel that forms in bathtub water when the plug is pulled. The funnel that spontaneously forms is an emergent phenomenon. Emergence is high-level simplicity from low-level complexity. Emergence is an example of dynamics that is too complex to derive from first principles or extant physical or economic knowledge, but which nevertheless occurs. The emergence results in a state of self-organized criticality or SOC. SOC reflects the simple rules that generate complex behavior in nature to evolve into a poised or critical state. The change to a state of SOC is mathematically catastrophic in that the changes are discrete rather than smoothly continuous.
It is the movement in discrete units or quanta that lends strength to our later use of the concept of quantum physics in the markets. The evolution to SOC is autonomous. It requires neither a design nor an outside agent; hence it is self-organized. SOC exists as a result of the dynamicalinteractions within a system. SOC is the only known general mechanism to generate complexity, and the most robust paradigm we have to explain this complexity is biological evolution.
The classic example of SOC in financial economics is Adam Smith's pin factory, where performance dramatically increased with specialization. One man drew the wire, another clipped, another formed the head, and so on, rather than each man performing all steps. Henry Ford would rediscover a similar benefit over a century later with the assembly line.
However, no amount of reductionist thinking, like counting water molecules, determining their mass, measuring the shape of the bathtub, and so forth, will help you to understand an emergent phenomenon like a funnel of water. Similarly, no amount of reductionism in the form of reiterating the fact that XYZ Corporation just got a big order and can anticipate an increase in earnings per share will help you to understand the dynamics of the market. This is of concern to you because your client wants to know why the stock is dropping. It is a common rut in finance.
At the reductive level of knowledge, science and mathematics form a bedrock-whether it is counting water molecules or earnings per share. People cling to this bedrock, however useless it may be in the face of a different class or level of problem. The rut is to assume that all you wish to know and understand can be reduced to this sort of reductive thinking-the sort that is intuitive and easy to explain in a sound bite to a client who would rather be playing golf. It is not to be. Since reality is complex, we will need more advanced explanations to describe it. Money and information flows of an economy or a company may indeed be a reductionist concept in that dollars or bytes are concrete and discrete units. Unfortunately, flows, whether in the context of physics or finance, are dynamic. Moreover, these dynamics are nonlinear or disproportionate more often than they are linear or proportionate-if they are ever linear at all.
The second problem with reductionism is that it accepts explanations only in terms of cause and effect. This is a chicken-and-egg problem, which implies that an earlier cause is superior to a recent one-for instance, the trite "butterfly effect" in chaos theory in classical physics, where a butterfly in one part of the world causes a storm in another part because the weather is highly sensitive to initial conditions. Since you can never go back to the beginning of time and know the initial states with infinite precision, this avenue does not seem very fruitful. The reductionist cause-and-effect approach does not lead to any theory, either. To be an explanation or a theory means to be falsifiable. Since "the tape" of history or the stock market cannot be replayed, we are left with only a historical narrative or, worse, a justification for an ill-posed course of action. Higher-level or derivative sciences like evolution, financial economics, politics, and psychology cannot be treated in such a manner.
Perhaps of greatest importance in the type of thinking needed to fully embrace nonlinear pricing is the realization that the power to model is the power to experiment. And the power to experiment is thepower to move from the retrospective of a historical narrative to a prospective or forward-looking analysis. By simulating scenarios and testing the effects, their intervention regulators and investors can better tailor their impact.
In fact, the purpose of higher-level phenomena may enable us to understand emergent phenomena, of which the most important are life, thought, and computation. All three will have a role in the restated physics we need in order to begin to address financial economics in the nonequilibrium and nonlinear reality which it inhabits.
There is a counterpart to reductionism called holism. Holism would have us look at only the whole and not the parts. That is okay, but if we can reduce anything to its most fundamental constituent parts we can learn something and are thus obliged to do so, if only to form a substrate of knowledge. The danger is the extreme approach in which observations about properties of a system, exhibited at the most fundamental level or a higher level, are excluded in favor of convenience. In sum, if it happens, there must be a reason and that reason has to be explained.
At this point we will need to explore a little more about the links between neoclassical economic theory and physics. There are two direct links between financial economics and the space-time physics of Einstein. Space-time is the three dimensions of space and one of time. In reality, it is the same stuff. The introduction of space-time here is important since financial-economics occurs strictly in the temporal realm. Later we will need to build on this concept to create a time-dependent arbitrage.
First is the concept of equilibrium, which is the basis for neoclassical economics. In physics, equilibrium refers to a stasis or period of stability. In financial economics, equilibrium says that for every buyer there is a seller. That is, the change in price is assumed to be smooth and continuous. Opening a market limit down, "trading limits" such as those on the New York Stock Exchange, and discontinuous prices such as those that go from 48 1 /2 to 41 1 /4 for example, do not exist. Equilibrium also makes no allowance for markets that can and do occasionally crash or more often exist in the far-from-equilibrium state.
Second is the concept of diffusion, as in a gas diffusing randomly and evenly throughout a room, or milk in your coffee. Stocks are assumed to follow this diffusion in one dimension (because they move up and down) or two if time is included rather than three for gas (length, width, and height). The mathematical term for randomness is Brownian motion.
The third link is so obvious that it is usually never stated. It is the limits of classical physics and not mathematics that determine what we can compute. The most evident example of this to date in conventional computing is the original Moore's Law, which states that the density of semiconductors will double every 18 months. Moore subsequently amended this in 1997 to increase its potency.
We will see that all three are wrong. Quantum physics, even though it has not yet encompassed gravity, is a more powerful paradigm than classical physics. In financial economics, equilibrium and randomness do not exist 100% of the time. Physics has a direct effect on Wall Street because of quantum computation. That is, quantum computers are more efficient than normal computers, which means that numbers can be factorized quickly. This lack of efficiency means that none of the encrypted transactions that are based on factorization, such as those provided by the industry leader RSA Security Inc., are as secure as one might have previously thought. Factorization is the opposite of multiplication. It is trivial to multiply two large numbers together, but it is hard to find two unique primes. It is possible to factorize a 125-number key. They already exist and are classified by the U. S. government as "munitions" and forbidden for export, which is why you have to attest to Netscape that you are a U. S. citizen before you can download that functionality. With the addition of every number, the difficulty increases by a factor of three. Thus, a 126-digit key takes triple the time of a 125-digit key to factorize. It is the unique factorization of two primes that enables the cryptographic key to work. It is not trivial to factor a 250-digit number. This problem is computable in that a solution exists. However, it is not tractable with normal computers since it would take roughly a million years using a network of a million computers, or so estimated the computer scientist, Donald Knuth. In the example above, computing is given a very specific meaning. If the term compute can be interpreted more l iberally, then it can also include the last major economic expansion caused by the steam engine in the mid 19th century, which is literally, thermodynamics in motion. From this example we can see that in general, as technology progresses, physical laws and theories become more relevant not only to the development of technology itself but also to the realm of financial economics, where those physical laws find their expression in the form of pricing goods and services in the free market.
A financial industry that is increasingly moving on-line for cost reasons and which needs security versus the reality of quantum physics, pits two huge forces against each other. Moreover, this example highlights the mismatch of the two ways of looking at something: Wall Street, which is comprised of reductionists who want to tear things into little parts, instrumentalists, who want only predictions, or a group that wants only simple explanations for nontrivial problems versus the counterintuitiveness of quantum reality. To be more precise, we have accepted quantum reality in limited applications such as lasers and microprocessors. What we have not done is logically conclude what those isolated instances mean for a broader view of reality. Extending existing facts to a logical conclusion may give us some crucial insight.
Ultimately, financial necessity will mandate an improved understanding, although I cannot say when, anymore than I can say when quantum computing will become a commercial reality. The fact that quantum computing exists, though, means that the threat to encryption exists. If experience is any guide, then the understanding in financial economics will also be very uneven. We see the effects of this understanding take place in terms of Intel's early experience in hiring people. Gordon Moore said something like, "When we first started out, we hired non-English speakers because of the simplicity of what we were doing. It was intuitive-you could sort of figure it out, like automobile parts. We can no longer do that because the technology has become too sophisticated." The implication is clear. There is nothing intuitive about how electrons behave. You cannot see them with the naked eye. It is concept-driven and therefore intellectual. It is the province of the knowledge worker rather than the laborer. He concluded, "Without the background you cannot participate." The same is true in financial economics. In terms of nonlinear techniques, at our current stage of development, we are like the non-English speakers without formal education. We are trying, understandably but incorrectly, to deal with very complex phenomena in terms that are intuitive. After all, the linear approach, which is simple and intuitive, has gotten us this far.
It is rarely stated but trust is also an essential element in progress. It is impossible in terms of time or money to verify every body of knowledge that affects us in modern life. For economic reasons alone, we simply trust knowledgeable people by their reputation, credentials, and previous successes, and believe that they will do their best on our behalf. Of course, assessing levels of trust is highly subjective. We implicitly rely on experts. The reality is like medicine, or any specialized body of knowledge-when you go for a checkup, you don't want to know all that the doctor knows, if you did, you would have gone to medical school, too. All you want to know is whether the doctor knows what he or she is supposed to know. It usually happens over time that we come to trust the people who are supposed to know. After all, you have never felt the Earth move.
Some of the concepts in this book may be far removed from the experience of daily life, yet they are logically constructed. They are no more visible to the naked senses than is an electron, but they are real and a contributory part of their acceptance will be based on trust. Refusing to understand and even accept is not a successful long-term strategy. To the client it means poor financial products, or as the late Emile Peynaud, professor at the Bordeaux Institute, said, "You drink the wine you deserve." The implication is chillingly clear. The client of financial products and services who wants superior results will have to demand them, but first will have to understand them. The free market works for the entire spectrum of quality. Bad wine and bad financial products exist only for buyers of inferior goods and services. The professional who admits ignorance and is no more capable than the client, leads the client to wonder, "What am I paying you for?" From a competitive point of view, if your competitor understands and you do not, what does that say about you?

A NEW UNDERSTANDING

Theories that apply to fundamental or emergent phenomena may thus be said to be low-level like physics or high-level like biology and economics, respectively. But that does not imply a privilege in the sense of hierarchy. Rank is based solely on explanatory power. It is not a reductive "theory of everything" that we will find, but one taken from David Deutch's, The Fabric of Reality, that is quantum-based. More specifically, Deutch combines three other theories, which are high-level in relation to quantum physics. They are:

1. The theory of evolution as articulated by Darwin and Dawkins
2. Epistemology (the theory of knowledge) as articulated by Popper
3. The theory of computation as set forth by Turing
4. Quantum physics as set forth by Bohr

Deutch's view may not be right in the sense of being final, but it is cogent and contains all the elements we need to look at economics for the 21st century. It also performs a great service in attempting to unify some seemingly disparate bodies of knowledge. His Fabric of Reality is far richer than boring old Newtonian physics.
Financial economics, as a complex adaptive process exists in a web of all four theories. Historically, financial economics has been cast in terms of Newtonian physics. That explanation was right for its era. Science dropped the Newtonian paradigm in 1926, yet financial economics still clings to it. It is time to move on. Thus, to look at financial economics again, we first have to look at a modern interpretation of the world, and much has hanged. It must also be noted that while each of the four theories has been accepted in part, their full ramifications have never been embraced, probably because the change would be too big. Moreover, a combination of the four theories of reality as set forth by Deutch, is by no means accepted at the household level, either. However, they made sense to me, which of course does not necessarily make them right. The conclusions drawn from them for financial economics are supported, but they are by no means generally accepted in the broad sense, either.
There is an important point to be made here. In the previous paragraph I said that economics was cast in terms of Newtonian physics. This is true specifically, since economics borrows equilibrium and diffusion, and it is true in general in the sense that we look at something by itself. It is as if we can step outside reality and examine it to the exclusion of everything else. Can we view the vital organ of a body independent of the body? This sentiment was humorously expressed by a mechanic who said to the doctor, "You know, doc, we really do the same thing." "Yes," the doctor replied, "but try it while the engine is running."
We need to be less like the mechanic and more like the doctor in our views. The markets are an extraordinarily complex phenomenon. No theory of financial economics exists independent of the physical laws of the universe or of the biological laws to which humans are subject. There is no fixed backdrop to view something because one cannot arbitrarily step outside of existence and acquire it like a target. Once we accept that we cannot do this, our Newtonian explanation of things, however comforting, crumbles. The view becomes more relational and we have to view things in terms of each other, not just as an extracted entity. Simply put, one does not get to stand on the bottom of the pool; one gets to tread water. The relational world foregoes the anchor of a Newtonian fixed backdrop that the mechanic enjoyed.
Fortunately, there is a sort of renaissance going on now and it is thought by many leading scientists that complexity in nature is the next big challenge. Complexity is the variety in a system, which in financial economics is sometimes called bounded rationality. Since financial economics does not exist independent of the world, it therefore seems natural and appropriate to introduce concepts from other disciplines that may help us understand our discipline better. Among the sciences, physics has done the best job of this, probably because physicists perceive everything as their subject. Perhaps we should be affronted by an intrusion, but personally I welcome a catholic taste in problem solving, regardless of its source. I once deadpanned to a scientist friend that just as astronomy became astrophysics in my lifetime, we will see economics become "ecophysics." He did not get the joke but after a solemn moment, nodded his head in agreement.
Our quest means we have to encompass some broad fields of knowledge and ask some big questions. Given an interdisciplinary approach in an attempt to formally describe the world, every reference is fair game. Remember, in choosing the nonlinear path, we are deviating from the linear path set into motion by Newton 350 years ago and the Western academic tradition, built upon it. The panoramic nature of such an inquiry typicallymakes it remote from those people with a narrow or simplistic interpretation and those who tend to be on the consuming side of knowledge rather than on the producing side. One of the subjects raised will be religion. The link between the measurement of financial instruments and religion is trivial. The mathematics used to characterize relationships in financial economics is linear, in equilibrium, and based on the absolute view of Newton rather than the relative view of Leibniz, the other founder of the calculus. This absolute view has a parallel in Christianity and Judaism, where the Ten Commandments are absolute laws rather than relative in that "Thou shalt not steal" makes no reference to situational ethics. By contrast, Eastern thought and religion tends be more relational rather than absolute. We need to know more about the relational.
The errant thrust of many of these sorts of books is that in the supreme effort to "just tell the reader how to make money" and thus sell books, the scaffolding of understanding that goes with that moneymaking process tends to be omitted. Moneymaking without understanding is a dangerous business, as the contemporary examples of large derivatives losses attest. The other crutch is to throw a bunch of formulas at the reader, thus satisfying everyone. The editor, reader, and teacher all agree on formulas like recipes in a cookbook. Unfortunately, nonlinear techniques are computational rather than analytic. There are no formulas to memorize. There are techniques like genetic logarithms, and there are relationships, like the price of a bond to an index. In illustrating the interplay between techniques and relationships, often the written word does better than mathematical symbols but not as well as three-dimensional graphics. If one wishes to extend the analogy of cookery, then real cookery is not about following any recipe; real cookery is about cognizing the relationships between ingredients, texture, portions, temperature, flavor, and so on for yourself.
This lack of formulas is one of the largest hurdles for most people in cognizing nonlinear pricing. Three hundred and fifty years of intellectual tradition has led them to expect that a formula always exists. In a larger sense, it has led them to expect that the nonlinear approach, is sort of like the linear approach, with just a tweak here or there. It is more involved than that. Upon receiving notification of the Nobel Prize, Richard Feynman was asked by a Time reporter, "Can you tell us in a minute what you got the Nobel Prize for?" Feynman retorted, "Buddy, if I could tell you in a minute, it wouldn't be worth the Nobel Prize." Yes, you can try to "just make money" by pulling up a software program on your screen, but with nonlinear techniques, the better you want to be, the better your understanding will need to be.

SACRED COWS

It seems customary for the writers of popular books on science to put their cards on the table so you know their assumptions. One of the most fundamental subjects is God. Since we will slaughter a few sacred cows in this book, in the interests of full disclosure we need to look at all assumptions. Religion is often the first arena in a young person's life where the really big questions are addressed and encouraged. To a Westerner's eyes, I am an Episcopalian although my religious studies are influenced by having lived in the Far East and on the subcontinent for several years. The lesson is that science and religion are not mutually exclusive. In fact, spiritual achievement is gained by inquest and doubt as achievement is in science as well as by practice, as in any other discipline, like jogging. John Polkinghorne, a former particle physicist cum Anglican priest and president of Queen's College, Cambridge, makes this point in his book The Faith of a Physicist. It is a revealing and erudite account of a man who exists at the rare intersection of science and religion. Frank Tipler's The Physics of Immortality is also a thought-provoking account for readers who want to deal with high levels of abstraction.
From Hinduism, I commend Vedic Mathematics by Jagadguru Swami Sri Bharati Krishna Tirthaji Marahaja, the late Sankaracharya of Govardhana Math, Puri, India. This little gem reinforces the underlying unity of science and religion by deriving mathematics from Vedic scripture-certainly an alien intersection of subjects to most Westerners. Vedic Mathematics is interesting because it deals with the patterns of numbers. In nonlinear pricing we also detect patterns. However, since the subjects we are addressing of risk and return and time that are borderless, perhaps the origin of any knowledge should not be viewed with undue prejudice. Further study here may provide some insight to Ramanajun's thinking. Ramanjun was an early-20th-century savant who kept finding patterns in numbers, particularly the number 24. Ramanajun's work in the mathematics of renormalization group theory plays an important part in another advanced description of reality called string theory. Many such numbers exist, such as 8 and 18 in electron shells; 1/ 137, known as the fine structure constant in quantum physics; or, from chaos, Feigenbaum's constant, 4.669. More approachable for the Westerner are the slim but potent volumes, The Holy Science by Swami Sri Yukteswar and The Science of Religion by Paramahansa Yogananda. In sum, I do not accept the answer "because that is the way God made it" as a substitute for scientific understanding. Nor do I see a conflict. There is a God. I believe in Him. There you have it.
The previous paragraphs are important for two reasons: first, because in the face of great uncertainty we, as a discipline, are bound socially to a common view, and second, to highlight the stance of the author on the anthropic principle. We will address them sequentially. Per Bak, the physicist who pioneered self-organized criticality, once queried geophysicists on the first point:

"Why is it that you guys are so conservative in your views, in the face of the almost complete lack of understanding of what is going on in your field?" I asked. The answer was as simple as it was surprising. "If we don't accept some common picture of the universe, however unsupported by the facts, there would be nothing to bind us together as a scientific community. Since it is unlikely that any picture of reality we use will be falsified in our lifetime, one theory is as good as any other." The explanation was social, not scientific.

Of course, there is a difference between earthquake prediction and financial economics, but the fact remains that financial economics and thus trade, by definition, are a form of social intercourse. While parochialism in interpreting phenomena in the information age is not our friend, woe to the person who dismisses commonly held social beliefs like the assumptions of linear mathematics in financial economics now held as articles of faith by a couple of generations of MBAs. That person's pronouncements may be met with a social response which in effect says, "You are cast out because you are not like us," rather than a scientific one.
Regarding the second point: The anthropic principle is the first of the three major lines of reasoning that can be pursued in attempting to explain something as broad and deep as nonlinear pricing. The anthropic principle reasons from man's existence that something exists. Examples include "because that is the way God made it" and guilt experienced by survivors of a devastating experience such as war. The anthropic principle is illustrated because had they not survived, they would not be able to question why they did. This is not our path. The second line of reasoning is that the laws of theuniverse are fixed. The failure of deterministic Newtonian physics highlights this problem. If laws are fixed for all time, how does anything evolve? If everything is fixed, how do we explain change? The third line of reasoning, that the laws of the universe actually evolve, is posited by Lee Smolin in The Life of the Cosmos. This avenue seems to be the most promising because economies and markets are complex adaptive processes and those processes are not static. They evolve. True, all time scales of change may not be that o

Table of Contents

A Toy Story for Wall Street.

Nonlinearity: A Retrospective.

Nonlinearity: A Prospective.

Fractal Analysis.

Results of the Hurst Exponent.

Nonlinear Technology.

Biology and the S&P.

Father Time.

Nonlinear Pricing-Advanced Concepts.

The Last Word-Resonance.

Appendix.

Glossary.

Bibliography.

Index.
From the B&N Reads Blog

Customer Reviews