herzmeister (OP)
Legendary
Offline
Activity: 1764
Merit: 1007


May 21, 2013, 10:25:34 PM 

Worth a discussion or two. http://arxiv.org/abs/1002.2284v2I prove that if markets are weakform efficient, meaning current prices fully reflect all information available in past prices, then P = NP, meaning every computational problem whose solution can be verified in polynomial time can also be solved in polynomial time. I also prove the converse by showing how we can "program" the market to solve NPcomplete problems. Since P probably does not equal NP, markets are probably not efficient. Specifically, markets become increasingly inefficient as the time series lengthens or becomes more frequent. An illustration by way of partitioning the excess returns to momentum strategies based on data availability confirms this prediction.




townf
Newbie
Offline
Activity: 42
Merit: 0


May 21, 2013, 10:49:07 PM 

Can you dumb that down for me




townf
Newbie
Offline
Activity: 42
Merit: 0


May 21, 2013, 10:50:19 PM 

Pretend that my IQ is just in the 100's




notme
Legendary
Offline
Activity: 1904
Merit: 1002


May 21, 2013, 11:18:58 PM 

This paper will not be accessible to you if you aren't familiar with formal mathematics and computational complexity theory.
TL;DR: Either economists are wrong about their efficient market hypothesis, or computers have a shitload of computational ability that computer scientists just haven't been clever enough to exploit yet.




townf
Newbie
Offline
Activity: 42
Merit: 0


May 21, 2013, 11:30:18 PM 

This paper will not be accessible to you if you aren't familiar with formal mathematics and computational complexity theory.
TL;DR: Either economists are wrong about their efficient market hypothesis, or computers have a shitload of computational ability that computer scientists just haven't been clever enough to exploit yet.
As long i can still gits my DrUgZez with them ther bzitk01nz, sall good dawg. Word




notme
Legendary
Offline
Activity: 1904
Merit: 1002


May 21, 2013, 11:49:44 PM 

This paper will not be accessible to you if you aren't familiar with formal mathematics and computational complexity theory.
TL;DR: Either economists are wrong about their efficient market hypothesis, or computers have a shitload of computational ability that computer scientists just haven't been clever enough to exploit yet.
As long i can still gits my DrUgZez with them ther bzitk01nz, sall good dawg. Word Way to represent the community .




Seal
Donator
Hero Member
Offline
Activity: 848
Merit: 1078


May 22, 2013, 12:09:27 PM 

This paper will not be accessible to you if you aren't familiar with formal mathematics and computational complexity theory.
TL;DR: Either economists are wrong about their efficient market hypothesis, or computers have a shitload of computational ability that computer scientists just haven't been clever enough to exploit yet.
As long i can still gits my DrUgZez with them ther bzitk01nz, sall good dawg. Word Way to represent the community . Lol, i totally don't get this at all.




caveden
Legendary
Offline
Activity: 1106
Merit: 1004


May 22, 2013, 12:20:53 PM 

Either economists are wrong about their efficient market hypothesis
This hypothesis doesn't come from those who practice economics they way it should be, i.e., aprioristically. They are commonly called "Austrian economists" or economists of the Austrian School, since the first members of these school of economics were Austrians. A quick search found me this article from Tom Woods: https://mises.org/daily/4904/FollowingtheEfficientMarketsHypothesisintoAbsurdity (haven't read it entirely though, just linking it here to make it explicit that actual economists don't agree with this theory)




aigeezer
Legendary
Offline
Activity: 1450
Merit: 1013
Cryptanalyst castrated by his government, 1952


May 22, 2013, 01:09:05 PM 

Assuming that the author's proof is valid (I haven't checked), the important part would be: "Since P probably does not equal NP, markets are probably not efficient."
It is close to certain that P is not equal to NP  the author's "probably" is an understatement.
I used to know this stuff pretty well. It was a big deal in grad school, but that was a long time ago. Here's what I remember, with apologies to anyone who is really up on this stuff.
Mathematicians have been looking at this issue closely for generations. The basic issue is whether "nondeterministic" (massively parallel, well  infinitely parallel if you want to get picky) problem solving approaches can "break through" to solve very difficult problems of the type that get much harder as the size of the data set increases. Generally, problems that can be dealt with in "polynomial time" are considered tractable, say where the difficulty can be expressed with a formula such as 100 times ncubed + 53 nsquared + some constant, where n is the size of the problem space. However, problems where n appears in an exponent are considered intractable  the difficulty increases so rapidly as the size of the data set increases, that they in effect cannot be solved by any known method. They might be easily solved where n is a small number but the issue comes into play as the data set size increases.
To dip a bit deeper...
Suppose the problem at hand is to print "all" the prime numbers. A deterministic computer would use an algorithm such as the Sieve of Eratoshenes, calculate a prime, print it, calculate the next prime, rinse and repeat forever  an intractable problem. A nondeterministic computer could "magically" have "all" the primes available, ready for printing, but the problem would still be intractable because the print job would still never end  still intractable even though the hardest part (calculating the primes) seems to have been magically resolved.
The academic issue is whether the "problem space" for deterministic problems is the same as for nondeterministic ones. Does P=NP? Is it possible that we just haven't found the right way of looking at the two spaces yet, and that some day someone might prove that they are equivalent. It would be a VERY big deal if that happened (all NP problems are mathematically equivalent, so proof for one would be proof for all), but it seems extremely unlikely by now.
Anyway, the takeaway on the original paper is still just "markets are probably not efficient" (if the author's math is sound).




benjamindees
Legendary
Offline
Activity: 1330
Merit: 1000


May 22, 2013, 01:46:39 PM 

This is an obvious conclusion. Nothing is perfectly efficient. Free markets that account for negative externalities tend towards efficiency, while centrallyplanned economies tend towards chaos and collapse.

Civil Liberty Through Complex Mathematics



aigeezer
Legendary
Offline
Activity: 1450
Merit: 1013
Cryptanalyst castrated by his government, 1952


May 22, 2013, 02:40:56 PM 

This is an obvious conclusion. Nothing is perfectly efficient. Free markets that account for negative externalities tend towards efficiency, while centrallyplanned economies tend towards chaos and collapse.
Beware of "obvious conclusions". It is perfectly obvious that the earth is flat if you look out your window. Anyway, obvious or not, the "efficient market hypothesis" is a widely held view in economics and the exposition of it is much more subtle than "perfect efficiency". I happen to think it is false, but thinking something and proving it are two different things. The point of the article under discussion is that the author claims to have proof that markets are not efficient. Wiki backgrounder on the efficient market hypothesis if you're interested in the gory details: http://en.wikipedia.org/wiki/Efficientmarket_hypothesis




marcus_of_augustus
Legendary
Offline
Activity: 3920
Merit: 2349
Eadem mutata resurgo


May 22, 2013, 03:49:08 PM Last edit: May 23, 2013, 03:45:48 AM by marcus_of_augustus 

... his "weak form" definition of 'efficiency' seems suspect.
There are also robustness, dynamic response, incorruptibility and other properties for why free markets are desirable.
Casting an old chestnut from economics into a rigorous classic math problem framework like P=NP seems pretty contrived. Brain candy, but useful?, nup.




digitalindustry


May 22, 2013, 03:54:27 PM 

This paper will not be accessible to you if you aren't familiar with formal mathematics and computational complexity theory.
TL;DR: Either economists are wrong about their efficient market hypothesis, or computers have a shitload of computational ability that computer scientists just haven't been clever enough to exploit yet.
As long i can still gits my DrUgZez with them ther bzitk01nz, sall good dawg. Word +1 gold : D

 Twitter @Kolin_Quark




hello_good_sir


May 23, 2013, 12:20:03 AM 

Wow, I had no idea that people put so much faith in the efficientmarket hypothesis. I view it more as a rule of thumb, something along the lines of "the best estimate of tomorrow's price is today's price".




JamesTaylor
Newbie
Offline
Activity: 30
Merit: 0


May 23, 2013, 12:23:38 AM 

It's widely accepted in the scientific community that P!=NP. Not sure, but most scientists "think" so.




caveden
Legendary
Offline
Activity: 1106
Merit: 1004


May 23, 2013, 07:52:57 AM 

There are also robustness, dynamic response, incorruptibility and other properties for why free markets are desirable.
The Austrians give very solid arguments on why free markets are desirable (there's no school of economics which stands more for free markets than the Austrian school) and they reject the efficient markets hypothesis.




