Another random plot:
This plot shows the mean
yearly rate of increase for the USD price of 1 BTC at Bitstamp since various past times to the present (specifically, to about jan/17/2014 23:00 UTC).
For example, the graph reads ~1000 around oct/17/2013. It means that the average price increase from that date to the present is equivalent to 1000-fold (9,990% increase)
per year. More specifically, if we compare the price at that date (138.38 USD) and the price at jan/17/2014 (795.89)
and extrapolate that rate of growth to one full year, we would get ~1000 times the starting price -- that is, ~795,000 USD by jan/17/2015.
On the other hand, the graph reads ~0.1 at about nov/30/2013. It means that between that date and jan/17 the price fell to such an extent (from ~1100 to ~795) that,
extrapolated to one year, would give a factor of ~1/10 (a 90% decrease); namely, a prediction of ~79,5 USD/BTC by jan/17/2015.
If we look instead at the change from ~jan/06 to ~jan/17/2014, we get a factor of ~0.001, that is, a prediction of 0,795 USD/BTC by jan/17/2015
Finally, the graph reads 1 at about nov/24/2013. It means that the price at that date was the same as it is now, i.e. the rate of (non-)increase since then was 0% per year.
Presently I believe that this sort of analysis is quite useless. The market does not seem to care for past prices, presumably because experienced traders believe that other experienced traders do not care for them, and so on recursively.
The graph seems to justify this indifference. Logically, the price at nov/29 should be more relevant than that of oct/17. But, depending on the time span considered, the analysis may yield from very positive to very negative predictions. Which time span is the right one to use?