Bitcoin Forum
May 08, 2024, 09:35:56 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Poll
Question: How would you like the data to be processed?
Download a complete dump then process - 8 (80%)
Process during download - 2 (20%)
Total Voters: 10

Pages: « 1 2 3 4 [5] 6 »  All
  Print  
Author Topic: Discussion for MtGox trade data downloader  (Read 14338 times)
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 18, 2013, 01:57:34 AM
 #81

Hi,

thanks a lot, i'm already downloading and post when i'm done.

Some things i wanted to ask you - since you have lots of experience with this data now.
How would you rate the data quality in terms of acurateness and "rightness". Any known bugs oddities? Is this data actually as all trades went through on Mtgox? something to consider if i might want to use this data to develop a strategy (need to read first, to get better with this)?

The data should be an exact representation of what happened in real time, or at least a microsecond-resolution approximation. By that I mean that it it is possible multiple trades were executed in the same tick, but for all intents and purposes this shouldn't affect your use of the data for strategy development.

There are a couple oddities with the Money_Trade__ values, but I don't think this will be particularly relevant to you. Otherwise, I think the data is relatively accurate.

It is also possible that there are some oddities around May 23rd 2013 and around today December 17th, purely because I collected the data from 3 sources, and those were the boundaries - I'm fairly sure that there shouldn't be a problem, but if you wan't to be really safe then you can avoid those two days.

Lastly, the Primary column -- some trades are duplicated into other orderbooks, and their duplicates will be marked with their Primary field as false. It should be easy to exclude this data from your exports though (and it should be excluded by default). Alternatively, you might want to include it as I think these non-primary trades can influence other currencies, but I'm not too sure about this -- if you're not sure, just exclude it.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
1715204156
Hero Member
*
Offline Offline

Posts: 1715204156

View Profile Personal Message (Offline)

Ignore
1715204156
Reply with quote  #2

1715204156
Report to moderator
1715204156
Hero Member
*
Offline Offline

Posts: 1715204156

View Profile Personal Message (Offline)

Ignore
1715204156
Reply with quote  #2

1715204156
Report to moderator
1715204156
Hero Member
*
Offline Offline

Posts: 1715204156

View Profile Personal Message (Offline)

Ignore
1715204156
Reply with quote  #2

1715204156
Report to moderator
The block chain is the main innovation of Bitcoin. It is the first distributed timestamping system.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1715204156
Hero Member
*
Offline Offline

Posts: 1715204156

View Profile Personal Message (Offline)

Ignore
1715204156
Reply with quote  #2

1715204156
Report to moderator
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 18, 2013, 05:09:10 PM
 #82

i downloaded the 2nd file (899,2 Mib, but i on my computer it shows with 1,07 GiB)  and exportet it to an .csv file - that worked. Great! Today i was playing a little with the data, reading your manual (very well written kudos) but many things are still very unclear to me:

What really confuses me is the whole complex surrounding the timestamps. When i convert the file to an .csv  for example the last entry looks like this when i open the .csv in notepad:
 2010-07-17,23:09:17,0.049510,2000000000

when i import the same .csv into Excel and i change the formating of the cells to hh:mm:ss:ms (could not find smaller units than milliseconds in Excel) than excel shows me: this as timestamp: 23:09:17:917

for me several things are not clear for me with this:
1. How can Excel display something which is not even in the .csv file contained. Did Excel just "make this up"?
2. In the thread there stands that the data is in microseconds accuracy, but when i look at the file in editor it seems to be seconds. For example the last 3 lines of the file are:

2013-12-17,15:47:30,715.700000,1210000
2013-12-17,15:47:30,715.700000,1000000
2013-12-17,15:47:30,715.700000,780000

3. Not so important but maybe someone knows here/ has experience with Excel displaying timestamps: As far as i found Excel (i use 2010) is not able to display a better resolution than milliseconds. But some of the timestamps in my Sheet have 4 digits after the seconds e. g. 18.07.2013 17:48:56:4856
 how is that even possible?

Quote
It is also possible that there are some oddities around May 23rd 2013 and around today December 17th, purely because I collected the data from 3 sources, and those were the boundaries - I'm fairly sure that there shouldn't be a problem, but if you wan't to be really safe then you can avoid those two days.

If i understand you right this file is a combination of 3 datasources. The 3 Datasources are not mashed up on a day by day fashion but more like this:
from 07/17/2010 : Datasource1 (Mark Karpeles?)
05/23/2013: Datasource2 (API from mtgox? Bitcoincharts?)
12/17/2013: Datasource3?

so should i cut out just May 23rd and Dec 17. or +- some days before and after?

Quote
There are a couple oddities with the Money_Trade__ values, but I don't think this will be particularly relevant to you. Otherwise, I think the data is relatively accurate.

before i posted here i downloaded the files from Google BigQuery. I noticed then that there have been quite large jumps in the Trade ID's. Are you refering to that or could it be that with the change of the "primary key" of the database after Trade id 218868 things could have been messed up? No or? I mean the closed the exchange for 6 days back then, to set everything up right..





The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 18, 2013, 05:39:49 PM
 #83

...

Hmm, some really strange stuff happening... Firstly, showing up as 1.07 GB - I just realised that the tool will create a duplicate index, I should have thought about that. It's not really a problem except for taking a few minutes the first time you load it up and increasing the file size more than necessary, otherwise though it shouldn't affect your usage.

Yeah sorry about that - the format I was asked to support only went down to second resolution, so the microsecond resolution isn't present. In fact, microsecond resolution isn't even available for the first 218868 ticks. Unfortunately there's so many different possible formats I could export to, so I picked a few and stuck with them (although obviously the dump contains all the raw data unfiltered). If there's one you really want then I could release a new version of the tool with that supported, but if you want to manipulate the data into more formats or more than you can do in Excel consider playing around with Python and seeing what you can do with it Smiley

Excel is being very weird - if you notice, it's taking the minute and seconds and converting them into a new millisecond value for some very strange reason, such as 17:48:56 -> 17:48:56:4856. There's no imperative for it to do this.

Yes, the first data source is the Google BigQuery database, the second is the MtGox HTTP API, and the third is the MtGox Socket API -- basically the socket API is used to just collect the last few trades in real time. If you're going to cut them out, then you can cut out the last few minutes, half an hour to be safe, of the data, and just the one day on May 23rd (although really there shouldn't be any discrepancy, the data should be exact).

Yes, the large jump is because MtGox changed recording format -- Money_Trade__, otherwise known as the TID/Trade ID, used to be a (mostly) sequential integer, then it became a microsecond timestamp afterwards (coinciding with that closure, I believe). It doesn't make a difference to the data though, all ticks should still be present, it's just a curiosity. Of course, this discontinuity will probably have some effect on the prices around that time, so you might want to exclude a few days ± around that point for that reason as it might mess up your backtesting.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 18, 2013, 10:17:06 PM
 #84

Hi nitrous,

Since my charting software didn't eat well the Date, Time as two columns i recalculated in another column as Date+time, and then saved this as .csv. On first glance it looks good. But Is there any problem with this approach, should i beware of something (like leap years and other stuff that could fuck up the calculation)...

Quote
It doesn't make a difference to the data though, all ticks should still be present, it's just a curiosity. Of course, this discontinuity will probably have some effect on the prices around that time, so you might want to exclude a few days ± around that point for that reason as it might mess up your backtesting.

I don't know if i got you right, you mean the data is correct, but the closing of the exhange for almost a week will have had its impact on the prices as people got scared but with the data itself all is good?

I go to bed see you tomorrow...

The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 18, 2013, 11:07:54 PM
 #85

Hi nitrous,

Since my charting software didn't eat well the Date, Time as two columns i recalculated in another column as Date+time, and then saved this as .csv. On first glance it looks good. But Is there any problem with this approach, should i beware of something (like leap years and other stuff that could fuck up the calculation)...

Quote
It doesn't make a difference to the data though, all ticks should still be present, it's just a curiosity. Of course, this discontinuity will probably have some effect on the prices around that time, so you might want to exclude a few days ± around that point for that reason as it might mess up your backtesting.

I don't know if i got you right, you mean the data is correct, but the closing of the exhange for almost a week will have had its impact on the prices as people got scared but with the data itself all is good?

I go to bed see you tomorrow...

Hi BNO,

I don't think so -- if you calculated the date+time as just the concatenation of the two, it should be fine as I created the columns directly from the original unix timestamp (in dump #1) using a decent conversion function. How did you compute the column? I guess considering the weirdness from excel before, it is possible the excel function you used might not work well - hopefully it will be consistent though.

Yeah, anytime an exchange stops or starts, even for a few hours, the data either side will probably be a bit skewed -- both for emotional reasons (like being scared), but also mainly just as the price readjusts through arbitrage to the market price elsewhere (or at least, this is what I would expect).

Cya

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 21, 2013, 01:01:51 AM
Last edit: December 21, 2013, 01:33:20 AM by BNO
 #86

Hi Nitrous,

wanted to write you yesterday but didn't find the time to do so..

One problem occured in the field price when i looked at the data:



i remember that in the BigQuery table you had to divide the price field by 10.000. Here something seems to have gone wrong you know what?

The field Volume i divided by 100.000.000 for adjusting to full coins, might something similiar have happened in this field too?

Bye

Edit: I just saw that the .csv you script created is fine, it must have happened during import to Excel. Do you have an idea what might have caused this? Huh

The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 21, 2013, 07:58:27 PM
 #87

Hi Nitrous,

wanted to write you yesterday but didn't find the time to do so..

One problem occured in the field price when i looked at the data:
...

i remember that in the BigQuery table you had to divide the price field by 10.000. Here something seems to have gone wrong you know what?

The field Volume i divided by 100.000.000 for adjusting to full coins, might something similiar have happened in this field too?

Bye

Edit: I just saw that the .csv you script created is fine, it must have happened during import to Excel. Do you have an idea what might have caused this? Huh

Hi BNO, Excel gets stranger and stranger!  Huh

I just did a test CSV export myself and the CSV is definitely fine, I'm not sure how or why excel is mangling the data like this. Dividing one column by 1e8 should not affect any other column, especially as inconsistently as the price column seems to be being affected :S What is the exact procedure by which you imported the data into excel?

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 21, 2013, 08:02:01 PM
 #88

I just found this resource and I thought it would be useful to some people Smiley -- http://api.bitcoincharts.com/v1/csv/

I can't believe I didn't find it before, anyway, it has regularly updated (15mins?) CSV files with {unix timestamp (1s resolution), price (float), volume (float)} fields for many different exchanges and currencies.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 22, 2013, 12:43:08 PM
 #89

Hi,

i played with this. It was my fault sorry if i caused extra work. The explanation what was the exact Problem might be interesting for people from Europe (like me i'm from germany) importing this into Excel.

The Setting here have to be like this.


The "decimal separator" (that's how its called in German Version translated to english) has to be "."

In my defendance i have to say it wasn't logic for me that this could have caused this error since, and that was now what was the real trick: The combination of the "1000's separtor setting" and the "decimal- separator" led to the "weird" effect that all numbers below 1 where shown correctly but all number beyond 1 not. finally realising this led me to the point that i thought: "Hm O.K. it shouldn't be somethign with the decimal seperator since the numbers in the beginning are right, but let's check it again.

Sorry for wasting your time...

Greetings.

The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 22, 2013, 12:50:53 PM
 #90

Hi,

i played with this. It was my fault sorry if i caused extra work. The explanation what was the exact Problem might be interesting for people from Europe (like me i'm from germany) importing this into Excel.

The Setting here have to be like this.
...

The "decimal separator" (that's how its called in German Version translated to english) has to be "."

In my defendance i have to say it wasn't logic for me that this could have caused this error since, and that was now what was the real trick: The combination of the "1000's separtor setting" and the "decimal- separator" led to the "weird" effect that all numbers below 1 where shown correctly but all number beyond 1 not. finally realising this led me to the point that i thought: "Hm O.K. it shouldn't be somethign with the decimal seperator since the numbers in the beginning are right, but let's check it again.

Sorry for wasting your time...

Greetings.

Hi BNO,

Haha, no don't worry about wasting my time, that was really confusing! That makes a lot of sense now, although now I'm not sure how Excel managed to distinguish between the columns considering they use a comma for the column separator... RFC should really standardise the csv number format as this wasn't very obvious. Anyway, I'm glad you've fixed the problem now, and hopefully this will be useful for other people Smiley

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 22, 2013, 01:46:12 PM
 #91

Quote
RFC should really standardise the csv number format as this wasn't very obvious. Anyway, I'm glad you've fixed the problem now, and hopefully this will be useful for other people

Another thing that logic afterwards but a bit confusing is: Excel automatically changes the "." to a "," as decimal separator when you do it as described above, because this is the country setting (in europe you use "," to separate decimal numbers) if you now save the file again as .csv Excel is not saving the "," as "." again, it leaves the "," - this leads now to the problem what you mentioned that excel could confuse colums, but then its smart and changes the commas as separators with ";" with is well thought out from MS, but one can be confused when importing to the charting software and then thinks "But wtf why is comma not working i saved this as Comma Separated..."

Now we have it all through with Excel what  Wink
It should all be good now, but i don't have time to check it out now, Since i have to pack my stuff fast for "Driving home for Christmas" and there i probably will be pretty busy...

Which brings me to the point: Nice X-mas to you nitrous and all of this board here

till later!

The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 23, 2013, 01:32:06 AM
 #92

Quote
RFC should really standardise the csv number format as this wasn't very obvious. Anyway, I'm glad you've fixed the problem now, and hopefully this will be useful for other people

Another thing that logic afterwards but a bit confusing is: Excel automatically changes the "." to a "," as decimal separator when you do it as described above, because this is the country setting (in europe you use "," to separate decimal numbers) if you now save the file again as .csv Excel is not saving the "," as "." again, it leaves the "," - this leads now to the problem what you mentioned that excel could confuse colums, but then its smart and changes the commas as separators with ";" with is well thought out from MS, but one can be confused when importing to the charting software and then thinks "But wtf why is comma not working i saved this as Comma Separated..."

Now we have it all through with Excel what  Wink
It should all be good now, but i don't have time to check it out now, Since i have to pack my stuff fast for "Driving home for Christmas" and there i probably will be pretty busy...

Which brings me to the point: Nice X-mas to you nitrous and all of this board here

till later!

Thanks, Merry Christmas to you too Cheesy

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
gigica viteazu`
Sr. Member
****
Offline Offline

Activity: 458
Merit: 250

beast at work


View Profile
December 29, 2013, 08:12:52 PM
 #93

The (windows version of the) tool crash after downloading about 100Mb of data, and the crash occur every time at the same spot.

I have tried several times, doesn`t matter if I`m downloading a new dump or just try to resume one.

Code:
Rows downloaded: 1024000
Latest TID: 1366547913512370
Data up to: 2013-04-21 12:38:33

Update in progress - 5074502 rows to download
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 29, 2013, 09:09:07 PM
 #94

The (windows version of the) tool crash after downloading about 100Mb of data, and the crash occur every time at the same spot.

I have tried several times, doesn`t matter if I`m downloading a new dump or just try to resume one.

Code:
Rows downloaded: 1024000
Latest TID: 1366547913512370
Data up to: 2013-04-21 12:38:33

Update in progress - 5074502 rows to download


A few other people seem to be having this problem (see the last couple pages of this thread). I'm not entirely sure why it's happening, but I think Google have changed their protocols subtly and it's broken the tool. As a result, I've created some full dumps, including data all the way up to mid-December 2013 here.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
gigica viteazu`
Sr. Member
****
Offline Offline

Activity: 458
Merit: 250

beast at work


View Profile
December 29, 2013, 09:14:15 PM
 #95

thank you
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 30, 2013, 10:22:25 AM
 #96

How was the timestamp of the data again created. I know i read it somwhere but don't remember. Was it UTC timestamp is there a problem with daylight savings time of Japan? As far as i understand UTC is a constant "world time" and all timezones of the earth a derived from UTC.

Is something known, that there is some problem with the timestamps like Mark K. changed (for whatsoever reasons) , the timesettings of servers or something like this?

The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 30, 2013, 12:47:27 PM
 #97

How was the timestamp of the data again created. I know i read it somwhere but don't remember. Was it UTC timestamp is there a problem with daylight savings time of Japan? As far as i understand UTC is a constant "world time" and all timezones of the earth a derived from UTC.

Is something known, that there is some problem with the timestamps like Mark K. changed (for whatsoever reasons) , the timesettings of servers or something like this?

Have you found an inconsistency?

I'm pretty sure they should be consistent with UTC, I take the timestamps directly from that returned by each one of the APIs, and I don't think there are any problems. If you think there are though, tell me which ones and maybe I can have a quick look. If you're just speculating though, then you should be fine. UTC is pretty easy to get right, his servers should just automatically sync with a reliable timeserver in order to get reliable UTC.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 30, 2013, 10:00:51 PM
 #98

Quote
Have you found an inconsistency?

No i was just thinking about it since its very destructive if you look at trading data and do not realize that the data might be looking 1 or 2h in the future.

Quote
UTC is pretty easy to get right, his servers should just automatically sync with a reliable timeserver in order to get reliable UTC.

Hope/guess you are right, but when i look at the "track record" of Mark Karpeles, i'm not sure. What he fucked up in the last 2 years is an achievement in itself.

Good nite its late here...


The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 30, 2013, 11:27:32 PM
 #99

Quote
Have you found an inconsistency?

No i was just thinking about it since its very destructive if you look at trading data and do not realize that the data might be looking 1 or 2h in the future.

Quote
UTC is pretty easy to get right, his servers should just automatically sync with a reliable timeserver in order to get reliable UTC.

Hope/guess you are right, but when i look at the "track record" of Mark Karpeles, i'm not sure. What he fucked up in the last 2 years is an achievement in itself.

Good nite its late here...


Hmm, well the last timestamp from the 2nd dump is consistent with the time I made it, so it looks like recent data is at least consistent with UTC. I suppose the only advice I could give is to do a dry run of any algorithm you develop first, just to see whether it makes the right choices in a period of, say, a week, before risking any real assets.

Personally I've been looking at other exchanges more recently, so you might want to consider other exchanges too, especially as you're worried about whether you can trust Mark. I'm quite interested in Kraken as it looks like it has a lot of potential, but I'm primarily using Bitstamp at the moment. There's some published CSVs for many exchanges here http://api.bitcoincharts.com/v1/csv/, although obviously then you have to trust bitcoincharts to have collected the data properly (I haven't analysed their data yet, but I assume it's fairly reliable; from a quick look the bitstamp one definitely seems to include the major price spike at roughly the right time).

Good night Smiley

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 31, 2013, 11:45:15 AM
 #100

Quote
Hmm, well the last timestamp from the 2nd dump is consistent with the time I made it, so it looks like recent data is at least consistent with UTC.

I'll just trust this data, hope i won't regret this one day.

Quote
Personally I've been looking at other exchanges more recently, so you might want to consider other exchanges too, especially as you're worried about whether you can trust Mark. I'm quite interested in Kraken as it looks like it has a lot of potential, but I'm primarily using Bitstamp at the moment. There's some published CSVs for many exchanges here http://api.bitcoincharts.com/v1/csv/, although obviously then you have to trust bitcoincharts to have collected the data properly (I haven't analysed their data yet, but I assume it's fairly reliable; from a quick look the bitstamp one definitely seems to include the major price spike at roughly the right time).


Oh when i became interested in BTC this year again one thing i was really clear about was: I won't put again money in Mtgox. And i was happy for bitcoin that now there are three Big USD BTC Exchanges: btc-e, Bitstamp and Mtgox. I tried bitstamp (really like look and feel of the site, customer service replies fast, cashing out no problem so far) btc-e (service is a bit harsh) and bitfinex. Have to say Kraken looks nice maybe i give it shot. Volume is probably low so and prices are a bit higher, withdrawal problems?

The reason why i wan't to use Mtgox Data is no one goes back so far. And since Mtgox has a significant higher price than other exchanges its maybe not so wise to mix the data between exchanges.


Quote

Did i already say: That's a good link. Thanks.

Do you know something about timestamp with Bitstamp? Just downloaded the data, i assumend unix timestam. i converted with like this: A1/(24*3600)+25569 gives me as first trade:
13.09.2011 13:53:36

is that right Bitstamp was already back then in 2011 around - wow should have known that back then would maybe have saved some money  Wink

The thinking that has led us to this point will not lead beyond - Albert Einstein
Pages: « 1 2 3 4 [5] 6 »  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!