Bitcoin Forum
April 27, 2024, 04:19:20 AM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Poll
Question: How would you like the data to be processed?
Download a complete dump then process - 8 (80%)
Process during download - 2 (20%)
Total Voters: 10

Pages: 1 2 3 4 5 6 [All]
  Print  
Author Topic: Discussion for MtGox trade data downloader  (Read 14337 times)
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
May 31, 2013, 01:27:36 PM
 #1

Hello,

As you may know, I am developing a tool to download data from MtGox's bigquery database (see here for info on this db). The tool is being written in python, but hopefully I will be able to also release it as a multi-platform self-contained app for non-programmers as well. I also plan on supporting several different formats, e.g. phantomcircuit's sqlite, csv, full sqlite dump, among others (it will also be easy for anyone who can program to implement their own).

The problem is that bigquery can be quite slow to sort this dataset, and downloading (and maintaining) the data using these ordered queries can quickly use up the free bigquery limits, so it is only really feasible to obtain the raw table data. This is good because it's quick and uses up 0 bytes of processing quota, however it also comes in unordered. For database formats this doesn't matter, but for formats like CSV, it's not really desirable to have records all over the place.

  • To solve this, I propose that the tool will download a full and complete sqlite dump of the data, which can then be used on its own or transcribed to other formats (in order), e.g. CSV, phantomcircuit's format, etc. Unfortunately, however, this will result in more memory usage as you will probably have at least 2 copies of the data. I estimate that the full dump will be about 460 mb, and other formats will depend on what data is  included and in what format. This does have the benefit, however, that you will always have an up-to-date copy of the bq database, and so can generate whatever format you need, even if you hadn't anticipated it.
  • The alternative is to process it while it downloads, however this will mean that you will be locked into the format you specify, and it will be in the order of the bigquery database, so the CSV format, for example, would be difficult to make effective use of.

As you can see, there are two options with both advantages and disadvantages. Please select which option you would prefer in the poll above. It would also be good to hear your thoughts/discussion on this tool.

In the meantime, you can download the partially complete tool which is capable of downloading a complete dump to a sqlite3 database here: https://bitbucket.org/nitrous/bq. Its dependencies are my sqlite class (pykrete, here) and the google python client library here

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
1714191560
Hero Member
*
Offline Offline

Posts: 1714191560

View Profile Personal Message (Offline)

Ignore
1714191560
Reply with quote  #2

1714191560
Report to moderator
1714191560
Hero Member
*
Offline Offline

Posts: 1714191560

View Profile Personal Message (Offline)

Ignore
1714191560
Reply with quote  #2

1714191560
Report to moderator
If you want to be a moderator, report many posts with accuracy. You will be noticed.
Advertised sites are not endorsed by the Bitcoin Forum. They may be unsafe, untrustworthy, or illegal in your jurisdiction.
1714191560
Hero Member
*
Offline Offline

Posts: 1714191560

View Profile Personal Message (Offline)

Ignore
1714191560
Reply with quote  #2

1714191560
Report to moderator
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
May 31, 2013, 03:14:30 PM
 #2

N.B. On my ~8meg connection, the full download took around 2.5 hours, and resulted in a 457mb db

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Loozik
Sr. Member
****
Offline Offline

Activity: 378
Merit: 250


Born to chew bubble gum and kick ass


View Profile
May 31, 2013, 09:13:54 PM
 #3

I vote for ''download a complete dump then process''

Thanks
whydifficult
Sr. Member
****
Offline Offline

Activity: 287
Merit: 250



View Profile WWW
June 01, 2013, 09:33:06 PM
 #4

Awesome work.

I am currently doing some research on how to get historical data from different exchanges to enable backtesting features. I want to use your tool to download everything into a SQLite db so that my bot can read this data. However I am running into trouble installing your tool:

  • I have installed python 2.7 (requirement of Google API client library)
  • I have downloaded the Google API client library and updated the clients_secrets and its in folder A
  • I have downloaded bq and pykrete in folder B and C

Do I need to point bq to pykrete and the Google library or do I need to put certain files at certain places? Maybe there is an installation doc I missed?

Gekko a nodejs bitcoin trading bot!
Realtime Bitcoin Globe - visualizing all transactions and blocks
Tip jar (BTC): 1KyQdQ9ctjCrGjGRCWSBhPKcj5omy4gv5S
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
June 01, 2013, 09:51:31 PM
Last edit: June 01, 2013, 11:05:34 PM by nitrous
 #5

Awesome work.

I am currently doing some research on how to get historical data from different exchanges to enable backtesting features. I want to use your tool to download everything into a SQLite db so that my bot can read this data. However I am running into trouble installing your tool:

  • I have installed python 2.7 (requirement of Google API client library)
  • I have downloaded the Google API client library and updated the clients_secrets and its in folder A
  • I have downloaded bq and pykrete in folder B and C

Do I need to point bq to pykrete and the Google library or do I need to put certain files at certain places? Maybe there is an installation doc I missed?

Sorry I wasn't more clear on this. To install a python package, you need to run
Code:
sudo python ./setup.py install
from the directory using the console/terminal/command prompt. I'm assuming you're on linux or mac here, if you're on windows omit the `sudo` part. Once you've installed pykrete and the google library, you should then be able to run mtgox.py and it should start the download Smiley It should also be able to resume previous downloads if you cancel it.

EDIT: When the tool is more complete and is ready to be packaged up into a self-contained GUI app, all these dependency issues should be gone as they will be built-in Smiley My first quick attempt to do this using pyinstaller didn't work, which is slightly ominous, but hopefully other tools such as py2exe and py2app will be able to handle the google library properly.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
June 01, 2013, 10:17:41 PM
 #6

After talking with Jordan Tigani from Google's bigquery project, it seems that this tool may stop working at some point after MtGox starts regular updates, because Google occasionally performs coalesce operations that don't respect the order of rows in a table. The initial download should always work properly, but if a coalesce occurs, then future updates may corrupt the local dump. Bigqeury do have plans to implement some way of respecting order, but it is not yet on the tables. Perhaps if we made a case to Google about the importance of this we could reach some kind of solution? Otherwise it will end up costing around $60/month to keep local copies regularly updated because `order by` queries will need to be run for each update instead of a simple (and free) tabledata:list. Running SQL queries would also slow down the tool speed.

The only other option I can envisage is someone performing a `SELECT * FROM [mt-gox:mtgox.trades] ORDER BY Money_Trade__ ASC` query every time the database is updated, saving it to a table, and then the tool would download from this table instead. If bigquery doesn't implement some kind of row order permanence, then I'll ask MagicalTux to consider doing this officially.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
June 01, 2013, 11:00:04 PM
 #7

Maybe there is an installation doc I missed?

Hi again, I just updated the pykrete docs to be a lot more complete, and they now include some simple installation instructions Smiley

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Loozik
Sr. Member
****
Offline Offline

Activity: 378
Merit: 250


Born to chew bubble gum and kick ass


View Profile
June 01, 2013, 11:24:35 PM
 #8

After talking with Jordan Tigani from Google's bigquery project, it seems that this tool may stop working at some point after MtGox starts regular updates, because Google occasionally performs coalesce operations that don't respect the order of rows in a table.

I read your discussion with Jordan Tigani (I do not understand it very well)

Quote from: Jordan Tigani
Coalesce happens on the order of once every 300 times you append data to the table.

Quote from: nitrous
The database is being maintained by a financial company, and they plan to start updating it automatically (up to every 10mins).

and have one comment: if the acceleration rate of number of transaction (ticks) is maintained (you know, more people will perform transactions more frequently) one can expect that in a few years 300 appendices will be made within one second. A year ago millisecond timestamping was a standard for reporting ticks. At the moment microsecond timestamping (one millionth of a second) is being introduced by financial data vendors and exchanges. I do not think 10 mins update is sustainable in the long run.

Otherwise it will end up costing around $60/month to keep local copies regularly updated because `order by` queries will need to be run for each update instead of a simple (and free) tabledata:list. Running SQL queries would also slow down the tool speed.

Is $60/month the cost to be incurred by MtGox / you or by every single user of this service?

If bigquery doesn't implement some kind of row order permanence, then I'll ask MagicalTux to consider doing this officially.

Couldn't MtGox just invest in servers / rent a data centre and pay a programmer like you to make a dedicated service / tailor made service instead of spending many hours talking to Google (without a guarantee for a success) - time is money.

Then MtGox can simply provide data API service to customers who will pay e.g. BTC 0.5 per month - it is a standard practice in financial industry that data API is provided to users (i) for a fee or (ii) in exchange for maintaining a certain balance in the account (e.g. if you want API from Dukascopy forex broker you must have a balance of at least $100k)
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
June 02, 2013, 06:47:13 AM
 #9

Hi Loozik - the bq database is not updated every time a transaction occurs. Instead, MtGox will upload all new trades that occurred between the last update and then, about every 10mins-1hour. If there are trades occurring 300 times per second, then these updates would insert each of the new 180k-1.8m trades, so there's no problem there. Also note that MtGox does use microsecond timestamps (see the Money_Trade__ field).

If there is no resolution to the problem, then up to $60/month would be incurred by the users of the service, but this is only if the user were to update their local dump every 10 minutes. If they did so every 4 hours or less, it would be within the Google bq monthly quota, and would be free. I understand the usual forex policy for API access, but I'd like to avoid any changes that would charge for access.

There are a couple of solutions I've already thought of, each with pros/cons, so I think it should be resolved soon.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Loozik
Sr. Member
****
Offline Offline

Activity: 378
Merit: 250


Born to chew bubble gum and kick ass


View Profile
June 02, 2013, 06:56:55 AM
 #10

There are a couple of solutions I've already thought of, each with pros/cons, so I think it should be resolved soon.

Can you share your thoughts on possible solutions or is it too early?
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
June 02, 2013, 10:07:52 AM
Last edit: June 02, 2013, 11:50:15 AM by nitrous
 #11

There are a couple of solutions I've already thought of, each with pros/cons, so I think it should be resolved soon.

Can you share your thoughts on possible solutions or is it too early?

Update: added fifth option (see the second row, using destination table)
Update 2: assuming that option 2 is feasible, MagicalTux has confirmed he will implement it Smiley

Possible solutionAssumptionsProsCons
There are two tables, trades1 and trades2. When it is time to update, MtGox appends to trades1, then copies this table to trades2. Next time, they append to trades2 and copy to trades1. Both tables are identical, and the tool can download from trades1 every 10 minutesThat copy operations preserve row order and prevent coalesce operationsFree, simple, quick, tool continues to work as it does now -- basically the ideal solutionNone
There are two tables, trades and trades_sorted. When it is time to update, MtGox appends to trades then performs a "SELECT * FROM trades ORDER BY Money_Trade__ ASC" using trades_sorted as the destination table (and using WRITE_TRUNCATE mode)That the sorted table will be guaranteed to remain sorted, and won't be coalescedTool continues to work as usual (switched to download from trades_sorted)MtGox must pay up to $60/month to implement this
Hybrid solution. The tool checks whether the trades table has been coalesced since last updated. If yes, then it downloads by SQL query, otherwise it downloads using the normal tabledata:list methodThat it is possible to check for coalescence, that checking is freeSince coalescence only occurs about every 2 days at most frequent, the usage should remain free, downloading from a query operation could be slightly quickerMore complex tool (only slightly though), query operations are slow (can take ~100s to sort the entire table)
Google makes the coalesce operation order-preservingThat changing the coalesce behaviour is easy and quickNo changes to toolAssuming that Google will change the way their service works just for our benefit
Google implements automatic table sorting for tables, and MtGox selects "Money_Trade__ ASC, Primary DESC" as the sorting orderThat Google would implement an entirely new functionalityNo changes to the tool, functionality could be useful for many people and other bigquery usersA lot to ask from Google, moreso than the last solution

As you can see, the first two rely on possible pre-existing behaviour, and the last two rely on Google implementing new behaviour, so the first two are more ideal. Unfortunately, however, Jordan and my timezones intersect at awkward times so correspondence can be a bit slow.



Update to the development:

The basic underlying tool mechanism is pretty ready, todos are:
  • Make the tool more reliable and not at risk of corruption should bq return unexpected results
  • Create a graphical interface
  • Create documentation (if necessary)
  • Test out different python packagers for creating apps/exes to see which works with the necessary dependencies, and release self-contained apps for different platforms (windows, mac), as well as an archived distribution for linux/mac
  • Take feedback and make any necessary improvements

I have 2 weeks of exams coming up followed by a family holiday, so I won't be able to put in too much effort. Any python programmers though can feel free to fork my repo and work on these if you want Smiley

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Loozik
Sr. Member
****
Offline Offline

Activity: 378
Merit: 250


Born to chew bubble gum and kick ass


View Profile
June 02, 2013, 02:27:35 PM
 #12

I have 2 weeks of exams coming up followed by a family holiday, so I won't be able to put in too much effort.

Enjoy your family holiday  Smiley
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
June 03, 2013, 07:22:21 PM
Last edit: June 04, 2013, 10:05:48 AM by nitrous
 #13

I have 2 weeks of exams coming up followed by a family holiday, so I won't be able to put in too much effort.

Enjoy your family holiday  Smiley

Thank you Smiley



I just heard back from Jordan that copying does respect the order, so option 1 should definitely be plausible. Creating a table in a single operation should also result in stable order access, so option 2 is also possible. The only uncertainty I have is whether the table needs to be deleted and recreated to 'reset' the coalesce operation, or whether just using a WRITE_TRUNCATE will be sufficient. Either way we should be able to do this quite easily now Smiley

Update: Perfect! WRITE_TRUNCATE will be sufficient, so it should be quite easy to get this going.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
8bitPunk
Member
**
Offline Offline

Activity: 70
Merit: 10



View Profile
June 11, 2013, 10:27:13 AM
 #14

Nitrous & MagicalTux - I really appreciate your effort in getting this data source available on big query.

I ran a query to pull all trades since 1 Jan 2013 and it looks complete up until 23 May 2013. However, I've also observed some gaps in the data where periods of hours or days do not have any trades (eg between 1367430878865520 & 1367547410438010). Yet there are 60K trades during that period in the data source, so my code has tripped up somewhere.

I just wanted to mention it in case it relates to the WRITE_TRUNCATE method to derive the sorted table. FWIW I queried the trades_raw table as a trades_sorted table wasn't available yet. I imagine that once you do have the sorted table then trades since 23 May will start being uploaded?

In the meantime I will use the python tool from Nitrous to get the complete dataset.

BTC 18bPunkuginRBm1Xz9mcgj8mWJnHDAW5Th | Ł LTCgXEdyBdoQ9WdF6JHi7Pa2EWtzbDjG76 | Ψ ATEBiTLkLpAYeW5hQknUfSvnb7Abbgegku
whydifficult
Sr. Member
****
Offline Offline

Activity: 287
Merit: 250



View Profile WWW
June 21, 2013, 05:27:44 PM
Last edit: June 21, 2013, 07:34:47 PM by whydifficult
 #15

Sorry for hijacking your topic but people using this tool might find this handy:

Using the database downloaded by the trade data downloader I wrote a tiny candleCalculator which calculates candles based on the trades in this database. I already calculated all hourly candles from Jun 26 2011 19:00:00 up to May 23 2013 16:00:00 (for BTCUSD, 16,699 candles), you can download them here.

If anyone wants to calculate their own candles check out the thread for the script.

(it's a small and easy script but you could always port it to python to make it part of the trade data downloader.)

Gekko a nodejs bitcoin trading bot!
Realtime Bitcoin Globe - visualizing all transactions and blocks
Tip jar (BTC): 1KyQdQ9ctjCrGjGRCWSBhPKcj5omy4gv5S
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
June 21, 2013, 05:47:19 PM
 #16

Sorry for hijacking your topic but people using this tool might find this handy:

Using the database downloaded by the trade data downloader I wrote a tiny candleCalculator which calculates candles based on the trades in this database. I already calculated all hourly candles from Jun 26 2011 19:00:00 up to May 23 2013 16:00:00 (for BTCUSD, 16,699 candles), you can download them here.

If anyone wants to calculate their own candles check out the thread for the script.

(it's a small and easy script but you could always port it to python to make it part of the trade data downloader.)

Thank you, that's great work Smiley I'll look into porting it to python and including it in the downloader soon if that's ok with you.

Just an update for anyone interested in this project: I've just finished two weeks of exams, and I'm now going on holiday, but I'll be back by the beginning of July and then I'll be able to put in some more work on this, thank you for your patience.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
bitsalame
Donator
Hero Member
*
Offline Offline

Activity: 714
Merit: 510


Preaching the gospel of Satoshi


View Profile
June 26, 2013, 05:31:08 AM
 #17

we'll be going with solution #2. It seems the most reliable option Smiley

Sorry to hijack this thread but:
WHEN ON EARTH WILL YOU FIX THE ANDROID APP?
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 02, 2013, 07:05:01 PM
Last edit: September 13, 2013, 11:53:48 AM by nitrous
 #18

Hi everyone,

While I was away SerialVelocity helped to develop the start of a GUI, and I've now filled in some basic update functionality Smiley Here are some screenshots of where we're at now:



There is still much to do, however:
  • Custom dump location
  • Dump statistics
  • Graphical error handling
  • Exporting the data to different formats
  • Choosing what data to export
  • Saving export preferences so that exports can be routinely updated
  • Testing on multiple platforms
  • Packaging the app into a self-contained bundle

As you can see, there's a fair bit to do, but hopefully it shouldn't take too long.

Thanks,
nitrous

EDIT: To use this, run:
Code:
python gui.pyw

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
whydifficult
Sr. Member
****
Offline Offline

Activity: 287
Merit: 250



View Profile WWW
July 02, 2013, 07:20:33 PM
 #19

Looks great, will test it out tonight!

Thank you, that's great work Smiley I'll look into porting it to python and including it in the downloader soon if that's ok with you.

Yes of course! It would be much better to be part of your tool instead of being a separate script (relying on a separate dev environment). It is a pretty simple script but if you need help porting it just let me know. I would have done it myself if I was able to code something in Python (learning it is on my list).

--

Thanks for your contributions to the community Smiley We definitely need more initiatives like this.

Gekko a nodejs bitcoin trading bot!
Realtime Bitcoin Globe - visualizing all transactions and blocks
Tip jar (BTC): 1KyQdQ9ctjCrGjGRCWSBhPKcj5omy4gv5S
Loozik
Sr. Member
****
Offline Offline

Activity: 378
Merit: 250


Born to chew bubble gum and kick ass


View Profile
July 02, 2013, 07:32:42 PM
 #20

Thanks for updating.
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 03, 2013, 02:35:04 PM
Last edit: September 13, 2013, 11:54:35 AM by nitrous
 #21



It seems that some windows users have been experiencing trouble with the initial authentication, so I've overridden part of the google code to create a GUI method that should work cross-platform. Please let me know if there are still problems authenticating with Google on any platform, or if the instructions aren't clear enough.

In addition, it seems that you do not need to supply your own client_secrets.json. Apparently, it is up to me as the app developer to do this, so there is no longer any need to dig through Google's API Console as I have incorporated the codes into the program, you just need to complete the authorisation in a browser.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 06, 2013, 06:23:37 PM
Last edit: July 06, 2013, 07:06:56 PM by nitrous
 #22

Ok everyone, I've completed the tool!

You can download a binary for mac or windows here:
https://bitbucket.org/nitrous/bq/downloads

The source is still available here:
https://bitbucket.org/nitrous/bq
And you can build your own binary if you want.


The tool is ready to download from the bigquery database, and to export to a number of formats.

  • CSV (price/volume) [ISO]
  • CSV Candle [ISO]
  • CSV Candle [Unix]
  • Sqlite3 PhantomCircuit

The first option is the format Loozik needed, a csv file with headers date,time,price,volume. Date and time are in ISO 8601 format.

The next two generate candlestick data using a port of https://bitcointalk.org/index.php?topic=239815.0. The first one uses ISO date formats and the second, Unix timestamps.

The last should be compatible with PhantomCircuit's script http://cahier2.ww7.be/bitcoinmirror/phantomcircuit/.


Feel free to download, try it out, and let me know if you find any bugs. Please note, however, that the bigquery database is not yet being updated, nor is it sorted. When it is, any current dumps will probably be incompatible, so I suggest you don't download a full dump yet unless you need/want the data. When automatic bigquery updates do start, you will need to generate a new dump (File->New Dump...).

I haven't gotten around to documenting the usage of this tool yet, but hopefully it's pretty self-explanatory.



Windows DLL requirements:

There are a number of DLL files which py2exe seems to require. I think those that are come preloaded with Windows, except for the MSVC runtime. Try running the app first without downloading any new stuff. If it doesn't work, first you should try installing the following:

http://www.microsoft.com/en-us/download/details.aspx?id=29

If you still can't run it, tell me and I'll try to see what the problem is.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Loozik
Sr. Member
****
Offline Offline

Activity: 378
Merit: 250


Born to chew bubble gum and kick ass


View Profile
July 07, 2013, 11:15:39 AM
 #23

Ok everyone, I've completed the tool!

Fantastic.

Please note, however, that the bigquery database is not yet being updated, nor is it sorted. When it is, any current dumps will probably be incompatible, so I suggest you don't download a full dump yet unless you need/want the data. When automatic bigquery updates do start, you will need to generate a new dump (File->New Dump...).

Do you perhaps know when MtGox will start updating the database?
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 07, 2013, 11:31:01 AM
 #24

Ok everyone, I've completed the tool!

Fantastic.

Please note, however, that the bigquery database is not yet being updated, nor is it sorted. When it is, any current dumps will probably be incompatible, so I suggest you don't download a full dump yet unless you need/want the data. When automatic bigquery updates do start, you will need to generate a new dump (File->New Dump...).

Do you perhaps know when MtGox will start updating the database?

Sorry, not yet. I've asked a couple times but I've yet to receive an official response.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Loozik
Sr. Member
****
Offline Offline

Activity: 378
Merit: 250


Born to chew bubble gum and kick ass


View Profile
July 07, 2013, 12:28:41 PM
 #25

Ok everyone, I've completed the tool!

Fantastic.

Please note, however, that the bigquery database is not yet being updated, nor is it sorted. When it is, any current dumps will probably be incompatible, so I suggest you don't download a full dump yet unless you need/want the data. When automatic bigquery updates do start, you will need to generate a new dump (File->New Dump...).

Do you perhaps know when MtGox will start updating the database?

Sorry, not yet. I've asked a couple times but I've yet to receive an official response.

Many thanks. Please let us know when you know something  Smiley
2weiX
Legendary
*
Offline Offline

Activity: 2058
Merit: 1005

this space intentionally left blank


View Profile
July 07, 2013, 12:31:28 PM
 #26

trying thiso out ASAP.
might have some feature requests, happy to donate.
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 07, 2013, 03:09:37 PM
 #27

trying thiso out ASAP.
might have some feature requests, happy to donate.

Great Smiley Let me know and I'll see what I can do.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Diabolicus
Member
**
Offline Offline

Activity: 90
Merit: 10


View Profile
July 08, 2013, 01:55:02 PM
Last edit: July 08, 2013, 02:24:54 PM by Diabolicus
 #28

It crashes when trying to export, no matter what settings I choose.
Tested on Windows 7 64bit and 32bit.
Log says:
Quote
Traceback (most recent call last):
  File "app.py", line 80, in __call__
  File "export.pyc", line 210, in begin_export
  File "formatters\Candles.pyc", line 181, in latest
ValueError: invalid literal for float(): 2010-07-22
or
Quote
Exception in thread Thread-1:
Traceback (most recent call last):
  File "threading.pyc", line 808, in __bootstrap_inner
  File "app.py", line 98, in thread_bootstrap
  File "app.py", line 52, in fatal
NameError: global name 'time' is not defined

Traceback (most recent call last):
  File "app.py", line 96, in thread_bootstrap
  File "mtgox.pyc", line 185, in run
  File "formatters\LoozikCandles.pyc", line 117, in open
IOError: Invalid CSV file to append - wrong number of fields
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 08, 2013, 05:10:03 PM
 #29

It crashes when trying to export, no matter what settings I choose.
Tested on Windows 7 64bit and 32bit.
Log says:
Quote
Traceback (most recent call last):
  File "app.py", line 80, in __call__
  File "export.pyc", line 210, in begin_export
  File "formatters\Candles.pyc", line 181, in latest
ValueError: invalid literal for float(): 2010-07-22
or
Quote
Exception in thread Thread-1:
Traceback (most recent call last):
  File "threading.pyc", line 808, in __bootstrap_inner
  File "app.py", line 98, in thread_bootstrap
  File "app.py", line 52, in fatal
NameError: global name 'time' is not defined

Traceback (most recent call last):
  File "app.py", line 96, in thread_bootstrap
  File "mtgox.pyc", line 185, in run
  File "formatters\LoozikCandles.pyc", line 117, in open
IOError: Invalid CSV file to append - wrong number of fields

Hi Diabolicus,

Thanks, the 'time' error was a bug (though it only occurs after the program crashes). Your other errors appear to be you trying to update a CSV file which was created with a different format. There are five types of CSV file that can currently be produced:

CSV (price/volume) [ISO]
CSV Candle [ISO] - no volume
CSV Candle [ISO] - with volume
CSV Candle [Unix] - no volume
CSV Candle [Unix] - with volume

None of these are compatible with each other. It should work as long as you remember which options you used (I recommend saving your export settings with the Save button in the export window and then reloading them with the Load button so that the settings are exactly as they were before).

I've now fixed the time bug, and I've made the app produce some user-friendly error messages instead of crashing on a CSV file error. I haven't updated the binaries yet though in case there are any changes to make.

Hopefully there aren't any bugs specific to Windows 7, but please let me know as I've only personally tested it on OS X Mountain Lion and Windows XP.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Diabolicus
Member
**
Offline Offline

Activity: 90
Merit: 10


View Profile
July 09, 2013, 10:08:12 AM
 #30

Ok, thanks, I will test again as soon as you release the update.
Epoch
Legendary
*
Offline Offline

Activity: 922
Merit: 1003



View Profile
July 10, 2013, 04:01:50 AM
 #31

I'm also getting a crash under Win7/64 when trying to export a CSV (candles, UNIX time, do not include volume, 1800s duration ... haven't tried the others). Log says:
Code:
Traceback (most recent call last):
  File "app.py", line 80, in __call__
  File "gui.pyc", line 95, in save_dump
  File "gui.pyc", line 109, in load_dump
IndexError: list index out of range
Traceback (most recent call last):
  File "app.py", line 80, in __call__
  File "gui.pyc", line 95, in save_dump
  File "gui.pyc", line 109, in load_dump
IndexError: list index out of range

Once a new release is out with improved error logging, I'll run it again to see what it shows.

A question: (assuming the crash is fixed in the next release) does anyone know what settings should be chosen for compatibility with WhyDifficult's trading bot?
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 10, 2013, 09:02:01 AM
 #32

I'm also getting a crash under Win7/64 when trying to export a CSV (candles, UNIX time, do not include volume, 1800s duration ... haven't tried the others). Log says:
Code:
Traceback (most recent call last):
  File "app.py", line 80, in __call__
  File "gui.pyc", line 95, in save_dump
  File "gui.pyc", line 109, in load_dump
IndexError: list index out of range
Traceback (most recent call last):
  File "app.py", line 80, in __call__
  File "gui.pyc", line 95, in save_dump
  File "gui.pyc", line 109, in load_dump
IndexError: list index out of range

Once a new release is out with improved error logging, I'll run it again to see what it shows.

A question: (assuming the crash is fixed in the next release) does anyone know what settings should be chosen for compatibility with WhyDifficult's trading bot?

Ok, that bug should be fixed. Expect a new release later today (I'll post again when it's ready). Improved error logging will only apply to the bits of code found to have the potential to cause an error, unexpected errors will still cause a crash unfortunately. When I have enough time I may read through all the code to try to find any more that could happen.

Gekko should work with the format set to 'CSV Candle [Unix]', and volume unchecked. I think in the future he plans to include volume in his candles. Technically, Gekko should still work with volume left checked as his backtest loader script currently ignores any extra fields, but that could change. Make sure to select only one currency, and the other settings are up to you. Also, WhyDifficult's published CSVs included non-primary trades. I usually don't include them, but again that's up to you.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 10, 2013, 06:03:56 PM
 #33

Ok, I've released a new version Smiley It should fix the bugs that have been reported and give more helpful error messages in the cases identified.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Epoch
Legendary
*
Offline Offline

Activity: 922
Merit: 1003



View Profile
July 10, 2013, 08:18:22 PM
 #34

I've just downloaded and tried the new version; still crashes on CSV export. However, this time it did not generate any .log file (the previous version created a logfile in the same directory as the executable). Any special steps needed to get the debugging info?
Diabolicus
Member
**
Offline Offline

Activity: 90
Merit: 10


View Profile
July 10, 2013, 08:26:15 PM
 #35

Same here, still crashes and no log.
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 10, 2013, 09:35:27 PM
 #36

Weird, it worked fine on my XP but I just tested it on windows 7 and I see what you mean. For some reason, Tkinter (the library responsible for the graphics of the app) is causing a crash, and I'm not sure why at the moment. I'll keep looking into it. Also, I have access to OS X 10.8, Windows XP and Windows 7, does this problem occur in Vista as well?

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 10, 2013, 10:37:16 PM
 #37

Ok, my current conclusion is that Tkinter is broken in windows 7 Angry I thought I found the (sort-of) problem, I fixed it, and then another bug came up, entirely due to tkinter. (Tkinter is the graphical library I use, and I use it for around a third of my code!) I believe I have now isolated the bug though and I think I've got around it satisfactorily. Sorry about that.

There are some other changes I also need to make (you still get an error when trying to update a CSV (price/volume) [ISO], for example), so expect a new release tomorrow, and I'll try to do all I can to make Tkinter work. In future, you'll know if it crashes because of Tkinter because you won't get a traceback, just something like 'python.exe has stopped working'. If you get that, please tell me the exact circumstances that led you to get that error so I can try to reproduce it, as without a traceback I have no idea where the bug is coming from.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 11, 2013, 02:38:38 PM
 #38

Ok, I'm now cautiously optimistic that the new version I just pushed should now work on windows 7 (at least 32 bit). I quickly tested it right now though (OS X 10.8, XP 32bit, 7 32bit) and it seems to be fully functional, touch wood.

Also, I've only experienced this a couple times, but when the Google library fails to get a connection to the Google servers quickly enough it can sometimes crash, I'll see if I can stop it but in the meantime just open the program again and it should be fine. You'll get a logfile if this happens.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Epoch
Legendary
*
Offline Offline

Activity: 922
Merit: 1003



View Profile
July 11, 2013, 03:01:48 PM
 #39

Ok, I'm now cautiously optimistic that the new version I just pushed should now work on windows 7 (at least 32 bit). I quickly tested it right now though (OS X 10.8, XP 32bit, 7 32bit) and it seems to be fully functional, touch wood.
Nope, same problem as the previous build. It crashes on export and does not generate any .log file. I'm running Win7/64.
Diabolicus
Member
**
Offline Offline

Activity: 90
Merit: 10


View Profile
July 11, 2013, 03:26:58 PM
Last edit: July 11, 2013, 03:38:54 PM by Diabolicus
 #40

Windows 7, 64 Bit:
When trying to build 1h candles from the whole 640MB dump.sql, after about 10-15% done it crashes with the following error:
Quote
Traceback (most recent call last):
  File "app.py", line 119, in <module>
  File "gui.pyc", line 280, in run
  File "bq.pyc", line 127, in __init__
  File "bq.pyc", line 192, in complete
  File "oauth2client\util.pyc", line 128, in positional_wrapper
  File "apiclient\discovery.pyc", line 192, in build
  File "oauth2client\util.pyc", line 128, in positional_wrapper
  File "oauth2client\client.pyc", line 490, in new_request
  File "httplib2\__init__.pyc", line 1570, in request
  File "httplib2\__init__.pyc", line 1317, in _request
  File "httplib2\__init__.pyc", line 1286, in _conn_request
  File "httplib.pyc", line 1033, in getresponse
httplib.ResponseNotReady
Traceback (most recent call last):
  File "app.py", line 81, in __call__
  File "export.pyc", line 223, in begin_export
  File "formatters\PhantomCircuit.pyc", line 69, in latest
DatabaseError: file is encrypted or is not a database
Windows 7, 32 Bit:
Runtime error after ~20% done:
Quote
Traceback (most recent call last):
  File "app.py", line 119, in <module>
  File "gui.pyc", line 280, in run
  File "bq.pyc", line 127, in __init__
  File "bq.pyc", line 192, in complete
  File "oauth2client\util.pyc", line 128, in positional_wrapper
  File "apiclient\discovery.pyc", line 192, in build
  File "oauth2client\util.pyc", line 128, in positional_wrapper
  File "oauth2client\client.pyc", line 490, in new_request
  File "httplib2\__init__.pyc", line 1570, in request
  File "httplib2\__init__.pyc", line 1317, in _request
  File "httplib2\__init__.pyc", line 1286, in _conn_request
  File "httplib.pyc", line 1033, in getresponse
httplib.ResponseNotReady
Traceback (most recent call last):
  File "app.py", line 81, in __call__
  File "export.pyc", line 223, in begin_export
  File "formatters\PhantomCircuit.pyc", line 69, in latest
DatabaseError: file is encrypted or is not a database
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 11, 2013, 06:21:41 PM
Last edit: July 12, 2013, 02:36:56 PM by nitrous
 #41

For some reason bitbucket didn't replace the version with the new one I uploaded, and now it's bugging out so I can't upload the new version. For now, I'm going to just host it on Dropbox until I can make sure bitbucket is working properly. Here's the new version. To verify it, you should get some red text at the top of the export window on Windows. The red text says that I've disabled the progress bar on Windows because I believe that is responsible for most of these crashes. Hopefully I can reenable it at some point in the future, but for now try these links. I am really sorry about all this hassle, and thank you for your patience. I really hope that this version does actually work.

Mac - https://dl.dropboxusercontent.com/u/1760961/MTT/v1.0.3/Binary%20(Mac).zip
Windows - https://dl.dropboxusercontent.com/u/1760961/MTT/v1.0.3/Binary%20(Windows).zip

BitBucket seems to be working again now, so you should be able to download from here, the links above are still available if not though.

If you are doing a large export and would like to have a progress bar, try this version and please tell me if it works for you:
Windows - https://dl.dropboxusercontent.com/u/1760961/MTT/v1.0.2/Binary%20(Windows).zip
(The latest mac version already includes the progress bar)

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Epoch
Legendary
*
Offline Offline

Activity: 922
Merit: 1003



View Profile
July 11, 2013, 07:28:06 PM
 #42

I just tried the latest Windows binary (without status bar) and can confirm it has successfully exported 3600 second candles. I"ll try some other values as well and report if I encounter any errors. Win7/64.
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 11, 2013, 07:38:36 PM
 #43

I just tried the latest Windows binary (without status bar) and can confirm it has successfully exported 3600 second candles. I"ll try some other values as well and report if I encounter any errors. Win7/64.

Yes, finally! I'm tentatively hopeful that this success will continue Tongue I'll be exploring possible solutions to the progress bar problem, as exports from a full dump can take a very long time; in the meantime, you can check the filesize of the export to see that it is in fact increasing, although that won't tell you how much longer there is to go.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Diabolicus
Member
**
Offline Offline

Activity: 90
Merit: 10


View Profile
July 12, 2013, 10:14:16 AM
 #44

I will check thoroughly later, but it seems to work. Great job!
Diabolicus
Member
**
Offline Offline

Activity: 90
Merit: 10


View Profile
July 12, 2013, 05:53:59 PM
 #45

If you are doing a large export and would like to have a progress bar, try this version and please tell me if it works for you:
Windows - https://dl.dropboxusercontent.com/u/1760961/MTT/v1.0.2/Binary%20(Windows).zip
(The latest mac version already includes the progress bar)

Not working on Win7 64Bit. Want me to test it with aero or something else disabled?
Cannot test 32Bit before monday, I'm not at the office any more.

However, this version
is right now exporting 15min candles since 2011 like a boss (Win 7, 64 Bit)
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 12, 2013, 06:02:45 PM
 #46

If you are doing a large export and would like to have a progress bar, try this version and please tell me if it works for you:
Windows - https://dl.dropboxusercontent.com/u/1760961/MTT/v1.0.2/Binary%20(Windows).zip
(The latest mac version already includes the progress bar)

Not working on Win7 64Bit. Cannot test 32Bit before monday, I'm not at the office any more.

Ok, I'll have to think of another way to get the progress bar working.

However, this version
is right now exporting 15min candles since 2011 like a boss

Awesome!

Thanks Epoch and Diabolicus for your patience and prompt testing Cheesy

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
2weiX
Legendary
*
Offline Offline

Activity: 2058
Merit: 1005

this space intentionally left blank


View Profile
July 12, 2013, 08:47:42 PM
 #47

trying thiso out ASAP.
might have some feature requests, happy to donate.

Great Smiley Let me know and I'll see what I can do.


err..
a dummie's guide to step-by-stepping to create a dump, first^^
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 12, 2013, 09:46:15 PM
 #48

trying thiso out ASAP.
might have some feature requests, happy to donate.

Great Smiley Let me know and I'll see what I can do.


err..
a dummie's guide to step-by-stepping to create a dump, first^^

Yeah, sorry I haven't got around to doing the readme yet. Hopefully I won't be busy tomorrow and I'll get on it.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 13, 2013, 04:17:02 PM
Last edit: July 13, 2013, 06:48:58 PM by nitrous
 #49

Ok, I've now written an in-depth readme that should explain how to do everything possible with the tool Smiley. There are screenshots for Mac, but the Windows interface is exactly the same so that shouldn't cause any problems. You can download it here:

https://bitbucket.org/nitrous/bq/downloads/Readme.pdf

I have also updated the landing page readme.md with a quickstart guide.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
100x
Jr. Member
*
Offline Offline

Activity: 30
Merit: 501


Seek the truth


View Profile
July 16, 2013, 11:08:38 PM
 #50

So I am teaching myself how to use Postgresql (among other things), and I thought "what better way could there be than finally getting around to messing with the mtgox historical trade data?" (I had been wanting to analyze this data for sometime). After searching around for a while, I came upon this thread.

Thank you so much for this awesome tool! You have saved me much time/effort Smiley It was very easy to start a dump, and the green bar is moving to the right currently so it looks like it is working! Just curious, the easiest way to fill in the missing data (as the Big Query db is not yet being updated regularly) would be to fetch trades with IDs after the ID of the last trade in the dump, yes? I was thinking writing a small python script using the API commands as specified in this post: https://bitcointalk.org/index.php?topic=178336.msg2067244#msg2067244
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 16, 2013, 11:40:28 PM
 #51

So I am teaching myself how to use Postgresql (among other things), and I thought "what better way could there be than finally getting around to messing with the mtgox historical trade data?" (I had been wanting to analyze this data for sometime). After searching around for a while, I came upon this thread.

Thank you so much for this awesome tool! You have saved me much time/effort Smiley It was very easy to start a dump, and the green bar is moving to the right currently so it looks like it is working! Just curious, the easiest way to fill in the missing data (as the Big Query db is not yet being updated regularly) would be to create a small script that uses the mtgox API to fetch trades with ID's after the ID of the last trade in the dump, yes?

Glad you've found it useful Smiley Yes, that's right. Here's an example of one script that's already been created - http://cahier2.ww7.be/bitcoinmirror/phantomcircuit/. This is a different format though, and doesn't store all the data (I did however port this to an export format). You'll need to determine which field in the API maps to which field in the BQ database, and fill in some that isn't included in the API (Bid_User_Rest_App__ for example).

There are some discrepancies with the API data - the missing fields, data being split into individual currencies, currencies occasionally being silently added, gaps in the API data, etc. Whilst this shouldn't be too much of a problem for your actual data processing, it could cause problems if you ever want to update from BQ again. I suggest that you instead insert the data from the API into a new table (say, dump_live). You can then create a view from a union between dump and dump_live, delete the obsolete index on dump, and index the view (Money_Trade__ ASC,[Primary] DESC,Currency__ ASC). This will allow you to still access all the data fairly fast, but without corrupting the BQ dump. If you don't mind an extra 200mb, you can leave the index on dump. My tool will recreate this index if it doesn't exist, so it might be easier and more efficient to just leave it.

I've actually given thought to putting this in the official tool, hence the insights I had above (I'm not sure when I'll have enough time to do that though, it would mean changing quite a lot of the tool), so if you want any help with your script let me know, especially as I've also written documentation on the MtGox API v2, so I can help you map the API data to the BQ schema and get around the data jumps Tongue

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Diabolicus
Member
**
Offline Offline

Activity: 90
Merit: 10


View Profile
July 17, 2013, 07:43:04 AM
 #52

Any idea when the trade data will be updated?
It would be interesting to backtest trade strategies against the bear market after May 23rd.
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 17, 2013, 04:10:08 PM
 #53

Any idea when the trade data will be updated?
It would be interesting to backtest trade strategies against the bear market after May 23rd.

I tried again today but I didn't get an answer. Tomorrow I'll try to contact him earlier, hopefully during Japanese business hours.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
aspeer
Member
**
Offline Offline

Activity: 102
Merit: 10



View Profile
July 18, 2013, 04:41:43 AM
 #54

very nice tool for running scenarios! thanks!

BTC accepted: 1C3Z4aA4v1bADYrSozjfSujzmtwxD4cvVH
Don't format your wallet into oblivion, back it up.  |  I use encrypted BackBlaze: http://backblaze.apspeer.com
10% off Campbx Fees: http://goo.gl/cIRwo
dlasher
Sr. Member
****
Offline Offline

Activity: 467
Merit: 250



View Profile WWW
July 27, 2013, 12:16:17 AM
 #55


Trying, but unsuccessful so far.

windows version (on win7-x64) crashes about 50% through.
linux version (on debian7) crashes after entering auth code with:

Code:
No handlers could be found for logger "oauth2client.util"
Traceback (most recent call last):
  File "app.py", line 81, in __call__
    return apply(self.func, args)
  File "/usr/src/bq/bq.py", line 169, in complete
    credential = flow.step2_exchange(code, http)
  File "/usr/local/lib/python2.7/dist-packages/google_api_python_client-1.1-py2.7.egg/oauth2client/util.py", line 128, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/google_api_python_client-1.1-py2.7.egg/oauth2client/client.py", line 1283, in step2_exchange
    headers=headers)
  File "/usr/local/lib/python2.7/dist-packages/httplib2-0.8-py2.7.egg/httplib2/__init__.py", line 1570, in request
    (response, content) = self._request(conn, authority, uri, request_uri, method, body, headers, redirections, cachekey)
  File "/usr/local/lib/python2.7/dist-packages/httplib2-0.8-py2.7.egg/httplib2/__init__.py", line 1317, in _request
    (response, content) = self._conn_request(conn, request_uri, method, body, headers)
  File "/usr/local/lib/python2.7/dist-packages/httplib2-0.8-py2.7.egg/httplib2/__init__.py", line 1252, in _conn_request
    conn.connect()
  File "/usr/local/lib/python2.7/dist-packages/httplib2-0.8-py2.7.egg/httplib2/__init__.py", line 1021, in connect
    self.disable_ssl_certificate_validation, self.ca_certs)
  File "/usr/local/lib/python2.7/dist-packages/httplib2-0.8-py2.7.egg/httplib2/__init__.py", line 80, in _ssl_wrap_socket
    cert_reqs=cert_reqs, ca_certs=ca_certs)
  File "/usr/lib/python2.7/ssl.py", line 381, in wrap_socket
    ciphers=ciphers)
  File "/usr/lib/python2.7/ssl.py", line 141, in __init__
    ciphers)
SSLError: [Errno 185090050] _ssl.c:340: error:0B084002:x509 certificate routines:X509_load_cert_crl_file:system lib


Ideas?
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
July 27, 2013, 01:21:06 PM
 #56


Trying, but unsuccessful so far.

windows version (on win7-x64) crashes about 50% through.
linux version (on debian7) crashes after entering auth code with:

Code:
No handlers could be found for logger "oauth2client.util"
Traceback (most recent call last):
  File "app.py", line 81, in __call__
    return apply(self.func, args)
  File "/usr/src/bq/bq.py", line 169, in complete
    credential = flow.step2_exchange(code, http)
  File "/usr/local/lib/python2.7/dist-packages/google_api_python_client-1.1-py2.7.egg/oauth2client/util.py", line 128, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/google_api_python_client-1.1-py2.7.egg/oauth2client/client.py", line 1283, in step2_exchange
    headers=headers)
  File "/usr/local/lib/python2.7/dist-packages/httplib2-0.8-py2.7.egg/httplib2/__init__.py", line 1570, in request
    (response, content) = self._request(conn, authority, uri, request_uri, method, body, headers, redirections, cachekey)
  File "/usr/local/lib/python2.7/dist-packages/httplib2-0.8-py2.7.egg/httplib2/__init__.py", line 1317, in _request
    (response, content) = self._conn_request(conn, request_uri, method, body, headers)
  File "/usr/local/lib/python2.7/dist-packages/httplib2-0.8-py2.7.egg/httplib2/__init__.py", line 1252, in _conn_request
    conn.connect()
  File "/usr/local/lib/python2.7/dist-packages/httplib2-0.8-py2.7.egg/httplib2/__init__.py", line 1021, in connect
    self.disable_ssl_certificate_validation, self.ca_certs)
  File "/usr/local/lib/python2.7/dist-packages/httplib2-0.8-py2.7.egg/httplib2/__init__.py", line 80, in _ssl_wrap_socket
    cert_reqs=cert_reqs, ca_certs=ca_certs)
  File "/usr/lib/python2.7/ssl.py", line 381, in wrap_socket
    ciphers=ciphers)
  File "/usr/lib/python2.7/ssl.py", line 141, in __init__
    ciphers)
SSLError: [Errno 185090050] _ssl.c:340: error:0B084002:x509 certificate routines:X509_load_cert_crl_file:system lib


Ideas?


Hi dlasher,

I haven't actually tried it on linux, but I can see why you get the error. The app currently expects httplib2/cacerts.txt to be installed locally to app.py (as in the windows and mac binaries), and it goes wrong when the library is actually installed to pythons dist-packages. I've just pushed an update that should address this issue and use the installed cacerts.txt if it can't find it locally.

As for the windows problem, was that with updating or exporting? Was there any indication as to what went wrong? Possibly a log file? I know I'm not catching every exception yet, which could cause an unexpected problem, but under normal conditions the only problems that could happen should be from Tkinter being buggy with multithreaded code.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
id10tothe9
Hero Member
*****
Offline Offline

Activity: 623
Merit: 500



View Profile
August 15, 2013, 09:21:40 PM
Last edit: August 15, 2013, 09:41:34 PM by id10tothe9
 #57

I tried it on Ubuntu, after authentication I get "Unexpected Exception" with a long message (cant be copied from the window) at the end of which it says: IOError: [Errno 13] Permission denied: '/home/username/.config/mtgox-trades-tool/creds.dat

not sure what to make of that Undecided

Edit: ok, kinda linux n00b Grin. got it working now..

           ▀██▄ ▄██▀
            ▐█████▌
           ▄███▀███▄
         ▄████▄  ▀███▄
       ▄███▀ ▀██▄  ▀███▄
     ▄███▀  ▄█████▄  ▀███▄
   ▄███▀  ▄███▀ ▀███▄  ▀███▄
  ███▀  ▄████▌   ▐████▄  ▀███
 ███   ██▀  ██▄ ▄██  ▀██   ███
███   ███  ███   ███  ███   ███
███   ███   ███████   ███   ███
 ███   ███▄▄       ▄▄███   ███
  ███▄   ▀▀█████████▀▀   ▄███
   ▀████▄▄           ▄▄████▀
      ▀▀███████████████▀▀
DeepOnion★  Anonymity Guaranteed
★  Anonymous and Untraceable
★  Guard Your Privacy
      ▄▄██████████▄▄
    ▄███▀▀      ▀▀█▀   ▄▄
   ███▀              ▄███
  ███              ▄███▀   ▄▄
 ███▌  ▄▄▄▄      ▄███▀   ▄███
▐███  ██████   ▄███▀   ▄███▀
███▌ ███  ███▄███▀   ▄███▀
███▌ ███   ████▀   ▄███▀
███▌  ███   █▀   ▄███▀  ███
▐███   ███     ▄███▀   ███
 ███▌   ███  ▄███▀     ███
  ███    ██████▀      ███
   ███▄             ▄███
    ▀███▄▄       ▄▄███▀
      ▀▀███████████▀▀
100x
Jr. Member
*
Offline Offline

Activity: 30
Merit: 501


Seek the truth


View Profile
August 16, 2013, 11:58:28 AM
 #58

Glad you've found it useful Smiley Yes, that's right. Here's an example of one script that's already been created - http://cahier2.ww7.be/bitcoinmirror/phantomcircuit/. This is a different format though, and doesn't store all the data (I did however port this to an export format). You'll need to determine which field in the API maps to which field in the BQ database, and fill in some that isn't included in the API (Bid_User_Rest_App__ for example).

There are some discrepancies with the API data - the missing fields, data being split into individual currencies, currencies occasionally being silently added, gaps in the API data, etc. Whilst this shouldn't be too much of a problem for your actual data processing, it could cause problems if you ever want to update from BQ again. I suggest that you instead insert the data from the API into a new table (say, dump_live). You can then create a view from a union between dump and dump_live, delete the obsolete index on dump, and index the view (Money_Trade__ ASC,[Primary] DESC,Currency__ ASC). This will allow you to still access all the data fairly fast, but without corrupting the BQ dump. If you don't mind an extra 200mb, you can leave the index on dump. My tool will recreate this index if it doesn't exist, so it might be easier and more efficient to just leave it.

I've actually given thought to putting this in the official tool, hence the insights I had above (I'm not sure when I'll have enough time to do that though, it would mean changing quite a lot of the tool), so if you want any help with your script let me know, especially as I've also written documentation on the MtGox API v2, so I can help you map the API data to the BQ schema and get around the data jumps Tongue

I ended up putting my trade analysis work on hold for a while, but I wanted to stop by and mention that I was able to get the remainder of the data using the API and add it to my db just fine. Thanks again for all your help.

I was curious about the gaps in the API data that you mentioned, what type of gaps exactly are you talking about? I did some extremely basic verification and recreated a few candles for a few random days, and I got the same result as bitcoincharts.com (after filtering to USD trades properly).

In case anyone is interested, here is the python code I used to query the remainder of the mtgox data. You have to do it in 1000 trade blocks, so it is essentially just a nice wrapper on a long series of API calls:
Code:
import urllib2
import json
import csv
import time
import datetime

API_FETCH_URL = 'https://data.mtgox.com/api/2/BTCUSD/money/trades/fetch?since='

# mtgox API varname: postgres DB varname (from BigQuery dump)
VAR_MAP = {
    'price_currency': 'currency__',
    'trade_type': 'type',
    'price_int': 'price',
    'item': 'item',
    'primary': 'single_market',
    'tid': 'money_trade__',
    'amount_int': 'amount',
    'date': 'date',
    'properties': 'properties'
}
VARS = [var for var in VAR_MAP.keys()]

def process_date(timestamp):
    return datetime.datetime.utcfromtimestamp(timestamp).isoformat(' ')

def fetch_trades(last_id= None, outfile= '', limit=False, noise=False):
    counter = 0

    with open(outfile, 'wb') as f:
        # create CSV writer object
        writer = csv.writer(f)

        # add header of varnames at top of file
        writer.writerow([VAR_MAP[var] for var in VARS])

        while True:
            # pause for 2 seconds between API calls to prevent getting banned by anti-DDOS
            time.sleep(2)

            # fetch trades after the most recent trade id, using mtgox API
            page_data = urllib2.urlopen(API_FETCH_URL + last_id)
            # read response from urlopen GET request
            json_response = page_data.read()
            # decode JSON data into python dictionary
            response = json.loads(json_response)

            if response['result'] == 'success' and len(response['data']) > 0:
                trades = response['data']

                if noise:
                    print 'Batch %04d -- ?since= %s, num trades: %d' % (counter + 1, last_id, len(trades))

                # write each trade as a separate line, using only trade values for the vars in the list VARS
                # for date, convert from timestamp into ISO str format (UTC) to match date column in postgres DB
                writer.writerows([[trade[var] if var != 'date' else process_date(trade[var]) for var in VARS]
                                  for trade in trades])

                # set last_id to the tid of the last trade, so we can fetch next batch of trades
                last_id = trades[-1]['tid']

                counter = counter + 1
                # if limit parameter is in use, then only do as many batches as specified
                if limit != False and counter >= limit:
                    break
            else:
                print '\n**********'
                if response['result'] == 'success':
                    print 'SCRIPT END -- last trade reached'
                else:
                    print 'SCRIPT END -- API call failed'
                print '**********'
                break

if __name__ == '__main__':
    username = 'name'
    fileame = 'mtgox_recent_trades.csv'
    # id of last trade from BigQuery dump
    last_dump_trade_id = '1369319572499520'
    outfile = 'c:\\users\\%s\documents\\data\\mtgox\\%s' % (username, filename)

    fetch_trades(last_id= last_dump_trade_id, outfile= outfile, noise= True)
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
August 16, 2013, 12:22:20 PM
 #59

I ended up putting my trade analysis work on hold for a while, but I wanted to stop by and mention that I was able to get the remainder of the data using the API and add it to my db just fine. Thanks again for all your help.
No problem. Thanks, I'm sure many people here will find that useful until MtGox gets around to finishing the bq database. For anyone still interested in this tool, I'm going to post an update on the situation immediately after this post.

I was curious about the gaps in the API data that you mentioned, what type of gaps exactly are you talking about? I did some extremely basic verification and recreated a few candles for a few random days, and I got the same result as bitcoincharts.com (after filtering to USD trades properly).
For USD I wouldn't expect any gaps. Essentially, as well as the 1000 trade limit in the API, there are also other limits such as a time limit, something like 86400 seconds. What this means is that if no trades happen in a 24-hour period, then your script will break down, as no data will be returned and you won't ever update your last_id. The most well known gap is for USD across the tid transition (see my documentation on this here) between tids 218868 and 1309108565842636. I believe this is the only USD gap, and since this was back in 2011 it is covered by the bq database and is not a problem.

For other less popular currencies, however, I have found that there are many gaps, some quite long, and so without regular bq updates this is liable to break a script. Of course if you have an up to date database and can run your script regularly, at least once per day, this will not be a problem as you will catch all trades as they come in. Alternatively, you could manually advance the last_id if you haven't caught up to the current time yet, but you need to confirm the limit is indeed 86400s first.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
August 16, 2013, 12:29:39 PM
 #60

I haven't posted an update about this yet because I haven't got a definitive update from MagicalTux that bq is even still happening Sad At the moment, with US legal issues, litecoin, etc, MagicalTux isn't really focusing on bq at all, but hopefully he will finish it eventually. I don't anticipate this to be anytime soon though, and he probably needs to be reminded about bq occasionally (and shown that there is a demand for it). Unfortunately, I don't have much time to work on this at the moment, and I'm going to university in a few weeks, so if anyone still wants regular data then perhaps someone with python experience might consider forking my project and adding in 100x's script? My idea was to use two tables in the database - one for BQ data, the other for API data. Then you could create a view into a union of both these tables, and delete API data as BQ replaces it. Remember that the API doesn't provide all fields that BQ does, and you have to access each currency individually with the API. If you don't need these other fields though, and only need a few select currencies, you could then use this hybrid system to do live exports as well (as Loozik requested).

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
bitranox
Newbie
*
Offline Offline

Activity: 13
Merit: 0



View Profile
August 27, 2013, 06:53:48 AM
 #61

Quote
For other less popular currencies, however, I have found that there are many gaps, some quite long, and so without regular bq updates this is liable to break a script. Of course if you have an up to date database and can run your script regularly, at least once per day, this will not be a problem as you will catch all trades as they come in. Alternatively, you could manually advance the last_id if you haven't caught up to the current time yet, but you need to confirm the limit is indeed 86400s first.

The (very good and appreciated) "Unofficial Documentation for MtGox's HTTP API v2" has some glitches in the description for the API function money/trades/fetch :

(see https://bitbucket.org/nitrous/mtgox-api#markdown-header-moneytradesfetch)

there are not really "gaps" (except the one big gap for USD when they switch from a counter to microtimestamp)

the mtGox API function money/trades/fetch simply works as follows :

if You pass a microtimestamp as parameter :
- it will return maximum 1000 records of a given currencypair
- it will return maximum the trades that happend 86400 seconds after the given microtimestamp
meaning if there are less then 1000 trades within 86400 seconds, You will receive just the trades that happened 86400 seconds after the given microtimestamp.

if You dont pass a microtimestamp parameter :
- it will receive the tradedate for a given currencypair within the last 86400 seconds (24 hours) from now. In that case the "1000 record" limit does not apply.

so - if no trades happened within the timespan of 86400 seconds for a given currencypair, You will not get back any datarows - what will only happen for some rarely traded currencypairs
in that case You will need to add 86400 seconds to the timestamp (respectively 86400 * 1E6 microseconds) and query again for the next day.

As nitrox pointed out in some other posts, I would not recommend that every user should download the full trade history from mtgox because of serverload issues, etc ...

however - since the bq database is not updated until now, I will provide the full dumps until up-to-date in the next couple of days. (what will be updated daily or hourly - I wait for user requests on that issue)

the python script that 100x posted before will fail on some rare traded currencypairs for that reason.
bitranox
Newbie
*
Offline Offline

Activity: 13
Merit: 0



View Profile
August 27, 2013, 11:02:09 AM
 #62

I am thinking to provide the tradedata by another method to the users.

I fetched all the data for myself, so if a considerable amount of users wants to use that, I will provide the tradedata in a different way (not rely on mtgox updating the data, not rely on big query).

please check out following post : https://bitcointalk.org/index.php?topic=282154.msg3017272#msg3017272 and let me know.

yours sincerely

bitranox
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
August 27, 2013, 11:07:06 AM
 #63

The (very good and appreciated) "Unofficial Documentation for MtGox's HTTP API v2" has some glitches in the description for the API function money/trades/fetch :

(see https://bitbucket.org/nitrous/mtgox-api#markdown-header-moneytradesfetch)

there are not really "gaps" (except the one big gap for USD when they switch from a counter to microtimestamp)

the mtGox API function money/trades/fetch simply works as follows :

Yeah, I think in a previous commit to the docs I actually did say that, but I wanted to emphasise (a) that this was due to gaps in trades, ie no trades happened in a 24h window, and (b) that people should not try to circumvent this due to server load. Of course, for small date ranges (or for your centralised database project), that's fine, but I don't want to openly advocate such usage. After all, it's the reason MagicalTux introduced the bq database in the first place!

the python script that 100x posted before will fail on some rare traded currencypairs for that reason.

Yeah I mentioned that a few posts ago. In actual fact, they don't even have to be particularly rare. I know GBP has some very long gaps near the beginning. Obviously for rare currencies this will be even more of a problem.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
arvin8
Newbie
*
Offline Offline

Activity: 9
Merit: 0


View Profile
August 30, 2013, 12:03:33 PM
 #64

got an error during download


An error occurred that shouldn't have happened. Please report this on the tool's forum thread at bitcointalk (you should be taken there when you click ok).

Traceback (most recent call last):
  File "/Applications/MtGox-Trades-Tool.app/Contents/Resources/app.py", line 97, in thread_bootstrap
  File "mtgox.pyc", line 120, in run
  File "bq.pyc", line 100, in gen2
  File "oauth2client/util.pyc", line 128, in positional_wrapper
  File "apiclient/http.pyc", line 676, in execute
  File "oauth2client/util.pyc", line 128, in positional_wrapper
  File "oauth2client/client.pyc", line 494, in new_request
  File "oauth2client/client.pyc", line 663, in _refresh
  File "oauth2client/client.pyc", line 682, in _do_refresh_request
  File "httplib2/__init__.pyc", line 1570, in request
  File "httplib2/__init__.pyc", line 1317, in _request
  File "httplib2/__init__.pyc", line 1286, in _conn_request
  File "/usr/lib/python2.7/httplib.py", line 1027, in getresponse
  File "/usr/lib/python2.7/httplib.py", line 407, in begin
  File "/usr/lib/python2.7/httplib.py", line 371, in _read_status
BadStatusLine: ''
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
August 30, 2013, 01:45:09 PM
 #65

got an error during download


An error occurred that shouldn't have happened. Please report this on the tool's forum thread at bitcointalk (you should be taken there when you click ok).

Traceback (most recent call last):
  File "/Applications/MtGox-Trades-Tool.app/Contents/Resources/app.py", line 97, in thread_bootstrap
  File "mtgox.pyc", line 120, in run
  File "bq.pyc", line 100, in gen2
  File "oauth2client/util.pyc", line 128, in positional_wrapper
  File "apiclient/http.pyc", line 676, in execute
  File "oauth2client/util.pyc", line 128, in positional_wrapper
  File "oauth2client/client.pyc", line 494, in new_request
  File "oauth2client/client.pyc", line 663, in _refresh
  File "oauth2client/client.pyc", line 682, in _do_refresh_request
  File "httplib2/__init__.pyc", line 1570, in request
  File "httplib2/__init__.pyc", line 1317, in _request
  File "httplib2/__init__.pyc", line 1286, in _conn_request
  File "/usr/lib/python2.7/httplib.py", line 1027, in getresponse
  File "/usr/lib/python2.7/httplib.py", line 407, in begin
  File "/usr/lib/python2.7/httplib.py", line 371, in _read_status
BadStatusLine: ''


Hmm, that sounds like a temporary problem with either Google or your internet connection.

For most errors, you should be able to start the program again. It should pick up where it left off and continue downloading.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
dirtyfilthy
Member
**
Offline Offline

Activity: 77
Merit: 10


View Profile
December 09, 2013, 01:50:07 AM
 #66

Hi,

I'm getting the following error when trying to download after being partially successful

Traceback (most recent call last):
  File "app.py", line 118, in thread_bootstrap
    self.thread_init()
  File "./bq/mtgox.py", line 153, in run
    raise Exception, "Insertion error: %d =/= %d" % (pos,size)
Exception: Insertion error: 1040000 =/= 1024000
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 09, 2013, 02:26:02 PM
 #67

Hi,

I'm getting the following error when trying to download after being partially successful

Traceback (most recent call last):
  File "app.py", line 118, in thread_bootstrap
    self.thread_init()
  File "./bq/mtgox.py", line 153, in run
    raise Exception, "Insertion error: %d =/= %d" % (pos,size)
Exception: Insertion error: 1040000 =/= 1024000

This means that whilst downloading, something bad happened either on Google's end, your end, or with your internet connection, and a set of trades wasn't properly inserted into the database, so you're missing 16000 entries. You have a few options:

1) I would recommend starting again if it's not too much trouble for your internet connection, as your data could be corrupted somehow.
2) Alternatively, you could resume the download as it looks like the last 16k entries just somehow weren't inserted, and it may not be corrupted at all.
3) Or to be safe, you could manually remove the last 24k rows to try to remove any potential corruption and resume from 1 million.

1 is the safest, but you'll probably be fine with any of the options.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
toniquone
Newbie
*
Offline Offline

Activity: 4
Merit: 0


View Profile
December 12, 2013, 02:22:21 PM
 #68

Hi!

Thanks for a great tool. But I can't make it work properly. I started it several times, and always I get the same error, and my dump always the same size - 107 mb. I'm not tried to continue updating because it looks like it's corrupted.
Is there any way to fix it?
http://s12.postimg.org/opvn4mmp9/Screen_Shot_2013_12_12_at_7_32_48_AM.pnghttp://s10.postimg.org/8q04iicmh/Screen_Shot_2013_12_12_at_7_32_56_AM.pnghttp://s16.postimg.org/xmfafdztx/Screen_Shot_2013_12_12_at_5_56_13_PM.png
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 12, 2013, 11:23:55 PM
 #69

Ok, it seems that the tool is now breaking for some unknown reason (changes to the bigquery API?), and I haven't updated it in a while so I can't be sure exactly why. Seeing as my tool hasn't changed, and it seems to be breaking around 1024000 rows for those who are having problems, I would guess it's a problem on Google's end, though more likely their API just changed and I should update my tool.

In addition, I'm sure many of you are rather frustrated that my app doesn't let you download data more up to date than May. So, rather than spend time I don't have updating the app, I'm going to release a database which is up to date to today, Thu, 12 Dec 2013 22:13:40 GMT. It is in a slightly different format, but I've tried to keep the columns as similar to the app as possible:

Money_Trade__intThe trade ID. As you may know, this is a sequential integer up to 218868, whereupon it is a unix micro timestamp.
DateintUnix timestamp (1s resolution).
Primarystr'1' for primary (recorded in original currency), '0' for non-primary.
ItemstrCurrently, only contains 'BTC'
Currency__str'USD', 'EUR', 'GBP', etc.
Typestr'bid' or 'ask'.
Propertiesstre.g. 'market'. May be a comma separated list if appropriate, e.g. 'limit,mixed_currency'
AmountintQuantity traded in terms of Item.
PriceintPrice at which the trade occurred, in terms of Currency__.
Bid_User_Rest_App__strID specific to the authorised application used to perform the trade on the bid side, if applicable.
Ask_User_Rest_App__strID specific to the authorised application used to perform the trade on the ask side, if applicable.

The most significant difference is that Date is a unix timestamp rather than in ISO format.

Here's the link: https://docs.google.com/file/d/0B3hexlKVFpMpYmdoUUhXckRrT2s

The table is dump, and it is indexed on Money_Trade__ asc, [Primary] desc (index is named ix).

Sorry if you're annoyed that you spent time and bandwidth downloading the data using my app, however it was my belief that MtGox would complete their side of the service so that my app would then be used to maintain and update a local copy of the database in any format. Then you wouldn't have had to rely on a third party maintaining an up-to-date copy, giving you complete control over your backtesting.

I don't think this database will be compatible with the app unfortunately, so you won't be able to export it to different formats, and I don't plan on keeping it up to date, but at least it's better than the old May database and works. Also, sorry that I've been rather less active as of late, I've got some pretty big commitments currently and, as such, I unfortunately have far less time to do stuff like this Sad

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
toniquone
Newbie
*
Offline Offline

Activity: 4
Merit: 0


View Profile
December 13, 2013, 09:42:32 AM
 #70

Big thanks, Nitrous! Smiley
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 13, 2013, 10:38:47 AM
 #71

Big thanks, Nitrous! Smiley

No problem Smiley

Also, for anyone who does want to maintain their own up to date local database and has managed to get my original app to work, siergiej has created a node.js script which you may find useful -- https://gist.github.com/siergiej/5971230 -- it should update the database in the correct format, but please note that it will probably not be compatible with the dump I just posted.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
poikkeus
Member
**
Offline Offline

Activity: 73
Merit: 10


View Profile
December 15, 2013, 01:12:21 PM
 #72

I've made trade data downloader with Java for my upcoming trading bot. I could modify it to only download trading data from the date you want and make CSV file. The files could be updated by running it again.

Or if you are a java coder you can do it yourself. https://github.com/jussirantala/goxbot
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 16, 2013, 09:33:55 PM
 #73

Hi all,

VERY interesting Thread here since i wanted to have a long time some reliable tickdata for bitcoin. I must say that i am absolutely no programmer - so i can not really worship all the great work Nitrous, Poikkeus and others have done, but it sounds really good and somehow i will get it running - hopefully Grin

Could someone help me out to understand this better:

I saw that mt-gox at Google Bigquery consists of 2 Tables: trades and trades_raw. What is the difference between them?
Is it possible to convert the dump somehow in an .csv file?
The tool from poikkeus has one file history.dat can i download this and - same question - get this somehow into a .csv file?

Greetings to you all...

 

The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 16, 2013, 11:25:35 PM
Last edit: December 16, 2013, 11:37:18 PM by nitrous
 #74

Hi all,

VERY interesting Thread here since i wanted to have a long time some reliable tickdata for bitcoin. I must say that i am absolutely no programmer - so i can not really worship all the great work Nitrous, Poikkeus and others have done, but it sounds really good and somehow i will get it running - hopefully Grin

Could someone help me out to understand this better:

I saw that mt-gox at Google Bigquery consists of 2 Tables: trades and trades_raw. What is the difference between them?
Is it possible to convert the dump somehow in an .csv file?
The tool from poikkeus has one file history.dat can i download this and - same question - get this somehow into a .csv file?

Greetings to you all...

 

Hi BNO,

The 2 tables relate to a discussion I had with MagicalTux (CEO of MtGox). My tool only really works if the bigquery data is presorted, but that can't be guaranteed if data is repeatedly being inserted into the table due to the way BQ works. I asked MagicalTux if he could implement a system of two tables, where one is updated, and the other is a sorted version of the updated table, hence the two tables. Since no updates have occurred since May though, it doesn't make a difference currently. Long story short, there is no difference between the tables Tongue.

If you download the data using my tool, then there are export options which can convert into a few different formats, CSV included (you can also choose which data to use and even do candles). Obviously though, the tool can only get data up to May (and has been a bit buggy lately). As far as I know, there aren't any other tools that can give you bulk csv data, although you might want to look at http://wizb.it which may at some point provide a working data service. I believe there are some charting websites out there though which can give you csv data, though with much lower resolution.

Please note that my tool isn't compatible with the full dump I just posted, and so this cannot be used to generate CSV. I might reformat it into a compatible version though seeing as there's quite a lot of demand for this data.

Sorry that your options seem a little sparse currently, if I had more time I'd love to develop a proper tool or set up a web service to make it easy to get data for any exchange, I know there's demand for it. I think I might make this a side project of mine, though it'll probably be quite slow.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 16, 2013, 11:42:25 PM
 #75

The tool from poikkeus has one file history.dat can i download this and - same question - get this somehow into a .csv file?

I haven't looked at his source code yet, but looking at the contents of history.dat it looks to be in a binary format, which means you'll need a script to convert it. Perhaps poikkeus could add conversion to CSV to his tool? I'm sure if not someone else could quickly hack up a conversion script, assuming the format is simple enough (it looks like it's a java object dump).

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
Exther
Newbie
*
Offline Offline

Activity: 45
Merit: 0


View Profile
December 17, 2013, 03:10:38 PM
 #76

I just tried the latest Windows binary (without status bar) and can confirm it has successfully exported 3600 second candles. I"ll try some other values as well and report if I encounter any errors. Win7/64.
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 17, 2013, 04:10:53 PM
Last edit: December 18, 2013, 05:17:27 PM by nitrous
 #77

I just tried the latest Windows binary (without status bar) and can confirm it has successfully exported 3600 second candles. I"ll try some other values as well and report if I encounter any errors. Win7/64.


That's good Smiley (although I was sure that the dump I uploaded would be incompatible...) Anyway, in case it still is, I've just finished another dump, up to date for 2013-12-17 15:48:26.308886, which should be fully compatible (the difference with the first one is that the date column is now in ISO format rather than unix timestamp, and the primary column is [true, false] instead of [1, 0]).

Here are the two dumps:
1) https://docs.google.com/file/d/0B3hexlKVFpMpYmdoUUhXckRrT2s (2013-12-12 22:13:40.483107; 665.1 MB)
2) https://docs.google.com/file/d/0B3hexlKVFpMpdVhxd1NvQ0Fqanc (2013-12-17 15:48:26.308886; 899.2 MB) <--- fully compatible with my tool for export (don't try to update it though)

The size discrepancy is because the ISO date format (2) is stored as text while the unix timestamp (1) can be stored more compactly as an integer.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 17, 2013, 04:38:12 PM
 #78

Hi nitrous,

i was just stopping by. Wanted to thank you for your answer first. Googled the last 2 hours to become a bit smarter on databases and how to get  this somehow done (didn't get far though Huh)

Do i understand you right, that  i could download the 2nd file and convert it with your tool? That would be awesome... Smiley


The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 17, 2013, 04:47:54 PM
 #79

Hi nitrous,

i was just stopping by. Wanted to thank you for your answer first. Googled the last 2 hours to become a bit smarter on databases and how to get  this somehow done (didn't get far though Huh)

Do i understand you right, that  i could download the 2nd file and convert it with your tool? That would be awesome... Smiley



Hi, yep, you should be able to just load it in and export Smiley

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 17, 2013, 04:54:10 PM
 #80

Hi,

thanks a lot, i'm already downloading and post when i'm done.

Some things i wanted to ask you - since you have lots of experience with this data now.
How would you rate the data quality in terms of acurateness and "rightness". Any known bugs oddities? Is this data actually as all trades went through on Mtgox? something to consider if i might want to use this data to develop a strategy (need to read first, to get better with this)?

The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 18, 2013, 01:57:34 AM
 #81

Hi,

thanks a lot, i'm already downloading and post when i'm done.

Some things i wanted to ask you - since you have lots of experience with this data now.
How would you rate the data quality in terms of acurateness and "rightness". Any known bugs oddities? Is this data actually as all trades went through on Mtgox? something to consider if i might want to use this data to develop a strategy (need to read first, to get better with this)?

The data should be an exact representation of what happened in real time, or at least a microsecond-resolution approximation. By that I mean that it it is possible multiple trades were executed in the same tick, but for all intents and purposes this shouldn't affect your use of the data for strategy development.

There are a couple oddities with the Money_Trade__ values, but I don't think this will be particularly relevant to you. Otherwise, I think the data is relatively accurate.

It is also possible that there are some oddities around May 23rd 2013 and around today December 17th, purely because I collected the data from 3 sources, and those were the boundaries - I'm fairly sure that there shouldn't be a problem, but if you wan't to be really safe then you can avoid those two days.

Lastly, the Primary column -- some trades are duplicated into other orderbooks, and their duplicates will be marked with their Primary field as false. It should be easy to exclude this data from your exports though (and it should be excluded by default). Alternatively, you might want to include it as I think these non-primary trades can influence other currencies, but I'm not too sure about this -- if you're not sure, just exclude it.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 18, 2013, 05:09:10 PM
 #82

i downloaded the 2nd file (899,2 Mib, but i on my computer it shows with 1,07 GiB)  and exportet it to an .csv file - that worked. Great! Today i was playing a little with the data, reading your manual (very well written kudos) but many things are still very unclear to me:

What really confuses me is the whole complex surrounding the timestamps. When i convert the file to an .csv  for example the last entry looks like this when i open the .csv in notepad:
 2010-07-17,23:09:17,0.049510,2000000000

when i import the same .csv into Excel and i change the formating of the cells to hh:mm:ss:ms (could not find smaller units than milliseconds in Excel) than excel shows me: this as timestamp: 23:09:17:917

for me several things are not clear for me with this:
1. How can Excel display something which is not even in the .csv file contained. Did Excel just "make this up"?
2. In the thread there stands that the data is in microseconds accuracy, but when i look at the file in editor it seems to be seconds. For example the last 3 lines of the file are:

2013-12-17,15:47:30,715.700000,1210000
2013-12-17,15:47:30,715.700000,1000000
2013-12-17,15:47:30,715.700000,780000

3. Not so important but maybe someone knows here/ has experience with Excel displaying timestamps: As far as i found Excel (i use 2010) is not able to display a better resolution than milliseconds. But some of the timestamps in my Sheet have 4 digits after the seconds e. g. 18.07.2013 17:48:56:4856
 how is that even possible?

Quote
It is also possible that there are some oddities around May 23rd 2013 and around today December 17th, purely because I collected the data from 3 sources, and those were the boundaries - I'm fairly sure that there shouldn't be a problem, but if you wan't to be really safe then you can avoid those two days.

If i understand you right this file is a combination of 3 datasources. The 3 Datasources are not mashed up on a day by day fashion but more like this:
from 07/17/2010 : Datasource1 (Mark Karpeles?)
05/23/2013: Datasource2 (API from mtgox? Bitcoincharts?)
12/17/2013: Datasource3?

so should i cut out just May 23rd and Dec 17. or +- some days before and after?

Quote
There are a couple oddities with the Money_Trade__ values, but I don't think this will be particularly relevant to you. Otherwise, I think the data is relatively accurate.

before i posted here i downloaded the files from Google BigQuery. I noticed then that there have been quite large jumps in the Trade ID's. Are you refering to that or could it be that with the change of the "primary key" of the database after Trade id 218868 things could have been messed up? No or? I mean the closed the exchange for 6 days back then, to set everything up right..





The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 18, 2013, 05:39:49 PM
 #83

...

Hmm, some really strange stuff happening... Firstly, showing up as 1.07 GB - I just realised that the tool will create a duplicate index, I should have thought about that. It's not really a problem except for taking a few minutes the first time you load it up and increasing the file size more than necessary, otherwise though it shouldn't affect your usage.

Yeah sorry about that - the format I was asked to support only went down to second resolution, so the microsecond resolution isn't present. In fact, microsecond resolution isn't even available for the first 218868 ticks. Unfortunately there's so many different possible formats I could export to, so I picked a few and stuck with them (although obviously the dump contains all the raw data unfiltered). If there's one you really want then I could release a new version of the tool with that supported, but if you want to manipulate the data into more formats or more than you can do in Excel consider playing around with Python and seeing what you can do with it Smiley

Excel is being very weird - if you notice, it's taking the minute and seconds and converting them into a new millisecond value for some very strange reason, such as 17:48:56 -> 17:48:56:4856. There's no imperative for it to do this.

Yes, the first data source is the Google BigQuery database, the second is the MtGox HTTP API, and the third is the MtGox Socket API -- basically the socket API is used to just collect the last few trades in real time. If you're going to cut them out, then you can cut out the last few minutes, half an hour to be safe, of the data, and just the one day on May 23rd (although really there shouldn't be any discrepancy, the data should be exact).

Yes, the large jump is because MtGox changed recording format -- Money_Trade__, otherwise known as the TID/Trade ID, used to be a (mostly) sequential integer, then it became a microsecond timestamp afterwards (coinciding with that closure, I believe). It doesn't make a difference to the data though, all ticks should still be present, it's just a curiosity. Of course, this discontinuity will probably have some effect on the prices around that time, so you might want to exclude a few days ± around that point for that reason as it might mess up your backtesting.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 18, 2013, 10:17:06 PM
 #84

Hi nitrous,

Since my charting software didn't eat well the Date, Time as two columns i recalculated in another column as Date+time, and then saved this as .csv. On first glance it looks good. But Is there any problem with this approach, should i beware of something (like leap years and other stuff that could fuck up the calculation)...

Quote
It doesn't make a difference to the data though, all ticks should still be present, it's just a curiosity. Of course, this discontinuity will probably have some effect on the prices around that time, so you might want to exclude a few days ± around that point for that reason as it might mess up your backtesting.

I don't know if i got you right, you mean the data is correct, but the closing of the exhange for almost a week will have had its impact on the prices as people got scared but with the data itself all is good?

I go to bed see you tomorrow...

The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 18, 2013, 11:07:54 PM
 #85

Hi nitrous,

Since my charting software didn't eat well the Date, Time as two columns i recalculated in another column as Date+time, and then saved this as .csv. On first glance it looks good. But Is there any problem with this approach, should i beware of something (like leap years and other stuff that could fuck up the calculation)...

Quote
It doesn't make a difference to the data though, all ticks should still be present, it's just a curiosity. Of course, this discontinuity will probably have some effect on the prices around that time, so you might want to exclude a few days ± around that point for that reason as it might mess up your backtesting.

I don't know if i got you right, you mean the data is correct, but the closing of the exhange for almost a week will have had its impact on the prices as people got scared but with the data itself all is good?

I go to bed see you tomorrow...

Hi BNO,

I don't think so -- if you calculated the date+time as just the concatenation of the two, it should be fine as I created the columns directly from the original unix timestamp (in dump #1) using a decent conversion function. How did you compute the column? I guess considering the weirdness from excel before, it is possible the excel function you used might not work well - hopefully it will be consistent though.

Yeah, anytime an exchange stops or starts, even for a few hours, the data either side will probably be a bit skewed -- both for emotional reasons (like being scared), but also mainly just as the price readjusts through arbitrage to the market price elsewhere (or at least, this is what I would expect).

Cya

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 21, 2013, 01:01:51 AM
Last edit: December 21, 2013, 01:33:20 AM by BNO
 #86

Hi Nitrous,

wanted to write you yesterday but didn't find the time to do so..

One problem occured in the field price when i looked at the data:



i remember that in the BigQuery table you had to divide the price field by 10.000. Here something seems to have gone wrong you know what?

The field Volume i divided by 100.000.000 for adjusting to full coins, might something similiar have happened in this field too?

Bye

Edit: I just saw that the .csv you script created is fine, it must have happened during import to Excel. Do you have an idea what might have caused this? Huh

The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 21, 2013, 07:58:27 PM
 #87

Hi Nitrous,

wanted to write you yesterday but didn't find the time to do so..

One problem occured in the field price when i looked at the data:
...

i remember that in the BigQuery table you had to divide the price field by 10.000. Here something seems to have gone wrong you know what?

The field Volume i divided by 100.000.000 for adjusting to full coins, might something similiar have happened in this field too?

Bye

Edit: I just saw that the .csv you script created is fine, it must have happened during import to Excel. Do you have an idea what might have caused this? Huh

Hi BNO, Excel gets stranger and stranger!  Huh

I just did a test CSV export myself and the CSV is definitely fine, I'm not sure how or why excel is mangling the data like this. Dividing one column by 1e8 should not affect any other column, especially as inconsistently as the price column seems to be being affected :S What is the exact procedure by which you imported the data into excel?

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 21, 2013, 08:02:01 PM
 #88

I just found this resource and I thought it would be useful to some people Smiley -- http://api.bitcoincharts.com/v1/csv/

I can't believe I didn't find it before, anyway, it has regularly updated (15mins?) CSV files with {unix timestamp (1s resolution), price (float), volume (float)} fields for many different exchanges and currencies.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 22, 2013, 12:43:08 PM
 #89

Hi,

i played with this. It was my fault sorry if i caused extra work. The explanation what was the exact Problem might be interesting for people from Europe (like me i'm from germany) importing this into Excel.

The Setting here have to be like this.


The "decimal separator" (that's how its called in German Version translated to english) has to be "."

In my defendance i have to say it wasn't logic for me that this could have caused this error since, and that was now what was the real trick: The combination of the "1000's separtor setting" and the "decimal- separator" led to the "weird" effect that all numbers below 1 where shown correctly but all number beyond 1 not. finally realising this led me to the point that i thought: "Hm O.K. it shouldn't be somethign with the decimal seperator since the numbers in the beginning are right, but let's check it again.

Sorry for wasting your time...

Greetings.

The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 22, 2013, 12:50:53 PM
 #90

Hi,

i played with this. It was my fault sorry if i caused extra work. The explanation what was the exact Problem might be interesting for people from Europe (like me i'm from germany) importing this into Excel.

The Setting here have to be like this.
...

The "decimal separator" (that's how its called in German Version translated to english) has to be "."

In my defendance i have to say it wasn't logic for me that this could have caused this error since, and that was now what was the real trick: The combination of the "1000's separtor setting" and the "decimal- separator" led to the "weird" effect that all numbers below 1 where shown correctly but all number beyond 1 not. finally realising this led me to the point that i thought: "Hm O.K. it shouldn't be somethign with the decimal seperator since the numbers in the beginning are right, but let's check it again.

Sorry for wasting your time...

Greetings.

Hi BNO,

Haha, no don't worry about wasting my time, that was really confusing! That makes a lot of sense now, although now I'm not sure how Excel managed to distinguish between the columns considering they use a comma for the column separator... RFC should really standardise the csv number format as this wasn't very obvious. Anyway, I'm glad you've fixed the problem now, and hopefully this will be useful for other people Smiley

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 22, 2013, 01:46:12 PM
 #91

Quote
RFC should really standardise the csv number format as this wasn't very obvious. Anyway, I'm glad you've fixed the problem now, and hopefully this will be useful for other people

Another thing that logic afterwards but a bit confusing is: Excel automatically changes the "." to a "," as decimal separator when you do it as described above, because this is the country setting (in europe you use "," to separate decimal numbers) if you now save the file again as .csv Excel is not saving the "," as "." again, it leaves the "," - this leads now to the problem what you mentioned that excel could confuse colums, but then its smart and changes the commas as separators with ";" with is well thought out from MS, but one can be confused when importing to the charting software and then thinks "But wtf why is comma not working i saved this as Comma Separated..."

Now we have it all through with Excel what  Wink
It should all be good now, but i don't have time to check it out now, Since i have to pack my stuff fast for "Driving home for Christmas" and there i probably will be pretty busy...

Which brings me to the point: Nice X-mas to you nitrous and all of this board here

till later!

The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 23, 2013, 01:32:06 AM
 #92

Quote
RFC should really standardise the csv number format as this wasn't very obvious. Anyway, I'm glad you've fixed the problem now, and hopefully this will be useful for other people

Another thing that logic afterwards but a bit confusing is: Excel automatically changes the "." to a "," as decimal separator when you do it as described above, because this is the country setting (in europe you use "," to separate decimal numbers) if you now save the file again as .csv Excel is not saving the "," as "." again, it leaves the "," - this leads now to the problem what you mentioned that excel could confuse colums, but then its smart and changes the commas as separators with ";" with is well thought out from MS, but one can be confused when importing to the charting software and then thinks "But wtf why is comma not working i saved this as Comma Separated..."

Now we have it all through with Excel what  Wink
It should all be good now, but i don't have time to check it out now, Since i have to pack my stuff fast for "Driving home for Christmas" and there i probably will be pretty busy...

Which brings me to the point: Nice X-mas to you nitrous and all of this board here

till later!

Thanks, Merry Christmas to you too Cheesy

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
gigica viteazu`
Sr. Member
****
Offline Offline

Activity: 458
Merit: 250

beast at work


View Profile
December 29, 2013, 08:12:52 PM
 #93

The (windows version of the) tool crash after downloading about 100Mb of data, and the crash occur every time at the same spot.

I have tried several times, doesn`t matter if I`m downloading a new dump or just try to resume one.

Code:
Rows downloaded: 1024000
Latest TID: 1366547913512370
Data up to: 2013-04-21 12:38:33

Update in progress - 5074502 rows to download
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 29, 2013, 09:09:07 PM
 #94

The (windows version of the) tool crash after downloading about 100Mb of data, and the crash occur every time at the same spot.

I have tried several times, doesn`t matter if I`m downloading a new dump or just try to resume one.

Code:
Rows downloaded: 1024000
Latest TID: 1366547913512370
Data up to: 2013-04-21 12:38:33

Update in progress - 5074502 rows to download


A few other people seem to be having this problem (see the last couple pages of this thread). I'm not entirely sure why it's happening, but I think Google have changed their protocols subtly and it's broken the tool. As a result, I've created some full dumps, including data all the way up to mid-December 2013 here.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
gigica viteazu`
Sr. Member
****
Offline Offline

Activity: 458
Merit: 250

beast at work


View Profile
December 29, 2013, 09:14:15 PM
 #95

thank you
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 30, 2013, 10:22:25 AM
 #96

How was the timestamp of the data again created. I know i read it somwhere but don't remember. Was it UTC timestamp is there a problem with daylight savings time of Japan? As far as i understand UTC is a constant "world time" and all timezones of the earth a derived from UTC.

Is something known, that there is some problem with the timestamps like Mark K. changed (for whatsoever reasons) , the timesettings of servers or something like this?

The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 30, 2013, 12:47:27 PM
 #97

How was the timestamp of the data again created. I know i read it somwhere but don't remember. Was it UTC timestamp is there a problem with daylight savings time of Japan? As far as i understand UTC is a constant "world time" and all timezones of the earth a derived from UTC.

Is something known, that there is some problem with the timestamps like Mark K. changed (for whatsoever reasons) , the timesettings of servers or something like this?

Have you found an inconsistency?

I'm pretty sure they should be consistent with UTC, I take the timestamps directly from that returned by each one of the APIs, and I don't think there are any problems. If you think there are though, tell me which ones and maybe I can have a quick look. If you're just speculating though, then you should be fine. UTC is pretty easy to get right, his servers should just automatically sync with a reliable timeserver in order to get reliable UTC.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 30, 2013, 10:00:51 PM
 #98

Quote
Have you found an inconsistency?

No i was just thinking about it since its very destructive if you look at trading data and do not realize that the data might be looking 1 or 2h in the future.

Quote
UTC is pretty easy to get right, his servers should just automatically sync with a reliable timeserver in order to get reliable UTC.

Hope/guess you are right, but when i look at the "track record" of Mark Karpeles, i'm not sure. What he fucked up in the last 2 years is an achievement in itself.

Good nite its late here...


The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 30, 2013, 11:27:32 PM
 #99

Quote
Have you found an inconsistency?

No i was just thinking about it since its very destructive if you look at trading data and do not realize that the data might be looking 1 or 2h in the future.

Quote
UTC is pretty easy to get right, his servers should just automatically sync with a reliable timeserver in order to get reliable UTC.

Hope/guess you are right, but when i look at the "track record" of Mark Karpeles, i'm not sure. What he fucked up in the last 2 years is an achievement in itself.

Good nite its late here...


Hmm, well the last timestamp from the 2nd dump is consistent with the time I made it, so it looks like recent data is at least consistent with UTC. I suppose the only advice I could give is to do a dry run of any algorithm you develop first, just to see whether it makes the right choices in a period of, say, a week, before risking any real assets.

Personally I've been looking at other exchanges more recently, so you might want to consider other exchanges too, especially as you're worried about whether you can trust Mark. I'm quite interested in Kraken as it looks like it has a lot of potential, but I'm primarily using Bitstamp at the moment. There's some published CSVs for many exchanges here http://api.bitcoincharts.com/v1/csv/, although obviously then you have to trust bitcoincharts to have collected the data properly (I haven't analysed their data yet, but I assume it's fairly reliable; from a quick look the bitstamp one definitely seems to include the major price spike at roughly the right time).

Good night Smiley

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
December 31, 2013, 11:45:15 AM
 #100

Quote
Hmm, well the last timestamp from the 2nd dump is consistent with the time I made it, so it looks like recent data is at least consistent with UTC.

I'll just trust this data, hope i won't regret this one day.

Quote
Personally I've been looking at other exchanges more recently, so you might want to consider other exchanges too, especially as you're worried about whether you can trust Mark. I'm quite interested in Kraken as it looks like it has a lot of potential, but I'm primarily using Bitstamp at the moment. There's some published CSVs for many exchanges here http://api.bitcoincharts.com/v1/csv/, although obviously then you have to trust bitcoincharts to have collected the data properly (I haven't analysed their data yet, but I assume it's fairly reliable; from a quick look the bitstamp one definitely seems to include the major price spike at roughly the right time).


Oh when i became interested in BTC this year again one thing i was really clear about was: I won't put again money in Mtgox. And i was happy for bitcoin that now there are three Big USD BTC Exchanges: btc-e, Bitstamp and Mtgox. I tried bitstamp (really like look and feel of the site, customer service replies fast, cashing out no problem so far) btc-e (service is a bit harsh) and bitfinex. Have to say Kraken looks nice maybe i give it shot. Volume is probably low so and prices are a bit higher, withdrawal problems?

The reason why i wan't to use Mtgox Data is no one goes back so far. And since Mtgox has a significant higher price than other exchanges its maybe not so wise to mix the data between exchanges.


Quote

Did i already say: That's a good link. Thanks.

Do you know something about timestamp with Bitstamp? Just downloaded the data, i assumend unix timestam. i converted with like this: A1/(24*3600)+25569 gives me as first trade:
13.09.2011 13:53:36

is that right Bitstamp was already back then in 2011 around - wow should have known that back then would maybe have saved some money  Wink

The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 31, 2013, 03:36:05 PM
 #101

...

Oh when i became interested in BTC this year again one thing i was really clear about was: I won't put again money in Mtgox. And i was happy for bitcoin that now there are three Big USD BTC Exchanges: btc-e, Bitstamp and Mtgox. I tried bitstamp (really like look and feel of the site, customer service replies fast, cashing out no problem so far) btc-e (service is a bit harsh) and bitfinex. Have to say Kraken looks nice maybe i give it shot. Volume is probably low so and prices are a bit higher, withdrawal problems?

The reason why i wan't to use Mtgox Data is no one goes back so far. And since Mtgox has a significant higher price than other exchanges its maybe not so wise to mix the data between exchanges.

...

Do you know something about timestamp with Bitstamp? Just downloaded the data, i assumend unix timestam. i converted with like this: A1/(24*3600)+25569 gives me as first trade:
13.09.2011 13:53:36

is that right Bitstamp was already back then in 2011 around - wow should have known that back then would maybe have saved some money  Wink

That's fair enough. You should probably be careful about using strategies developed for MtGox with other exchanges. Simple strategies are probably fine, but you might find subtle differences. And these will probably be amplified for more complex strategies. You should be able to compensate for this with some minor tweaks though, so using MtGox data initially should still be fine.

Yeah, Kraken is quite new. Sometimes their 24h XBTEUR volume peaks above 1000 though. I think there have been a few introductory problems as they've grown in popularity, but I think they're getting through this now.

Yes, they are unix timestamps and that was their first trade (or, at least, the first one recorded by bitcoincharts).

Haha, hindsight is a wonderful thing Wink

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
kyotoku
Newbie
*
Offline Offline

Activity: 32
Merit: 0


View Profile
December 31, 2013, 08:04:22 PM
 #102

Hi nitrous,
I tried running the app in OS X 1.6.8 (snow leopard). But it crashes after I select the dump file name in (File->new dump). It prints segmentation fault at the prompt.

I was able to track the error to self.update() call at the end Application.load_dump function at gui.py file (line 217). It actually inconsistent, when I place some print around update() call, it works some time. Anyway, it ran fine when I commented the self.update() line out. I think this function updates window visual after update_button state is set to normal, isn't it? I don't have a better idea of what it could be. But I am happy to test on OS X 1.6.8 if you have any idea.

If anyone is interested I posted a version with the alteration described above
https://www.dropbox.com/sh/cyql48n834n9sfq/VWuOLxYA79

Cheers
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
December 31, 2013, 09:25:00 PM
 #103

Hi nitrous,
I tried running the app in OS X 1.6.8 (snow leopard). But it crashes after I select the dump file name in (File->new dump). It prints segmentation fault at the prompt.

I was able to track the error to self.update() call at the end Application.load_dump function at gui.py file (line 217). It actually inconsistent, when I place some print around update() call, it works some time. Anyway, it ran fine when I commented the self.update() line out. I think this function updates window visual after update_button state is set to normal, isn't it? I don't have a better idea of what it could be. But I am happy to test on OS X 1.6.8 if you have any idea.

If anyone is interested I posted a version with the alteration described above
https://www.dropbox.com/sh/cyql48n834n9sfq/VWuOLxYA79

Cheers

Ahh, thank you for posting a working version. Yeah, I used Tkinter but it doesn't work well with multithreading, hence the numerous errors we've all encountered with this. It doesn't seem to work with Mavericks either. Unfortunately, I don't really have time to maintain this anymore, but hopefully it should continue to work with the dump I posted for now.

Thanks again for the new update Smiley!

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
January 17, 2014, 04:15:13 PM
 #104

Hi all,

at one point nitrous posted this link:

Quote

Can anyone comment on dataquality/accuracy known problems/oddities of the bitcoinchartsdata for mtgox, bitstamp and btc-e?
Nitrous do you have an opinion about this?

Greetings to all of you!


The thinking that has led us to this point will not lead beyond - Albert Einstein
nitrous (OP)
Sr. Member
****
Offline Offline

Activity: 246
Merit: 250


View Profile
January 18, 2014, 12:48:52 PM
 #105

Hi all,

at one point nitrous posted this link:

Quote

Can anyone comment on dataquality/accuracy known problems/oddities of the bitcoinchartsdata for mtgox, bitstamp and btc-e?
Nitrous do you have an opinion about this?

Greetings to all of you!



Difficult to tell, it seems good, you'll need to judge that for yourself though I think. I have only looked at the bitstamp data, I can't comment on the other exchanges, but for bitstamp the one oddity I found is that the trade times are given with second resolution, and so it's difficult to tell the order when multiple trades happened in the same time period. It's likely that they are given in the correct order, but I'm taking a VWAP of each second to avoid any potential problems with that.

Donations: 1Q2EN7TzJ6z82xvmQrRoQoMf3Tf4rMCyvL
MtGox API v2 Unofficial Documentation: https://bitbucket.org/nitrous/mtgox-api/overview
MtGox API v2 Unofficial Documentation Forum Thread: https://bitcointalk.org/index.php?topic=164404.0
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
January 19, 2014, 06:51:33 PM
 #106

Hi nitrous,

good to hear from you again.
O.k. that sounds good with the quality. Even if order in the 1second Intervall is mixed up, its not that bad, its just a second i guess.

O.K. have look for myself and post if i find something.

If someone else has some more info on this - don't be shy Wink


greetings to all of ya..

The thinking that has led us to this point will not lead beyond - Albert Einstein
toniquone
Newbie
*
Offline Offline

Activity: 4
Merit: 0


View Profile
January 20, 2014, 07:32:09 AM
 #107

I see only one problem with the bitcoincharts data - there is no bid/ask column Sad
In other ways data seems to be enough accurate.
BNO
Full Member
***
Offline Offline

Activity: 157
Merit: 100


View Profile
January 20, 2014, 10:20:57 AM
 #108

Quote
I see only one problem with the bitcoincharts data - there is no bid/ask column Sad
In other ways data seems to be enough accurate.


Well bid/ask is a whole new story. But if you want to have whole Orderbook at any given point of time that is gigantic masses of data. That is not managable i guess. Even for stock market, this is not commonly used.

The thinking that has led us to this point will not lead beyond - Albert Einstein
toniquone
Newbie
*
Offline Offline

Activity: 4
Merit: 0


View Profile
January 26, 2014, 10:02:57 AM
 #109

Quote
Well bid/ask is a whole new story. But if you want to have whole Orderbook at any given point of time that is gigantic masses of data. That is not managable i guess. Even for stock market, this is not commonly used.
don't know about whole orderbook, but nitrous db has this column (not included in standard export), and it's very handy for backtesting - strategy chooses more correct price on buy/sell actions.
blankia
Newbie
*
Offline Offline

Activity: 27
Merit: 0


View Profile
March 11, 2014, 09:55:25 AM
 #110

Im getting an error when loading the dump:

Traceback ( most recent call last):
File "app.py", line 97 in thread_bootstrao
File "mtgox.pyc", line 120, in run
File "bq.pyc", line 103, in gen2
KeyError:'pageToken'

BitOnyx
Member
**
Offline Offline

Activity: 112
Merit: 10

Cryptocurrencies Exchange


View Profile WWW
March 11, 2014, 10:00:33 AM
 #111

Im getting an error when loading the dump:

Traceback ( most recent call last):
File "app.py", line 97 in thread_bootstrao
File "mtgox.pyc", line 120, in run
File "bq.pyc", line 103, in gen2
KeyError:'pageToken'



maybe because mt.gox is down...

intron
Sr. Member
****
Offline Offline

Activity: 427
Merit: 251


- electronics design|embedded software|verilog -


View Profile
March 11, 2014, 02:39:53 PM
 #112

Im getting an error when loading the dump:

Traceback ( most recent call last):
File "app.py", line 97 in thread_bootstrao
File "mtgox.pyc", line 120, in run
File "bq.pyc", line 103, in gen2
KeyError:'pageToken'



maybe because mt.gox is down...

LOL:-)
blankia
Newbie
*
Offline Offline

Activity: 27
Merit: 0


View Profile
March 11, 2014, 04:54:41 PM
 #113

Im getting an error when loading the dump:

Traceback ( most recent call last):
File "app.py", line 97 in thread_bootstrao
File "mtgox.pyc", line 120, in run
File "bq.pyc", line 103, in gen2
KeyError:'pageToken'



maybe because mt.gox is down...

LOL:-)

Hahaha nice one Smiley Thanks for pointing that out. I thought the old data was still hosted on Google BigQuery?
VinCeCream
Member
**
Offline Offline

Activity: 89
Merit: 10


View Profile
June 09, 2014, 10:11:54 PM
 #114

Im getting an error when loading the dump:

Traceback ( most recent call last):
File "app.py", line 97 in thread_bootstrao
File "mtgox.pyc", line 120, in run
File "bq.pyc", line 103, in gen2
KeyError:'pageToken'



maybe because mt.gox is down...

EPIC!

Pages: 1 2 3 4 5 6 [All]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!