Bitcoin Forum
October 06, 2024, 08:15:36 PM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 [2]  All
  Print  
Author Topic: Artificial Neural Network & Genetic Algorithm Library For Deja Vu  (Read 4622 times)
BadBitcoin (James Sutton)
Donator
Sr. Member
*
Offline Offline

Activity: 452
Merit: 252



View Profile
February 18, 2014, 03:31:46 PM
 #21

for any incredibly complex system like the bitcoin market its possible to forcast market conditions with only 1 network, however holy crap is it unadvised
My plan wasn't to train just one net and then use it to trade for me. My plan was to train multiple nets and then use them to create a committee of machines and average the outputs from all of them. Associative neural networks seem to be an extension of that basic concept but I'm not really sure of the exact details. Autoassociative neural networks seem to a whole other thing and I haven't read much about how they work either. Apparently the Hopfield network is one example of an autoassociative neural network but I'd have to read more about it to understand how it works. However I did read that one of the main uses of the Hopfield network is to pick out patterns from noisy data so I can see why you'd think that was a good choice.

Quote
Also your bot being able to pick out trends, is it autoassociative and unsupervised?
I described exactly how my nets are structured when I described how the DNA is formatted. The design is not autoassociative as far as I know but the nets are obviously trained using unsupervised learning because they use a genetic algorithm for training and nothing else. All I do is give them some training data and then let them evolve by themselves based on how well they perform at virtual trading (which is linked to their ability to predict future trends in the data). If I had of ranked them based purely on their ability to predict what data will come next in their training data then I would still have to build my own trading algorithms based on the predictions they make. I wanted them to evolve their own trading strategies by themselves so I ranked them by how well they performed at virtual trading.

back again, it's been a crazy week. My foundation net is almost functioning perfectly but I think I need to tweak it for a few more hours to get the RMS per iteration to improve significantly.

I'm currently using a really interesting hybrid approach that I came up with, I order every individual 0 - popsize -1, and then based on that order put it into an expotential decaying function to determine if it's a parent or not. I then have an if statement in the reproduction function that takes the order of the individual as the parameter for another expotential decaying probability boolean, and then if it succeeds the genes of the parent are directly passed to the child (child = parent), I've worked it out so that theres only a probability of ~10% for the best of the generation but thats something that I'll be tweaking.

the rest of the reproductive system is all stoichastic, everything is probability based like parent selection, gene pairing, mutation, etc. I've done a few monte carlo simulations (IE I run a very large population iteration through and cout every if statement success). For a genetic algorithim I think this method is solid, but as a backup I also have a "shake" function, which does a single iteration of back propagation to every child before its tested for fitness. One thing that I'm worried about with my back propagation is that since my training regimen is "all in" (IE I run each individual through the entire training data and then average the RMS of the output errors), I run the risk of having a faulty backpropagation, since I'm averaging the values of my inputs with all of the possible values that input can have throughout the training data.

I think I might do a few tests without the back prop to see if that's the issue, and have a read in some of my textbooks to determine what the best way to incorporate back prop into a genetic algo is, I may have to change my training regimen to be stage based.

Also (I know your using built in libraries but you might know) how are you calculating normalized fitness? I'm currently taking the negative expotential of the average RMS over the training data times a lamda constant (IE fitness = e^(-lamda*avg_rms) for an individual and then dividing each individual fitness by the average fitness of the iteration, so I always have fitness values greater than 0, and their always "different" enough for the computer to recognize the differences.


I haven't gotten to the point yet of testing my network on real data, I'm still testing with an XOR table, but since I'm writing this from scratch I feel that I'll be able to reach the goal of a real bitcoin forcast/profit maker soon, once my foundation is finished everything should be (knock on wood) as easy as using a library.
bitfreak! (OP)
Legendary
*
Offline Offline

Activity: 1536
Merit: 1000


electronic [r]evolution


View Profile WWW
February 19, 2014, 05:42:57 AM
Last edit: February 19, 2014, 02:36:03 PM by bitfreak!
 #22

Also (I know your using built in libraries but you might know) how are you calculating normalized fitness?
I am not using any libraries other than the libraries I have created myself. The fitness of each net is calculated as a percentage of profit or loss with respect to the starting balance. If the net starts with a balance of $1000 and it makes a profit of $10 during the virtual trading test then it gets a score of 1.01 but if it makes a loss of $10 then it gets a score of 0.99. If it were to double its starting balance then it would obviously get a score of 2. A set number of nets with the highest scores are placed into the elite group for breeding (although the parent pairings and offspring mutations are random). To make the process faster I also included a mechanism which allows nets to "die" if they aren't performing well enough. For example if the net seems to be making a consistent loss or if it isn't placing enough trades then the testing process will be cut short and that net will incur a large score penalty to make sure it isn't included in the elite group. Doing that helps speed up the training process by a large degree and it also mimics natural evolution where the subjects who are bad at surviving die quickly.

XCN: CYsvPpb2YuyAib5ay9GJXU8j3nwohbttTz | BTC: 18MWPVJA9mFLPFT3zht5twuNQmZBDzHoWF
Cryptonite - 1st mini-blockchain altcoin | BitShop - digital shop script
Web Developer - PHP, SQL, JS, AJAX, JSON, XML, RSS, HTML, CSS
BadBitcoin (James Sutton)
Donator
Sr. Member
*
Offline Offline

Activity: 452
Merit: 252



View Profile
February 24, 2014, 11:17:54 PM
 #23


I am not using any libraries other than the libraries I have created myself. The fitness of each net is calculated as a percentage of profit or loss with respect to the starting balance. If the net starts with a balance of $1000 and it makes a profit of $10 during the virtual trading test then it gets a score of 1.01 but if it makes a loss of $10 then it gets a score of 0.99. If it were to double its starting balance then it would obviously get a score of 2. A set number of nets with the highest scores are placed into the elite group for breeding (although the parent pairings and offspring mutations are random). To make the process faster I also included a mechanism which allows nets to "die" if they aren't performing well enough. For example if the net seems to be making a consistent loss or if it isn't placing enough trades then the testing process will be cut short and that net will incur a large score penalty to make sure it isn't included in the elite group. Doing that helps speed up the training process by a large degree and it also mimics natural evolution where the subjects who are bad at surviving die quickly.

Interesting design, does it function well? It seems a tad bit simplistic in its parameters to function effectively but I could see it working given enough backdata and hidden neuron count.

I'm currently converting all of my members and functions into openCL kernels, as I've recently mastered (or mastered enough) openCL over the weekend and I plan on optimising the shit out of my algorithims. I have a plan on having a 3 layered network with 1 output each, however the first network will have something on the order of ~350 inputs and ~250 hidden neurons, so it could definitely get messy, however I'm confident my openCL design will still make short work of the computation time.

However this shit-storm with mtgox probably won't be able for any bot (that I can think of, at least) to be able to predict the mtgox price collapse, however I think I could get really close if I have multiple exchange data with trading bots running on all and communicating with each-other on an even higher tier neural net.
BadBitcoin (James Sutton)
Donator
Sr. Member
*
Offline Offline

Activity: 452
Merit: 252



View Profile
March 08, 2014, 04:56:28 PM
 #24

also bitfreak or anyone else interested, I just started a blog on neural nets (and job hunting, but mostly neural nets Tongue) if anyones interested in having a conversation I'm all ears!

http://learningann.wordpress.com/
Pages: « 1 [2]  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!