Bitcoin Forum
May 24, 2024, 09:01:58 PM *
News: Latest Bitcoin Core release: 27.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: « 1 [2]  All
  Print  
Author Topic: Neural Networks and Secp256k1  (Read 428 times)
DoinSomeStuff (OP)
Newbie
*
Offline Offline

Activity: 15
Merit: 0


View Profile
May 14, 2021, 11:30:22 AM
 #21


There no know relationship between Y and -Y. Atleast for polynominal. Thats why im trying to use neural network to discover that,

Do you have sample dataset  of Y and result ?
Qhat input? , Qhat output?


Ofc. I've tried bunch of them.

Code:
X1;X2;X3;X4;X5;X6;X7;X8;X9;X10;X11;X12;X13;X14;X15;X16;X17;X18;X19;X20;X21;X22;X23;X24;X25;X26;X27;X28;X29;X30;X31;X32;Y1;Y2;Y3;Y4;Y5;Y6;Y7;Y8;Y9;Y10;Y11;Y12;Y13;Y14;Y15;Y16;Y17;Y18;Y19;Y20;Y21;Y22;Y23;Y24;Y25;Y26;Y27;Y28;Y29;Y30;Y31;Y32;target
233;18;54;13;167;132;41;227;170;248;134;37;7;113;94;20;171;72;185;202;195;2;247;229;78;165;85;239;238;206;187;41;122;192;153;227;188;238;59;122;136;199;95;24;167;130;100;146;89;92;190;97;65;161;50;95;94;31;78;35;162;106;231;205;1
233;18;54;13;167;132;41;227;170;248;134;37;7;113;94;20;171;72;185;202;195;2;247;229;78;165;85;239;238;206;187;41;181;59;102;28;66;17;196;133;119;56;160;231;88;125;155;109;166;163;65;158;190;94;205;160;161;224;177;220;93;149;24;50;0
Thats example of X,Y,is positive.
ToByteArray.



Again, we don't need large sophisticated neural networks, small ones will do. Though at this point you're making more of an empirical test of already used pubkeys since the Y polarity of the entire public key space converges to 50%.

if do small one, neural networks

for who know about neural networks. you can do with you Nvidia CUDA GPU. you know already to do.
but still not yet have sample code on github for try and testing

but for who don't know must about neural networks

try AutoML Tables by google cloud
https://cloud.google.com/automl-tables

just upload dataset to server and training, that easy to use without no coding
 
(other service both Microsoft Azure AutoML and Amazon AWS SageMaker have Automatic neural networks service same)

or OpenNN. NeuralDesigner have friendly GUI, its easy to use.
fxsniper
Member
**
Offline Offline

Activity: 406
Merit: 45


View Profile
May 14, 2021, 12:34:11 PM
 #22


Simple GUI try use ML.NET Model Builder GPU Support (Preview)

Download Visual Studio 2019 for Windows Community and install ML.NET Model Builder

but ML.NET Model Builder is not Neural Networks (NN use perceptron)

ML.NET Model Builder is use algorithm and try select best algorithm automatic for predict

(real Neural Networks should be use Keras no GUI or AutoKeras still need to coding)

ML.NET Model Builder try apply by use Text classification or Value prediction


fxsniper
Member
**
Offline Offline

Activity: 406
Merit: 45


View Profile
May 14, 2021, 02:02:05 PM
 #23

sample python script create dataset for neural networks

this script just for testing (test on ml.net)
for use need to upgrade and fix

you need to modify to fit as you use

my test on ml.net use binary to 1 and 0 get result better than number (Dec)


test 1
datasetNN1.py
Code:
import random
import time
from bit import Key
import math
 
timestr = time.strftime("%Y%m%d-%H%M%S")
filename = "datasetNN_" + str(timestr) + ".csv"
print(time.strftime("%Y-%m-%d-%H:%M:%S"))
print(filename)

feature = ""

f = open(filename, "w")
j = 1
while j <= 256:
    #print(j)
    feature = feature + "f" + str(j) + ","
    j += 1
header = feature+"Lebel"
#print(header)
f.write(header+"\n")
f.close()


i = 1
while i < 1000:
#while i < 1000000:
    #label_output  = '0'
    label_output  = 'even'
    #print(i)
    seed = random.randrange(2**119,2**120)
    #seed = random.randrange(2**256)
    key = Key.from_int(seed)
    address = key.address
    pubkey = key.public_key.hex()
    x,y = key.public_point
    if y % 2 == 0:
        #label_output = 0  # even
        #label_output = 'even'  # even
        label_output = 1  # even
    else:
        #label_output = 1  # odd
        #label_output = 'odd'  # odd
        label_output = 2  # odd
   
    y2_bin = bin(y)[2:]
    bin2_split = list(y2_bin)

    if len(bin2_split) == 256:
        feature_binary = ""
        for x in range(len(bin2_split)):
            feature_binary = feature_binary + bin2_split[x] + ","

   
        adddataline = feature_binary + str(label_output)
        #print(addline)
        f = open(filename, "a")
        f.write(adddataline+"\n")
        f.close()
        i += 1

   
print(time.strftime("%Y-%m-%d-%H:%M:%S"))



test 2
datasetNN2.py
Code:
import random
import time
from bit import Key
import math
 
timestr = time.strftime("%Y%m%d-%H%M%S")
filename = "datasetNN_" + str(timestr) + ".csv"
print(time.strftime("%Y-%m-%d-%H:%M:%S"))
print(filename)

feature = ""

f = open(filename, "w")
j = 1
#while j <= 256:
while j <= 64:
    #print(j)
    #feature = feature + "f" + str(j) + ","
    feature = feature + "x" + str(j) + ","
    j += 1
header = feature+"Lebel"
#print(header)
f.write(header+"\n")
f.close()


i = 1
while i < 1000:
#while i < 1000000:
    #label_output  = '0'
    label_output  = 'even'
    #print(i)
    seed = random.randrange(2**119,2**120)
    #seed = random.randrange(2**256)
    key = Key.from_int(seed)
    address = key.address
    pubkey = key.public_key.hex()
    x,y = key.public_point

    if y % 2 == 0:
        #label_output = 0  # even
        #label_output = 'even'  # even
        label_output = 1  # even
    else:
        #label_output = 1  # odd
        #label_output = 'odd'  # odd
        label_output = 2  # odd
   
    #y2_bin = bin(y)[2:]
    #bin2_split = list(y2_bin)
    bin2_split = list(pubkey[2:])

    #if len(bin2_split) == 256:
    #if len(pubkey) == 64:
    feature_hex = ""
    for x in range(len(bin2_split)):
        #feature_hex = feature_hex + bin2_split[x] + ","
        hex2_num = int(bin2_split[x], 16)
        feature_hex = feature_hex + str(hex2_num) + ","


    adddataline = feature_hex + str(label_output)
    #print(addline)
    f = open(filename, "a")
    f.write(adddataline+"\n")
    f.close()
    i += 1

   
print(time.strftime("%Y-%m-%d-%H:%M:%S"))

DoinSomeStuff (OP)
Newbie
*
Offline Offline

Activity: 15
Merit: 0


View Profile
May 14, 2021, 02:12:02 PM
 #24

sample python script create dataset for neural networks

this script just for testing (test on ml.net)
for use need to upgrade and fix

you need to modify to fit as you use

my test on ml.net use binary to 1 and 0 get result better than number (Dec)


test 1
datasetNN1.py
Code:
import random
import time
from bit import Key
import math
 
timestr = time.strftime("%Y%m%d-%H%M%S")
filename = "datasetNN_" + str(timestr) + ".csv"
print(time.strftime("%Y-%m-%d-%H:%M:%S"))
print(filename)

feature = ""

f = open(filename, "w")
j = 1
while j <= 256:
    #print(j)
    feature = feature + "f" + str(j) + ","
    j += 1
header = feature+"Lebel"
#print(header)
f.write(header+"\n")
f.close()


i = 1
while i < 1000:
#while i < 1000000:
    #label_output  = '0'
    label_output  = 'even'
    #print(i)
    seed = random.randrange(2**119,2**120)
    #seed = random.randrange(2**256)
    key = Key.from_int(seed)
    address = key.address
    pubkey = key.public_key.hex()
    x,y = key.public_point
    if y % 2 == 0:
        #label_output = 0  # even
        #label_output = 'even'  # even
        label_output = 1  # even
    else:
        #label_output = 1  # odd
        #label_output = 'odd'  # odd
        label_output = 2  # odd
    
    y2_bin = bin(y)[2:]
    bin2_split = list(y2_bin)

    if len(bin2_split) == 256:
        feature_binary = ""
        for x in range(len(bin2_split)):
            feature_binary = feature_binary + bin2_split[x] + ","

    
        adddataline = feature_binary + str(label_output)
        #print(addline)
        f = open(filename, "a")
        f.write(adddataline+"\n")
        f.close()
        i += 1

    
print(time.strftime("%Y-%m-%d-%H:%M:%S"))



test 2
datasetNN2.py
Code:
import random
import time
from bit import Key
import math
 
timestr = time.strftime("%Y%m%d-%H%M%S")
filename = "datasetNN_" + str(timestr) + ".csv"
print(time.strftime("%Y-%m-%d-%H:%M:%S"))
print(filename)

feature = ""

f = open(filename, "w")
j = 1
#while j <= 256:
while j <= 64:
    #print(j)
    #feature = feature + "f" + str(j) + ","
    feature = feature + "x" + str(j) + ","
    j += 1
header = feature+"Lebel"
#print(header)
f.write(header+"\n")
f.close()


i = 1
while i < 1000:
#while i < 1000000:
    #label_output  = '0'
    label_output  = 'even'
    #print(i)
    seed = random.randrange(2**119,2**120)
    #seed = random.randrange(2**256)
    key = Key.from_int(seed)
    address = key.address
    pubkey = key.public_key.hex()
    x,y = key.public_point

    if y % 2 == 0:
        #label_output = 0  # even
        #label_output = 'even'  # even
        label_output = 1  # even
    else:
        #label_output = 1  # odd
        #label_output = 'odd'  # odd
        label_output = 2  # odd
    
    #y2_bin = bin(y)[2:]
    #bin2_split = list(y2_bin)
    bin2_split = list(pubkey[2:])

    #if len(bin2_split) == 256:
    #if len(pubkey) == 64:
    feature_hex = ""
    for x in range(len(bin2_split)):
        #feature_hex = feature_hex + bin2_split[x] + ","
        hex2_num = int(bin2_split[x], 16)
        feature_hex = feature_hex + str(hex2_num) + ","


    adddataline = feature_hex + str(label_output)
    #print(addline)
    f = open(filename, "a")
    f.write(adddataline+"\n")
    f.close()
    i += 1

    
print(time.strftime("%Y-%m-%d-%H:%M:%S"))



Wrong, Key % 2 or key > n/2  not y % 2.
j2002ba2
Full Member
***
Offline Offline

Activity: 204
Merit: 437


View Profile
May 14, 2021, 02:19:45 PM
 #25

Using NN for cracking cryptographic functions is pointless. NN can capture only simple dependencies.

I expect the number of weights needed for capturing one bit with bigger than insignificant probability to be in the order of 2128.

DoinSomeStuff (OP)
Newbie
*
Offline Offline

Activity: 15
Merit: 0


View Profile
May 14, 2021, 02:27:54 PM
 #26

Using NN for cracking cryptographic functions is pointless. NN can capture only simple dependencies.

I expect the number of weights needed for capturing one bit with bigger than insignificant probability to be in the order of 2128.



NNs can capture very hard dependecies, depends on type and number of hidden layers.

https://www.sciencedirect.com/science/article/pii/S0895717707000362
fxsniper
Member
**
Offline Offline

Activity: 406
Merit: 45


View Profile
May 15, 2021, 12:36:17 AM
 #27


problem on ML.NET
dataset like a random no pattern

training with 1 million dataset result very low accuracy at 0.0001%
it not works
I will try on keras 5 layer and 256 NN
possible get result same

neural networks may be work only on dataset have pattern, NN can find pattern
but Secp256k1 or elliptic curve like a random
DoinSomeStuff (OP)
Newbie
*
Offline Offline

Activity: 15
Merit: 0


View Profile
May 15, 2021, 08:57:41 AM
 #28


problem on ML.NET
dataset like a random no pattern

training with 1 million dataset result very low accuracy at 0.0001%
it not works
I will try on keras 5 layer and 256 NN
possible get result same

neural networks may be work only on dataset have pattern, NN can find pattern
but Secp256k1 or elliptic curve like a random


1m is nothing, i've tried 50m. I suppose there need tests for 1billion dataset
And for calculation need take small curves and increase numbers if success, to get formula and calculate how much data required for Secp256.
fxsniper
Member
**
Offline Offline

Activity: 406
Merit: 45


View Profile
May 15, 2021, 09:26:16 AM
 #29


1m is nothing, i've tried 50m. I suppose there need tests for 1billion dataset
And for calculation need take small curves and increase numbers if success, to get formula and calculate how much data required for Secp256.

great if can do with 1billion dataset

my GPU is small gtx 1050 just 1 millions is slow training
may be need to optimize model code

test at 1000 and 10000 is no problem but 100000 and 1000000 have a lot of error show on training

I can do only test and train
for use may be do by manual use


ABCbits
Legendary
*
Offline Offline

Activity: 2884
Merit: 7516


Crypto Swap Exchange


View Profile
May 15, 2021, 09:35:22 AM
 #30

my GPU is small gtx 1050 just 1 millions is slow training

A bit off-topic, but i've heard my friend they're using Google colab and Kaggle which give you access to high-end professional/data server GPU. I don't know the limitation, but it might worth to try.

█▀▀▀











█▄▄▄
▀▀▀▀▀▀▀▀▀▀▀
e
▄▄▄▄▄▄▄▄▄▄▄
█████████████
████████████▄███
██▐███████▄█████▀
█████████▄████▀
███▐████▄███▀
████▐██████▀
█████▀█████
███████████▄
████████████▄
██▄█████▀█████▄
▄█████████▀█████▀
███████████▀██▀
████▀█████████
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
c.h.
▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄
▀▀▀█











▄▄▄█
▄██████▄▄▄
█████████████▄▄
███████████████
███████████████
███████████████
███████████████
███░░█████████
███▌▐█████████
█████████████
███████████▀
██████████▀
████████▀
▀██▀▀
NotATether
Legendary
*
Offline Offline

Activity: 1610
Merit: 6761


bitcoincleanup.com / bitmixlist.org


View Profile WWW
May 15, 2021, 11:21:41 AM
 #31

my GPU is small gtx 1050 just 1 millions is slow training

A bit off-topic, but i've heard my friend they're using Google colab and Kaggle which give you access to high-end professional/data server GPU. I don't know the limitation, but it might worth to try.

Colab has heavy limitations on the GPU in its free tier where they'll stop your whole notebook once you exceed a certain number of hours.


problem on ML.NET
dataset like a random no pattern

training with 1 million dataset result very low accuracy at 0.0001%
it not works
I will try on keras 5 layer and 256 NN
possible get result same

neural networks may be work only on dataset have pattern, NN can find pattern
but Secp256k1 or elliptic curve like a random


1m is nothing, i've tried 50m. I suppose there need tests for 1billion dataset
And for calculation need take small curves and increase numbers if success, to get formula and calculate how much data required for Secp256.

What are your detection rates for the 50m dataset (true positive/negative %, false positive/negative % etc.)?

.
.BLACKJACK ♠ FUN.
█████████
██████████████
████████████
█████████████████
████████████████▄▄
░█████████████▀░▀▀
██████████████████
░██████████████
████████████████
░██████████████
████████████
███████████████░██
██████████
CRYPTO CASINO &
SPORTS BETTING
▄▄███████▄▄
▄███████████████▄
███████████████████
█████████████████████
███████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
███████████████████████
█████████████████████
███████████████████
▀███████████████▀
█████████
.
DoinSomeStuff (OP)
Newbie
*
Offline Offline

Activity: 15
Merit: 0


View Profile
May 15, 2021, 11:44:38 AM
 #32

my GPU is small gtx 1050 just 1 millions is slow training

A bit off-topic, but i've heard my friend they're using Google colab and Kaggle which give you access to high-end professional/data server GPU. I don't know the limitation, but it might worth to try.

Colab has heavy limitations on the GPU in its free tier where they'll stop your whole notebook once you exceed a certain number of hours.


problem on ML.NET
dataset like a random no pattern

training with 1 million dataset result very low accuracy at 0.0001%
it not works
I will try on keras 5 layer and 256 NN
possible get result same

neural networks may be work only on dataset have pattern, NN can find pattern
but Secp256k1 or elliptic curve like a random


1m is nothing, i've tried 50m. I suppose there need tests for 1billion dataset
And for calculation need take small curves and increase numbers if success, to get formula and calculate how much data required for Secp256.

What are your detection rates for the 50m dataset (true positive/negative %, false positive/negative % etc.)?

Selection error was 0.9916. But in tests it was 50%, shaking from 1000 to -1000 (good - bad). So still close to 50%
fxsniper
Member
**
Offline Offline

Activity: 406
Merit: 45


View Profile
May 15, 2021, 12:42:21 PM
 #33

my GPU is small gtx 1050 just 1 millions is slow training

A bit off-topic, but i've heard my friend they're using Google colab and Kaggle which give you access to high-end professional/data server GPU. I don't know the limitation, but it might worth to try.

Google colab is very good you need to subscription $9.99 dollar per month to use colab pro available to use Tesla V100 it is very good deal

colab pro limited use 5 note same time and can use with 24 hour will be re-connect again require to open web browser can not close work page

you need to use with google drive to save you job subscriot more 100gp drive will be good

I just try to use colab pro
j2002ba2
Full Member
***
Offline Offline

Activity: 204
Merit: 437


View Profile
May 15, 2021, 12:46:43 PM
 #34

NNs can capture very hard dependecies, depends on type and number of hidden layers.

https://www.sciencedirect.com/science/article/pii/S0895717707000362

You wish, but the reality says no.

In the paper they look at 14, 20, and 32 bit elliptic curves. The corresponding weights storage is 213, 213.5, 214.2. Theoretically the storage would be in the order of 27, 210, and 216. So all looks good, no breakthrough.

To even learn the secp256k1 weights you'd need at least 2128 examples. Good luck executing that.

fxsniper
Member
**
Offline Offline

Activity: 406
Merit: 45


View Profile
May 15, 2021, 11:49:58 PM
 #35

NNs can capture very hard dependecies, depends on type and number of hidden layers.

https://www.sciencedirect.com/science/article/pii/S0895717707000362

You wish, but the reality says no.

In the paper they look at 14, 20, and 32 bit elliptic curves. The corresponding weights storage is 213, 213.5, 214.2. Theoretically the storage would be in the order of 27, 210, and 216. So all looks good, no breakthrough.

To even learn the secp256k1 weights you'd need at least 2128 examples. Good luck executing that.



I think small neural networks  can not handle with Secp256k1 success problem curve with large number it make very complex
neural networks is small digit work with neurons
problem is still on large number

other idea is create some algorithm to predict will be small and easy than may be correct at 50% only can call success
fxsniper
Member
**
Offline Offline

Activity: 406
Merit: 45


View Profile
May 16, 2021, 12:46:06 AM
 #36

my GPU is small gtx 1050 just 1 millions is slow training

A bit off-topic, but i've heard my friend they're using Google colab and Kaggle which give you access to high-end professional/data server GPU. I don't know the limitation, but it might worth to try.

A bit off-topic,

I try Neural Networks with colab

work with dataset of Secp256k1
generate pubkey and save to colab
Limited storage 100GB (system 40gp limited to use 60gb)

under 1 million dataset may be can work with colab
over 1 million is too much for colab

Google colab pro is good to can use Tesla V100 (limited on 24 hour)
Google colab design for work with neural networks
Google colab pro good value for $9.99 cost
name colab "LAB" is already tell for  testing and proof of concept not for production for real work because use time more 24 hour

Kaggle is good for find sample code (most way use github)
I like Google colab more than kaggle

Google colab free you need to finish job on 12 hour work only 1 gpu limited
Google colab pro ($9.99)  finish job on 24 hour
it good for testing job

colab you still need to coding (no GUI)
colab risk to loss you job if disconnect from server or session reset ( do new one again)
colab save file to ssd risk

colab need to use with google drive to use you file on drive is better (but make it work slow than on ssd on colab)

Real work with Neural Networks you should be use you own GPU is better only way
I run test pubkey small dataset with autokeras ()optimize code) use time over 48 hour on low end  gtx 1050 (slow work)

Neural Networks with Secp256k1 use very long time to train it. it use time a lot for try optimize code for get best result

some problem
use pubkey dataset 1000 to 10000 no problem on test
use pubkey dataset  1 million have a lot of error message between training

optimize code on small dataset is fine
but when try with large dataset 1 million

dataset
on test
use dataset by hex (split each)
use dataset by convert to decimal (split each)
use dataset by convert to binary  (binary 10101 not byte code) (split each)
my best result convert to binary 10101

I just on testing on small dataset

agree to try wok with over 1 billion dataset
try on large dataset and fix problem same time
I test on small and change to use large is difference like to try test new one on large dataset

test algorithm Neural Networks
try apply algorithm
use classification is give answer with percent
use predict number is give predict number with out percent

I think it using classification training is better because give result with percent of predict it useful for use decision
fxsniper
Member
**
Offline Offline

Activity: 406
Merit: 45


View Profile
May 16, 2021, 01:22:24 AM
 #37

good or not if change 32 to max at 256


Layer (type)                 Output Shape             
=============================
input_1 (InputLayer)         [(None, 256)]           
_____________________________________
multi_category_encoding (Mul (None, 256)               
_____________________________________
dense (Dense)                (None, 32)               
_____________________________________
re_lu (ReLU)                 (None, 32)               
_____________________________________
dense_1 (Dense)              (None, 32)               
_____________________________________
re_lu_1 (ReLU)               (None, 32)               
_____________________________________
regression_head_1 (Dense)    (None, 1)                 
=============================



I will test add more 5 layer
activate I use ReLU


Layer (type)                 Output Shape             
=============================
input_1 (InputLayer)         [(None, 256)]           
_____________________________________
multi_category_encoding (Mul (None, 256)               
_____________________________________
dense (Dense)                (None, 256)               
_____________________________________
re_lu (ReLU)                 (None, 256)               
_____________________________________
dense_1 (Dense)              (None, 128)               
_____________________________________
re_lu_1 (ReLU)               (None, 128)               
_____________________________________
regression_head_1 (Dense)    (None, 1)                 
=============================



I try to use modify sample code for keras to use pubkey dataset
use keras predict titanic and keras predict California housing
just test may be wrong algorithm that should be use

on github have a ot of bitcoin trading bot use neural networks predict price easy than predict elliptic curve
DoinSomeStuff (OP)
Newbie
*
Offline Offline

Activity: 15
Merit: 0


View Profile
May 16, 2021, 01:55:27 PM
 #38

good or not if change 32 to max at 256


Layer (type)                 Output Shape             
=============================
input_1 (InputLayer)         [(None, 256)]           
_____________________________________
multi_category_encoding (Mul (None, 256)               
_____________________________________
dense (Dense)                (None, 32)               
_____________________________________
re_lu (ReLU)                 (None, 32)               
_____________________________________
dense_1 (Dense)              (None, 32)               
_____________________________________
re_lu_1 (ReLU)               (None, 32)               
_____________________________________
regression_head_1 (Dense)    (None, 1)                 
=============================



I will test add more 5 layer
activate I use ReLU


Layer (type)                 Output Shape             
=============================
input_1 (InputLayer)         [(None, 256)]           
_____________________________________
multi_category_encoding (Mul (None, 256)               
_____________________________________
dense (Dense)                (None, 256)               
_____________________________________
re_lu (ReLU)                 (None, 256)               
_____________________________________
dense_1 (Dense)              (None, 128)               
_____________________________________
re_lu_1 (ReLU)               (None, 128)               
_____________________________________
regression_head_1 (Dense)    (None, 1)                 
=============================



I try to use modify sample code for keras to use pubkey dataset
use keras predict titanic and keras predict California housing
just test may be wrong algorithm that should be use

on github have a ot of bitcoin trading bot use neural networks predict price easy than predict elliptic curve

Don't forget to make youre X,Y length to 32 or 256.
while(X.length != 256)
X = "0" + X;
NotATether
Legendary
*
Offline Offline

Activity: 1610
Merit: 6761


bitcoincleanup.com / bitmixlist.org


View Profile WWW
May 17, 2021, 11:58:39 AM
 #39

good or not if change 32 to max at 256

~

I try to use modify sample code for keras to use pubkey dataset
use keras predict titanic and keras predict California housing
just test may be wrong algorithm that should be use

on github have a ot of bitcoin trading bot use neural networks predict price easy than predict elliptic curve

I'm no ML expert but I don't think adding more NN layers is going to improve the accuracy of this neural network. Remember that we basically have a hexadecimal number as input (trying to feed millions of hex numbers as input to the same neurons makes no sense because the numbers have no relationship with each other), and as such, they should be in bytes form instead of ASCII, and then we'd do something like assign each bit of the number to one or more input neurons, and then apply layers on it.

If you do want to use millions of numbers as input then you should use a ML classification algorithm and also give as input the polarity of these points for training. Neural networks are meant to work on one input only e.g. given a drawing of the number "2", identify which digit it is.

.
.BLACKJACK ♠ FUN.
█████████
██████████████
████████████
█████████████████
████████████████▄▄
░█████████████▀░▀▀
██████████████████
░██████████████
████████████████
░██████████████
████████████
███████████████░██
██████████
CRYPTO CASINO &
SPORTS BETTING
▄▄███████▄▄
▄███████████████▄
███████████████████
█████████████████████
███████████████████████
█████████████████████████
█████████████████████████
█████████████████████████
███████████████████████
█████████████████████
███████████████████
▀███████████████▀
█████████
.
Pages: « 1 [2]  All
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!