JennyR
Newbie
Offline
Activity: 5
Merit: 0
|
 |
March 14, 2019, 10:13:46 PM |
|
We have next fork now, its the end.
Same here, Different machines show different fork blocks after update to 1.1.9.9g version. After such a long time, it still failed to return on the right block, which is worrying.  I gave some power on correct chain(about 40% all), we must waiting .... wchich one is correct, i have wallets on blocks 107049, 107076 and 107021 107076 in most cases few wallets unable to connect peers good question ... hard to say which is the "real" chain at this point Please try not to quote banned members posts, thanks. Sir, I don’t know who is banned or not. I just have the same problem. Captain, the ship is in a storm and we are ALL in it. Please give us clear and concise instructions what to do when our wallets show different block counts even after multiple restarts. Where can we find the latest correct hash of the main kosha chain updated in real time, please? We need your help and direction in times like this, Sir, to steady the ship. May God speed you! We await your instructions....
|
|
|
|
bible_pay (OP)
Full Member
 
Offline
Activity: 1176
Merit: 215
Jesus is the King of Kings and Lord of Lords
|
 |
March 14, 2019, 10:15:25 PM |
|
wchich one is correct, i have wallets on blocks 107049, 107076 and 107021 107076 in most cases few wallets unable to connect peers
Did you re-sync the chain from zero? This is not a true complaint from these other trollers! The chain is syncing fine. Every Node needs resynced from zero that was above 107,000 after the upgrade.
clean install, i stopped podc and install wallet and start it. then several rescan, reindex, delete all, reinstall, uninstall etc...  also trying with clean config file or with added nodes nothing help to get peers, only if i copy peers file from another wallet, then it starts connecting somewhere Ok yeah I think a lot of our peers banned each other due to the constant asking for blocks. Let me look at each chain in getchaintips. It appears the problem is we have network fragmentation; the pool is not connected to a node that my home network is connected to; so it doesnt even have the block header or that chain. I believe we need to ask people to delete banlist.dat and resync the chain. In the meantime, Ill run a global command to delete banlist.dat and resync all of my sancs - since they are public that will potentially help bridge the problem together.
|
|
|
|
bible_pay (OP)
Full Member
 
Offline
Activity: 1176
Merit: 215
Jesus is the King of Kings and Lord of Lords
|
 |
March 14, 2019, 10:17:25 PM |
|
We have next fork now, its the end.
Same here, Different machines show different fork blocks after update to 1.1.9.9g version. After such a long time, it still failed to return on the right block, which is worrying.  I gave some power on correct chain(about 40% all), we must waiting .... wchich one is correct, i have wallets on blocks 107049, 107076 and 107021 107076 in most cases few wallets unable to connect peers good question ... hard to say which is the "real" chain at this point Please try not to quote banned members posts, thanks. Sir, I don’t know who is banned or not. I just have the same problem. Captain, the ship is in a storm and we are ALL in it. Please give us clear and concise instructions what to do when our wallets show different block counts even after multiple restarts. Where can we find the latest correct hash of the main kosha chain updated in real time, please? We need your help and direction in times like this, Sir, to steady the ship. May God speed you! We await your instructions.... Thanks Jenny - please do this: - Wait 10 minutes for me to issue the global command to my sancs - cd %appdata%\biblepaycore - rm banlist.dat - rm blocks -r - rm chainstate -r - Restart wallet The theory is that by removing the banlist, we will see each others nodes. Thanks, Rob
|
|
|
|
macko20
Newbie
Offline
Activity: 89
Merit: 0
|
 |
March 14, 2019, 10:20:04 PM |
|
We have next fork now, its the end.
Same here, Different machines show different fork blocks after update to 1.1.9.9g version. After such a long time, it still failed to return on the right block, which is worrying.  I gave some power on correct chain(about 40% all), we must waiting .... wchich one is correct, i have wallets on blocks 107049, 107076 and 107021 107076 in most cases few wallets unable to connect peers Did you re-sync the chain from zero? This is not a true complaint from these other trollers! The chain is syncing fine. Every Node needs resynced from zero that was above 107,000 after the upgrade.
Relax Rob, We have to recover the main chain. There is a problem and we are here to help.
|
|
|
|
bible_pay (OP)
Full Member
 
Offline
Activity: 1176
Merit: 215
Jesus is the King of Kings and Lord of Lords
|
 |
March 14, 2019, 10:24:42 PM |
|
Did you see how the mean spirited individual took this out of context and tried to imply we are smaller by pointing fingers, etc, instead of looking at what we can do to grow?
I see the context of this post as positive, it means all of the lazy armchair whiners are who helped cause us to shrink, or possibly people leaving because of the low price.
Either way it would be nice to form a solid group with the Holy Spirit to propel us forward.
Yes, there is always a possibility for us to pull together and be greater than the sum of our parts. What kind of activities could we be doing that would spread the message and improve our impact? Could you please start a thread with TheSnat on forum.biblepay.org, and we can all get involved in the conversation when bandwidth opens up (most likely the day Evolution goes into testnet). Until then were pretty much working 24/7 to release Evolution to testnet, so replying here would not give the subject the time or intelligence it deserves.
|
|
|
|
bible_pay (OP)
Full Member
 
Offline
Activity: 1176
Merit: 215
Jesus is the King of Kings and Lord of Lords
|
 |
March 14, 2019, 10:26:16 PM |
|
...
No Macko your not- you have been banned, so please dont disguise yourself as someone who cares after stealing over 1 mil bbp from the exchange and having a long history of disrespect here - you are already banned until you personally apologize to me and show that you can change.
|
|
|
|
bible_pay (OP)
Full Member
 
Offline
Activity: 1176
Merit: 215
Jesus is the King of Kings and Lord of Lords
|
 |
March 14, 2019, 10:31:47 PM |
|
Ok I deleted banlist and Im resyncing all my sancs, not sure if it will help others but lets see what happens in 30 mins.
|
|
|
|
bible_pay (OP)
Full Member
 
Offline
Activity: 1176
Merit: 215
Jesus is the King of Kings and Lord of Lords
|
 |
March 14, 2019, 10:47:59 PM |
|
Im seeing this hash on 90% of my peers:
getblockhash 107086 3dba45d9162957eb7feaf6e40e8fb9f55d6155236ab1dfbc4a5ab80bcb3bf0cc
POW diff: 1279
Its a little too early to tell; resyncing the pool; Ill check again in 20 mins.
|
|
|
|
Kismar
Newbie
Offline
Activity: 12
Merit: 0
|
 |
March 14, 2019, 11:07:41 PM |
|
Im seeing this hash on 90% of my peers:
getblockhash 107086 3dba45d9162957eb7feaf6e40e8fb9f55d6155236ab1dfbc4a5ab80bcb3bf0cc
POW diff: 1279
Its a little too early to tell; resyncing the pool; Ill check again in 20 mins.
my 2 wallets and masternode agree with this. Though my linux wallet crashed, didn't see any errors in the debug log, only had this from the command prompt: [1]+ Segmentation fault (core dumped) ./biblepay/src/qt/biblepay-qt I restarted it and will let you know if it happens again.
|
|
|
|
dave_bbp
Jr. Member
Offline
Activity: 405
Merit: 3
|
 |
March 14, 2019, 11:24:24 PM |
|
Im seeing this hash on 90% of my peers:
getblockhash 107086 3dba45d9162957eb7feaf6e40e8fb9f55d6155236ab1dfbc4a5ab80bcb3bf0cc
POW diff: 1279
Its a little too early to tell; resyncing the pool; Ill check again in 20 mins.
This is also where my wallets (1199g synced from scratch) landed on. Let's hope the network can agree on this chain. Also the pool seems to accept the solutions now. Good work so far!
|
|
|
|
bbptoshi
Copper Member
Newbie
Offline
Activity: 39
Merit: 0
|
 |
March 14, 2019, 11:58:25 PM |
|
is there a set of instructions for restarting sanctuaries? edit: nevermind, found this: Thanks Jenny - please do this:
- Wait 10 minutes for me to issue the global command to my sancs - cd %appdata%\biblepaycore - rm banlist.dat - rm blocks -r - rm chainstate -r - Restart wallet
The theory is that by removing the banlist, we will see each others nodes.
Thanks, Rob
and for my wallet I'm banning the nodes that only have 107021 blocks.
|
|
|
|
bible_pay (OP)
Full Member
 
Offline
Activity: 1176
Merit: 215
Jesus is the King of Kings and Lord of Lords
|
 |
March 15, 2019, 01:10:51 AM Last edit: March 15, 2019, 01:29:33 AM by bible_pay |
|
is there a set of instructions for restarting sanctuaries? edit: nevermind, found this: Thanks Jenny - please do this:
- Wait 10 minutes for me to issue the global command to my sancs - cd %appdata%\biblepaycore - rm banlist.dat - rm blocks -r - rm chainstate -r - Restart wallet
The theory is that by removing the banlist, we will see each others nodes.
Thanks, Rob
and for my wallet I'm banning the nodes that only have 107021 blocks. I realize this is very frustrating all. Looking at the code, I don't see any problem with our POW (POBH) algorithm. I believe the main problem is that the legacy (ghost) chain on < 1.1.9.9 is still considered longer, the client is pulling those blocks into memory, and the re-organize code keeps getting called. We just need to be patient and get more people upgraded to 1.1.9.9 (> 50%). It looks like we have 25% upgraded so far. Lets keep trying.
|
|
|
|
bible_pay (OP)
Full Member
 
Offline
Activity: 1176
Merit: 215
Jesus is the King of Kings and Lord of Lords
|
 |
March 15, 2019, 01:32:29 AM |
|
Ok it makes a little more sense now, I believe I see the lions share of the problem (IE bigger than reorganizing).
Ive got two nodes that dont agree here, both consider one of the forks valid but node 2 is following a shorter chain. Looking at the log on node2, it feels the list of sanctuaries is different than node1. Another words, it has a view of sanctuary payments that differs from other parts of our network.
So basically the sanctuary payment is enforced with a hard rule - and this can cause a fork. We need to either A) Sync all the sancs, or B) Temporarily not enforce that sancs get paid properly.
Ill work on B at a network level now.
Sancs: Please try to sync to the top so your data can synchronize.
|
|
|
|
zthomasz
Member

Offline
Activity: 489
Merit: 12
|
 |
March 15, 2019, 01:50:05 AM |
|
Give us the block number on the good chain.
|
|
|
|
bible_pay (OP)
Full Member
 
Offline
Activity: 1176
Merit: 215
Jesus is the King of Kings and Lord of Lords
|
 |
March 15, 2019, 01:58:11 AM |
|
Ok it makes a little more sense now, I believe I see the lions share of the problem (IE bigger than reorganizing).
Ive got two nodes that dont agree here, both consider one of the forks valid but node 2 is following a shorter chain. Looking at the log on node2, it feels the list of sanctuaries is different than node1. Another words, it has a view of sanctuary payments that differs from other parts of our network.
So basically the sanctuary payment is enforced with a hard rule - and this can cause a fork. We need to either A) Sync all the sancs, or B) Temporarily not enforce that sancs get paid properly.
Ill work on B at a network level now.
Sancs: Please try to sync to the top so your data can synchronize.
Ok, I'm fully positive the problem is that some nodes are denying sanctuary payments, other nodes arent, because they have different views (confirmed in the logs). So what this means is we either need 51% upgraded or we need to disable this hard check until we are all in agreement (this is sort of a chicken and egg problem). I think the safer way is for me to disable this rule and then re-enable after we are fully synced. However, the problem is this rule switch-over to spork requires another upgrade. Building... P.S. The reason we did not have to go through this before is because at Christmas of 2017 we made our cutover to sanctuaries a mandatory all at the same height (and we only had a few sancs). Now we have 500 sancs that are completely in disagreement with each others height.
|
|
|
|
thesnat21
Jr. Member
Offline
Activity: 490
Merit: 4
|
 |
March 15, 2019, 01:58:47 AM |
|
Ok it makes a little more sense now, I believe I see the lions share of the problem (IE bigger than reorganizing).
Ive got two nodes that dont agree here, both consider one of the forks valid but node 2 is following a shorter chain. Looking at the log on node2, it feels the list of sanctuaries is different than node1. Another words, it has a view of sanctuary payments that differs from other parts of our network.
So basically the sanctuary payment is enforced with a hard rule - and this can cause a fork. We need to either A) Sync all the sancs, or B) Temporarily not enforce that sancs get paid properly.
Ill work on B at a network level now.
Sancs: Please try to sync to the top so your data can synchronize.
We will need to know what the "top" is.. I have 3 different chains i'm seeing
|
|
|
|
eternalenvoy
Newbie
Offline
Activity: 56
Merit: 0
|
 |
March 15, 2019, 02:31:05 AM |
|
However, No matter where you are on which chain, the process end automatically after a period of time, whether in LINUX or WINDOWS.
Tested by different machine with delete the biblepaycore folder completely every time.
|
|
|
|
zthomasz
Member

Offline
Activity: 489
Merit: 12
|
 |
March 15, 2019, 02:35:40 AM |
|
7 nodes total, 6 different block numbers, 2 nodes with the same number (107021).
|
|
|
|
bible_pay (OP)
Full Member
 
Offline
Activity: 1176
Merit: 215
Jesus is the King of Kings and Lord of Lords
|
 |
March 15, 2019, 02:39:56 AM |
|
Ok it makes a little more sense now, I believe I see the lions share of the problem (IE bigger than reorganizing).
Ive got two nodes that dont agree here, both consider one of the forks valid but node 2 is following a shorter chain. Looking at the log on node2, it feels the list of sanctuaries is different than node1. Another words, it has a view of sanctuary payments that differs from other parts of our network.
So basically the sanctuary payment is enforced with a hard rule - and this can cause a fork. We need to either A) Sync all the sancs, or B) Temporarily not enforce that sancs get paid properly.
Ill work on B at a network level now.
Sancs: Please try to sync to the top so your data can synchronize.
We will need to know what the "top" is.. I have 3 different chains i'm seeing Ive confirmed we need a mandatory upgrade for the entire network to turn off the switch, then we sync to the top, the top stays the top, then in a couple days we will re-enable the spork. So for now, please disregard my comment about the Top. The new version will be out in less than one hour. Thanks for everyones patience.
|
|
|
|
bible_pay (OP)
Full Member
 
Offline
Activity: 1176
Merit: 215
Jesus is the King of Kings and Lord of Lords
|
 |
March 15, 2019, 04:10:05 AM |
|
Ok it makes a little more sense now, I believe I see the lions share of the problem (IE bigger than reorganizing).
Ive got two nodes that dont agree here, both consider one of the forks valid but node 2 is following a shorter chain. Looking at the log on node2, it feels the list of sanctuaries is different than node1. Another words, it has a view of sanctuary payments that differs from other parts of our network.
So basically the sanctuary payment is enforced with a hard rule - and this can cause a fork. We need to either A) Sync all the sancs, or B) Temporarily not enforce that sancs get paid properly.
Ill work on B at a network level now.
Sancs: Please try to sync to the top so your data can synchronize.
We will need to know what the "top" is.. I have 3 different chains i'm seeing Ive confirmed we need a mandatory upgrade for the entire network to turn off the switch, then we sync to the top, the top stays the top, then in a couple days we will re-enable the spork. So for now, please disregard my comment about the Top. The new version will be out in less than one hour. Thanks for everyones patience. One other reason this happened is we have GIN hosting a fair amount of Sancs and they are on the old chain (as I really dont want to have them upgrade until we are synced) - exacerbating the problem. (Their sancs are not in agreement with heights from 1199 sancs, so no masternode winners can be calculated yet).
|
|
|
|
|