I would use casino grade dices and test it for its bias before using it. To convert it to binary without bias i would use this model : number1 : 1 number2: 0 number3: 00 number4: 01 number5: 10 number6: 11.
that sounds sketchy. but it could be valid. i have no idea. the thing is though that think about it. you're trying to get a 256-bit binary string. 4,5 and 6 reduce the entropy because they take more information to encode but they have the same probability of occuring as all the other numbers. that seems problematic maybe i'm not sure.
also, you have no idea how many dice rolls are going to be needed to generate your 256 bit number that seems problematic too. it could be 256 rolls but it could be 128 or anything in between which brings up another question: what if you roll the dice and have collected 255 bits and then on your final roll, you happen to roll a 4,5 or 6? then you have a problem.
it means you have to start all over again. i guess.
partly correct but you would not need to do all that fancy stuff you mentioned above since ian coleman has a "base 6" option. you can just enter your rolls using the digits 2-6. for example, 2342356533533225644331....
Do you think this is a a valid method?
not as valid as using o_e_l_e_o's coin flipping bias eliminator method. it's in this thread...