Bitcoin Forum
November 05, 2024, 04:36:01 PM *
News: Latest Bitcoin Core release: 28.0 [Torrent]
 
   Home   Help Search Login Register More  
Pages: [1]
  Print  
Author Topic: [deleted]  (Read 455 times)
moukafih.adil (OP)
Newbie
*
Offline Offline

Activity: 1
Merit: 0


View Profile
December 06, 2017, 12:15:48 PM
Last edit: January 12, 2018, 12:26:06 AM by moukafih.adil
 #1

[deleted] (to mod : please delete this thread and the account with it for my privacy)
hasmukh_rawal
Copper Member
Full Member
***
Offline Offline

Activity: 490
Merit: 105


★777Coin.com★ Fun BTC Casino!


View Profile
December 06, 2017, 12:46:16 PM
 #2

I am not much into mathematics since I was dumb in the subject from my school days but I do know quite a few things in the digital world. I have studied computer science all my life and know there are ways to solve any problem with the right mindset and intelligence. I have studied that data can be compressed without losing any information.
So I think it is possible to compress raw data(blocks from the transaction) and use it to free up some space in the blocks. There is a possibility of a lossless compression by modifying the algorithm but then it will take many experts in coding to find a feasible way to compress the raw data and save up some space to add more data into the block. The question is, if it is possible why hasn't anybody tried it yet ? Undecided

nullius
Copper Member
Hero Member
*****
Offline Offline

Activity: 630
Merit: 2614


If you don’t do PGP, you don’t do crypto!


View Profile WWW
December 06, 2017, 03:56:45 PM
 #3

One thing always came in my mind is a block are like a bus who pass every 10min. In real life the bus have theoric limited amount of seats (the 1mb limit analogy), but it possible to add more people in the bus with a little of additional work. or "compress" work

[...]

And in physics its possible to compress anything on a small limited volume, but that "compress" need always addition work according to the initial volume. (Black holes for example)

So my theory is if it is possibile to find an algorythm that do the same thing but with data and information. Giving an arbitrary large file, is it possible mathematically to compress it to a limit less 1Mb.

What an excellent idea!!  May I ask a humble question, maybe to improve your genius.  Why not feed the output of the compression program back into the compression program recursively?  You could compress the whole blockchain to be printed in a QR code for backup!  Or even the whole Internet!

Possible prior art:  WEB compressor, U.S. Patent 5,533,051, U.S. Patent 5,488,364, etc.  Tell me, is your method patented??

(Forum, please forgive me.  I never had the pleasure of suffering these in comp.compression.)


mda
Member
**
Offline Offline

Activity: 144
Merit: 13


View Profile
December 06, 2017, 05:02:52 PM
 #4

In short, If this proposition is true :
"Every positive integer is the sum of Superior Highly Composite Number"
Quote
The first 15 superior highly composite numbers, 2, 6, 12, 60, 120, 360, 2520, 5040, 55440, 720720, 1441440, 4324320, 21621600, 367567200, 6983776800

1 is a positive integer so 'Quote from: moukafih.adil' isn't true, sorry.
nullius
Copper Member
Hero Member
*****
Offline Offline

Activity: 630
Merit: 2614


If you don’t do PGP, you don’t do crypto!


View Profile WWW
December 06, 2017, 05:26:13 PM
 #5

What an excellent idea!!  May I ask a humble question, maybe to improve your genius.  Why not feed the output of the compression program back into the compression program recursively?  You could compress the whole blockchain to be printed in a QR code for backup!  Or even the whole Internet!

Possible prior art:  WEB compressor, U.S. Patent 5,533,051, U.S. Patent 5,488,364, etc.  Tell me, is your method patented??

(Forum, please forgive me.  I never had the pleasure of suffering these in comp.compression.)


Yeah I think its possible, by compressing the compression. Until making the whole blockchain or internet in a QR Core. And if you want to decompress you need let it grow in the program like a growing tree from a small seed. And this if its the compression work.

My method is just a theory written in some pieces of paper, I didnt made any program yet to test to prove if it work.

No I havent patented yet. I have no idea how to proceed for a Moroccan citizen if you have any experiance in patenting paper work.

To be honest I dont care if it patented or not. The important thing for me is make it work and open to everyone. I dont care who will get the credit at the end.

I think the first important step for this method, is to prove the mathematic behind it, the proposition of "Every positif integer is a sum of Superior Highly Composite Number", then prove the summation factorisation algorithm is computationaly feasable for large numbers. And at the end make some opensource program to test it and make it real.

EDIT:
Just forget the constraint of dictionary of prime numbers.

Compressing a file, it need a dictionnary of prime numbers. the same for decompressing. so the compression of the blockchain or internet need a large dictionnary of prime numbers in storage so i think its not feasable, maybe if a new formula is discovered to construct prime numbers, like for SHCN in the futur who knows.

I want to frame this and hang it on my wall.  Tell me, have you ever considered designing a motor which drives an electrical generator to power the motor?  Or a train locomotive which holds a giant magnet in front, to pull the whole train forward?  Or a water wheel which powers a pump which moves the water back to the top of the water wheel?  Fascinating ideas.

I will grant that your Bitcoin scaling idea is superior to the scaling ideas behind BCH, BU, 2X, and their ilk.

Now, you go read Section 9 of the comp.compression FAQ.  Please don’t come back unless you’ve read that, else the forum may never forgive me.

Quote from: comp.compression FAQ
It is mathematically impossible to create a program compressing without loss all files by at least one bit (see below and also item 73 in part 2 of this FAQ). Yet from time to time some people claim to have invented a new algorithm for doing so. Such algorithms are claimed to compress random data and to be applicable recursively, that is, applying the compressor to the compressed output of the previous run, possibly multiple times. Fantastic compression ratios of over 100:1 on random data are claimed to be actually obtained.

[...]

9.2 The counting argument

Theorem:
No program can compress without loss all files of size >= N bits, for any given integer N >= 0.

Proof:
Assume that the program can compress without loss all files of size >= N bits.  Compress with this program all the 2^N files which have exactly N bits.  All compressed files have at most N-1 bits, so there are at most (2^N)-1 different compressed files [2^(N-1) files of size N-1, 2^(N-2) of size N-2, and so on, down to 1 file of size 0]. So at least two different input files must compress to the same output file. Hence the compression program cannot be lossless.

The proof is called the "counting argument". It uses the so-called pigeon-hole principle: you can't put 16 pigeons into 15 holes without using one of the holes twice.

The overcrowded pigeons were a hint.  I did not waste my time reading your equations.  What you are trying to do is proved mathematically impossible.

By the way, JPEG is a lossy compression format.  It throws away information.  Your example of a compressed file was a JPEG.  Learn the difference before you try designing a compression algorithm.

mda
Member
**
Offline Offline

Activity: 144
Merit: 13


View Profile
December 06, 2017, 09:31:49 PM
 #6

There is a gap between 12×2+6×2+2×2+1=41 and 60.
Why not use product of primes instead of SHCN addition?
After all product is just a serial addition and it's miner's job to repeat operations.
Pages: [1]
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2006-2009, Simple Machines Valid XHTML 1.0! Valid CSS!