Staredit Network > Forums > Null > Topic: Megaupload shut down by the FBI
Megaupload shut down by the FBI
Jan 19 2012, 8:22 pm
By: Aristocrat
Pages: < 1 « 3 4 5
 

Feb 9 2012, 9:15 pm O)FaRTy1billion[MM] Post #81

👻 👾 👽 💪

Quote from Lanthanide
Quote from rockz
Take farty's example: mp3 is 1.47 MB. png is 1.42 without decent compression.
Where does Farty say how big either of the files are?
In my only post. ;o

Actually, I guess not. Maybe it was in the shoutbox ... I was talking about it a bit somewhere at least. /Edit

Quote from Lanthanide
Also this simply doesn't make sense from a theoretical standpoint, unless the MP3 compression simply wasn't as "compressed" as possible. I think you'd see a similar drop in size if you put it into a .zip or .rar.
Compression isn't the major issue here ... Other than being able to actually upload the image, the files would the same size as you'd download anyway. And if you really want to get fancy you could just chunk the files into small pieces with any archiving tool.. though that would be a pain to retrieve (re-assembly could be automated).



TinyMap2 - Latest in map compression! ( 7/09/14 - New build! )
EUD Action Enabler - Lightweight EUD/EPD support! (ChaosLauncher/MPQDraft support!)
EUDDB - topic - Help out by adding your EUDs! Or Submit reference files in the References tab!
MapSketch - New image->map generator!
EUDTrig - topic - Quickly and easily convert offsets to EUDs! (extended players supported)
SC2 Map Texture Mask Importer/Exporter - Edit texture placement in an image editor!
\:farty\: This page has been viewed [img]http://farty1billion.dyndns.org/Clicky.php?img.gif[/img] times!

Feb 10 2012, 7:34 am EzDay281 Post #82



Quote
Actually, I guess not. Maybe it was in the shoutbox ... I was talking about it a bit somewhere at least. /Edit
Don't know about that other guy, but you mentioned the sizes to me over MSN.

Quote
Also this simply doesn't make sense from a theoretical standpoint, unless the MP3 compression simply wasn't as "compressed" as possible. I think you'd see a similar drop in size if you put it into a .zip or .rar.
Well, the most ""compressed" as possible" algorithm would yield a one byte file (and be pretty useless, but that's irrelevant).
No compression is perfect; any file can be compressed further by introducing a new algorithm.



None.

Feb 10 2012, 8:27 am Lanthanide Post #83



Quote from EzDay281
Well, the most ""compressed" as possible" algorithm would yield a one byte file (and be pretty useless, but that's irrelevant).
No compression is perfect; any file can be compressed further by introducing a new algorithm.
Please go do some information theory or something, because you don't know what you're talking about.

As usual wikipedia is a useful starting point: http://en.wikipedia.org/wiki/Information_entropy and http://en.wikipedia.org/wiki/Lossless_data_compression#Limitations



None.

Feb 10 2012, 4:10 pm Aristocrat Post #84



Quote from Lanthanide
Please go do some information theory or something, because you don't know what you're talking about.

Actually, he's right. An algorithm told to interpret a single bit as "01001111101010010000010101111100101010001111111111110100100..." would achieve "compression" by reducing the size of said corresponding file to one bit (as on any device with said algorithm, the one-bit file would be "uncompressable". It just happens to be useless because the size of the decompressor would be larger than the original file.



None.

Feb 10 2012, 8:01 pm FatalException Post #85



Hey, look what Reddit found



None.

Feb 10 2012, 8:05 pm EzDay281 Post #86



Quote from Lanthanide
Quote from EzDay281
Well, the most ""compressed" as possible" algorithm would yield a one byte file (and be pretty useless, but that's irrelevant).
No compression is perfect; any file can be compressed further by introducing a new algorithm.
Please go do some information theory or something, because you don't know what you're talking about.
I made a specific claim. You should explain specifically in what way it is irrelevant or false.
I'm quite confident the latter is not the case.

Quote from Aristocrat
Quote from Lanthanide
Please go do some information theory or something, because you don't know what you're talking about.

Actually, he's right. An algorithm told to interpret a single bit as "01001111101010010000010101111100101010001111111111110100100..." would achieve "compression" by reducing the size of said corresponding file to one bit (as on any device with said algorithm, the one-bit file would be "uncompressable". It just happens to be useless because the size of the decompressor would be larger than the original file.
Why would we be making a decompressor for a single file? It'd be simpler to just make a program that spits out the desired file with no input. A decompressor larger than a single instance of its output isn't necessarily useless if given multiple, different inputs. (Granted, only two possible ones for a single-bit compression)
On the other hand, a single-bit compression's corresponding decompression would be, like stated, two possible values. Lossiness there is the problem.

Post has been edited 3 time(s), last time on Feb 10 2012, 8:49 pm by EzDay281.



None.

Feb 10 2012, 9:07 pm Lanthanide Post #87



Quote from EzDay281
I made a specific claim. You should explain specifically in what way it is irrelevant or false.
I'm quite confident the latter is not the case.
I provided you with wikipedia links, did you read them?

You will discover that a "lossless compression algorithm", which is what we are discussing, must be able to reproduce the original document, otherwise it is not a "lossless compression algorithm". Therefore reducing a file to a single bit (or byte) is not a lossless compression algorithm.

Furthermore if you read the wikipedia pages, you will see that they say that for any compression algorithm, there are some inputs for which the resulting output will always be larger than the input; it is IMPOSSIBLE to make an algorithm that makes 100% of all inputs smaller. So your claim that you can take any file and give it to another algorithm that will always make it smaller is also wrong. If you read the page of shannon's theory, which is about the entropy of information, you would know that there is a mathematical limit to compression which cannot be breached: when a file is already at this limit, using another algorithm on it will NOT make it smaller.

Consider the actual implications of what you're suggesting: we could find some string of algorithms that we could use to make any file 1/10th of it's original size (or even smaller). And yet this doesn't happen in reality, because it is in fact impossible.



None.

Feb 10 2012, 9:37 pm Sacrieur Post #88

Still Napping

A universal lossless compressor is impossible.

You must create a specialized algorithm that only reduces files you need. the nature of the algorithm means that other unideal inputs must be increased in size.



None.

Feb 10 2012, 9:43 pm BiOAtK Post #89



Stop talking about compression. This has nothing to do with what the topic is about...



None.

Feb 10 2012, 9:52 pm O)FaRTy1billion[MM] Post #90

👻 👾 👽 💪

Quote from Lanthanide
Furthermore if you read the wikipedia pages, you will see that they say that for any compression algorithm, there are some inputs for which the resulting output will always be larger than the input; it is IMPOSSIBLE to make an algorithm that makes 100% of all inputs smaller.
He never claimed otherwise to any of this. ;o

Quote
Consider the actual implications of what you're suggesting: we could find some string of algorithms that we could use to make any file 1/10th of it's original size (or even smaller). And yet this doesn't happen in reality, because it is in fact impossible.
He never suggested that either.

Post has been edited 1 time(s), last time on Feb 10 2012, 9:58 pm by FaRTy1billion.



TinyMap2 - Latest in map compression! ( 7/09/14 - New build! )
EUD Action Enabler - Lightweight EUD/EPD support! (ChaosLauncher/MPQDraft support!)
EUDDB - topic - Help out by adding your EUDs! Or Submit reference files in the References tab!
MapSketch - New image->map generator!
EUDTrig - topic - Quickly and easily convert offsets to EUDs! (extended players supported)
SC2 Map Texture Mask Importer/Exporter - Edit texture placement in an image editor!
\:farty\: This page has been viewed [img]http://farty1billion.dyndns.org/Clicky.php?img.gif[/img] times!

Feb 10 2012, 9:56 pm Lanthanide Post #91



Quote from O)FaRTy1billion[MM]
Quote from Lanthanide
Furthermore if you read the wikipedia pages, you will see that they say that for any compression algorithm, there are some inputs for which the resulting output will always be larger than the input; it is IMPOSSIBLE to make an algorithm that makes 100% of all inputs smaller.
He never claimed otherwise to any of this. ;o
Quote
Consider the actual implications of what you're suggesting: we could find some string of algorithms that we could use to make any file 1/10th of it's original size (or even smaller). And yet this doesn't happen in reality, because it is in fact impossible.
He never suggested that either.
He said this: "No compression is perfect; any file can be compressed further by introducing a new algorithm." which has the implications of both the above statements.



None.

Feb 10 2012, 9:58 pm O)FaRTy1billion[MM] Post #92

👻 👾 👽 💪

For a given file (not any/every file) you can find better algorithms more suited for that file (or type of file), so that particular file will get compressed more that compared to other algorithms. Never was it suggest that "a single better algorithm for every file ever" could exist.

EDIT:
By the way I edited the hell out of this trying to make myself more clear. xD "Any" seemed pretty ambiguous here, so I was replacing with "every".

Post has been edited 1 time(s), last time on Feb 10 2012, 10:04 pm by FaRTy1billion.



TinyMap2 - Latest in map compression! ( 7/09/14 - New build! )
EUD Action Enabler - Lightweight EUD/EPD support! (ChaosLauncher/MPQDraft support!)
EUDDB - topic - Help out by adding your EUDs! Or Submit reference files in the References tab!
MapSketch - New image->map generator!
EUDTrig - topic - Quickly and easily convert offsets to EUDs! (extended players supported)
SC2 Map Texture Mask Importer/Exporter - Edit texture placement in an image editor!
\:farty\: This page has been viewed [img]http://farty1billion.dyndns.org/Clicky.php?img.gif[/img] times!

Feb 10 2012, 10:00 pm Lanthanide Post #93



That's not necessarily true, because we are taking an *already compressed* file and trying to make it smaller. It is not the case that it is possible to find an algorithm for a given file that makes that given file smaller, see the wikipedia page:
Quote
Mark Nelson, frustrated over many cranks trying to claim having invented a magic compression algorithm appearing in comp.compression, has constructed a 415,241 byte binary file ([1]) of highly entropic content, and issued a public challenge of $100 to anyone to write a program that, together with its input, would be smaller than his provided binary data yet be able to reconstitute ("decompress") it without error.[7]




None.

Feb 10 2012, 10:01 pm EzDay281 Post #94



Quote
I provided you with wikipedia links, did you read them?
https://www.google.com/
This website I just linked you to explains why you are wrong.
Somewhere.
Find it yourself.

Quote
You will discover that a "lossless compression algorithm", which is what we are discussing
...
mp3.
Lossless.
The hell are you talking about?

Quote
So your claim that you can take any file and give it to another algorithm that will always make it smaller is also wrong.
I never said that.
I referred to "a new algorithm" - not "a single algorithm", not "one of the algorithms we've already applied to a file", but "any one of the uncountable millions of millions of possible algorithms that can be defined on a modern computer in a few mega- or gigabytes."

Quote
Consider the actual implications of what you're suggesting: we could find some string of algorithms that we could use to make any file 1/10th of it's original size (or even smaller).
That's because to be useful in any way, we need to keep track of what algorithms we used so that we know how to decompress it.
And yes, in the vast majority of the set of all possible data strings, this data plus the compressed data will be as large or larger than the input.
Fortunately for me, that's not what we're talking about. We're talking about the result of an mp3 compression of what I assume is mostly naturally-recorded sound (though I suppose farty could have used white noise, techno music, a videogame recording or something as the source), which is an extremely limited subset of all possible data strings of that given size. So no, the output is not random noise; it is a predictable, pattern-rich string.
Unless you want to try to tell me that mp3 is a practically perfect compression (i.e. the output is effectively random noise for all practically useful decompression algorithms except mp3).

edit:
Ninja'd. Actually, ninja'd^5 or so.
Quote
Mark Nelson, frustrated over many cranks trying to claim having invented a magic compression algorithm appearing in comp.compression, has constructed a 415,241 byte binary file ([1]) of highly entropic content, and issued a public challenge of $100 to anyone to write a program that, together with its input, would be smaller than his provided binary data yet be able to reconstitute ("decompress") it without error.[7]
It took us 358 years to solve Fermat's last theorem.
"It's hard" and "it's mathematically impossible" are very different things.

Post has been edited 1 time(s), last time on Feb 10 2012, 10:07 pm by EzDay281.



None.

Feb 10 2012, 10:05 pm Aristocrat Post #95



:facepalm:



None.

Feb 13 2012, 2:16 am Oh_Man Post #96

Find Me On Discord (Brood War UMS Community & Staredit Network)

OK Megashares search function is back up, false alarm! *phew*




Feb 21 2012, 7:41 pm Gigins Post #97



This pretty much sums up the piracy thing.

http://www.escapistmagazine.com/videos/view/jimquisition/5268-Piracy-Episode-One-Copyright



None.

Mar 9 2012, 2:17 am Zycorax Post #98

Grand Moderator of the Games Forum

Looks like Hotfile might be going down next: http://www.bbc.co.uk/news/technology-17300225




Options
Pages: < 1 « 3 4 5
  Back to forum
Please log in to reply to this topic or to report it.
Members in this topic: None.
[01:19 pm]
Vrael -- IM GONNA MANUFACTURE SOME SPORTBALL EQUIPMENT WHERE THE SUN DONT SHINE BOY
[01:35 am]
Ultraviolet -- Vrael
Vrael shouted: NEED SOME SPORTBALL> WE GOT YOUR SPORTBALL EQUIPMENT MANUFACTURING
Gonna put deez sportballs in your mouth
[2024-5-01. : 1:24 pm]
Vrael -- NEED SOME SPORTBALL> WE GOT YOUR SPORTBALL EQUIPMENT MANUFACTURING
[2024-4-30. : 5:08 pm]
Oh_Man -- https://youtu.be/lGxUOgfmUCQ
[2024-4-30. : 7:43 am]
NudeRaider -- Vrael
Vrael shouted: if you're gonna link that shit at least link some quality shit: https://www.youtube.com/watch?v=uUV3KvnvT-w
Yeah I'm not a big fan of Westernhagen either, Fanta vier much better! But they didn't drop the lyrics that fit the situation. Farty: Ich bin wieder hier; nobody: in meinem Revier; Me: war nie wirklich weg
[2024-4-29. : 6:36 pm]
RIVE -- Nah, I'm still on Orange Box.
[2024-4-29. : 4:36 pm]
Oh_Man -- anyone play Outside the Box yet? it was a fun time
[2024-4-29. : 12:52 pm]
Vrael -- if you're gonna link that shit at least link some quality shit: https://www.youtube.com/watch?v=uUV3KvnvT-w
[2024-4-29. : 11:17 am]
Zycorax -- :wob:
[2024-4-27. : 9:38 pm]
NudeRaider -- Ultraviolet
Ultraviolet shouted: NudeRaider sing it brother
trust me, you don't wanna hear that. I defer that to the pros.
Please log in to shout.


Members Online: Roy