2 questions about winrar!

Discussion in 'Software' started by jollyr0ger, Mar 13, 2021.

Tags:
  1. jollyr0ger

    jollyr0ger Guest

    hello everybody. i was wondering what software is used to create these checksum files? 1.jpg
    and also.. what settings to use to compress a lot? especially the sample packs!
     
  2.  
  3. 5teezo

    5teezo Audiosexual

    Joined:
    Feb 2, 2012
    Messages:
    2,063
    Likes Received:
    1,173
    WinRar can create the checksum as well.

    Try .rar, .zip, .7z for compression and compare them. Think about cenvenience as well. People don't like having to install extra software because you used an obscure packing algorithm to compress it to save 5 mb of space.
     
    • Interesting Interesting x 1
    • List
  4. jollyr0ger

    jollyr0ger Guest

    i read the manual, but didn't find how to create those checsum .sfv files! i think they use an external program..
    i like winrar.. what weird formats are you referring to? however you can also compress in .zip if you need to share files on the web etc..
     
  5. Ariel Gonzalez

    Ariel Gonzalez Platinum Record

    Joined:
    Nov 26, 2020
    Messages:
    523
    Likes Received:
    209
    Location:
    Somewhere Out There
    when you add files to a new archive, it has an option to separate it on a determine number of archive with the same amount of space (on MB)
     
  6. DoubleTake

    DoubleTake Audiosexual

    Joined:
    Jul 16, 2017
    Messages:
    2,197
    Likes Received:
    1,152
    I used to just use the default setting with "best" compression.
    When i first began to pay attention I found that most I downloaded were using rar4, & I figured that was for backwards compatibility.

    I compared rar4 vs rar5, and rar5 seemed to compress a little better but i got different results depending on dictionary size.
    I am not sure of the best way, but I settled to use rar5 & best compression.
    I believe I have the other settings all default except that i increased the dictionary size to 1024 Meg (over 1 Gig) because I expect to have plenty of memory.
    "Compression needs about 6 times more memory than dictionary size. Decompression takes slightly more memory than just 1 dictionary size."
    So, with at least 6 Gig of memory it should not be a problem. Well these days at least 16 Gig is more common (even my old WinXP machine had 8 Gig).
    ---
    I began to make solid archives of Kontakt libraries, as it is not often i would need to get any file from inside them.
    " * WinRAR provides functionality for creating a 'solid' archive,
    which can raise the compression ratio by 10% - 50% over more common
    methods, particularly when packing large numbers of small files."

    Winrar help says that it takes longer to extract individual items from solid archives & there is less chance of recovering a corrupted file, but I don't include recovery volumes anyway, and instead I mirror my archive drive to a second drive.
    ----
    So, for big archives I make them solid, 1024 Meg dictionary size, best compression, store symbolic links & hard links "as LINKS", add 'quick open' information: for larger files ( i did not realizer i had that one selected, but that may be the default).
     
    • Like Like x 1
    • Useful Useful x 1
    • List
  7. Haliax

    Haliax Guest

    If you want better compression, LZMA often outperforms most common compression algorithms (winzip, winrar etc), you could also consider freearc which is comparable. 7-zip uses LZMA
     
    • Interesting Interesting x 1
    • Useful Useful x 1
    • List
  8. jollyr0ger

    jollyr0ger Guest

    thanks for all the compression tips! I always use rar5, better method and the dictionary at 1024 mb, I thought it could compress more! example: 1.jpg
     
    • Interesting Interesting x 1
    • List
  9. DoubleTake

    DoubleTake Audiosexual

    Joined:
    Jul 16, 2017
    Messages:
    2,197
    Likes Received:
    1,152
    Something I forgot... was finding R2R packages with compression ratios I could not match, no matter what settings I tried.
    I wondered if they had somehow hacked Winrar to get better compression ratios.
    I then also found a few 7zip files i could not match, so i am wondering ... must be some better ways, depending on the source files, or what?
     
    • Interesting Interesting x 2
    • Like Like x 1
    • List
  10. Haliax

    Haliax Guest

    Compression algorithms depend on the source, some work better than others. You could find the best algorithm, then use it for a different file and it sucks. It's a mix of trial and error, and experience.

    I've seen 100GB compressed down to less than 30GB, but the decompression takes hours.
     
    Last edited by a moderator: Mar 13, 2021
    • Agree Agree x 2
    • Interesting Interesting x 2
    • List
  11. Xupito

    Xupito Audiosexual

    Joined:
    Jan 21, 2012
    Messages:
    6,988
    Likes Received:
    3,862
    Location:
    Europe
    As Haliax says, there's a lot of factors that matter in compression ratio.
    For instance, when I download "no install" releases they are always 7zip files (the winrar self extracting exe's counterpart) with a huge compression ratio. But who knows how much time they take.
    Considering you know quite a lot seeing your posts I wouldn't sweat it unless it's really important.

    The only thing I'd add is about the solid compression. That one can do wonders when you compress many small files of the same format. The reason is that it treats all the archives as a big one containing all of them (think of it like compressing an iso file).
    So you avoid the overhead of adding a lot of new files and treat them as one.

    Also, for me has become indispensable a list of already compressed formats, like .flac files. Compressed file formats can't be compressed further even if the method is old and dated. Of course this includes .rar and .zip files. You tell Winrar which extensions are already compressed and it only "stores" them (like in the compression option). This boosts the speed a lot.

    This is my list that includes Kontakt and Toontrack huge compressed sample formats:
    Code:
    *.gp7bank *.rar *.zip *.lzma *.tgz *.gz *.z *.tbz *.tbz2 *.bz *.bz2 *.7z *.mov *.mpeg *.mpg *.mkv *.avi *.flv *.mp4 *.m4v *.m4a *.mp3 *.flac *.swf *.aac *.ac3 *.ogg *.ogm *.jpeg *.jpg *.gif *.png *.jp2 *.pdf *.nkx *.nkc *.nks *.nxp *.f4v *.msi *.ncw *.xpak *.obw *.db *.bfdca *.n2v *.n2p *.n3v *.n3p *.cab *.msi *.esd *.wim *.vsix *.cab *.msi *.mshc *.klz
     
    • Love it! Love it! x 1
    • Useful Useful x 1
    • List
  12. DoubleTake

    DoubleTake Audiosexual

    Joined:
    Jul 16, 2017
    Messages:
    2,197
    Likes Received:
    1,152
    Nice! I am missing something though ..
    I can't see a place to have Winrar only store those types.
    Do I place the extension list in a setting someplace?
     
  13. jollyr0ger

    jollyr0ger Guest

    hy @Xupito in my example i compressed a folder of samples loopmasters .wav.. do you think it is a decent compression?
     
    • Interesting Interesting x 1
    • List
  14. ArticStorm

    ArticStorm Moderator Staff Member

    Joined:
    Jun 7, 2011
    Messages:
    7,240
    Likes Received:
    3,525
    Location:
    AudioSexPro
    I am using Totoal Commander by Ghisler, which can create all sorts of checksums and hashes.
     
  15. Xupito

    Xupito Audiosexual

    Joined:
    Jan 21, 2012
    Messages:
    6,988
    Likes Received:
    3,862
    Location:
    Europe
    Open Winrar
    Options menu->Settings
    Compression Tab
    Create default profile button
    Files tab
    "Files to store without compression" text input (the third at the

    Damn, what a bunch of steps now that I realize. No wonder most people doesn't know about it.
    I don't know, it depends of the kind of sounds. Usually 60% or lower is good compression. But it varies.

    Finally, this is a nice Windows shell extension to compute or verify hashes:
    https://github.com/gurnec/HashCheck/releases/latest
     
    • Love it! Love it! x 1
    • Useful Useful x 1
    • List
  16. phumb-reh

    phumb-reh Guest

    Here's some fancy nighttime reading: https://peazip.github.io/fast-compression-benchmark-brotli-zstandard.html

    Note it is mainly about speed in regards to Zstd and Brotli, but in the last chapter you see comparisons to RAR/ZIP/7zip.

    My settings are for archival: Format: RAR(5), 512MB dictionary size, Good compression, Solid archive, recovery record.

    For "casual use" or sending files, then smaller dictionary size will do, recovery record (non-solid archive)

    Also: encryption if it's going anywhere near a cloud drive.

    *edit:* oh and for checksums if they're needed I use "sha256sum", though WinRAR can create blake2 hashes directly.
     
    Last edited by a moderator: Mar 13, 2021
    • Interesting Interesting x 1
    • Love it! Love it! x 1
    • Useful Useful x 1
    • List
  17. Xupito

    Xupito Audiosexual

    Joined:
    Jan 21, 2012
    Messages:
    6,988
    Likes Received:
    3,862
    Location:
    Europe
    • Like Like x 1
    • Agree Agree x 1
    • Interesting Interesting x 1
    • List
  18. phumb-reh

    phumb-reh Guest

    Well they're not doing too bad on the compression amount, but I think the initial impulse for Brotli was designed to replace gzip compression in HTTP transfers, where you'd want a fast compression/decompression. Or other on-line compression effort.

    But yeah, so far they're not close to RAR/7z efficiency for archiving.
     
    • Like Like x 1
    • Interesting Interesting x 1
    • List
  19. DoubleTake

    DoubleTake Audiosexual

    Joined:
    Jul 16, 2017
    Messages:
    2,197
    Likes Received:
    1,152
    OK Great!
    I had a few profiles already set up for different compression types and formats, with & without passwords, so I will need to add this into each profile and resave them.
     
  20. jollyr0ger

    jollyr0ger Guest

    thanks for the information on compression.. hashcheck i was already using it, i didn't notice i could also create checksum files!
     
  21. Xupito

    Xupito Audiosexual

    Joined:
    Jan 21, 2012
    Messages:
    6,988
    Likes Received:
    3,862
    Location:
    Europe
    That makes a lot of sense. Sometimes one forgets that many web files are gzip compressed between the server and the browser. Or can be.
     
Loading...
Similar Threads - questions winrar Forum Date
LUFS questions Mixing and Mastering Feb 18, 2024
Questions regarding routing of Bluecat's Patchwork Software Jan 17, 2024
A few stupid beginner's questions Mac / Hackintosh Jan 15, 2024
Questions about Focusrite Saffire Firewire pcie card comparability in 2023 Computer Hardware Dec 20, 2023
Mac m1 Samona Firewall security questions Mac / Hackintosh Dec 16, 2023
Loading...