New servers and file optimisation request!

Status
Not open for further replies.
Power Uploader
Joined
Feb 1, 2018
Messages
56
If you optimize all uploaded images, I hope you’ll only use the output if it’s actually smaller than the original file. My groups’ uploads are already fully optimized using various techniques. ;)

JPEGs can be optimized too. I use jpegrescan to losslessly optimize JPEGs. Yes, JPEGs are lossy, but it’s possible to reduce their file size without any additional quality loss by rearranging the encoded data. This reduces the size of non-optimized JPEGs by 25-33%.

I’ve seen many groups save their edited grayscale pages as full color PNGs/JPEGs. Throwing away the empty color channels of a JPEG usually saves only a bit of space, but converting a colorless 24 bit RGB PNG to an 8 bit grayscale PNG often saves up to 20% (not 67% even though you go from 24 to 8 bit because the three identical color channels can be compressed somewhat efficiently). Even more stupid is that some groups save their PNGs as 32 bit RGB+alpha or 16 bit grayscale+alpha, because that’s what Photoshop does by default even when the alpha channel is fully opaque! Throwing away useless alpha channels can save up to 10%. Color and transparency detection would need some additional computing power during an automatic optimization run, though.
But seriously. Anyone who saves a grayscale page as 32 bit RGB+alpha should be forced to personally apologize to any user on a metered Internet connection. Worldwide. I’ve shaved 75% off some chapters I downloaded from some groups. That means that metered users pay four times as much as they need to. /rant

Please don’t reduce the colors of PNGs to 16, though. That would make all the editors who paid extra attention to screentones and gradients cry (hint: dithering and banding) and quite often black and white won’t be 0 and 255 anymore, which looks shit and makes the editors cry more.
 
Group Leader
Joined
Mar 8, 2018
Messages
567
pngout shouldn't result in any change in the image, AFAIK (other than potential bitdepth changes) except that deleting all metadata also means removing the colourspace information. So if you strip out the metadata that says an image is sRGB, and your display isn't natively sRGB (or by extension, if the PNG was saved as Adobe RGB and you strip that out so the browser interprets it as sRGB), then yeah, there will be a visual difference. It's not exactly a change in image quality, it's just removing information needed to interpret the colourspace correctly.

Other advice for getting smaller image files:

1) Denoise your raws before starting work. Most raws are JPEG, and even high-quality raws benefit from a level 0 waifu2x denoise, while the vast majority need a level 1 waifu2x denoise. Some low-quality raws may even require a level 3 denoise. Even if you don't need to upscale your raws, you should still be using a level 0 or level 1 waifu2x denoise (or some equivalent), as the removal of JPEG compression artifacts results in a vastly more compressible image when it comes time to export as PNG. This also generally produces much nicer image quality for the readers. But don't go overboard! Only apply as much denoising as is required, no more. Level 1 normally does not have much if any loss of image detail, but level 3 is very heavy handed and will definitely lose fine detail. Level 3 should only be used as a last resort on really bad raws where the compression artifacts are much more distracting than any loss in detail.

2) The monochrome part of colour manga pages isn't monochrome, and compression suffers. In some of the series I work on, you might have a small little bit of it that has some colour. As a result, you need to export the page as a 24-bit PNG instead of an 8-bit PNG. In theory, monochrome image data should compress equally well in 8-bit or 24-bit because the extra bit depth is redundant. But in practice, as soon as you have some colour on the page, what looks like black and white image content probably has veeeery subtle colour variations. Like, instead of a shade of gray being 87/87/87, it might be 87/88/87. You can't really *see* that difference, but it absolutely murders lossless compression.

The solution is a bit of editing. If the colour is constrained to a part of your image, the following steps can dramatically improve compression (I'm talking a 50%+ reduction in file size) in Photoshop, and probably works in other image editing tools:

[ol]
[li]Select just the colour part of the image, as tightly as possible, and copy it to the clipboard
[li]Change the image mode from RGB to monochrome (this removes all the colour data)
[li]Change the image mode back to RGB (now you have a true monochrome image in colour mode)
[li]Paste the colour part of the image back from the clipboard ("Paste in place", SHIFT-CONTROL-V, to avoid having to reposition the pasted stuff)
[/ol]

Now when you go to export the PNG, all of your image except for the colour bits are fully monochrome and the PNG compression will work much better on it (particularly after using something like pnggauntlet).
 
Power Uploader
Joined
Feb 1, 2018
Messages
56
@Guspaz said:
If the colour is constrained to a part of your image, the following steps can dramatically improve compression (I'm talking a 50%+ reduction in file size) in Photoshop, and probably works in other image editing tools:

Select just the colour part of the image, as tightly as possible, and copy it to the clipboard
Change the image mode from RGB to monochrome (this removes all the colour data)
Change the image mode back to RGB (now you have a true monochrome image in colour mode)
Paste the colour part of the image back from the clipboard ("Paste in place", SHIFT-CONTROL-V, to avoid having to reposition the pasted stuff)
Hint: You can get that done a lot easier. Select the area without colours in the image (or select the coloured part and invert the selection) and then use the Desaturate tool. Done.
(That’s what it’s called in Gimp, but I’m sure Photoshop has it too.)

The Desaturate tool is also great for single-colour pages (like this): Desaturate the whole page, level to black and white as if it were a normal greyscale page, then add a screen layer with the desired colour. Voilà, super clean solid-colour areas.
 
Power Uploader
Joined
Feb 8, 2018
Messages
14
Tip: FileOptimizer. Run a TON of other programs, like Pingo, pngout, pngwolf, ECT, etc. Basically, it's running all the know optimizers, in the most optimal order. Take A LOT of time, but generates by far the best result.
 
Group Leader
Joined
Mar 8, 2018
Messages
567
That's a good point, that would save a few keystrokes. I normally don't need to bother with desaturate because 99% of the pages I see are monochrome so the Photoshop macro I have to set up my work environment on each page has a step to change the image mode from RGB to monochrome, and then it just stays there. You want to be in monochrome mode (rather than just desaturating) if possible, because a lot of PNG optimization tools are not smart enough to notice that your 24-bit RGB image has three identical channels, so you want them to start from a point where they already know it's a monochrome image because it's an 8-bit PNG they're optimizing. Fully monochrome 24-bit PNG images are still a bit larger than 8-bit PNG images even once optimized.
 
Member
Joined
Feb 15, 2018
Messages
437
@XXXXXXXXXIII
Wouldn't it be easier to just convert and store lossy JPGs when user upload the chapter?
100%/high quality jpg is bigger than png-8 in 99,9 cases out of 100, unless you start lowering quality to the point of visible jpg artifacts. I.e. grayscale b/w png-8 and color pages in jpg is the most sane option.

@Dher
Take A LOT of time, but generates by far the best result.
Usually it's not worth it. After first optimization (which most of the time you only need to convert scans to grayscale) by https://css-ig.net/pingo or things like http://www.mediafire.com/file/hlr1xszqildnyqn/png_for_dummies.zip/file any further optimizations are around 0.5-1% at best.
The best optimization is to clear scans from jpg artifacts and any additional garbage from used filters, change scan's mode to grayscale, and save in png-8 with 32-256 colors (depends on quality of raws). This makes further optimization while useful, but not that much as with RGB to grayscale.
 
Active member
Joined
Mar 5, 2019
Messages
761
Damnit, cant find my well written tutorial on how to minimize file size while increasing quality. I could probably make an example image set, however the only things I have scanned at high enough quality to do this with is porn...

Anyway let me go though the process.

Assuming you have control over scans.
1) get the image scanned at literally the highest detail you can, when I still had a scanner that worked that would come out to around 200-400mb for an a4 sized piece of paper, with a digital camera, a macro lense, and an image stitching program, I could probably go far FAR higher detail now, needless to say, if you can see individual fibers in the paper, you are good enough to proceed, if you cant, red do the damn scan till you can.
2) You want top quality, then you will have to do this page by page, you can batch it, but you will run into problems if you go to light or you will degrade quality if you overcompensate.
You want to completely remove any grain visible from the scan, right up to the ass edge of eating into the dots, once you have that, you want to pull the darkness down till all the black is solid black. because there is no grey in manga scans, there are only dots that make it look grey, this will effectively make a binary image, its is either white or it is black.
3) scale this stupidly large image up 2-9x size, and do all typesetting here, the extra size tends to make the text look better, and when you scale it down, IF you scale it down, it will ultimately look better
4) manga are 5x7 or 6x8 so the resulting images should be 3000x4200 or 3600x4800, this is just over and just under 8k in pixel height. the image you make would sound like is gigantic, but really, its about 1-2mb big, and this is before putting it though any kind of png crunching program

If you do not have control
1) look at your scan quality, can you see the dots on even the the worst of the images? are they sharp? if yes, good, you read the prior one as you have good enough scans to do it. If they are blurry, try to see if adjusting the white and black levels fixes it, if not keep going.
2) ok, your scan is shit, nothing you can do about that besides doing it yourself, but you aren't doing that. so piss with the cock ya got time. run it though waifux2 and increase the size 4-9 times, depending on how horrific the scan is, maybe 15, this WILL result in a stupidly large image, deal with it, you don't want to do it yourself, you use hdd space and ram to compensate.
Do all the editing and typesetting here
3)now you get a very fun job. you have to run your worst page though this process, and keep doing it till you get a good result.
-1 get irfanview and load the image into the batch
-2 output png, compression 9
-3 use advanced settings, change color depth, custom colors 16, use floyd dithering, use best color quality, convert to greyscale
take a look at the output file, compare it to the un effed with file, are there any visual differences? no, then do -3 again with 8 colors and from there reduce by 1 till you see differences you don't want to live with
4) most manga if you have a good enough base image you can get away with 4 or 5, I have gotten one down to 3, this is a very tedious process, but now it gets more tedious, go to the resize option, preserve aspect ratio, use resample, set long or short side only, and shrink it till it causes you issues of quality come up, you had a shit scan so you aren't getting an easy number guide to hit. doing this may also throw off the image so you have to readjust the colors too. do this till its as small as you can make it without killing the image quality, likely do this in increments of 1000 to start off, and ending in hundreds.
5) batch all greyscale images the same way, and do a final check, you likely have a resulting file that is 400kb-1.5mb if you did this right.

doing these should make images that are stupidly large, but modern browsers and image viewers will resample it on the users end and you will have an image that scales with time, and is as good as you can get to lossless quality this side of the mangaka being your friend and letting you scan their finished pages before they send them off.
 
Power Uploader
Joined
Dec 16, 2019
Messages
225
@alidan said:
doing these should make images that are stupidly large, but modern browsers and image viewers will resample it on the users end and you will have an image that scales with time, and is as good as you can get to lossless quality this side of the mangaka being your friend and letting you scan their finished pages before they send them off.
The sheer amount of work. Also 400KB is still higher than your usual compressed page
 
Member
Joined
Aug 25, 2019
Messages
82
@BzzBzz Except Google actually uses it extensively on sites they control where it makes sense?
(i.e. where big bandwidth savings can be made)

main-qimg-b531fdf5f7ce14d9c0012bf61b124376


As for browser support, technical sources show there's no issues with browser compatibility with the most used browsers

What you might be noticing though is that unless you're on mobile, using mobile apps or using specific browsers, Google won't give you the WebP version.

If you look at its stats, it's not used by many sites but it's preferred by high traffic websites, like big e-commerce sites.

Even Cloudflare (In 2018 they were handling 10 trillion requests per month, which were nearly 10 percent of all Internet requests) offers you to optionally serve your images as WebP to save on bandwidth since late 2016, and they point out massive 38% savings against PNG (Or vs 16% if they just optimize the original image) .

And in lossless compression it wins both against newer formats that have yet to mature and old ones.

Besides, why do you care if it's popular or not? If it works and it solves the problem of consuming less bandwidth what exactly is the problem?

In any case optimizing the existing PNGs is still the first logical step. WebP is at best a consideration for the future, and entirely up to whoever is maintaining the site
 
Active member
Joined
Mar 5, 2019
Messages
761
@hyoretsu the amount of work if you can dictate scan quality is minimal, if you cant, the process is more tedious then amount of work,
16 color
8 color
4 color
compare images to origional
if cant go lower
5 color
6 color
7 color
and compare again the thing that takes time here is image rendering rather then anything else
from here, you would take the image down in 1000 pixel increments till you see quality problems, and from there 100 pixels up or down from a 500 increment mark.

there is alot of hentai I like, and there are alot of morons that cant save an image to save their lives and thing 16mb is reasonable for a 1200 pixel wide png, how the people loaded the image into photoshop and did any editing baffles me given their judgement on what an acceptable file size is.
 
Dex-chan lover
Joined
Mar 24, 2018
Messages
599
@ixlone
I did a whole discussion on this exact thing a while back. Nobody (maybe a few) understood the whole problem of media bloat. I wish I could link to that thread. Thread search by user post name is not a thing yet.
Things to look out for about (Pinga/pingo).
1. The author of the software does not respond well to feedback. Can not see the wrongs when give perfect process and examples.
2. The software has issue's when dealing with large number of files to process. (GUI interface) will crash out of compressing and not compress the remaining files.
3. Multi-threading is the only way to get anything done in a timely manor. The resulting CPU workload goes to 100% (all process'es). Resulting in CPU bog-down.
4. Because of #2 problem, crashing out you can not tell which file(s) caused the problem. And not trust file are not compress-able according to pinga.
5. All stats reported by pinga are borked. Never trust the bottom line report in program as trustworthy.
6. Never use pinga on JPG type files. Does a very bad job of artifacting the file type. Use NXpower lite desktop 8.
7. If using nxpower lite, delete the png external call dll file. nxpower lite does an even worst job than pinga does to jpg.
8. Some groups may already using a compression program like pinga. There files will not process much more.

Good stuff about pinga.
1. File compression is outstanding. Tried them all. Best one working and does not bork the media.
2. Sometime the author will fix things. Rare but it happens.
3. It's labor intensive, but copy out to a working directory. Run pinga, move back to original location (overwrite with smaller on bigger). Repeat until satisfied wont do better.

#3 can resolve the crashing out effect (thru your labor). You just have to try different directories (ordering) to get around the problem files.

I have spent a lot time on pinga/pingo. I have seen 13mb files reduce to 1.5mb. Not much but then again ant bites don't hurt much until you get a lot.

Lastly: Just for laugh's look up pinga on google urban dictionary result. I did!
BTW, You will get nothing but grief about this subject, I did months ago.
PS. I am against using webp as a media type. I have seen it explode a file size from the original.
 
Joined
Dec 11, 2018
Messages
2
@Aeder Your technical source shows it doesn't support Safari so you are gonna meme some people doing it. (IMO that's what they get for using safari 😀)
 
Staff
Admin
Joined
May 29, 2012
Messages
594
@doppler

Maybe the crashing is due to lack of resources?

I dropped over 5k images of various sizes and res and while the GUI locked up, it finished all of it just fine.

A chapter doesn't have that many pages though, so it's generally done super fast for me.
 
The One
Staff
Admin
Joined
Jan 18, 2018
Messages
1,088
The issue with using webp is that many meme browsers still don't support it. Be it older browsers where people don't update because of reasons, or obscure FOSS browsers that just don't support it yet. (Apparently we have multiple people reading primarily on their PS4s)
 
Miku best girl
Admin
Joined
May 29, 2012
Messages
1,441
Pretty sure Safari don't support it... And half of our users are on iOS lol
 
Dex-chan lover
Joined
Mar 24, 2018
Messages
599
@ixlone
There is no rhyme or reason for the crash out's. Could be resources, could be solar flares or GOD playing dice and hiding the dice. <-- Look that one up. I have proven to my self (found files with a problem). Using GIMP re-saved the problem file and tried again (pinga). Same problem again. The problem is in his GUI. It will pull more than 8 files to process. Even tho my I7 only has 8 threads. All of it is bog-down worthy results. It commits too may threads.

Reducing files will save on space and bandwidth. That bullet must be bitten. It should be done on a offline system, my recommendation. Start with newest to oldest. Oldest will only save space, since nobody is pulling those records.
 
Status
Not open for further replies.

Users who are viewing this thread

Top