...making Linux just a little more fun!
Ben Okopnik [ben at linuxgazette.net]
I've always been curious about the huge disparity in file sizes between certain images, especially when they have - oh, more or less similar content (to my perhaps uneducated eye.) E.g., I've got a large list of files on a client's site where I have to find some path between a good average image size (the pic that pops up when you click the thumbnail) and a reasonable file size (something that won't crash PHP/GD - a 2MB file brings things right to a halt.)
Here's the annoying thing, though:
ben at Jotunheim:/tmp$ ls -l allegro90_1_1.jpg bahama20__1.jpg; identify allegro90_1_1.jpg bahama20__1.jpg -rwxr-xr-x 1 ben ben 43004 2010-09-28 19:43 allegro90_1_1.jpg -rwxr-xr-x 1 ben ben 1725638 2010-09-28 14:37 bahama20__1.jpg allegro90_1_1.jpg JPEG 784x1702 784x1702+0+0 8-bit DirectClass 42kb bahama20__1.jpg[1] JPEG 2240x1680 2240x1680+0+0 8-bit DirectClass 1.646mb
The first image, which is nearly big enough to cover my entire screen, is 42k; the second one, while admittedly about 3X bigger in one dimension, is 1.6MB+, over 40 times the file size. Say *what*?
And it's not like the complexity of the content is all that different; in fact, visually, the first one is more complex than the second (although I'm sure I'm judging it by the wrong parameters. Obviously.) Take a look at them, if you want:
https://okopnik.com/images/allegro90_1_1.jpg https://okopnik.com/images/bahama20__1.jpg
So... what makes an image - seemingly of the same type, according to what "identify" is reporting - that much bigger? Does anybody here know? And is there any way to make the file sizes closer without losing a significant amount of visual content?
-- * Ben Okopnik * Editor-in-Chief, Linux Gazette * https://LinuxGazette.NET *
afsilva at gmail.com [(afsilva at gmail.com)]
> > So... what makes an image - seemingly of the same type, according to > what "identify" is reporting - that much bigger? Does anybody here know? > And is there any way to make the file sizes closer without losing a > significant amount of visual content? >
It has to do with the compression of the image. I am going to guess someone else will have a scientific explanation, so I am not even going to bother... There is an interesting graph on wikipedia here: https://en.wikipedia.org/wiki/JPEG#JPEG_compression
But related to your photo, one thing that I noticed is if you open your up with Image Viewer, and keep zooming in you will quickly notice that the smaller photo will start getting 'pixelated' first.
This is very useful for what digital cameras call 'digital zoom' which basically the process of zooming into the 'density' of DPI of the image, instead of using optics.
Not sure if it helps at all, but I thought I'd say something :-p
AS
An HTML attachment was scrubbed... URL: <https://lists.linuxgazette.net/private.cg[...]nts/20100928/31da3f4f/attachment.htm>
Michael SanAngelo [msanangelo at gmail.com]
I believe it has something to do with the file compression and resolution. If you were to compress the image by lowering the image quality to something like 75%, you will find the image size to be reduced even more. also, the resolution plays a part. the higher the resolution, the larger the file size because it has more pixels to store. --------------------------------------------------------------
Want real freedom to use your pc the way you want? Get Linux. It's not just for geeks anymore and its free!! --- Go to Ubuntu.com
On Tue, Sep 28, 2010 at 7:04 PM, Ben Okopnik <ben at linuxgazette.net> wrote:
> I've always been curious about the huge disparity in file sizes between > certain images, especially when they have - oh, more or less similar > content (to my perhaps uneducated eye.) E.g., I've got a large list of > files on a client's site where I have to find some path between a > good average image size (the pic that pops up when you click the > thumbnail) and a reasonable file size (something that won't crash > PHP/GD - a 2MB file brings things right to a halt.) > > Here's the annoying thing, though: > > `` > ben at Jotunheim:/tmp$ ls -l allegro90_1_1.jpg bahama20__1.jpg; identify > allegro90_1_1.jpg bahama20__1.jpg > -rwxr-xr-x 1 ben ben 43004 2010-09-28 19:43 allegro90_1_1.jpg > -rwxr-xr-x 1 ben ben 1725638 2010-09-28 14:37 bahama20__1.jpg > allegro90_1_1.jpg JPEG 784x1702 784x1702+0+0 8-bit DirectClass 42kb > bahama20__1.jpg[1] JPEG 2240x1680 2240x1680+0+0 8-bit DirectClass 1.646mb > '' > > The first image, which is nearly big enough to cover my entire screen, > is 42k; the second one, while admittedly about 3X bigger in one > dimension, is 1.6MB+, over 40 times the file size. Say *what*? > > And it's not like the complexity of the content is all that different; > in fact, visually, the first one is more complex than the second > (although I'm sure I'm judging it by the wrong parameters. Obviously.) > Take a look at them, if you want: > > https://okopnik.com/images/allegro90_1_1.jpg > https://okopnik.com/images/bahama20__1.jpg > > So... what makes an image - seemingly of the same type, according to > what "identify" is reporting - that much bigger? Does anybody here know? > And is there any way to make the file sizes closer without losing a > significant amount of visual content? > > > -- > * Ben Okopnik * Editor-in-Chief, Linux Gazette * https://LinuxGazette.NET * > > TAG mailing list > TAG at lists.linuxgazette.net > https://lists.linuxgazette.net/listinfo.cgi/tag-linuxgazette.net >
An HTML attachment was scrubbed... URL: <https://lists.linuxgazette.net/private.cg[...]nts/20100928/0be6d189/attachment.htm>
Jimmy O'Regan [joregan at gmail.com]
On 29 September 2010 01:14, afsilva at gmail.com <afsilva at gmail.com> wrote:
> >> >> So... what makes an image - seemingly of the same type, according to >> what "identify" is reporting - that much bigger? Does anybody here know? >> And is there any way to make the file sizes closer without losing a >> significant amount of visual content? > > It has to do with the compression of the image. I am going to guess someone > else will have a scientific explanation, so I am not even going to bother... > There is an interesting graph on wikipedia here: > https://en.wikipedia.org/wiki/JPEG#JPEG_compression >
On top of that, most digital cameras add a thumbnail of about 1/20th of the file size, lots of cameras duplicate information in EXIF, IPTC, and XMP (which is uncompressed XML), etc., but the compression - and, as it's a lossy compression, the homogenisation of similar areas - it the main factor.
-- <Leftmost> jimregan, that's because deep inside you, you are evil. <Leftmost> Also not-so-deep inside you.
Ben Okopnik [ben at linuxgazette.net]
On Tue, Sep 28, 2010 at 08:14:13PM -0400, Anderson Silva wrote:
> > So... what makes an image - seemingly of the same type, according to > what "identify" is reporting - that much bigger? Does anybody here know? > And is there any way to make the file sizes closer without losing a > significant amount of visual content? > > It has to do with the compression of the image. I am going to guess someone > else will have a scientific explanation, so I am not even going to bother... > There is an interesting graph on wikipedia here: https://en.wikipedia.org/wiki/ > JPEG#JPEG_compression > > But related to your photo, one thing that I noticed is if you open your up with > Image Viewer, and keep zooming in you will quickly notice that the smaller > photo will start getting 'pixelated' first.
The thing is that the maximum "zoom" at which these are going to get displayed is 1. However, there's no way to make a safe guess for what's still going to look OK at a given compression.
> This is very useful for what digital cameras call 'digital zoom' which > basically the process of zooming into the 'density' of DPI of the image, > instead of using optics.
Right - which is why buying digital cameras based on just "zoom ratio" is silly; you have to distinguish between optical and digital zoom.
> Not sure if it helps at all, but I thought I'd say something :-p
Hey, bits of knowledge are a good thing. They can often be assembled.
-- * Ben Okopnik * Editor-in-Chief, Linux Gazette * https://LinuxGazette.NET *
afsilva at gmail.com [(afsilva at gmail.com)]
> > The thing is that the maximum "zoom" at which these are going to get > displayed is 1. However, there's no way to make a safe guess for what's > still going to look OK at a given compression. >
Well, not necessarily (maybe yours, yeah), but the photography that I do, a lot of times, I will crop an image to give the 'digital zoom' effect, and for a computer screen it still looks ok.
For example:
https://www.flickr.com/photos/afsilva/5018924879/ https://www.flickr.com/photos/afsilva/4884510746/
With a very high resolution, I was able to crop in and give the impression I was taking a photo with some major zoom lens
AS
An HTML attachment was scrubbed... URL: <https://lists.linuxgazette.net/private.cg[...]nts/20100928/24998a61/attachment.htm>
Ben Okopnik [ben at linuxgazette.net]
On Tue, Sep 28, 2010 at 07:14:15PM -0500, Michael SanAngelo wrote:
> I believe it has something to do with the file compression and resolution. If > you were to compress the image by lowering the image quality to something like > 75%, you will find the image size to be reduced even more. also, the resolution > plays a part. the higher the resolution, the larger the file size because it > has more pixels to store.
I understand your point; however:
# Using 'sed' to get rid of the leading whitespace ben at Jotunheim:/tmp$ identify -verbose allegro90_1_1.jpg|sed 's/^ *//' > a ben at Jotunheim:/tmp$ identify -verbose bahama20__1.jpg |sed 's/^ *//' > b ben at Jotunheim:/tmp$ diff -y a b Image: allegro90_1_1.jpg | Image: bahama20__1.jpg Class: DirectClass Class: DirectClass Geometry: 784x1702+0+0 | Geometry: 2240x1680+0+0 Resolution: 100x100 | Resolution: 72x72 Units: Undefined | Units: PixelsPerInch Type: TrueColor Type: TrueColor Colorspace: RGB Colorspace: RGB Depth: 8-bit Depth: 8-bit Channel depth: Channel depth: red: 8-bit red: 8-bit green: 8-bit green: 8-bit blue: 8-bit blue: 8-bit Interlace: None Interlace: None Compression: JPEG Compression: JPEG jpeg:colorspace: 2 jpeg:colorspace: 2 jpeg:sampling-factor: 2x2,1x1,1x1 | jpeg:sampling-factor: 1x1,1x1,1x1 Profiles: Profiles: Profile-APP12: 15 bytes | Profile-8bim: 10192 bytes > Profile-exif: 868 bytes > Profile-icc: 3144 bytes > IEC 61966-2.1 Default RGB colour space - sRGB > Profile-iptc: 34 bytes > Profile-xmp: 5439 bytes Filesize: 42kb | Filesize: 1.646mb Number pixels: 1.273mb | Number pixels: 3.589mb > Pixels per second: 3.589mb
(I've clipped a lot of non-relevant info, like the EXIF data from 'bahama' and red/blue/green skew, kurtosis, etc. for both images.) Please note that the compression type is the same for both - and the number of pixels between the two has only about a 2:1 ratio. Moreover, 'bahama' has a lower resolution than 'allegro'.
GIMP tells me that the smaller image takes 12.8MB in memory, and the larger one 34.6MB; again, about a 3:1 ratio - all reasonable, if it wasn't for that 40:1 file size ratio.
Trying to save them from GIMP shows that 'allegro' has a current compression of 85, while 'bahama' is at 98. That's about the only thing I can see that would make a significant difference... saving 'bahama' at a compression of 85 gives me a 451kB file. A lot smaller, but still huge by comparison.
-- * Ben Okopnik * Editor-in-Chief, Linux Gazette * https://LinuxGazette.NET *
Jimmy O'Regan [joregan at gmail.com]
On 29 September 2010 02:39, Ben Okopnik <ben at linuxgazette.net> wrote:
> jpeg:sampling-factor: 2x2,1x1,1x1 ? ? ? ?| jpeg:sampling-factor: 1x1,1x1,1x1
This one stood out to me. A quick google leads here: https://www.groupsrv.com/computers/about6852.html
"Common sampling factors used are 2x2, 2x1, 1x2, and 1x1. 2x2 is the default setting used in the IJG library and is therefore quite common. 2x1 is the "traditional" setting coming from video. Nearly all digital camera firmwares today still use this setting. Note also that the sample quantization tables given in the JPEG standard where derived with this setting. 1x2 is the result of lossless rotation of 2x1 sampled JPEG files. 1x1 is the recommended setting for higher quality JPEG images.
For more recent results see also
https://jpegclub.org/foveon/ https://kb-bmts.rz.tu-ilmenau.de/gcg/GCGMAT1.HTM (-> first paper) "
Looking at the first link leads to: https://jpegclub.org/foveon/index2.html which explains that 1x1 is 'no color subsampling reduction'
-- <Leftmost> jimregan, that's because deep inside you, you are evil. <Leftmost> Also not-so-deep inside you.
Ben Okopnik [ben at linuxgazette.net]
On Wed, Sep 29, 2010 at 02:10:16AM +0100, Jimmy O'Regan wrote:
> On 29 September 2010 01:14, afsilva at gmail.com <afsilva at gmail.com> wrote: > >> > >> So... what makes an image - seemingly of the same type, according to > >> what "identify" is reporting - that much bigger? Does anybody here know? > >> And is there any way to make the file sizes closer without losing a > >> significant amount of visual content? > > > > It has to do with the compression of the image. I am going to guess someone > > else will have a scientific explanation, so I am not even going to bother... > > There is an interesting graph on wikipedia here: > > https://en.wikipedia.org/wiki/JPEG#JPEG_compression > > On top of that, most digital cameras add a thumbnail of about 1/20th > of the file size, lots of cameras duplicate information in EXIF, IPTC, > and XMP (which is uncompressed XML), etc., but the compression - and, > as it's a lossy compression, the homogenisation of similar areas - it > the main factor.
OK, that makes sense. And you're right, EXIF isn't a major factor here: 'jpegoptim --strip-all', which strips EXIF and comments, only reports a minimal difference.
ben at Jotunheim:/tmp$ jpegoptim -nt --strip-all bahama20__1.jpg bahama20__1.jpg 2240x1680 24bit Exif [OK] 1725638 --> 1704954 bytes (1.20%), optimized. Average compression (1 files): 1.20% (20k)
I think I'm starting to close in on an idea, like recommending, say, a compression of 90 or so to my client. At 1:1, that'll probably look fine, and it'll let him have all his images protected from crashing by a safe margin.
ben at Jotunheim:/tmp$ for n in `seq 80 90`; do convert -quality $n bahama20__1.jpg $n-bahama20__1.jpg; done ben at Jotunheim:/tmp$ ls -lS [0-9]*bahama20* -rw-r--r-- 1 ben ben 652859 2010-09-28 21:51 90-bahama20__1.jpg -rw-r--r-- 1 ben ben 595572 2010-09-28 21:51 89-bahama20__1.jpg -rw-r--r-- 1 ben ben 570463 2010-09-28 21:51 88-bahama20__1.jpg -rw-r--r-- 1 ben ben 524618 2010-09-28 21:51 87-bahama20__1.jpg -rw-r--r-- 1 ben ben 496748 2010-09-28 21:51 86-bahama20__1.jpg -rw-r--r-- 1 ben ben 466398 2010-09-28 21:51 85-bahama20__1.jpg -rw-r--r-- 1 ben ben 440749 2010-09-28 21:51 84-bahama20__1.jpg -rw-r--r-- 1 ben ben 422661 2010-09-28 21:51 83-bahama20__1.jpg -rw-r--r-- 1 ben ben 395218 2010-09-28 21:51 82-bahama20__1.jpg -rw-r--r-- 1 ben ben 376780 2010-09-28 21:51 81-bahama20__1.jpg -rw-r--r-- 1 ben ben 356326 2010-09-28 21:51 80-bahama20__1.jpg
Hmm, that's starting to look pretty reasonable. Even at 90, shrinking the image down to 1/3 the size (with, presumably, a corresponding file size reduction) would get it down into the 200k range - which would be great. Heck, even 650kB is a big improvement already.
-- * Ben Okopnik * Editor-in-Chief, Linux Gazette * https://LinuxGazette.NET *
Ben Okopnik [ben at linuxgazette.net]
On Wed, Sep 29, 2010 at 02:53:35AM +0100, Jimmy O'Regan wrote:
> On 29 September 2010 02:39, Ben Okopnik <ben at linuxgazette.net> wrote: > > jpeg:sampling-factor: 2x2,1x1,1x1 ? ? ? ?| jpeg:sampling-factor: 1x1,1x1,1x1 > > This one stood out to me. A quick google leads here: > https://www.groupsrv.com/computers/about6852.html > > "Common sampling factors used are 2x2, 2x1, 1x2, and 1x1. > 2x2 is the default setting used in the IJG library and is therefore quite > common. > 2x1 is the "traditional" setting coming from video. Nearly all digital > camera firmwares today still use this setting. Note also that the sample > quantization tables given in the JPEG standard where derived with this > setting. > 1x2 is the result of lossless rotation of 2x1 sampled JPEG files. > 1x1 is the recommended setting for higher quality JPEG images.
Oh - interesting!
> Looking at the first link leads to: > https://jpegclub.org/foveon/index2.html which explains that 1x1 is 'no > color subsampling reduction'
Seems that's the only page on the Net that mentions it, though.
On the other hand, searching for it without quotes leads to
https://registry.gimp.org/node/33?page=1
which presents a GIMP plugin called "save_for_web" [laugh]. Someone there actually asked about changing color subsampling, and the author mentioned that it's in the works! Awesome stuff for the future, then.
-- * Ben Okopnik * Editor-in-Chief, Linux Gazette * https://LinuxGazette.NET *
Jimmy O'Regan [joregan at gmail.com]
On 29 September 2010 03:00, Ben Okopnik <ben at linuxgazette.net> wrote:
> ben at Jotunheim:/tmp$ for n in `seq 80 90`; do convert -quality $n bahama20__1.jpg $n-bahama20__1.jpg; done
try adding -sampling-factor 2:1:1 (or 4:1:1) to that
-- <Leftmost> jimregan, that's because deep inside you, you are evil. <Leftmost> Also not-so-deep inside you.
Jimmy O'Regan [joregan at gmail.com]
On 29 September 2010 03:09, Ben Okopnik <ben at linuxgazette.net> wrote:
> which presents a GIMP plugin called "save_for_web" [laugh]. Someone > there actually asked about changing color subsampling, and the author > mentioned that it's in the works! Awesome stuff for the future, then.
I think my last answer must have passed yours on the street and neglected to wave ImageMagick supports changing colour subsampling, and it makes quite a difference.
-- <Leftmost> jimregan, that's because deep inside you, you are evil. <Leftmost> Also not-so-deep inside you.
Jimmy O'Regan [joregan at gmail.com]
On 29 September 2010 03:13, Jimmy O'Regan <joregan at gmail.com> wrote:
> On 29 September 2010 03:09, Ben Okopnik <ben at linuxgazette.net> wrote: >> which presents a GIMP plugin called "save_for_web" [laugh]. Someone >> there actually asked about changing color subsampling, and the author >> mentioned that it's in the works! Awesome stuff for the future, then. > > I think my last answer must have passed yours on the street and > neglected to wave ImageMagick supports changing colour subsampling, > and it makes quite a difference.
Huh. On my sample image, it reduced it by more than half, but with yours, (with -quality 100) it actually increased the size :/ Using -quality 90 -sampling-factor 2:1:1 brings it to 556k, 4:1:1 to 506k.
-- <Leftmost> jimregan, that's because deep inside you, you are evil. <Leftmost> Also not-so-deep inside you.
Jimmy O'Regan [joregan at gmail.com]
On 29 September 2010 03:28, Jimmy O'Regan <joregan at gmail.com> wrote:
> On 29 September 2010 03:13, Jimmy O'Regan <joregan at gmail.com> wrote: >> On 29 September 2010 03:09, Ben Okopnik <ben at linuxgazette.net> wrote: >>> which presents a GIMP plugin called "save_for_web" [laugh]. Someone >>> there actually asked about changing color subsampling, and the author >>> mentioned that it's in the works! Awesome stuff for the future, then. >> >> I think my last answer must have passed yours on the street and >> neglected to wave ImageMagick supports changing colour subsampling, >> and it makes quite a difference. > > Huh. On my sample image, it reduced it by more than half, but with > yours, (with -quality 100) it actually increased the size :/ Using > -quality 90 -sampling-factor 2:1:1 brings it to 556k, 4:1:1 to 506k.
Oh. As well as the 2:1:1 stuff, you can just give it -sampling-factor 2x2 etc., which is a little easier to remember
-- <Leftmost> jimregan, that's because deep inside you, you are evil. <Leftmost> Also not-so-deep inside you.
Ben Okopnik [ben at linuxgazette.net]
On Tue, Sep 28, 2010 at 10:09:34PM -0400, Benjamin Okopnik wrote:
> > On the other hand, searching for it without quotes leads to > > https://registry.gimp.org/node/33?page=1 > > which presents a GIMP plugin called "save_for_web" [laugh]. Someone > there actually asked about changing color subsampling, and the author > mentioned that it's in the works! Awesome stuff for the future, then.
Whoa. Rockin'. I just compiled this plugin and installed it, opened the image in GIMP and clicked 'Save for Web' on the 'File' menu; without changing compression or anything else that I could see, it brought the file size down to 334kB. Impressive.
Not useful for scripting, which is what I'd need here, but nice to have.
-- * Ben Okopnik * Editor-in-Chief, Linux Gazette * https://LinuxGazette.NET *
Ben Okopnik [ben at linuxgazette.net]
On Wed, Sep 29, 2010 at 03:11:10AM +0100, Jimmy O'Regan wrote:
> On 29 September 2010 03:00, Ben Okopnik <ben at linuxgazette.net> wrote: > > ben at Jotunheim:/tmp$ for n in `seq 80 90`; do convert -quality $n bahama20__1.jpg $n-bahama20__1.jpg; done > > try adding -sampling-factor 2:1:1 (or 4:1:1) to that
Y'know, I actually looked at the ImageMagick options to see if something like that was available, as soon as you mentioned it. Grrrr.
Love their programs (although the syntax can be really strange and their error messages are as weird as snake suspenders); hate their documentation.
-- * Ben Okopnik * Editor-in-Chief, Linux Gazette * https://LinuxGazette.NET *
Ben Okopnik [ben at linuxgazette.net]
On Wed, Sep 29, 2010 at 03:11:10AM +0100, Jimmy O'Regan wrote:
> On 29 September 2010 03:00, Ben Okopnik <ben at linuxgazette.net> wrote: > > ben at Jotunheim:/tmp$ for n in `seq 80 90`; do convert -quality $n bahama20__1.jpg $n-bahama20__1.jpg; done > > try adding -sampling-factor 2:1:1 (or 4:1:1) to that
Just did. 2:1:1 doesn't seem to make much difference in size; 4:1:1 gets it down a fair bit. E.g., with a quality of 90, it goes from 650kB (no -sampling-factor specified) to about 505kB. Cool and excellent!
-- * Ben Okopnik * Editor-in-Chief, Linux Gazette * https://LinuxGazette.NET *
Ben Okopnik [ben at linuxgazette.net]
On Wed, Sep 29, 2010 at 03:13:13AM +0100, Jimmy O'Regan wrote:
> On 29 September 2010 03:09, Ben Okopnik <ben at linuxgazette.net> wrote: > > which presents a GIMP plugin called "save_for_web" [laugh]. Someone > > there actually asked about changing color subsampling, and the author > > mentioned that it's in the works! Awesome stuff for the future, then. > > I think my last answer must have passed yours on the street and > neglected to wave
I hope you'll speak to it sternly.
> ImageMagick supports changing colour subsampling, > and it makes quite a difference.
Yeah, looks that way. Applying it without any reduction in quality cuts the file size by more than 2/3rds:
ben at Jotunheim:/tmp$ convert -sampling-factor 4:1:1 bahama20__1.jpg bahama20__1-ss411.jpg ben at Jotunheim:/tmp$ ls -l baha* -rw-r--r-- 1 ben ben 1725638 2010-09-28 14:37 bahama20__1.jpg -rw-r--r-- 1 ben ben 573111 2010-09-28 22:49 bahama20__1-ss411.jpg
THAT is great coolness. Thanks, Jimmy!
-- * Ben Okopnik * Editor-in-Chief, Linux Gazette * https://LinuxGazette.NET *
Jim Jackson [jj at franjam.org.uk]
> > Trying to save them from GIMP shows that 'allegro' has a current > compression of 85, while 'bahama' is at 98. That's about the only thing^^^^^^^^^^^
not compression per se, but quality - how much detail in the image to save. The less detail the smaller the file size.
> I can see that would make a significant difference... saving 'bahama' at > a compression of 85 gives me a 451kB file. A lot smaller, but still huge > by comparison.
Sometime before you said the image was 3 times as large, that's 9x the number of pixels. So the filesize here is about right. If you are displaying on computer screen, resize the image to 1024xwhatever and use the resized image.
Ben Okopnik [ben at linuxgazette.net]
On Wed, Sep 29, 2010 at 09:33:42AM +0100, Jim Jackson wrote:
> > > > Trying to save them from GIMP shows that 'allegro' has a current > > compression of 85, while 'bahama' is at 98. That's about the only thing > ^^^^^^^^^^^ > > not compression per se, but quality - how much detail in the image to save. > The less detail the smaller the file size.
Right; I misspoke (miswrote?) - that's what I meant. I used the term 'quality' later in describing the same thing.
> > I can see that would make a significant difference... saving 'bahama' at > > a compression of 85 gives me a 451kB file. A lot smaller, but still huge > > by comparison. > > Sometime before you said the image was 3 times as large, that's 9x the > number of pixels.
No, I mentioned that was in one dimension (it's actually a bit smaller in the other one.) That's less than 3x the number of pixels - and that's before you consider that the smaller image is actually higher resolution (100x100 vs. 72x72).
> So the filesize here is about right. If you are > displaying on computer screen, resize the image to 1024xwhatever > and use the resized image.
OK, but that's not the problem - if it was just a single image, I wouldn't have even posted the question. It's an entire class of problems that I keep running into and would like to solve on a broader basis - i.e., how do you "normalize" an average image file size but still keep a fairly good quality, particularly when you have several thousand images in a wide variety of sizes and resolutions to deal with?
The parameters in this case are roughly the following:
Max thumbnail size: 245x245 Preferred actual size for "zooming": 1024x768 would be nice... Max file size: Unknown, but 1.6MB kills PHP::GD with an OOM error Current display sizes and file sizes: find -iname '*jpg' -size +200k|xargs identify -format "%wx%h x %b\n"|sort -un -tx -k1,1 -k2,2 199x300 x 16587 200x300 x 41264 201x300 x 19176 203x300 x 53850 [...] 1020x1403 x 111763 1024x768 x 359313 1057x768 x 67972 1118x712 x 63760 1118x810 x 61841 1118x813 x 62618 1119x809 x 99894 1119x816 x 95016 1123x816 x 45381 1152x768 x 105284 1176x854 x 62942 1202x1045 x 63249 1215x1800 x 700621 1680x2240 x 695381 1800x2701 x 170426 1818x2520 x 1775954 1832x1614 x 1460378 2092x1680 x 428419 2240x1680 x 1725638
If you see an easy solution here, Jim, then I'm all ears.
-- * Ben Okopnik * Editor-in-Chief, Linux Gazette * https://LinuxGazette.NET *
Jimmy O'Regan [joregan at gmail.com]
On 29 September 2010 03:31, Jimmy O'Regan <joregan at gmail.com> wrote:
>> Huh. On my sample image, it reduced it by more than half, but with >> yours, (with -quality 100) it actually increased the size :/ Using >> -quality 90 -sampling-factor 2:1:1 brings it to 556k, 4:1:1 to 506k. > > Oh. As well as the 2:1:1 stuff, you can just give it -sampling-factor > 2x2 etc., which is a little easier to remember
Google have released a new open source lossy image format, because they've decided JPEG isn't efficient enough: https://code.google.com/speed/webp/index.html
-- <Leftmost> jimregan, that's because deep inside you, you are evil. <Leftmost> Also not-so-deep inside you.
Ben Okopnik [ben at linuxgazette.net]
On Thu, Sep 30, 2010 at 09:48:02PM +0100, Jimmy O'Regan wrote:
> On 29 September 2010 03:31, Jimmy O'Regan <joregan at gmail.com> wrote: > >> Huh. On my sample image, it reduced it by more than half, but with > >> yours, (with -quality 100) it actually increased the size :/ Using > >> -quality 90 -sampling-factor 2:1:1 brings it to 556k, 4:1:1 to 506k. > > > > Oh. As well as the 2:1:1 stuff, you can just give it -sampling-factor > > 2x2 etc., which is a little easier to remember > > Google have released a new open source lossy image format, because > they've decided JPEG isn't efficient enough: > https://code.google.com/speed/webp/index.html
Nice. But:
Since this is a new image format, you won't be able to view your WebP images in browsers or image viewers until support for the format has been provided for those programs*. However, you can view WebP images by converting them into PNG format and using your favorite PNG viewer. Since PNG is a lossless compression, it will produce an exact rendition of the WebP image content.
That seems just a bit pointless: PNGs aren't known for scrimping on file size. Well, let's see:
ben at Jotunheim:/tmp$ identify 0mnia90_2.jpg 0mnia90_2.jpg JPEG 720x1123 720x1123+0+0 8-bit DirectClass 1.062mb ben at Jotunheim:/tmp$ webpconv 0mnia90_2.jpg processing 0mnia90_2.jpg Output file 0mnia90_2.webp Used quality=85 ben at Jotunheim:/tmp$ ls -l 0mnia90_2.webp -rw-r--r-- 1 ben ben 166576 2010-09-30 20:56 0mnia90_2.webp
"quality=85"? Seems like a cheat. Let's see how it does at 100.
ben at Jotunheim:/tmp$ webpconv -quality 100 0mnia90_2.jpg Target quality 100 processing 0mnia90_2.jpg Output file 0mnia90_2.webp Output file exists. You may change the output_dir
[Sigh] I hate bondage-and-domination UIs.
ben at Jotunheim:/tmp$ rm 0mnia90_2.webp ben at Jotunheim:/tmp$ webpconv -quality 100 0mnia90_2.jpg Target quality 100 processing 0mnia90_2.jpg Output file 0mnia90_2.webp ben at Jotunheim:/tmp$ ls -l 0mnia90_2.webp -rw-r--r-- 1 ben ben 410258 2010-09-30 21:04 0mnia90_2.webp
About 2.7:1 reduction in size. Not bad, assuming the image quality doesn't decrease significantly. Now, just for fun:
ben at Jotunheim:/tmp$ webpconv -quality 100 -format png 0mnia90_2.jpg Target quality 100 Output format png processing 0mnia90_2.jpg Output file 0mnia90_2.png ben at Jotunheim:/tmp$ ls -l 0mnia90_2.png -rw-r--r-- 1 ben ben 1605314 2010-09-30 21:06 0mnia90_2.png
About 45% bigger than the original - woohoo!
In other words, it ain't quite there yet.
(Along the same lines, I just checked on the GuruPlug and the Ionics Plug, which we discussed here about three months ago. Same story as before, weeks and weeks of shipping time delays. One of these days, I guess...)
-- * Ben Okopnik * Editor-in-Chief, Linux Gazette * https://LinuxGazette.NET *