Image optimization reports often celebrate percentages: 40 percent smaller, 68 percent smaller, 90 percent smaller. Those numbers are useful, but they can also mislead. A huge percentage reduction on a tiny icon may not matter. A smaller reduction on a large hero image may improve a real page. A dramatic reduction that damages product quality is not a win.
The goal is not the highest savings percentage. The goal is the best balance of user experience, visual quality, and operational reliability.
"Compress images and look for 80 percent savings" is not a review process. It treats every image as if it has the same page role, quality risk, and publishing path. A real review records the file, the page that uses it, the output size, the visual decision, and whether the optimized file is actually shipped to visitors.
Start With Absolute Bytes#
Percentages need context. Reducing a 10 KB file by 80 percent saves 8 KB. Reducing a 900 KB hero image by 25 percent saves 225 KB. The second change matters more to page weight even though the percentage is lower.
Track both values:
Original: 900 KB
Output: 675 KB
Savings: 225 KB
Savings ratio: 25%
This keeps the discussion grounded in actual transfer size.
For a GetWebP CLI review pass, generate machine-readable output instead of copying numbers from a UI by hand:
npx -y getwebp ./review-source -o ./review-output --recursive --format webp --json
The CLI command options are documented in GetWebP CLI commands. The JSON output format is documented in GetWebP JSON output. Use the fields together:
| Field | How to use it in the review |
|---|---|
originalSize | Starting bytes for the source file |
newSize | Output bytes after conversion |
saved | Absolute bytes removed, which is usually more useful than a percentage |
savedRatio | Relative savings; useful only after checking file role and visual quality |
outputPath | The exact generated file to inspect or publish |
status | Whether the row should be counted, retried, or excluded |
Do not hide awkward rows. A negative savedRatio can happen when an already efficient source becomes larger after re-encoding. A very high ratio on a tiny decorative image can be mathematically true and still irrelevant to user experience. Both cases belong in the report because they keep the team from turning image optimization into a vanity metric.
Weight Images by Page Impact#
Not every image affects the page equally. Prioritize:
- above-the-fold images
- likely LCP images
- large product or gallery images
- repeated thumbnails
- images on high-traffic pages
- assets loaded on slow connections
An unused media-library file with excellent savings does not improve the user experience. A modest improvement to the largest visible image may matter more.
PageSpeed Insights and browser DevTools can help identify image-related performance opportunities. Google's PageSpeed Insights is useful for page-level checks, while Google's WebP documentation explains the format background.
Use a weighted review table instead of a single "percent saved" column:
| Image role | How to score the savings |
|---|---|
| LCP hero | Count only if the rendered image is smaller, visually approved, and still selected at the intended breakpoint |
| Product zoom | Count only after checking texture, edges, color, and zoom state |
| Repeated thumbnail | Count the aggregate transfer reduction across the repeated component |
| Inline editorial image | Count if the page actually loads the file and the crop still supports the article |
| Unused media-library file | Count only for storage cleanup, not for page-speed improvement |
| Decorative icon | Usually low priority unless it appears many times or blocks rendering |
This table changes the conversation. A 22 percent reduction on the LCP image may be worth more than a 90 percent reduction on a hidden archive image.
Add Visual Approval to the Savings Table#
A savings report should include quality status:
Image: product-chair-hero.jpg
Savings: 31%
Bytes saved: 184 KB
Visual review: approved
Notes: texture checked in zoom view
If the visual review fails, the savings number should not be counted as a win. The team can retry with different settings or keep the original format.
Make the review state explicit:
| State | Meaning |
|---|---|
| Approved | Count the bytes in the performance or storage summary |
| Retry | Do not count yet; test a different quality setting, size, or format |
| Keep original | Do not count the attempted savings; the source is better for this use |
| Remove from page | Count as page cleanup only after the page no longer requests the asset |
| Storage only | Count for repository or media-library cleanup, not for live page transfer |
This prevents a common reporting problem: a spreadsheet says the team saved megabytes, but some of those "savings" came from files that were never approved, never published, or never loaded by a real page.
Separate Resize Savings From Compression Savings#
A file may become smaller because it was resized, because metadata was removed, because the format changed, or because lossy compression was applied. If you report only one final savings number, stakeholders may think WebP alone did all the work.
For larger projects, separate the stages:
- original source
- resized working file
- compressed WebP or AVIF output
- published file after CMS processing
This makes results more honest and helps the team understand which changes are worth repeating.
For WordPress work, keep conversion and delivery separate. Batch conversion can create WebP or AVIF siblings, but the page is faster only when the frontend sends those variants to visitors. The GetWebP WordPress docs describe batch conversion and frontend delivery as separate concerns. A useful report should say whether the file was generated, whether the rendered markup references it, and whether the browser receives the expected format.
Example row:
Source: uploads/2025/12/category-hero.jpg
Generated: uploads/2025/12/category-hero.jpg.webp
CLI saved: 218 KB
Rendered page: /chairs/
Frontend delivery: confirmed in picture source
Visual review: approved at desktop and mobile crop
Report bucket: page transfer reduction
Watch for Low-Quality Sources#
A highly compressed source image may not get much smaller without visible damage. That does not mean the tool failed. It may mean the image was already near the limit of acceptable compression.
For these files, focus on:
- whether dimensions are appropriate
- whether the image is used
- whether a better source exists
- whether the file should be replaced rather than recompressed
Do not force a savings target onto every image.
The rejection note should be specific enough that another reviewer would reach the same decision:
Rejected: source already has visible JPEG blocking in the blue fabric.
Reason: WebP output saves 37 KB but makes the product texture look smeared in zoom.
Next action: keep original until a cleaner source is available.
That line is better than "quality looked bad" because it identifies the defect, the tradeoff, and the next action.
Use Category-Level Decisions#
Review savings by asset type:
- hero photos
- product photos
- screenshots
- transparent graphics
- thumbnails
- diagrams
Each group has different quality tolerance. Screenshots may save less because text must remain clear. Thumbnails may allow more compression. Product photos may need texture preservation.
Category-level reporting prevents one average number from hiding important tradeoffs.
Use category rules before the review starts:
| Category | Approval rule |
|---|---|
| Screenshots | Text, cursor edges, UI borders, and small labels must remain clean |
| Product photos | Texture, color, specular highlights, and zoom detail must survive |
| Portraits | Skin tone and hair detail must not look waxy |
| Diagrams | Thin lines and labels must stay readable |
| Transparent graphics | Edges must stay clean on light and dark backgrounds |
| Thumbnails | More compression is acceptable only when the component is not a detail view |
These rules are part of editorial quality. They stop the article, report, or client deliverable from sounding like generic optimization advice.
Avoid Vanity Totals#
Total bytes saved across a whole media folder can look impressive while including unused images, duplicates, and old exports. For performance reporting, focus on files loaded by real pages.
If the goal is storage cleanup, folder-level totals are useful. If the goal is faster pages, page-level impact matters more.
This distinction is important when reporting work to clients or non-technical stakeholders. A cleanup pass may remove hundreds of megabytes from a repository without changing a single live page. A performance pass may save fewer total bytes but improve the pages visitors actually load. Label those outcomes separately so nobody confuses operational housekeeping with front-end speed work.
Split the final report into buckets:
| Bucket | Count it when |
|---|---|
| Page transfer reduction | The optimized asset is loaded by a reviewed page |
| LCP candidate improvement | The image is the likely LCP element and remains visually approved |
| Storage cleanup | The file exists in the repo, uploads folder, or media library but does not prove page speed impact |
| Rejected savings | The conversion was smaller but failed visual or layout review |
| Bigger output | The conversion increased file size and should usually be excluded from wins |
| Delivery mismatch | The optimized file exists but the frontend still sends the original |
The "delivery mismatch" bucket is especially useful. It turns a vague complaint into an action item: fix the template, cache, CDN rule, or CMS delivery path before claiming the savings.
A Better Review Summary#
A useful image optimization summary looks like this:
Pages reviewed: 12
User-facing images optimized: 86
Total transfer reduction on reviewed pages: 4.8 MB
Failed visual review: 7
Largest remaining issue: oversized product zoom images
For a more audit-ready version, include the evidence trail:
Command: npx -y getwebp ./review-source -o ./review-output --recursive --format webp --json
Rows parsed from JSON output: 124
Approved page-transfer reduction: 4.8 MB
Storage-only reduction: 312 MB
Rejected after visual review: 7 files / 431 KB attempted savings
Bigger WebP outputs kept out of wins: 5 files
Delivery mismatches found: 3 templates
Next fix: update product zoom responsive image markup
This gives the team a real picture of progress.
Savings percentages are helpful when they support a broader review. They become harmful when they replace judgment. Measure bytes, page impact, and visual quality together.

Jack
GetWebP EditorJack writes GetWebP guides about local-first image conversion, WebP workflows, browser compatibility, and practical performance checks for teams that publish images on the web.