While video has seemingly taken over the web, there’s no denying that images online are everywhere. It’s hard to head to a single website without seeing an image or two, and given the vast variety of different screen resolutions out there, it can be difficult for websites and designers to get the right quality of image for their audience. Now that data allowances are a big deal once again, customers are often looking for ways to cut down on their data usage, and websites are awarded brownie points for cutting down the amount their pages use, too. The problem with this, however, is that using an image that has a smaller data footprint is often smaller in terms of resolution and therefore in quality, too. Now, Google wants their new RAISR initiative to help bridge this gap.
Over on the Google Research Blog, the team behind RAISR, or Rapid and Accurate Image Super-Resolution explain the technique. In a nutshell, the team used Machine Learning to “train” a computer algorithm to more efficiently fill in the gaps when enlarging a smaller image. Usually, when we enlarge a smaller image, things become quite blocky, and this is because traditional enlarging techniques fill in the pixels that are side-by-side each other with squares of a similar color, because it doesn’t know any better. To fix this, the team gave RAISR 10,000 pairs of images, one lower-resolution and one higher-resolution, in order to give the algorithm an idea of how to fill in the gaps when enlarging those smaller images in order to keep the overall look of the image the same.
Super-resolution techniques like these have come a long, long way and Google’s RAISR has become something a filter that has a pair of methods it has been trained to use. Both of which – as we can see below – do a great job of improving the overall quality of an enlarged image without using any more data, even if the result can appear a little “soft” in some cases.
The source link below goes into a lot of detail about the new project, and while this might seem like the strange lab stuff we never see in the real world, there are so many real wolrd applications for this. Imagine responsive design on different screen resolutions delivering a great experience no matter the size or resolution of the display, without having to use large and data-heavy images for assets? It could lead to improved magnification of older photographs to get a better look at the past, and it could lead to improvements in photo editing software on our smartphones, too.