CIGaussianBlur image size
A faster solution is to avoid CGImageRef altogether and perform all transformations at CIImage lazy level.
So, instead of your unfitting:
// create UIImage from filtered image (but size is wrong)
blurredImage = [[UIImage alloc] initWithCIImage:resultImage];
A nice solution is to write:
Objective-C
// cropping rect because blur changed size of image
CIImage *croppedImage = [resultImage imageByCroppingToRect:imageToBlur.extent];
// create UIImage from filtered cropped image
blurredImage = [[UIImage alloc] initWithCIImage:croppedImage];
Swift 3
// cropping rect because blur changed size of image
let croppedImage = resultImage.cropping(to: imageToBlur.extent)
// create UIImage from filtered cropped image
let blurredImage = UIImage(ciImage: croppedImage)
Swift 4
// cropping rect because blur changed size of image
let croppedImage = resultImage.cropped(to: imageToBlur.extent)
// create UIImage from filtered cropped image
let blurredImage = UIImage(ciImage: croppedImage)
The issue isn't that it's not blurring all of the image, but rather that the blur is extending the boundary of the image, making the image larger, and it's not lining up properly as a result.
To keep the image the same size, after the line:
CIImage *resultImage = [gaussianBlurFilter valueForKey: @"outputImage"];
You can grab the CGRect
for a rectangle the size of the original image in the center of this resultImage
:
// note, adjust rect because blur changed size of image
CGRect rect = [resultImage extent];
rect.origin.x += (rect.size.width - viewImage.size.width ) / 2;
rect.origin.y += (rect.size.height - viewImage.size.height) / 2;
rect.size = viewImage.size;
And then use CIContext
to grab that portion of the image:
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgimg = [context createCGImage:resultImage fromRect:rect];
UIImage *blurredImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
Alternatively, for iOS 7, if you go to the iOS UIImageEffects sample code and download iOS_UIImageEffects.zip
, you can then grab the UIImage+ImageEffects
category. Anyway, that provides a few new methods:
- (UIImage *)applyLightEffect;
- (UIImage *)applyExtraLightEffect;
- (UIImage *)applyDarkEffect;
- (UIImage *)applyTintEffectWithColor:(UIColor *)tintColor;
- (UIImage *)applyBlurWithRadius:(CGFloat)blurRadius tintColor:(UIColor *)tintColor saturationDeltaFactor:(CGFloat)saturationDeltaFactor maskImage:(UIImage *)maskImage;
So, to blur and image and lightening it (giving that "frosted glass" effect) you can then do:
UIImage *newImage = [image applyLightEffect];
Interestingly, Apple's code does not employ CIFilter
, but rather calls vImageBoxConvolve_ARGB8888 of the vImage high-performance image processing framework. This technique is illustrated in WWDC 2013 video Implementing Engaging UI on iOS.