ios - Reduce unwanted motion blur while using GPUImage to capture -


i'm writing app in swift, , using gpuimage capture , manipulate images. i'm looking way decrease exposure time reduce motion blur. if move in frame looks blurry. have lighting, i'm not sure why exposure isn't fast enough.

i'm doing setup gpuimage:

self.stillcamera = gpuimagestillcamera(sessionpreset: avcapturesessionpreset640x480, cameraposition: .front) self.stillcamera!.outputimageorientation = .portrait 

i setup filters want (a crop , optionally effects). start preview:

self.stillcamera?.startcameracapture() 

and capture frame:

self.finalfilter?.usenextframeforimagecapture() var capturedimage = self.finalfilter?.imagefromcurrentframebuffer() 

the reason you're seeing such long exposure times you're using gpuimagestillcamera , preview capture frames. gpuimagestillcamera uses avcapturestillimageoutput under hood, , enables live preview feed that. photo preview feed runs @ ~15 fps or lower on various devices, , doesn't provide clear image gpuimagevideocamera will.

you either want capture photos avcapturestillimageoutput triggering actual photo capture (via -capturephotoprocesseduptofilter: or like) or use gpuimagevideocamera , capture individual frames above.


Comments

Popular posts from this blog

OpenCV OpenCL: Convert Mat to Bitmap in JNI Layer for Android -

android - org.xmlpull.v1.XmlPullParserException: expected: START_TAG {http://schemas.xmlsoap.org/soap/envelope/}Envelope -

python - How to remove the Xframe Options header in django? -