亚洲香蕉成人av网站在线观看_欧美精品成人91久久久久久久_久久久久久久久久久亚洲_热久久视久久精品18亚洲精品_国产精自产拍久久久久久_亚洲色图国产精品_91精品国产网站_中文字幕欧美日韩精品_国产精品久久久久久亚洲调教_国产精品久久一区_性夜试看影院91社区_97在线观看视频国产_68精品久久久久久欧美_欧美精品在线观看_国产精品一区二区久久精品_欧美老女人bb

首頁 > 學院 > 開發設計 > 正文

GPUImageAPI文檔之GPUImageFramebuffer類

2019-11-14 17:55:55
字體:
來源:轉載
供稿:網友

  GPUImageFramebuffer類用于管理幀緩沖對象,負責幀緩沖對象的創建和銷毀,讀取幀緩沖內容

  屬性

  @PRoperty(readonly) CGSize size

  說明:只讀屬性,在實現中,設置緩沖區的size

 

  @property(readonly) GPUTextureOptions textureOptions

  說明:紋理的選項

 

  @property(readonly) GLuint texture

  說明:管理紋理

 

  @property(readonly) BOOL missingFramebuffer

  說明:指示是否丟失幀緩沖對象

  

  方法

   - (id)initWithSize:(CGSize)framebufferSize

    說明:創建一個size為framebufferSize大小的幀緩沖對象

    參數:framebuffer的size。

    返回:創建成功的幀緩沖對象。

    實現

- (id)initWithSize:(CGSize)framebufferSize;{    GPUTextureOptions defaultTextureOptions;    defaultTextureOptions.minFilter = GL_LINEAR;    defaultTextureOptions.magFilter = GL_LINEAR;    defaultTextureOptions.wrapS = GL_CLAMP_TO_EDGE;    defaultTextureOptions.wrapT = GL_CLAMP_TO_EDGE;    defaultTextureOptions.internalFormat = GL_RGBA;    defaultTextureOptions.format = GL_BGRA;    defaultTextureOptions.type = GL_UNSIGNED_BYTE;    if (!(self = [self initWithSize:framebufferSize textureOptions:defaultTextureOptions onlyTexture:NO]))    {        return nil;    }    return self;}

 

     - (id)initWithSize:(CGSize)framebufferSize textureOptions:(GPUTextureOptions)fboTextureOptions onlyTexture:(BOOL)onlyGenerateTexture

    說明:創建一個size為framebufferSize大小的幀緩沖對象

    參數:framebufferSize為framebuffer的size。fboTextureOptions是紋理的詳細配置。onlyGenerateTexture說明是否只創建紋理而不創建陳幀緩沖對象。

    返回:創建成功的幀緩沖對象。

    實現

- (id)initWithSize:(CGSize)framebufferSize textureOptions:(GPUTextureOptions)fboTextureOptions onlyTexture:(BOOL)onlyGenerateTexture;{    if (!(self = [super init]))    {        return nil;    }        _textureOptions = fboTextureOptions;    _size = framebufferSize;    framebufferReferenceCount = 0;    referenceCountingDisabled = NO;    _missingFramebuffer = onlyGenerateTexture;    if (_missingFramebuffer)    {        runSynchronouslyOnVideoProcessingQueue(^{            [GPUImageContext useImageProcessingContext];            [self generateTexture];            framebuffer = 0;        });    }    else    {        [self generateFramebuffer];    }    return self;}

 

   - (id)initWithSize:(CGSize)framebufferSize overriddenTexture:(GLuint)inputTexture

    說明:創建一個size為framebufferSize大小的幀緩沖對象

    參數:inputTexture為輸入的紋理,用于渲染圖片。

    返回:創建成功的幀緩沖對象。

    實現

- (id)initWithSize:(CGSize)framebufferSize overriddenTexture:(GLuint)inputTexture;{    if (!(self = [super init]))    {        return nil;    }    GPUTextureOptions defaultTextureOptions;    defaultTextureOptions.minFilter = GL_LINEAR;    defaultTextureOptions.magFilter = GL_LINEAR;    defaultTextureOptions.wrapS = GL_CLAMP_TO_EDGE;    defaultTextureOptions.wrapT = GL_CLAMP_TO_EDGE;    defaultTextureOptions.internalFormat = GL_RGBA;    defaultTextureOptions.format = GL_BGRA;    defaultTextureOptions.type = GL_UNSIGNED_BYTE;    _textureOptions = defaultTextureOptions;    _size = framebufferSize;    framebufferReferenceCount = 0;    referenceCountingDisabled = YES;        _texture = inputTexture;        return self;}

 

   - (void)activateFramebuffer

    說明:激活剛創建的framebuffer對象。只有調用它后,才會起作用。

    實現 

- (void)activateFramebuffer;{    glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);    glViewport(0, 0, (int)_size.width, (int)_size.height);}

 

   - (void)lock

    說明:引用計數管理 +1

    實現

- (void)lock;{    if (referenceCountingDisabled)    {        return;    }        framebufferReferenceCount++;}

 

   - (void)unlock

    說明:引用計數管理 -1

    實現

- (void)unlock;{    if (referenceCountingDisabled)    {        return;    }    NSAssert(framebufferReferenceCount > 0, @"Tried to overrelease a framebuffer, did you forget to call -useNextFrameForImageCapture before using -imageFromCurrentFramebuffer?");    framebufferReferenceCount--;    if (framebufferReferenceCount < 1)    {        [[GPUImageContext sharedFramebufferCache] returnFramebufferToCache:self];    }}

 

   - (void)clearAllLocks

    說明:引用計數管理 設置為0

    實現

- (void)clearAllLocks;{    framebufferReferenceCount = 0;}

 

   - (void)disableReferenceCounting

    說明:引用計數管理 禁用引用計數

    實現

- (void)disableReferenceCounting;{    referenceCountingDisabled = YES;}

 

   - (void)enableReferenceCounting

    說明:引用計數管理 啟用引用計數

    實現

- (void)enableReferenceCounting;{    referenceCountingDisabled = NO;}

 

   - (CGImageRef)newCGImageFromFramebufferContents

    說明:輸出幀緩沖內容?!菊f明待更新......】

    實現

- (CGImageRef)newCGImageFromFramebufferContents;{    // a CGImage can only be created from a 'normal' color texture    NSAssert(self.textureOptions.internalFormat == GL_RGBA, @"For conversion to a CGImage the output texture format for this filter must be GL_RGBA.");    NSAssert(self.textureOptions.type == GL_UNSIGNED_BYTE, @"For conversion to a CGImage the type of the output texture of this filter must be GL_UNSIGNED_BYTE.");        __block CGImageRef cgImageFromBytes;        runSynchronouslyOnVideoProcessingQueue(^{        [GPUImageContext useImageProcessingContext];                NSUInteger totalBytesForImage = (int)_size.width * (int)_size.height * 4;        // It appears that the width of a texture must be padded out to be a multiple of 8 (32 bytes) if reading from it using a texture cache                GLubyte *rawImagePixels;                CGDataProviderRef dataProvider = NULL;        if ([GPUImageContext supportsFastTextureUpload])        {#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE            NSUInteger paddedWidthOfImage = CVPixelBufferGetBytesPerRow(renderTarget) / 4.0;            NSUInteger paddedBytesForImage = paddedWidthOfImage * (int)_size.height * 4;                        glFinish();            CFRetain(renderTarget); // I need to retain the pixel buffer here and release in the data source callback to prevent its bytes from being prematurely deallocated during a photo write Operation            [self lockForReading];            rawImagePixels = (GLubyte *)CVPixelBufferGetBaseAddress(renderTarget);            dataProvider = CGDataProviderCreateWithData((__bridge_retained void*)self, rawImagePixels, paddedBytesForImage, dataProviderUnlockCallback);            [[GPUImageContext sharedFramebufferCache] addFramebufferToActiveImageCaptureList:self]; // In case the framebuffer is swapped out on the filter, need to have a strong reference to it somewhere for it to hang on while the image is in existence#else#endif        }        else        {            [self activateFramebuffer];            rawImagePixels = (GLubyte *)malloc(totalBytesForImage);            glReadPixels(0, 0, (int)_size.width, (int)_size.height, GL_RGBA, GL_UNSIGNED_BYTE, rawImagePixels);            dataProvider = CGDataProviderCreateWithData(NULL, rawImagePixels, totalBytesForImage, dataProviderReleaseCallback);            [self unlock]; // Don't need to keep this around anymore        }                CGColorSpaceRef defaultRGBColorSpace = CGColorSpaceCreateDeviceRGB();                if ([GPUImageContext supportsFastTextureUpload])        {#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE            cgImageFromBytes = CGImageCreate((int)_size.width, (int)_size.height, 8, 32, CVPixelBufferGetBytesPerRow(renderTarget), defaultRGBColorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst, dataProvider, NULL, NO, kCGRenderingIntentDefault);#else#endif        }        else        {            cgImageFromBytes = CGImageCreate((int)_size.width, (int)_size.height, 8, 32, 4 * (int)_size.width, defaultRGBColorSpace, kCGBitmapByteOrderDefault | kCGImageAlphaLast, dataProvider, NULL, NO, kCGRenderingIntentDefault);        }                // Capture image with current device orientation        CGDataProviderRelease(dataProvider);        CGColorSpaceRelease(defaultRGBColorSpace);            });        return cgImageFromBytes;}

 

   - (void)restoreRenderTarget

    說明:還原渲染目標對象

    實現

- (void)restoreRenderTarget;{#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE    [self unlockAfterReading];    CFRelease(renderTarget);#else#endif}

 

   - (void)lockForReading

    說明:鎖定PixelBuffer

    實現:

- (void)lockForReading{#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE    if ([GPUImageContext supportsFastTextureUpload])    {        if (readLockCount == 0)        {            CVPixelBufferLockBaseAddress(renderTarget, 0);        }        readLockCount++;    }#endif}

 

 

   - (void)unlockAfterReading

    說明:解鎖PixelBuffer

- (void)unlockAfterReading{#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE    if ([GPUImageContext supportsFastTextureUpload])    {        NSAssert(readLockCount > 0, @"Unbalanced call to -[GPUImageFramebuffer unlockAfterReading]");        readLockCount--;        if (readLockCount == 0)        {            CVPixelBufferUnlockBaseAddress(renderTarget, 0);        }    }#endif}

 

   - (NSUInteger)bytesPerRow

    說明:獲取pixel buffer的行字節數

    實現

- (NSUInteger)bytesPerRow;{    if ([GPUImageContext supportsFastTextureUpload])    {#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE        return CVPixelBufferGetBytesPerRow(renderTarget);#else        return _size.width * 4; // TODO: do more with this on the non-texture-cache side#endif    }    else    {        return _size.width * 4;    }}

 

   - (GLubyte *)byteBuffer

    說明:獲取pixel buffer的基地址

    實現

- (GLubyte *)byteBuffer;{#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE    [self lockForReading];    GLubyte * bufferBytes = CVPixelBufferGetBaseAddress(renderTarget);    [self unlockAfterReading];    return bufferBytes;#else    return NULL; // TODO: do more with this on the non-texture-cache side#endif}

 

完整代碼

#import <Foundation/Foundation.h>#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE#import <OpenGLES/EAGL.h>#import <OpenGLES/ES2/gl.h>#import <OpenGLES/ES2/glext.h>#else#import <OpenGL/OpenGL.h>#import <OpenGL/gl.h>#endif#import <QuartzCore/QuartzCore.h>#import <CoreMedia/CoreMedia.h>typedef struct GPUTextureOptions {    GLenum minFilter;    GLenum magFilter;    GLenum wrapS;    GLenum wrapT;    GLenum internalFormat;    GLenum format;    GLenum type;} GPUTextureOptions;@interface GPUImageFramebuffer : NSObject@property(readonly) CGSize size;@property(readonly) GPUTextureOptions textureOptions;@property(readonly) GLuint texture;@property(readonly) BOOL missingFramebuffer;// Initialization and teardown- (id)initWithSize:(CGSize)framebufferSize;- (id)initWithSize:(CGSize)framebufferSize textureOptions:(GPUTextureOptions)fboTextureOptions onlyTexture:(BOOL)onlyGenerateTexture;- (id)initWithSize:(CGSize)framebufferSize overriddenTexture:(GLuint)inputTexture;// Usage- (void)activateFramebuffer;// Reference counting- (void)lock;- (void)unlock;- (void)clearAllLocks;- (void)disableReferenceCounting;- (void)enableReferenceCounting;// Image capture- (CGImageRef)newCGImageFromFramebufferContents;- (void)restoreRenderTarget;// Raw data bytes- (void)lockForReading;- (void)unlockAfterReading;- (NSUInteger)bytesPerRow;- (GLubyte *)byteBuffer;@end
View Code

 

 

#import "GPUImageFramebuffer.h"#import "GPUImageOutput.h"@interface GPUImageFramebuffer(){    GLuint framebuffer;#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE    CVPixelBufferRef renderTarget;    CVOpenGLESTextureRef renderTexture;    NSUInteger readLockCount;#else#endif    NSUInteger framebufferReferenceCount;    BOOL referenceCountingDisabled;}- (void)generateFramebuffer;- (void)generateTexture;- (void)destroyFramebuffer;@endvoid dataProviderReleaseCallback (void *info, const void *data, size_t size);void dataProviderUnlockCallback (void *info, const void *data, size_t size);@implementation GPUImageFramebuffer@synthesize size = _size;@synthesize textureOptions = _textureOptions;@synthesize texture = _texture;@synthesize missingFramebuffer = _missingFramebuffer;#pragma mark -#pragma mark Initialization and teardown- (id)initWithSize:(CGSize)framebufferSize textureOptions:(GPUTextureOptions)fboTextureOptions onlyTexture:(BOOL)onlyGenerateTexture;{    if (!(self = [super init]))    {        return nil;    }        _textureOptions = fboTextureOptions;    _size = framebufferSize;    framebufferReferenceCount = 0;    referenceCountingDisabled = NO;    _missingFramebuffer = onlyGenerateTexture;    if (_missingFramebuffer)    {        runSynchronouslyOnVideoProcessingQueue(^{            [GPUImageContext useImageProcessingContext];            [self generateTexture];            framebuffer = 0;        });    }    else    {        [self generateFramebuffer];    }    return self;}- (id)initWithSize:(CGSize)framebufferSize overriddenTexture:(GLuint)inputTexture;{    if (!(self = [super init]))    {        return nil;    }    GPUTextureOptions defaultTextureOptions;    defaultTextureOptions.minFilter = GL_LINEAR;    defaultTextureOptions.magFilter = GL_LINEAR;    defaultTextureOptions.wrapS = GL_CLAMP_TO_EDGE;    defaultTextureOptions.wrapT = GL_CLAMP_TO_EDGE;    defaultTextureOptions.internalFormat = GL_RGBA;    defaultTextureOptions.format = GL_BGRA;    defaultTextureOptions.type = GL_UNSIGNED_BYTE;    _textureOptions = defaultTextureOptions;    _size = framebufferSize;    framebufferReferenceCount = 0;    referenceCountingDisabled = YES;        _texture = inputTexture;        return self;}- (id)initWithSize:(CGSize)framebufferSize;{    GPUTextureOptions defaultTextureOptions;    defaultTextureOptions.minFilter = GL_LINEAR;    defaultTextureOptions.magFilter = GL_LINEAR;    defaultTextureOptions.wrapS = GL_CLAMP_TO_EDGE;    defaultTextureOptions.wrapT = GL_CLAMP_TO_EDGE;    defaultTextureOptions.internalFormat = GL_RGBA;    defaultTextureOptions.format = GL_BGRA;    defaultTextureOptions.type = GL_UNSIGNED_BYTE;    if (!(self = [self initWithSize:framebufferSize textureOptions:defaultTextureOptions onlyTexture:NO]))    {        return nil;    }    return self;}- (void)dealloc{    [self destroyFramebuffer];}#pragma mark -#pragma mark Internal- (void)generateTexture;{    glActiveTexture(GL_TEXTURE1);    glGenTextures(1, &_texture);    glBindTexture(GL_TEXTURE_2D, _texture);    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, _textureOptions.minFilter);    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, _textureOptions.magFilter);    // This is necessary for non-power-of-two textures    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, _textureOptions.wrapS);    glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, _textureOptions.wrapT);        // TODO: Handle mipmaps}- (void)generateFramebuffer;{    runSynchronouslyOnVideoProcessingQueue(^{        [GPUImageContext useImageProcessingContext];            glGenFramebuffers(1, &framebuffer);        glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);                // By default, all framebuffers on iOS 5.0+ devices are backed by texture caches, using one shared cache        if ([GPUImageContext supportsFastTextureUpload])        {#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE            CVOpenGLESTextureCacheRef coreVideoTextureCache = [[GPUImageContext sharedImageProcessingContext] coreVideoTextureCache];            // Code originally sourced from http://allmybrain.com/2011/12/08/rendering-to-a-texture-with-ios-5-texture-cache-api/                        CFDictionaryRef empty; // empty value for attr value.            CFMutableDictionaryRef attrs;            empty = CFDictionaryCreate(kCFAllocatorDefault, NULL, NULL, 0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); // our empty IOSurface properties dictionary            attrs = CFDictionaryCreateMutable(kCFAllocatorDefault, 1, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks);            CFDictionarySetValue(attrs, kCVPixelBufferIOSurfacePropertiesKey, empty);                        CVReturn err = CVPixelBufferCreate(kCFAllocatorDefault, (int)_size.width, (int)_size.height, kCVPixelFormatType_32BGRA, attrs, &renderTarget);            if (err)            {                NSLog(@"FBO size: %f, %f", _size.width, _size.height);                NSAssert(NO, @"Error at CVPixelBufferCreate %d", err);            }                        err = CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault, coreVideoTextureCache, renderTarget,                                                                NULL, // texture attributes                                                                GL_TEXTURE_2D,                                                                _textureOptions.internalFormat, // opengl format                                                                (int)_size.width,                                                                (int)_size.height,                                                                _textureOptions.format, // native iOS format                                                                _textureOptions.type,                                                                0,                                                                &renderTexture);            if (err)            {                NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);            }                        CFRelease(attrs);            CFRelease(empty);                        glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture));            _texture = CVOpenGLESTextureGetName(renderTexture);            glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, _textureOptions.wrapS);            glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, _textureOptions.wrapT);                        glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0);#endif        }        else        {            [self generateTexture];            glBindTexture(GL_TEXTURE_2D, _texture);                        glTexImage2D(GL_TEXTURE_2D, 0, _textureOptions.internalFormat, (int)_size.width, (int)_size.height, 0, _textureOptions.format, _textureOptions.type, 0);            glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, _texture, 0);        }                #ifndef NS_BLOCK_ASSERTIONS        GLenum status = glCheckFramebufferStatus(GL_FRAMEBUFFER);        NSAssert(status == GL_FRAMEBUFFER_COMPLETE, @"Incomplete filter FBO: %d", status);        #endif                glBindTexture(GL_TEXTURE_2D, 0);    });}- (void)destroyFramebuffer;{    runSynchronouslyOnVideoProcessingQueue(^{        [GPUImageContext useImageProcessingContext];                if (framebuffer)        {            glDeleteFramebuffers(1, &framebuffer);            framebuffer = 0;        }                if ([GPUImageContext supportsFastTextureUpload] && (!_missingFramebuffer))        {#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE            if (renderTarget)            {                CFRelease(renderTarget);                renderTarget = NULL;            }                        if (renderTexture)            {                CFRelease(renderTexture);                renderTexture = NULL;            }#endif        }        else        {            glDeleteTextures(1, &_texture);        }    });}#pragma mark -#pragma mark Usage- (void)activateFramebuffer;{    glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);    glViewport(0, 0, (int)_size.width, (int)_size.height);}#pragma mark -#pragma mark Reference counting- (void)lock;{    if (referenceCountingDisabled)    {        return;    }        framebufferReferenceCount++;}- (void)unlock;{    if (referenceCountingDisabled)    {        return;    }    NSAssert(framebufferReferenceCount > 0, @"Tried to overrelease a framebuffer, did you forget to call -useNextFrameForImageCapture before using -imageFromCurrentFramebuffer?");    framebufferReferenceCount--;    if (framebufferReferenceCount < 1)    {        [[GPUImageContext sharedFramebufferCache] returnFramebufferToCache:self];    }}- (void)clearAllLocks;{    framebufferReferenceCount = 0;}- (void)disableReferenceCounting;{    referenceCountingDisabled = YES;}- (void)enableReferenceCounting;{    referenceCountingDisabled = NO;}#pragma mark -#pragma mark Image capturevoid dataProviderReleaseCallback (void *info, const void *data, size_t size){    free((void *)data);}void dataProviderUnlockCallback (void *info, const void *data, size_t size){    GPUImageFramebuffer *framebuffer = (__bridge_transfer GPUImageFramebuffer*)info;        [framebuffer restoreRenderTarget];    [framebuffer unlock];    [[GPUImageContext sharedFramebufferCache] removeFramebufferFromActiveImageCaptureList:framebuffer];}- (CGImageRef)newCGImageFromFramebufferContents;{    // a CGImage can only be created from a 'normal' color texture    NSAssert(self.textureOptions.internalFormat == GL_RGBA, @"For conversion to a CGImage the output texture format for this filter must be GL_RGBA.");    NSAssert(self.textureOptions.type == GL_UNSIGNED_BYTE, @"For conversion to a CGImage the type of the output texture of this filter must be GL_UNSIGNED_BYTE.");        __block CGImageRef cgImageFromBytes;        runSynchronouslyOnVideoProcessingQueue(^{        [GPUImageContext useImageProcessingContext];                NSUInteger totalBytesForImage = (int)_size.width * (int)_size.height * 4;        // It appears that the width of a texture must be padded out to be a multiple of 8 (32 bytes) if reading from it using a texture cache                GLubyte *rawImagePixels;                CGDataProviderRef dataProvider = NULL;        if ([GPUImageContext supportsFastTextureUpload])        {#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE            NSUInteger paddedWidthOfImage = CVPixelBufferGetBytesPerRow(renderTarget) / 4.0;            NSUInteger paddedBytesForImage = paddedWidthOfImage * (int)_size.height * 4;                        glFinish();            CFRetain(renderTarget); // I need to retain the pixel buffer here and release in the data source callback to prevent its bytes from being prematurely deallocated during a photo write operation            [self lockForReading];            rawImagePixels = (GLubyte *)CVPixelBufferGetBaseAddress(renderTarget);            dataProvider = CGDataProviderCreateWithData((__bridge_retained void*)self, rawImagePixels, paddedBytesForImage, dataProviderUnlockCallback);            [[GPUImageContext sharedFramebufferCache] addFramebufferToActiveImageCaptureList:self]; // In case the framebuffer is swapped out on the filter, need to have a strong reference to it somewhere for it to hang on while the image is in existence#else#endif        }        else        {            [self activateFramebuffer];            rawImagePixels = (GLubyte *)malloc(totalBytesForImage);            glReadPixels(0, 0, (int)_size.width, (int)_size.height, GL_RGBA, GL_UNSIGNED_BYTE, rawImagePixels);            dataProvider = CGDataProviderCreateWithData(NULL, rawImagePixels, totalBytesForImage, dataProviderReleaseCallback);            [self unlock]; // Don't need to keep this around anymore        }                CGColorSpaceRef defaultRGBColorSpace = CGColorSpaceCreateDeviceRGB();                if ([GPUImageContext supportsFastTextureUpload])        {#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE            cgImageFromBytes = CGImageCreate((int)_size.width, (int)_size.height, 8, 32, CVPixelBufferGetBytesPerRow(renderTarget), defaultRGBColorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst, dataProvider, NULL, NO, kCGRenderingIntentDefault);#else#endif        }        else        {            cgImageFromBytes = CGImageCreate((int)_size.width, (int)_size.height, 8, 32, 4 * (int)_size.width, defaultRGBColorSpace, kCGBitmapByteOrderDefault | kCGImageAlphaLast, dataProvider, NULL, NO, kCGRenderingIntentDefault);        }                // Capture image with current device orientation        CGDataProviderRelease(dataProvider);        CGColorSpaceRelease(defaultRGBColorSpace);            });        return cgImageFromBytes;}- (void)restoreRenderTarget;{#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE    [self unlockAfterReading];    CFRelease(renderTarget);#else#endif}#pragma mark -#pragma mark Raw data bytes- (void)lockForReading{#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE    if ([GPUImageContext supportsFastTextureUpload])    {        if (readLockCount == 0)        {            CVPixelBufferLockBaseAddress(renderTarget, 0);        }        readLockCount++;    }#endif}- (void)unlockAfterReading{#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE    if ([GPUImageContext supportsFastTextureUpload])    {        NSAssert(readLockCount > 0, @"Unbalanced call to -[GPUImageFramebuffer unlockAfterReading]");        readLockCount--;        if (readLockCount == 0)        {            CVPixelBufferUnlockBaseAddress(renderTarget, 0);        }    }#endif}- (NSUInteger)bytesPerRow;{    if ([GPUImageContext supportsFastTextureUpload])    {#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE        return CVPixelBufferGetBytesPerRow(renderTarget);#else        return _size.width * 4; // TODO: do more with this on the non-texture-cache side#endif    }    else    {        return _size.width * 4;    }}- (GLubyte *)byteBuffer;{#if TARGET_IPHONE_SIMULATOR || TARGET_OS_IPHONE    [self lockForReading];    GLubyte * bufferBytes = CVPixelBufferGetBaseAddress(renderTarget);    [self unlockAfterReading];    return bufferBytes;#else    return NULL; // TODO: do more with this on the non-texture-cache side#endif}- (GLuint)texture;{//    NSLog(@"accessing texture: %d from FB: %@", _texture, self);    return _texture;}@end
View Code

 

  


發表評論 共有條評論
用戶名: 密碼:
驗證碼: 匿名發表
亚洲香蕉成人av网站在线观看_欧美精品成人91久久久久久久_久久久久久久久久久亚洲_热久久视久久精品18亚洲精品_国产精自产拍久久久久久_亚洲色图国产精品_91精品国产网站_中文字幕欧美日韩精品_国产精品久久久久久亚洲调教_国产精品久久一区_性夜试看影院91社区_97在线观看视频国产_68精品久久久久久欧美_欧美精品在线观看_国产精品一区二区久久精品_欧美老女人bb
亚洲欧美日韩中文在线| 久久黄色av网站| 亚洲第一区在线观看| 日本精品一区二区三区在线| 57pao成人永久免费视频| 日韩高清电影免费观看完整| 久久精品电影网站| 久久视频在线看| 欧美激情第三页| 午夜免费久久久久| 亚洲人成电影在线播放| 欧美激情一区二区三区在线视频观看| 97香蕉超级碰碰久久免费的优势| 久久国产精品视频| 国语自产精品视频在线看一大j8| 欧美午夜片在线免费观看| 91精品久久久久久久久久久久久| 91香蕉国产在线观看| 国产精品一二区| 国产国语videosex另类| 国产拍精品一二三| 色av中文字幕一区| 北条麻妃一区二区在线观看| 亚洲石原莉奈一区二区在线观看| 另类色图亚洲色图| www.日韩不卡电影av| 亚洲精品免费av| 欧美性视频在线| 性色av香蕉一区二区| 国产视频精品免费播放| 中文字幕欧美日韩精品| 色综合久久88色综合天天看泰| 日韩美女毛茸茸| 亚洲天堂第一页| 日韩电影大片中文字幕| 性欧美暴力猛交69hd| 91wwwcom在线观看| 亚洲欧美日韩中文视频| 岛国av一区二区三区| 在线午夜精品自拍| 日韩av在线导航| 国产精品久久97| 亚洲一区免费网站| 日韩美女在线播放| 91沈先生在线观看| 欧美精品在线播放| 国产在线精品一区免费香蕉| 欧美亚洲国产视频小说| 日韩国产一区三区| 成人激情在线播放| 亚洲精品免费网站| 久久久精品一区二区三区| 欧美一级高清免费| 亚洲第一页在线| 国产精品一二三在线| 亚洲免费中文字幕| 91精品视频大全| 欧美国产在线视频| 亚洲影院色无极综合| 久久久久久亚洲精品中文字幕| 久久久免费在线观看| 97精品欧美一区二区三区| 奇米四色中文综合久久| 亚洲精品久久久久中文字幕二区| 久久亚洲国产精品成人av秋霞| 成人免费在线网址| 亚洲欧美成人一区二区在线电影| 国产欧美一区二区| 久99久在线视频| 国产精品久久久91| 中文字幕日韩精品在线观看| 欧美日韩中文字幕日韩欧美| 国内精品模特av私拍在线观看| 欧美猛少妇色xxxxx| 成人在线视频福利| 日本19禁啪啪免费观看www| 亚洲综合色av| 久久久久国产精品www| 欧美成人在线影院| 视频在线观看99| 日韩av免费网站| 国内精品久久久久影院优| 97视频在线观看亚洲| 国产一区av在线| 国产成人午夜视频网址| 国产日韩av在线播放| 日本中文字幕久久看| 国产精品欧美一区二区| 97精品伊人久久久大香线蕉| 亲爱的老师9免费观看全集电视剧| 亚洲成人激情图| 成人福利网站在线观看| 欧美日韩亚洲系列| 国产精品私拍pans大尺度在线| 超碰日本道色综合久久综合| 久久久在线观看| 91香蕉嫩草影院入口| 国模gogo一区二区大胆私拍| 成人做爽爽免费视频| 黄网站色欧美视频| 亚洲最大福利视频网站| 38少妇精品导航| 91精品中文在线| 亚洲a中文字幕| 久久久成人精品| 日韩一区二区av| 欧美巨乳在线观看| 日韩美女免费视频| 日韩欧美aⅴ综合网站发布| 亚洲精美色品网站| 亚洲精品乱码久久久久久按摩观| 久久久99久久精品女同性| 成人久久18免费网站图片| 国产欧美一区二区三区在线| 91性高湖久久久久久久久_久久99| 欧美黄色www| 久久精品成人动漫| 最新国产成人av网站网址麻豆| 亚洲视频在线观看视频| 精品国产91久久久久久老师| 国产综合视频在线观看| 亚洲国产精品99| 亚洲第一精品电影| 久久九九全国免费精品观看| 欧美日韩爱爱视频| 欧美午夜www高清视频| 亚洲激情免费观看| 国产日本欧美一区二区三区| 国产精品mp4| 日韩欧美在线视频免费观看| www.亚洲男人天堂| 日韩视频精品在线| 北条麻妃99精品青青久久| 国产精品成人av在线| 91在线中文字幕| 理论片在线不卡免费观看| 精品二区三区线观看| 2019日本中文字幕| 国产精品国内视频| 亚洲国产精品电影在线观看| 777国产偷窥盗摄精品视频| 国产精品v片在线观看不卡| 成人精品在线视频| 亚洲欧美激情在线视频| 热久久这里只有精品| 91国产美女视频| 日韩在线精品一区| 国产精品电影久久久久电影网| 欧美视频专区一二在线观看| 777777777亚洲妇女| 国产精品电影网站| 国产精品无av码在线观看| 国产亚洲欧美一区| 国产精品久久久久久久久| 久久久人成影片一区二区三区观看| 日韩欧美亚洲范冰冰与中字| 久久人人爽人人爽人人片av高请| 成人h视频在线| 91国产视频在线播放| 亚洲人成网站999久久久综合| 91免费看片网站| 国产在线精品播放| 日本精品一区二区三区在线| 日韩毛片在线看|