2012年2月28日 星期二

Filter4Cam 實作: 5. 畫出原始影像內容

since: 2012/02/28
update: 2012/02/28

reference: 1. I touchs: Filter4Cam 學習之 Getting Raw Video Data
                   2. I touchs: Filter4Cam 學習之 Saving a Photo

A. 畫出原始影像內容

   1. 開啓 ViewController.m 檔案, 修改如下:
....
#pragma mark Implementation Delegate Method

//@add implementation method for <AVCaptureVideoDataOutputSampleBufferDelegate>
// 將接收到由傳送者(delivers)所回傳(callback) 的原始像素資料(raw pixel data).
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
   
    //@TODO
    // 將取得的 sampleBuffer 轉成 CVPixelBuffer
    CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
   
    // 接著, 將 CVPixelBuffer 利用 Core Image 的初始化方法再轉成 CIImage.
    CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
   
    // 然後使用 CIContext 物件將其內容畫到 render buffer
    [self.coreImageContext drawImage:ciImage atPoint:CGPointZero fromRect:[ciImage extent]];
   
    // 最後, 在螢幕上呈現出來.
    [self.glContext presentRenderbuffer:GL_RENDERBUFFER];
    
}
....
- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    //@add:helper Test
    [self helperTest];
   
    //@add: establish Render
    [self establishRender];

    //@add: establishCamera
    [self establishCamera:kCameraBack];
   
    //@add: startRunning
    [self startRunningSession];
}

---------------------------------------------------------------------------------------------

B. 調整方向的轉變
    1. 開啓 Filter4CamHelper.h 檔案, 修改如下:
#import <Foundation/Foundation.h>
//@add
#import <AVFoundation/AVFoundation.h>
#import <CoreImage/CoreImage.h>
....
//@add for Utility Tools
+ (CIImage *)orientationTransform:(CIImage *)sourceImage; // 方向轉變的調整

@end

    2. 開啓 Filter4CamHelper.m 檔案, 修改如下:
....
#pragma mark Utility Tools

//@add:方向轉變的調整
+ (CIImage *)orientationTransform:(CIImage *)sourceImage
{
    UIDeviceOrientation orientation = [[UIDevice currentDevice] orientation];
    CGAffineTransform affineTransform; // 仿射轉換
   
    if (orientation == UIDeviceOrientationPortrait)
    {
        affineTransform = CGAffineTransformMakeRotation(-M_PI / 2);
    }
    else if (orientation == UIDeviceOrientationPortraitUpsideDown) {
        affineTransform = CGAffineTransformMakeRotation(M_PI / 2);
    }
    else if (orientation == UIDeviceOrientationLandscapeRight) {
        affineTransform = CGAffineTransformMakeRotation(M_PI);
    }
    else {
        affineTransform = CGAffineTransformMakeRotation(0);
    }
   
    return [sourceImage imageByApplyingTransform:affineTransform];
}

@end

    3. 開啓 ViewController.m 檔案, 修改如下:
....
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

    // 將取得的 sampleBuffer 轉成 CVPixelBuffer
    CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
   
    // 接著, 將 CVPixelBuffer 利用 Core Image 的初始化方法再轉成 CIImage.
    CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
      
    //@add:方向轉變的調整
    ciImage = [Filter4CamHelper orientationTransform:ciImage];

    // 然後使用 CIContext 物件將其內容畫到 render buffer
    [self.coreImageContext drawImage:ciImage atPoint:CGPointZero fromRect:[ciImage extent]];
   
    // 最後, 在螢幕上呈現出來.
    [self.glContext presentRenderbuffer:GL_RENDERBUFFER];
}
....

    4. 編譯並執行:
    說明: 目前當 iOS 設備方向轉變時, 會出現以下的錯誤訊息: (以後再處理)
            Failed to make complete framebuffer object 8cd6
            CoreImage: EAGLContext framebuffer or renderbuffer incorrectly configured!

---------------------------------------------------------------------------------------------

C. 濾鏡測試
    1. 開啓 ViewController.h 檔案, 修改如下:
....
//@add for test
- (void)helperTest;
- (CIImage *)filterTest:(CIImage *)sourceImage;

@end

    2. 開啓 ViewController.m 檔案, 修改如下:
....
#pragma mark Test
....
//@add
- (CIImage *)filterTest:(CIImage *)sourceImage
{
    CIFilter *filter = [CIFilter filterWithName:@"CISepiaTone" keysAndValues:
                        kCIInputImageKey, sourceImage,
                        @"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
   
    return [filter outputImage];
}
....

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
   
    // 將取得的 sampleBuffer 轉成 CVPixelBuffer
    CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
   
    // 接著, 將 CVPixelBuffer 利用 Core Image 的初始化方法再轉成 CIImage.
    CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
       
    //@add:濾鏡測試
    ciImage = [self filterTest:ciImage];
   
    //@add:方向轉變的調整
    ciImage = [Filter4CamHelper orientationTransform:ciImage];
 
    // 然後使用 CIContext 物件將其內容畫到 render buffer
    [self.coreImageContext drawImage:ciImage atPoint:CGPointZero fromRect:[ciImage extent]];
   
    // 最後, 在螢幕上呈現出來.
    [self.glContext presentRenderbuffer:GL_RENDERBUFFER];
}
....

    3. 編譯並執行

Filter4Cam 實作: 4. 新增相機相關方法

since: 2012/02/28
update: 2012/02/28

reference: 1. The iOS 5 Developer's Cookbook
                   2. I touchs: Filter4Cam 學習之 Getting Raw Video Data

A. 開啓 ViewController.h 檔案, 修改如下:
....
//@add
#import "Filter4CamHelper.h"

//@add:Cameras defined
enum {
    kCameraNone = -1, // 無相機
    kCameraFront,        // 前置相機
    kCameraBack,         // 後置相機
} availableCameras;

//@interface ViewController : GLKViewController
//@update
@interface ViewController : GLKViewController <AVCaptureVideoDataOutputSampleBufferDelegate>
{
....
    // AVCaptureSession: Used to coordinate the flow of data from
    //                                     AV input devices to outputs.

    AVCaptureSession *session;
   
    BOOL isUsingFrontCamera;
}

//@add
@property (strong, nonatomic) EAGLContext *glContext;
@property (strong, nonatomic) CIContext *coreImageContext;
@property (strong, nonatomic) GLKView *glView;
@property (strong) AVCaptureSession *session;
@property (assign) BOOL isUsingFrontCamera;

//@add for Render
- (void)establishRender; // establish Render

//@add for Camera
- (void)establishCamera:(uint)whichCamera; // 建立相機 Session
- (void)startRunningSession;                              // 啟動相機 Session
- (void)stopRunningSession;                              // 停止相機 Session

//@add for test
- (void)helperTest;

-----------------------------------------------------------------------------------

B. 開啓 ViewController.m 檔案, 修改如下:
@implementation ViewController

//@add
@synthesize glContext = _glContext;
@synthesize coreImageContext = _coreImageContext;
@synthesize glView = _glView;
@synthesize session = _session;
@synthesize isUsingFrontCamera = _isUsingFrontCamera;
....
#pragma mark Camera
//@add:啟動相機 Session
- (void)startRunningSession
{
    if (self.session.running) return;
    [self.session startRunning];
}

//@add:停止相機 Session
- (void)stopRunningSession
{
    [self.session stopRunning];
}

//@add:建立相機 Session
- (void)establishCamera:(uint)whichCamera
{
    NSError *error;
   
    // Is a camera available
    if (![Filter4CamHelper numberOfCameras]) return;
   
    // 設置相機輸入: 建立 session 並設定. 在這邊設定 session 為 640 pixels 寬,
    //              480 pixels 高. 還有其他的選項, 包括 720 pixels 跟 1080 pixels.
    //              Core Image 的解析度越高, 效能就越低. 一個單一簡單的濾鏡就能夠來
    //              處理高解析度.   

    // Create a session
    self.session = [[AVCaptureSession alloc] init];
   
    // begin
    [self.session beginConfiguration];
    [self.session setSessionPreset:AVCaptureSessionPreset640x480];   
   
   
    // 輸入設備: 假如要指定前置或後置相機, 需要呼叫 devicesWithMediaType, 
    //          它會回傳一個設備的陣列. 若要取得前置相機, 可在
    //          AVCaptureDevicePosition 屬性中, 重複地在陣列中尋找
    //          AVCaptureDevicePositionFront.
    //
   
    // Choose camera
    self.isUsingFrontCamera = NO;
    if ((whichCamera == kCameraFront) && [Filter4CamHelper frontCameraAvailable])
    {
        self.isUsingFrontCamera = YES;
    }
   
    // 設定輸入設備
    // Retrieve the selected camera
    //
    // 使用預設的設備:
    // AVCaptureDevice *device = [AVCaptureDevice
    //                               defaultDeviceWithMediaType:AVMediaTypeVideo];

    //
    AVCaptureDevice *device = self.isUsingFrontCamera ? [Filter4CamHelper frontCamera] : [Filter4CamHelper backCamera];
   
    // Create the capture input
    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];

    if (!captureInput)
    { 
        NSLog(@"Error establishing device input: %@", error);
        return;
    }
   
    [self.session addInput:captureInput];
   
    /*********************************************************************/
   
    // 設定輸出: 並忽略較慢的影格(frames). 如果需要記錄改變的地方,
    //                    可以設定輸入資料的顏色格式.

    //                 
   
    // 設定輸出設備
    AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];

    [captureOutput setAlwaysDiscardsLateVideoFrames:YES];   
   
    // Establish settings
    NSDictionary *settings = [NSDictionary dictionaryWithObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA] forKey:(NSString *)kCVPixelBufferPixelFormatTypeKey];
   
    [captureOutput setVideoSettings:settings];
   
   
    /*********************************************************************/
   
    // 設定 delegate: 它將會收到回傳(callback)的每個 frame,
    //                            並且設為主要的駐列(queue).
    //                  
   
    // Create capture output
    [captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
    [self.session addOutput:captureOutput];
   
    // 完成設定
    //
   
    // commit
    [self.session commitConfiguration];


#pragma mark Implementation Delegate Method

//@add implementation method for
//  <AVCaptureVideoDataOutputSampleBufferDelegate>

// 將接收到由傳送者(delivers)所回傳(callback) 的原始像素資料(raw pixel data).
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    //@TODO
}

2012年2月27日 星期一

Filter4Cam 實作: 3. 新增描繪相關方法

since: 2012/02/27
update: 2012/02/28

reference: I touchs: Filter4Cam 學習之 Getting Raw Video Data

A. 開啓 ViewController.h 檔案, 修改如下:
#import <UIKit/UIKit.h>
#import <GLKit/GLKit.h>

//@add
#import <AVFoundation/AVFoundation.h>
#import <CoreMedia/CoreMedia.h>
#import <CoreVideo/CoreVideo.h>
#import <QuartzCore/QuartzCore.h>
#import <CoreImage/CoreImage.h>
#import <ImageIO/ImageIO.h>

//@add
#import "Filter4CamHelper.h"

@interface ViewController : GLKViewController
{
    //@add
    // CIContext: Provides an evaluation context for rendering a CIImage object
    //                      through Quartz 2D or OpenGL.

    CIContext *coreImageContext;
   
    // EAGLContext: Manages the state information, commands, and resources
    //                             needed to draw using OpenGL ES.

    EAGLContext *glContext;
   
    // GLuint: 無符號四位元組整數型態,包含數值從 0 到 4,294,967,295
    GLuint _renderBuffer;
   
    // GLKView: Simplifies the effort required to create an OpenGL ES application
    //                     by providing a default implementation of an OpenGL ES-aware view

    GLKView *glView;
}

//@add
@property (strong, nonatomic) EAGLContext *glContext;
@property (strong, nonatomic) CIContext *coreImageContext;
@property (strong, nonatomic) GLKView *glView;

//@add for Render
- (void)establishRender; // establish Render

@end

----------------------------------------------------------------------------------------

B. 開啓 ViewController.m 檔案, 修改如下:
#import "ViewController.h"

@implementation ViewController

//@add
@synthesize glContext = _glContext;
@synthesize coreImageContext = _coreImageContext;
@synthesize glView = _glView;
....
#pragma mark Getter

//@add: initialize glContext
- (EAGLContext *)glContext
{
    if (_glContext == nil) {
        _glContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
    }
   
    return _glContext;
}

//@add: initialize coreImageContext
- (CIContext *)coreImageContext
{
    if (_coreImageContext == nil) {
        _coreImageContext = [CIContext contextWithEAGLContext:self.glContext];
    }
   
    return _coreImageContext;
}

#pragma mark Render

//@add: establish Render
- (void)establishRender
{
    // create EAGLContext: It will auto invoked by glContext "getter".
   
    // setup: glView
    self.glView = (GLKView *)self.view;
    self.glView.context = self.glContext; // call glContext "getter" here.
    self.glView.drawableDepthFormat = GLKViewDrawableDepthFormat24;
   
    // setup: render buffer
    glGenRenderbuffers(1, &_renderBuffer);
    glBindRenderbuffer(GL_RENDERBUFFER, _renderBuffer);
   
    // init Core Image context: It will auto invoked by coreImageContext "getter".
    // In fact, It will invoked inner
    // captureOutput:didOutputSampleBuffer:fromConnection: method.

}

- (void)viewDidUnload
{   
    [super viewDidUnload];
   
    //@add
    if ([EAGLContext currentContext] == self.glContext) {
        [EAGLContext setCurrentContext:nil];
    }
    self.glContext = nil;
}

2012年2月23日 星期四

Filter4Cam 實作: 2. 建立助手類別

since: 2012/02/23
update: 2012/02/27

reference: The iOS 5 Developer's Cookbook

A. 建立助手類別
     Xcode > File > New > New File...
     iOS > Cocoa Touch > Objective-C class > Next
     Class: Filter4CamHelper
     Subclass of: NSObject
     > Next > Create

----------------------------------------------------------------------------------

B. 查詢與取得相機     
     1. 開啓 Filter4CamHelper.h 檔案, 修改如下:
#import <Foundation/Foundation.h>
//@add
#import <AVFoundation/AVFoundation.h>

@interface Filter4CamHelper : NSObject
{

}

//@add for Available Cameras
+ (int)numberOfCameras; // 相機的數目
+ (BOOL)backCameraAvailable; // 後置相機是否可用
+ (BOOL)frontCameraAvailable; //
前置相機是否可用
+ (AVCaptureDevice *)backCamera; //
後置相機
+ (AVCaptureDevice *)frontCamera; //
前置相機

@end


     2. 開啓 Filter4CamHelper.m 檔案, 修改如下:
#import "Filter4CamHelper.h"

#pragma mark Filter4Cam Helper

@implementation Filter4CamHelper

//@add for Available Cameras
#pragma mark Available Cameras

//@add:相機的數目

+ (int)numberOfCameras
{
    return [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo].count;
}

//@add:後置相機是否可用
+ (BOOL)backCameraAvailable
{
    NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for (AVCaptureDevice *device in videoDevices)
        if (device.position == AVCaptureDevicePositionBack) return YES;

    return NO;
}

//@add:前置相機是否可用
+ (BOOL)frontCameraAvailable
{
    NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for (AVCaptureDevice *device in videoDevices)
        if (device.position == AVCaptureDevicePositionFront) return YES;

    return NO;
}

//@add:後置相機
+ (AVCaptureDevice *)backCamera
{
    NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for (AVCaptureDevice *device in videoDevices)
        if (device.position == AVCaptureDevicePositionBack) return device;
   
    return [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}

//@add:前置相機
+ (AVCaptureDevice *)frontCamera
{
    NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for (AVCaptureDevice *device in videoDevices)
        if (device.position == AVCaptureDevicePositionFront) return device;
   
    return [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}

@end  

----------------------------------------------------------------------------------

C. 測試
    1. 開啓 ViewController.h 檔案, 修改如下:
#import <UIKit/UIKit.h>
#import <GLKit/GLKit.h>
//@add
#import "Filter4CamHelper.h"
....

//@add for test
- (void)helperTest;

@end

    2. 開啓 ViewController.m 檔案, 修改如下:
#import "ViewController.h"
....

#pragma mark Test
//@add

- (void)helperTest
{
    NSLog(@"numberOfCameras = %d", [Filter4CamHelper numberOfCameras]);

    NSLog(@"backCameraAvailable = %@", [Filter4CamHelper backCameraAvailable] ? @"YES" : @"NO");


    NSLog(@"frontCameraAvailable = %@", [Filter4CamHelper frontCameraAvailable] ? @"YES" : @"NO");

}

- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    //@add:helper Test
    [self helperTest];
}
....

    2. 編譯並執行:

    3. 備註:
        如果部署到實機測試時發生問題, 請檢查 "佈建描述檔" 是否過於雜亂,
        可以將其刪除.
         iOS Device > 設定 > 一般 > 描述檔:

2012年2月19日 星期日

Xcode 4.3

since: 2012/02/19
update: 2012/02/19

reference: Updating to Xcode 4.3 - Blog - Use Your Loaf

A. 說明
      Xcode 4.3 目前已成為標準的 Mac App Store 套件, 會直接安裝到 Applications
      目錄裡. 以前, 當你從 App Store 下載來安裝, 會讓人感到有點困惑: 會先下載
      Xcode installer 到 Applications 目錄裡, 接著才開始執行安裝到 /Developer
      目錄裡.

-----------------------------------------------------------------------------------------

B. 安裝程式
     可以從 Mac App Store - Xcode 免費下載, 一但下載完成後, 會直接安裝到
      /Applications 目錄裡. 你甚至不會看到有出現安裝的對話框, 來讓你選擇安裝的
     目錄. (注意: 如果目前是使用 iOS 5.1 beta 來開發的話, 你應該繼續使用 iOS Dev
     Center 裡提供的 Xcode 4.3 Developer preview)
    
     第一次執行 Xcode, 會提示你移除安裝在 /Developer 內的舊 Xcode 版本
     (在此為 4.2.1)並且刪除在 /Applications 中多餘的 Xcode installer.


-----------------------------------------------------------------------------------------

C. 安裝選擇性的元件

   1. XcodePreferences...
      包括: iOS 4.3 模擬器, 舊版的 Device Debugging Support
                以及 Command Line Tools

   2. 其它選擇性的工具, 可在 Downloads for Apple developers 下載, 不再跟 Xcode
       綁在一起, 其中也包括了 Command Line Tools(讓你可以單獨使用 compiler
       或 OS X header files).

-----------------------------------------------------------------------------------------

D. /Developer 目錄已經不在了
   1. 切換 Xcode 版本:
      如果之前有利用 command-line tools 在使用一些 /Developer 裡的 scripts,
      例如: 利用 agvtool 來管理編譯的版本號碼, 現在執行的話, 會出現以下的錯誤訊息: 
      $ agvtool
      Error: No developer directory found at /Developer. Run /usr/bin/xcode-select
      to update the developer directory path.

      說明: xcode-select 工具程式, 可以讓你用來切換不同版本的 Xcode, 所以修正的方式:
               $ sudo /usr/bin/xcode-select -switch /Applications/Xcode.app

   2. 找到以前在 /Developer 目錄下的檔案:
       Applications 目錄 > Xcode.app > 滑鼠右鍵 > 顯示套件內容:
       舊的 /Developer/usr/bin 目錄, 目前是位在:
        /Applications/Xcode.app/Contents/Developer/usr/bin


   3. 調整設定的環境變數.
       如果之前有將 /Developer/usr/bin 加到 PATH 裡, 或將 /Developer/usr/share/man
       加到 MANPATH 裡; 那麼你就需要將 /Applications/Xcode.app/Contents 加到設定
       的最前面.

-----------------------------------------------------------------------------------------

E. Instruments 工具
    這次改編的影響之一, 是你已經不能夠使用 Finder 去存取附加的開發工具, 如:
    Instruments. 取得方式如下任一:  
    1. Xcode > Open Developer Tool > Instruments

    2. Dock > Xcode > Open Developer Tool > Instruments

    備註: 如果想要將 Instruments 的 Icon永遠保留在 Dock 上, 以方便將來使用:
               Dock > Instruments > 選項 > 保留在 Dock 上.

2012年2月17日 星期五

Filter4Cam 實作: 1. 新增專案

since: 2012/02/17
update: 2012/03/10

reference:
1. I touchs: My Second iApp Developement
2. I touchs: Git 單一開發者與遠端備份機制
3. I touchs: Filter4Cam 學習之 Getting Raw Video Data

A. 新增專案
       Xcode > File > New > New Project... > iOS > Application > OpenGL Game > Next
        Product Name: Filter4Cam
        Device Family: iPhone
        Use StroyBoard: checked
        Use Automatic Reference Counting: checked
        > Next > Create

-------------------------------------------------------------------------------------

B. 調整相關設定
   1. Filter4Cam-Info.plist 檔案
       a. 將 Key 為 Bundle identifier 的 Value 設為: com.blogspot.Filter4Cam

       b. 設定安裝此 app 之 iOS 設備需求.
           (告訴 iTunes 使用者的 iOS 設備需要符合哪些需求, 才可安裝此 app.)
           > 點選 Key 為 Required device capabilities 的左邊箭頭以展開 Item, 再點選
              Item 右方的 "加號", 在新產生的 Item 項目中, 將其 Value 設為 still-camera,
              其代表: 此 app 需要一個內建的相機, 並且可以從此相機中使用影像擷取器
              界面來拍攝相片.

   2. TARGETS(Filter4Cam) > Build Settings (All, Combined
       a. Base SDK: iOS 5.0

       b. Code Signing Identity >
           Debug / Release : Any iOS SDK: iPhone Developer

       c. Deployment >
           iOS Deployment target: iOS 5.0

-------------------------------------------------------------------------------------

C. 新增 Framework
   1. 預設的 Framework 為:

     GLKit
     - P
rovides libraries of commonly needed functions and classes to reduce the effort
       needed to create a new OpenGL ES 2.0 application or the effort required to port
       an existing OpenGL ES 1.1 application to OpenGL ES 2.0.


     OpenGLES

     - Is used for visualizing 2D and 3D data.

   2. 加入以下的 Framework:

        AssetsLibrary
        - Access the pictures and videos managed by the Photos application.


        AVFoundation

         - Manage and Play Audio-Visual media in your iOS app.

        CoreImage
        - Use pixel-accurate near-real-time image processing.

        CoreLocation
        - Determine the current location or heading associated with a device.


        CoreMedia
         - Handle time-based AV assets.

        CoreVideo
        - Play and process movies with frame-by-frame control.

        ImageIO
        - Provides interfaces for reading and writing most image formats.

        MobileCoreServices
        - Give you access to constants and additional APIs for dealing with time-based
           media.


        QuartzCore
        - Add 2D graphics rendering support.

-------------------------------------------------------------------------------------

D. 從樣板移除不會使用到的 code
      開啓 ViewController.m 檔案, 只留下以下的程式碼, 其餘皆刪除:
#import "ViewController.h"

@implementation ViewController

- (void)viewDidLoad
{
    [super viewDidLoad];
}

- (void)viewDidUnload
{   
    [super viewDidUnload];
}

- (void)didReceiveMemoryWarning
{
    [super didReceiveMemoryWarning];
    // Release any cached data, images, etc. that aren't in use.
}

- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
    // Return YES for supported orientations
    /*
    if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone) {
        return (interfaceOrientation != UIInterfaceOrientationPortraitUpsideDown);
    } else {
        return YES;
    }
    */

    //@update
    return YES;

}

@end

-------------------------------------------------------------------------------------

E. 專案備份與回復

     1. 備份
        a. 打開終端機, 進入到專案目錄內(含 .git 檔的位置):
            $ cd /Lanli/RD/Projects/Filter4Cam/

        b. 執行備份:
            $ git bundle create Filter4Cam_2012_02_24_01.BUNDLE --all

        c. 上傳到 Google 文件中:

     2. 回復
         a. 從 Google 文件下載 xxx.BUNDLE 檔案
         b. 執行以下的步驟:
             $ cd /Lanli/RD
             $ git clone xxx.BUNDLE myProject
             $ cd myProject
             $ git fetch
             $ git pull
             說明: myProject 為自定的新專案目錄名稱.

-------------------------------------------------------------------------------------

F. 備註
     利用 Filter4Cam 參考: 新增專案 的方式產生的專案, 當實際利用 OpenGL ES
     來產生影像輸出時, 會出現錯誤訊息:

      2012-02-24 09:07:28.216 Filter4Cam[2331:707] *** Terminating app due
      to uncaught exception
'NSInternalInconsistencyException', reason:
      '-[GLKViewController loadView] loaded the
"qIN-Z6-fbv-view-rsN-32-Zl3"
      nib but didn't get a GLKView.'


     解決方式: (use GLKit Framework)
     在新增 View Controller 後, 將 Subclass 從 UIViewController 改成
     GLKViewController, 並將 .storyboard 裡的 View 由 UIView 改成
     GLKView.

     雖然之後可以正常呈現相機影像了, 但是方向轉變時, 不穩定度更高.
     因此, 最後還是用目前的方式來建立專案.

2012年2月15日 星期三

Filter4Cam 參考: 建立描繪類別

since: 2012/02/15
update: 2012/02/17

reference: I touchs: Filter4Cam 學習之 Getting Raw Video Data

A. 建立描繪類別
    1. Xcode > File > New > New File...
        iOS > Cocoa Touch > Objective-C class > Next
        Class: Filter4CamRender
        Subclass of: GLKViewController
        > Next > Create

    2. 開啓 Filter4CamRender.h 檔案, 修改如下:
//@add
#import <GLKit/GLKit.h>

@interface Filter4CamRender : GLKViewController
{
}
@end

--------------------------------------------------------------------------------

B. 新增描繪相關變數與方法
   1. 開啓 Filter4CamRender.h 檔案, 修改如下:
//@add
#import <GLKit/GLKit.h>

@interface Filter4CamRender : GLKViewController
{
    //@add
    // CIContext: Provides an evaluation context for rendering a CIImage object
    // through Quartz 2D or OpenGL.
    CIContext *coreImageContext;
   
    // EAGLContext: Manages the state information, commands, and resources needed
    // to draw using OpenGL ES.
    EAGLContext *glContext;
   
    // GLKView: Simplifies the effort required to create an OpenGL ES application
    // by providing a default implementation of an OpenGL ES-aware view
    GLKView *glView;
   
    // GLuint: 無符號四位元組整數型態,包含數值從 0 到 4,294,967,295
    GLuint _renderBuffer;
}

//@add
@property (strong, nonatomic)CIContext *coreImageContext;
@property (strong, nonatomic)EAGLContext *glContext;
@property (strong, nonatomic)GLKView *glView;

//@add:init method
+ (id)RenderInit;

//@add:establish Render
- (void)establishRender;

@end

   2. 開啓 Filter4CamRender.m 檔案, 修改如下:
#import "Filter4CamRender.h"

@implementation Filter4CamRender

//@add
@synthesize coreImageContext = _coreImageContext;
@synthesize glContext = _glContext;
@synthesize glView = _glView;

//@add
- (id)initWithRender
{
    if (!(self = [super init])) return self;
    [self establishRender];
   
    return self;
}

//@add:establish Render
- (void)establishRender
{
    // create EAGLContext
    self.glContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];

    if (!self.glContext) {
        NSLog(@"Failed to create ES context");
    }
   
    // setup: glView
    self.glView = (GLKView *)self.view;
    self.glView.context = self.glContext;
    self.glView.drawableDepthFormat = GLKViewDrawableDepthFormat24;
   
    // setup: render buffer
    glGenRenderbuffers(1, &_renderBuffer);
    glBindRenderbuffer(GL_RENDERBUFFER, _renderBuffer);
   
    // init: Core Image context
    self.coreImageContext = [CIContext contextWithEAGLContext:self.glContext];
}

//@add:init method
+ (id)RenderInit
{
    Filter4CamRender *render = [[Filter4CamRender alloc] initWithRender];
    return render;
}

@end

2012年2月13日 星期一

Filter4Cam 參考: 建立助手類別

since: 2012/02/11
update: 2012/02/17

reference: The iOS 5 Developer's Cookbook

A. 設定安裝此 app 之 iOS 設備需求
    1. 說明: 告訴 iTunes 使用者的 iOS 設備需要符合哪些需求, 才可安裝此 app.

    2. 先點選 Filter4Cam-Info.plist 檔案, 再點選右邊 Key 為 Required device capabilities
        的右方 "加號", 在新產生的 Item 項目中, 將其 Value 設為 still-camera, 代表:
        此 app 需要一個內建的相機, 並且可以從此相機中使用影像擷取器界面來拍攝相片.

-----------------------------------------------------------------------------------

B. 建立助手類別
    1. Xcode > File > New > New File...
        iOS > Cocoa Touch > Objective-C class > Next
        Class: Filter4CamHelper
        Subclass of: NSObject
        > Next > Create

-----------------------------------------------------------------------------------

C. 查詢與取得相機
     1. 開啓 Filter4CamHelper.h 檔案, 修改如下:
#import <Foundation/Foundation.h>
//@add
#import <AVFoundation/AVFoundation.h>

@interface Filter4CamHelper : NSObject
{
}

//@add for Available Cameras
+ (int)numberOfCameras; // 相機的數目
+ (BOOL)backCameraAvailable; // 後置相機是否可用
+ (BOOL)frontCameraAvailable; //
前置相機是否可用
+ (AVCaptureDevice *)backCamera; //
後置相機
+ (AVCaptureDevice *)frontCamera; //
前置相機

@end


     2. 開啓 Filter4CamHelper.m 檔案, 修改如下:
#import "Filter4CamHelper.h"

#pragma mark Filter4Cam Helper

@implementation Filter4CamHelper

//@add for Available Cameras
#pragma mark Available Cameras
//@add:相機的數目

+ (int)numberOfCameras
{
    return [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo].count;
}

//@add:後置相機是否可用
+ (BOOL)backCameraAvailable
{
    NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for (AVCaptureDevice *device in videoDevices)
        if (device.position == AVCaptureDevicePositionBack) return YES;

    return NO;
}

//@add:前置相機是否可用
+ (BOOL)frontCameraAvailable
{
    NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for (AVCaptureDevice *device in videoDevices)
        if (device.position == AVCaptureDevicePositionFront) return YES;

    return NO;
}

//@add:後置相機
+ (AVCaptureDevice *)backCamera
{
    NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for (AVCaptureDevice *device in videoDevices)
        if (device.position == AVCaptureDevicePositionBack) return device;
   
    return [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}

//@add:前置相機
+ (AVCaptureDevice *)frontCamera
{
    NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for (AVCaptureDevice *device in videoDevices)
        if (device.position == AVCaptureDevicePositionFront) return device;
   
    return [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}

@end  

-----------------------------------------------------------------------------------

D. 建立相機的 Session
   1. 開啓 Filter4CamHelper.h 檔案, 修改如下:
#import <Foundation/Foundation.h>
//@add
#import <AVFoundation/AVFoundation.h>

//@add:Cameras defined
enum {
    kCameraNone = -1, // 無相機
    kCameraFront,       // 前置相機
    kCameraBack,       // 後置相機
} availableCameras;

//@interface Filter4CamHelper : NSObject
//@update
@interface Filter4CamHelper : NSObject <AVCaptureVideoDataOutputSampleBufferDelegate>
{
    //@add
    BOOL isUsingFrontCamera;
    AVCaptureSession *session;
    CIImage *ciImage;

}

//@add
@property (readonly)BOOL isUsingFrontCamera;
@property (strong)AVCaptureSession *session;
@property (strong)CIImage *ciImage;

//@add for Available Cameras
+ (int)numberOfCameras;
+ (BOOL)backCameraAvailable;
+ (BOOL)frontCameraAvailable;
+ (AVCaptureDevice *)backCamera;
+ (AVCaptureDevice *)frontCamera;

//@add for Camera
- (void)establishCamera:(uint)whichCamera; // 建立相機 Session
- (void)startRunningSession; // 啟動相機 Session
- (void)stopRunningSession; // 停止相機 Session

   2. 開啓 Filter4CamHelper.m 檔案, 修改如下:
#import "Filter4CamHelper.h"

@implementation Filter4CamHelper

//@add
@synthesize isUsingFrontCamera;
@synthesize session;
@synthesize ciImage;

....

#pragma mark Capture
//@add:實作 <AVCaptureVideoDataOutputSampleBufferDelegate> 的方法
 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
 {    //@TODO
}

//@add for Setup
#pragma mark Setup

//@add:啟動相機 Session
- (void)startRunningSession
{
    if (session.running) return;
    [session startRunning];
}

//@add:停止相機 Session
- (void)stopRunningSession
{
    [session stopRunning];
}

//@add:建立相機 Session
- (void)establishCamera:(uint)whichCamera
{
    NSError *error;
  
    // Is a camera available
    if (![Filter4CamHelper numberOfCameras]) return;
  
    // Create a session
    self.session = [[AVCaptureSession alloc] init];
  
    //@begin
    [session beginConfiguration];
    [session setSessionPreset:AVCaptureSessionPreset640x480];
  
    // Choose camera
    isUsingFrontCamera = NO;
    if ((whichCamera == kCameraFront) && [Filter4CamHelper frontCameraAvailable])
        isUsingFrontCamera = YES;
  
    //@設定輸入設備
    // Retrieve the selected camera
    AVCaptureDevice *device = isUsingFrontCamera ? [Filter4CamHelper frontCamera] : [Filter4CamHelper backCamera];
  
    // Create the capture input
    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (!captureInput)
    {  
        NSLog(@"Error establishing device input: %@", error);
        return;
    }
  
    [session addInput:captureInput];
  
  
    //@設定輸出設備
    AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
    [captureOutput setAlwaysDiscardsLateVideoFrames:YES];
  
    // Establish settings
    NSDictionary *settings = [NSDictionary dictionaryWithObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA] forKey:(NSString *)kCVPixelBufferPixelFormatTypeKey];

    [captureOutput setVideoSettings:settings];
  
    // Create capture output: not to use the main queue
    char *queueName = "com.blogspot.Filter4Cam.grabFrames";
    dispatch_queue_t queue = dispatch_queue_create(queueName, NULL);
    [captureOutput setSampleBufferDelegate:self queue:queue];
  
    [session addOutput:captureOutput];
  
  
    //@commit
    [session commitConfiguration];
}
 
-----------------------------------------------------------------------------------

E. 建立初始化方法
    1. 開啓 Filter4CamHelper.h 檔案, 修改如下:
....
//@add:init method 
+ (id)helperWithCamera:(uint)whichCamera;
....

    2. 開啓 Filter4CamHelper.m 檔案, 修改如下:
....
#pragma mark Creation
//@add
- (id)init
{
    if (!(self = [super init])) return self;
    [self establishCamera: kCameraBack];
    return self;


//@add
- (id)initWithCamera:(uint)whichCamera
{
    if (!(self = [super init])) return self;   
    [self establishCamera: whichCamera];
    return self;
}

//@add:init method 
+ (id)helperWithCamera:(uint)whichCamera
{
    Filter4CamHelper *helper = [[Filter4CamHelper alloc] initWithCamera:(uint) whichCamera];

    return helper;
}
....

-----------------------------------------------------------------------------------

F. 切換相機
    1. 開啓 Filter4CamHelper.h 檔案, 修改如下:
....
//@add for Camera
- (void)establishCamera:(uint)whichCamera; // 建立相機 Session
- (void)startRunningSession; // 啟動相機 Session
- (void)stopRunningSession;  // 停止相機 Session
- (void)switchCameras; // 切換相機
....

    2. 開啓 Filter4CamHelper.m 檔案, 修改如下:
....
//@add:切換相機
- (void)switchCameras
{
    if (![Filter4CamHelper numberOfCameras] > 1) return;
   
    isUsingFrontCamera = !isUsingFrontCamera;

    AVCaptureDevice *newDevice = isUsingFrontCamera ? [Filter4CamHelper frontCamera] : [Filter4CamHelper backCamera];
   
    [session beginConfiguration];
   
    // Remove existing inputs
    for (AVCaptureInput *input in [session inputs])
        [session removeInput:input];
   
    // Change the input
    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:newDevice error:nil];

    [session addInput:captureInput];
   
    [session commitConfiguration];
}
....