2013-10-15 6 views
1

카메라에서 사진을 가져와야하지만 내 UIViewController에서 사용해야합니다. 나는 사진을 찍을 수있는 사용자 지정 단추가 있습니다.iOS는 modalViewController없이 카메라에서 사진을 찍습니다.

는 내가 인스 타 그램처럼 카메라에서 스트림과 정의의 ViewController를 작성하려면이

-(void) takePicture:(id) sender 
{ 
    UIImagePickerController *imagePicker = [[UIImagePickerController alloc] init]; 

    if([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) 
    { 
     [imagePicker setSourceType:UIImagePickerControllerSourceTypeCamera]; 
    } 
    else 
    { 
     [imagePicker setSourceType:UIImagePickerControllerSourceTypePhotoLibrary]; 
    } 
    [imagePicker setDelegate:self]; 
    [self presentModalViewController:imagePicker animated:YES]; 
} 

등의 사용을 원하지 않는다. 내가 어떻게 할 수 있니? 감사합니다.

+1

'UIImagePickerController'에 대한 문서를 읽으십시오. "완전 맞춤형 미디어 캡처 및 브라우징"섹션을보십시오. – rmaddy

답변

2

사진을 찍으려면 AVFoundation을 사용할 수 있습니다. Apple 설명서를 참조하십시오. AVCaptureDevice

다음은 apple의 샘플 코드입니다. 여기 AVCam

나는 튜토리얼 발견 link

또 다른 유래의 질문에,이 내가 생각하는 좋은 예제 코드가있다 : link

당신을위한 또 다른 유용한 링크 : link

1

다음은 modalViewController없이 카메라에서 사진을 찍는 AVCam 코드. 귀하의 답변에이 코드에 대한 링크를 제공 해주셔서 감사합니다.

아래 코드를 사용하면 사용자가 'takePhoto'버튼을 탭하고 장치는 사진 미리보기 화면없이 UI =로 변경하지 않고 사진을 찍습니다. 아래의 코드는 apple https://developer.apple.com/LIBRARY/IOS/samplecode/AVCam/Introduction/Intro.html에서 '추가 기능'(정지 사진 촬영과 관련 없음)이 주석 처리 된 것입니다. 귀하의 답변에이 코드를 제안 해 주셔서 감사합니다.

- (IBAction)takePhoto:(id)sender { 
    [self snapStillImage:sender]; 
} 


#pragma mark - AVFoundation 

- (BOOL)isSessionRunningAndDeviceAuthorized 
{ 
    return [[self session] isRunning] && [self isDeviceAuthorized]; 
} 

+ (NSSet *)keyPathsForValuesAffectingSessionRunningAndDeviceAuthorized 
{ 
    return [NSSet setWithObjects:@"session.running", @"deviceAuthorized", nil]; 
} 

- (void)viewDidLoad 
{ 
    [super viewDidLoad]; 

    [self getDeviceId]; 

    // Create the AVCaptureSession 
    AVCaptureSession *session = [[AVCaptureSession alloc] init]; 
    [self setSession:session]; 

    // Check for device authorization 
    [self checkDeviceAuthorizationStatus]; 

    // In general it is not safe to mutate an AVCaptureSession or any of its inputs, outputs, or connections from multiple threads at the same time. 
    // Why not do all of this on the main queue? 
    // -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue so that the main queue isn't blocked (which keeps the UI responsive). 

    dispatch_queue_t sessionQueue = dispatch_queue_create("session queue", DISPATCH_QUEUE_SERIAL); 
    [self setSessionQueue:sessionQueue]; 

    dispatch_async(sessionQueue, ^{ 
     [self setBackgroundRecordingID:UIBackgroundTaskInvalid]; 

     NSError *error = nil; 

     AVCaptureDevice *videoDevice = [CMRootViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionFront]; 
     AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; 

     if (error) 
     { 
      NSLog(@"%@", error); 
     } 

     if ([session canAddInput:videoDeviceInput]) 
     { 
      [session addInput:videoDeviceInput]; 
      [self setVideoDeviceInput:videoDeviceInput]; 

      dispatch_async(dispatch_get_main_queue(), ^{ 
       // Why are we dispatching this to the main queue? 
       // Because AVCaptureVideoPreviewLayer is the backing layer for AVCamPreviewView and UIView can only be manipulated on main thread. 
       // Note: As an exception to the above rule, it is not necessary to serialize video orientation changes on the AVCaptureVideoPreviewLayer’s connection with other session manipulation. 
      }); 
     } 

     AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject]; 
     AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error]; 

     if (error) 
     { 
      NSLog(@"%@", error); 
     } 

     if ([session canAddInput:audioDeviceInput]) 
     { 
      [session addInput:audioDeviceInput]; 
     } 

     AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init]; 
     if ([session canAddOutput:movieFileOutput]) 
     { 
      [session addOutput:movieFileOutput]; 
      AVCaptureConnection *connection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo]; 
      if ([connection isVideoStabilizationSupported]) 
       [connection setEnablesVideoStabilizationWhenAvailable:YES]; 
      [self setMovieFileOutput:movieFileOutput]; 
     } 

     AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init]; 
     if ([session canAddOutput:stillImageOutput]) 
     { 
      [stillImageOutput setOutputSettings:@{AVVideoCodecKey : AVVideoCodecJPEG}]; 
      [session addOutput:stillImageOutput]; 
      [self setStillImageOutput:stillImageOutput]; 
     } 
    }); 
} 

- (void)viewWillAppear:(BOOL)animated 
{ 
    dispatch_async([self sessionQueue], ^{ 
     [self addObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext]; 
     [self addObserver:self forKeyPath:@"stillImageOutput.capturingStillImage" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:CapturingStillImageContext]; 
     [self addObserver:self forKeyPath:@"movieFileOutput.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext]; 
     [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]]; 

     __weak CMRootViewController *weakSelf = self; 
     [self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addObserverForName:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) { 
      CMRootViewController *strongSelf = weakSelf; 
      dispatch_async([strongSelf sessionQueue], ^{ 
       // Manually restarting the session since it must have been stopped due to an error. 
       [[strongSelf session] startRunning]; 
      }); 
     }]]; 
     [[self session] startRunning]; 
    }); 
} 

- (void)viewDidDisappear:(BOOL)animated 
{ 
    dispatch_async([self sessionQueue], ^{ 
     [[self session] stopRunning]; 

     [[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[[self videoDeviceInput] device]]; 
     [[NSNotificationCenter defaultCenter] removeObserver:[self runtimeErrorHandlingObserver]]; 

     [self removeObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" context:SessionRunningAndDeviceAuthorizedContext]; 
     [self removeObserver:self forKeyPath:@"stillImageOutput.capturingStillImage" context:CapturingStillImageContext]; 
     [self removeObserver:self forKeyPath:@"movieFileOutput.recording" context:RecordingContext]; 
    }); 
} 

- (BOOL)prefersStatusBarHidden 
{ 
    return YES; 
} 

- (BOOL)shouldAutorotate 
{ 
    // Disable autorotation of the interface when recording is in progress. 
    return ![self lockInterfaceRotation]; 
} 

- (NSUInteger)supportedInterfaceOrientations 
{ 
    return UIInterfaceOrientationMaskAll; 
} 

- (void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration 
{ 
// [[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] setVideoOrientation:(AVCaptureVideoOrientation)toInterfaceOrientation]; 
} 

- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context 
{ 
    if (context == CapturingStillImageContext) 
    { 
     BOOL isCapturingStillImage = [change[NSKeyValueChangeNewKey] boolValue]; 

     if (isCapturingStillImage) 
     { 
      [self runStillImageCaptureAnimation]; 
     } 
    } 
    else if (context == RecordingContext) 
    { 
     BOOL isRecording = [change[NSKeyValueChangeNewKey] boolValue]; 

     dispatch_async(dispatch_get_main_queue(), ^{ 
      if (isRecording) 
      { 
//    [[self cameraButton] setEnabled:NO]; 
//    [[self recordButton] setTitle:NSLocalizedString(@"Stop", @"Recording button stop title") forState:UIControlStateNormal]; 
//    [[self recordButton] setEnabled:YES]; 
      } 
      else 
      { 
//    [[self cameraButton] setEnabled:YES]; 
//    [[self recordButton] setTitle:NSLocalizedString(@"Record", @"Recording button record title") forState:UIControlStateNormal]; 
//    [[self recordButton] setEnabled:YES]; 
      } 
     }); 
    } 
    else if (context == SessionRunningAndDeviceAuthorizedContext) 
    { 
     BOOL isRunning = [change[NSKeyValueChangeNewKey] boolValue]; 

     dispatch_async(dispatch_get_main_queue(), ^{ 
      if (isRunning) 
      { 
//    [[self cameraButton] setEnabled:YES]; 
//    [[self recordButton] setEnabled:YES]; 
//    [[self stillButton] setEnabled:YES]; 
      } 
      else 
      { 
//    [[self cameraButton] setEnabled:NO]; 
//    [[self recordButton] setEnabled:NO]; 
//    [[self stillButton] setEnabled:NO]; 
      } 
     }); 
    } 
    else 
    { 
     [super observeValueForKeyPath:keyPath ofObject:object change:change context:context]; 
    } 
} 

#pragma mark Actions 

- (IBAction)toggleMovieRecording:(id)sender 
{ 
// [[self recordButton] setEnabled:NO]; 

    dispatch_async([self sessionQueue], ^{ 
     if (![[self movieFileOutput] isRecording]) 
     { 
      [self setLockInterfaceRotation:YES]; 

      if ([[UIDevice currentDevice] isMultitaskingSupported]) 
      { 
       // Setup background task. This is needed because the captureOutput:didFinishRecordingToOutputFileAtURL: callback is not received until AVCam returns to the foreground unless you request background execution time. This also ensures that there will be time to write the file to the assets library when AVCam is backgrounded. To conclude this background execution, -endBackgroundTask is called in -recorder:recordingDidFinishToOutputFileURL:error: after the recorded file has been saved. 
       [self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil]]; 
      } 

      // Update the orientation on the movie file output video connection before starting recording. 
//   [[[self movieFileOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]]; 

      // Turning OFF flash for video recording 
      [CMRootViewController setFlashMode:AVCaptureFlashModeOff forDevice:[[self videoDeviceInput] device]]; 

      // Start recording to a temporary file. 
      NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[@"movie" stringByAppendingPathExtension:@"mov"]]; 
      [[self movieFileOutput] startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputFilePath] recordingDelegate:self]; 
     } 
     else 
     { 
      [[self movieFileOutput] stopRecording]; 
     } 
    }); 
} 

- (IBAction)changeCamera:(id)sender 
{ 
// [[self cameraButton] setEnabled:NO]; 
// [[self recordButton] setEnabled:NO]; 
// [[self stillButton] setEnabled:NO]; 

    dispatch_async([self sessionQueue], ^{ 
     AVCaptureDevice *currentVideoDevice = [[self videoDeviceInput] device]; 
     AVCaptureDevicePosition preferredPosition = AVCaptureDevicePositionUnspecified; 
     AVCaptureDevicePosition currentPosition = [currentVideoDevice position]; 

     switch (currentPosition) 
     { 
      case AVCaptureDevicePositionUnspecified: 
       preferredPosition = AVCaptureDevicePositionBack; 
       break; 
      case AVCaptureDevicePositionBack: 
       preferredPosition = AVCaptureDevicePositionFront; 
       break; 
      case AVCaptureDevicePositionFront: 
       preferredPosition = AVCaptureDevicePositionBack; 
       break; 
     } 

     AVCaptureDevice *videoDevice = [CMRootViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:preferredPosition]; 
     AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil]; 

     [[self session] beginConfiguration]; 

     [[self session] removeInput:[self videoDeviceInput]]; 
     if ([[self session] canAddInput:videoDeviceInput]) 
     { 
      [[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:currentVideoDevice]; 

      [CMRootViewController setFlashMode:AVCaptureFlashModeAuto forDevice:videoDevice]; 
      [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:videoDevice]; 

      [[self session] addInput:videoDeviceInput]; 
      [self setVideoDeviceInput:videoDeviceInput]; 
     } 
     else 
     { 
      [[self session] addInput:[self videoDeviceInput]]; 
     } 

     [[self session] commitConfiguration]; 

     dispatch_async(dispatch_get_main_queue(), ^{ 
//   [[self cameraButton] setEnabled:YES]; 
//   [[self recordButton] setEnabled:YES]; 
//   [[self stillButton] setEnabled:YES]; 
     }); 
    }); 
} 

- (IBAction)snapStillImage:(id)sender 
{ 
    dispatch_async([self sessionQueue], ^{ 
     // Update the orientation on the still image output video connection before capturing. 
//  [[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]]; 

     // Flash set to Auto for Still Capture 
     [CMRootViewController setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]]; 

     // Capture a still image. 
     [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { 

      if (imageDataSampleBuffer) 
      { 
       NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer]; 
       UIImage *image = [[UIImage alloc] initWithData:imageData]; 

       [self saveImageToCloud:image]; 
       self.checkMarkTakePhoto.hidden = NO; 

      } 
     }]; 
    }); 
} 

- (IBAction)focusAndExposeTap:(UIGestureRecognizer *)gestureRecognizer 
{ 
// CGPoint devicePoint = [(AVCaptureVideoPreviewLayer *)[[self previewView] layer] captureDevicePointOfInterestForPoint:[gestureRecognizer locationInView:[gestureRecognizer view]]]; 
// [self focusWithMode:AVCaptureFocusModeAutoFocus exposeWithMode:AVCaptureExposureModeAutoExpose atDevicePoint:devicePoint monitorSubjectAreaChange:YES]; 
} 

- (void)subjectAreaDidChange:(NSNotification *)notification 
{ 
    CGPoint devicePoint = CGPointMake(.5, .5); 
    [self focusWithMode:AVCaptureFocusModeContinuousAutoFocus exposeWithMode:AVCaptureExposureModeContinuousAutoExposure atDevicePoint:devicePoint monitorSubjectAreaChange:NO]; 
} 

#pragma mark File Output Delegate 

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error 
{ 
    if (error) 
     NSLog(@"%@", error); 

    [self setLockInterfaceRotation:NO]; 

    // Note the backgroundRecordingID for use in the ALAssetsLibrary completion handler to end the background task associated with this recording. This allows a new recording to be started, associated with a new UIBackgroundTaskIdentifier, once the movie file output's -isRecording is back to NO — which happens sometime after this method returns. 
    UIBackgroundTaskIdentifier backgroundRecordingID = [self backgroundRecordingID]; 
    [self setBackgroundRecordingID:UIBackgroundTaskInvalid]; 

    [[[ALAssetsLibrary alloc] init] writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) { 
     if (error) 
      NSLog(@"%@", error); 

     [[NSFileManager defaultManager] removeItemAtURL:outputFileURL error:nil]; 

     if (backgroundRecordingID != UIBackgroundTaskInvalid) 
      [[UIApplication sharedApplication] endBackgroundTask:backgroundRecordingID]; 
    }]; 
} 

#pragma mark Device Configuration 

- (void)focusWithMode:(AVCaptureFocusMode)focusMode exposeWithMode:(AVCaptureExposureMode)exposureMode atDevicePoint:(CGPoint)point monitorSubjectAreaChange:(BOOL)monitorSubjectAreaChange 
{ 
    dispatch_async([self sessionQueue], ^{ 
     AVCaptureDevice *device = [[self videoDeviceInput] device]; 
     NSError *error = nil; 
     if ([device lockForConfiguration:&error]) 
     { 
      if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:focusMode]) 
      { 
       [device setFocusMode:focusMode]; 
       [device setFocusPointOfInterest:point]; 
      } 
      if ([device isExposurePointOfInterestSupported] && [device isExposureModeSupported:exposureMode]) 
      { 
       [device setExposureMode:exposureMode]; 
       [device setExposurePointOfInterest:point]; 
      } 
      [device setSubjectAreaChangeMonitoringEnabled:monitorSubjectAreaChange]; 
      [device unlockForConfiguration]; 
     } 
     else 
     { 
      NSLog(@"%@", error); 
     } 
    }); 
} 

+ (void)setFlashMode:(AVCaptureFlashMode)flashMode forDevice:(AVCaptureDevice *)device 
{ 
    if ([device hasFlash] && [device isFlashModeSupported:flashMode]) 
    { 
     NSError *error = nil; 
     if ([device lockForConfiguration:&error]) 
     { 
      [device setFlashMode:flashMode]; 
      [device unlockForConfiguration]; 
     } 
     else 
     { 
      NSLog(@"%@", error); 
     } 
    } 
} 

+ (AVCaptureDevice *)deviceWithMediaType:(NSString *)mediaType preferringPosition:(AVCaptureDevicePosition)position 
{ 
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:mediaType]; 
    AVCaptureDevice *captureDevice = [devices firstObject]; 

    for (AVCaptureDevice *device in devices) 
    { 
     if ([device position] == position) 
     { 
      captureDevice = device; 
      break; 
     } 
    } 

    return captureDevice; 
} 

#pragma mark UI 

- (void)runStillImageCaptureAnimation 
{ 
/* 
    dispatch_async(dispatch_get_main_queue(), ^{ 
     [[[self previewView] layer] setOpacity:0.0]; 
     [UIView animateWithDuration:.25 animations:^{ 
      [[[self previewView] layer] setOpacity:1.0]; 
     }]; 
    }); 
*/ 
} 

- (void)checkDeviceAuthorizationStatus 
{ 
    NSString *mediaType = AVMediaTypeVideo; 

    [AVCaptureDevice requestAccessForMediaType:mediaType completionHandler:^(BOOL granted) { 
     if (granted) 
     { 
      //Granted access to mediaType 
      [self setDeviceAuthorized:YES]; 
     } 
     else 
     { 
      //Not granted access to mediaType 
      dispatch_async(dispatch_get_main_queue(), ^{ 
       [[[UIAlertView alloc] initWithTitle:@"AVCam!" 
              message:@"AVCam doesn't have permission to use Camera, please change privacy settings" 
              delegate:self 
            cancelButtonTitle:@"OK" 
            otherButtonTitles:nil] show]; 
       [self setDeviceAuthorized:NO]; 
      }); 
     } 
    }]; 
} 
+0

누군가 아래의 기능에 대해 더 자세히 설명해 줄 수 있습니까? ========= 'keyPathsForValuesAffectingSessionRunningAndDeviceAuthorized' ========= 왜 코드에 추가해야하는지 모르겠습니다. 그것을 위해 사용됩니다. –

+1

@DinhNhat이 메서드는 "일대일 관계"에 대해 자동으로 알림을 트리거하는 데 사용됩니다. SessionRunningAndDeviceAuthorized는 [Session isRunning]과 [self isDeviceAuthorized] 모두에 의존합니다. 자세한 내용은 https://developer.apple.com/library/ios/documentation/Cocoa/Conceptual/KeyValueObserving/Articles/KVODependentKeys.html에서 확인할 수 있습니다. –

관련 문제