Accessing the Built in Image Picker Controller is a quick and easy way to get image and video capture into your app. However, when you need style and functionality that goes beyond the stock Image Picker Controller you will need to create a Custom Camera View.
Add the following view elements to the ViewController in Storyboard:
At the top of your ViewController file, import AVFoundation
Create Outlets for the UIView and UIImageView.
previewView
.captureImageView
.Create an Action for the UIButton.
didTakePhoto
.Above the viewDidLoad
method, where you create variables you want to be accessible anywhere in the ViewController file, create the following Instance Variables.
var captureSession: AVCaptureSession!
var stillImageOutput: AVCapturePhotoOutput!
var videoPreviewLayer: AVCaptureVideoPreviewLayer!
@interface CameraViewController ()
@property (nonatomic) AVCaptureSession *captureSession;
@property (nonatomic) AVCapturePhotoOutput *stillImageOutput;
@property (nonatomic) AVCaptureVideoPreviewLayer *videoPreviewLayer;
@end
The bulk of the camera setup will happen in the viewDidAppear
.
super.viewDidAppear(animated)
also.override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
// Setup your camera here...
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
// Setup your camera here...
}
The session will coordinate the input and output data from the devices camera.
Still in viewDidAppear
captureSession = AVCaptureSession()
captureSession.sessionPreset = .medium
self.captureSession = [AVCaptureSession new];
self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureSession.Preset.High
or AVCaptureSession.Preset.medium
to keep the size under the 10mb Parse max.In this example, we will be using the rear camera. The front camera and microphone are additional input devices at your disposal. Printing debug comment incase the fetching the rear camera fails.
Still in viewDidAppear
guard let backCamera = AVCaptureDevice.default(for: AVMediaType.video)
else {
print("Unable to access back camera!")
return
}
AVCaptureDevice *backCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (!backCamera) {
NSLog(@"Unable to access back camera!");
return;
}
We now need to make an AVCaptureDeviceInput. The AVCaptureDeviceInput will serve as the "middle man" to attach the input device, backCamera
to the session.
backCamera
input device. try
catch
to handle any potential errors we might encounter. In Objective C, errors will be using the traditional NSError pattern.
Still in viewDidAppear
do {
let input = try AVCaptureDeviceInput(device: backCamera)
//Step 9
}
catch let error {
print("Error Unable to initialize back camera: \(error.localizedDescription)")
}
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera
error:&error];
if (!error) {
//Step 9
}
else {
NSLog(@"Error Unable to initialize back camera: %@", error.localizedDescription);
}
Just like we created an AVCaptureDeviceInput to be the "middle man" to attach the input device, we will use AVCapturePhotoOutput to help us attach the output to the session.
stillImageOutput = AVCapturePhotoOutput()
self.stillImageOutput = [AVCapturePhotoOutput new];
If there are no errors from our last step and the session is able to accept input and output, the go ahead and add input add output to the Session.
stillImageOutput = AVCapturePhotoOutput()
if captureSession.canAddInput(input) && captureSession.canAddOutput(stillImageOutput) {
captureSession.addInput(input)
captureSession.addOutput(stillImageOutput)
setupLivePreview()
}
self.stillImageOutput = [AVCapturePhotoOutput new];
if ([self.capturesSession canAddInput:input] && [self.capturesSession canAddOutput:self.stillImageOutput]) {
[self.capturesSession addInput:input];
[self.capturesSession addOutput:self.stillImageOutput];
[self setupLivePreview];
}
Now that the input and output are all hooked up with our session, we just need to get our Live Preview going so we can actually display what the camera sees on the screen in our UIView, previewView
.
previewView
func setupLivePreview() {
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
videoPreviewLayer.videoGravity = .resizeAspect
videoPreviewLayer.connection?.videoOrientation = .portrait
previewView.layer.addSublayer(videoPreviewLayer)
//Step12
}
- (void)setupLivePreview {
self.videoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.capturesSession];
if (self.videoPreviewLayer) {
self.videoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspect;
self.videoPreviewLayer.connection.videoOrientation = AVCaptureVideoOrientationPortrait;
[self.previewView.layer addSublayer:self.videoPreviewLayer];
//Step12
}
}
We need to call -startRunning
on the session to start the live view. However -startRunning
is a blocking method which means it will block the UI if it's running on the main thread. If the session takes a while to start, users would want the UI to be responsive and cancel out of the camera view.
DispatchQueue.global(qos: .userInitiated).async { //[weak self] in
self.captureSession.startRunning()
//Step 13
}
dispatch_queue_t globalQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0);
dispatch_async(globalQueue, ^{
[self.capturesSession startRunning];
//Step 13
});
Once the live view starts let's set the Preview layer to fit, but we must return to the main thread to do so!
DispatchQueue.main.async {
self.videoPreviewLayer.frame = self.previewView.bounds
}
dispatch_async(dispatch_get_main_queue(), ^{
self.videoPreviewLayer.frame = self.previewView.bounds;
});
Let's create an IBAction of the Take photo Button
and capture a JPEG by calling our instance of AVCapturePhotoOutput
or stillImageOut
the method func capturePhoto(with:, delegate:)
or -capturePhotoWithSettings:delegate:
. This method requires us to provide it with a setting and a deleget to deliver the capturedPhoto to. This delegate will be this ViewController so we also need to conform to the protocol AVCapturePhotoCaptureDelegate
class CameraViewControllerSwift: UIViewController, AVCapturePhotoCaptureDelegate {
....
@IBAction func didTakePhoto(_ sender: Any) {
let settings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg])
stillImageOutput.capturePhoto(with: settings, delegate: self)
}
}
//CameraViewController.h
@interface CameraViewController : UIViewController <AVCapturePhotoCaptureDelegate>
//CameraViewController.m
- (IBAction)didTakePhoto:(id)sender {
AVCapturePhotoSettings *settings = [AVCapturePhotoSettings photoSettingsWithFormat:@{AVVideoCodecKey: AVVideoCodecTypeJPEG}];
[self.stillImageOutput capturePhotoWithSettings:settings delegate:self];
}
The AVCapturePhotoOutput
will deliver the captured photo to the assigned delegate which is our current ViewController by a delegate method called photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?)
. The photo is delivered to us as an AVCapturePhoto
which is easy to transform into Data/NSData
and than into UIImage.
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
guard let imageData = photo.fileDataRepresentation()
else { return }
let image = UIImage(data: imageData)
captureImageView.image = image
}
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(nullable NSError *)error {
NSData *imageData = photo.fileDataRepresentation;
if (imageData) {
UIImage *image = [UIImage imageWithData:imageData];
// Add the image to captureImageView here...
self.captureImageView.image = image;
}
}
Let's not forget to stop the session when we leave the camera view!
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
self.captureSession.stopRunning()
}
- (void)viewWillDisappear:(BOOL)animated {
[super viewWillDisappear:animated];
[self.capturesSession stopRunning];
}
NOTE: The simulator does NOT have a camera so you need to run your app on an Actual Device to see the magic!
previewView
.