AVFoundation + AssetWriter: Generate Movie With Images and Audio

I ended up exporting the video separately using the above code and added the audio files separately using AVComposition & AVExportSession. Here is the code -(void) addAudioToFileAtPath:(NSString *) filePath toPath:(NSString *)outFilePath { NSError * error = nil; AVMutableComposition * composition = [AVMutableComposition composition]; AVURLAsset * videoAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:filePath] options:nil]; AVAssetTrack * videoAssetTrack = … Read more

Face Detection with Camera

There are two ways to detect faces: CIFaceDetector and AVCaptureMetadataOutput. Depending on your requirements, choose what is relevant for you. CIFaceDetector has more features, it gives you the location of the eyes and mouth, a smile detector, etc. On the other hand, AVCaptureMetadataOutput is computed on the frames and the detected faces are tracked and … Read more

How to solve the warning: Sending ‘ViewController *const __strong’ to parameter of incompatible type ‘id?

Conform to the AVAudioPlayerDelegate protocol in your header file and the warning will go away. By not declaring that a class conforms to a given protocol, the compiler cannot guarantee (well, at least warn) about your failure to implement the methods required of it. The following code is a corrected version that will suppress the … Read more

ios capturing image using AVFramework

Add the following line output.minFrameDuration = CMTimeMake(5, 1); below the comment // If you wish to cap the frame rate to a known value, such as 15 fps, set // minFrameDuration. but above the [session startRunning]; Edit Use the following code to preview the camera output. AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session]; UIView *aView = self.view; … Read more

How to use AVCapturePhotoOutput

Updated to Swift 4 Hi it’s really easy to use AVCapturePhotoOutput. You need the AVCapturePhotoCaptureDelegate which returns the CMSampleBuffer. You can get as well a preview image if you tell the AVCapturePhotoSettings the previewFormat class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate { let cameraOutput = AVCapturePhotoOutput() func capturePhoto() { let settings = AVCapturePhotoSettings() let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first! let … Read more

Streaming live camera video from iOS (iPhone/iPad) to remote PC / server

I have developed such a library, and you can find it at github.com/jgh-/VideoCore I am updating this answer because I have created a simplified iOS API that will allow you to easily setup a Camera/Mic RTMP session. You can find it at https://github.com/jgh-/VideoCore/blob/master/api/iOS/VCSimpleSession.h. Additionally, VideoCore is now available in CocoaPods.