Skip to content

Depth delivery or camera calibration enabled? #15

@megatran

Description

@megatran

Hello, I tried to modify the capturePhoto() function to allow depth data delivery but in the photo callback, the depth data is still nil.

I tried to modify the configuration code to use either LiDAR or TrueDepth like this sample project (https://developer.apple.com/documentation/avfoundation/additional_data_capture/capturing_depth_using_the_lidar_camera) but I received no depth nor calibration either. I'm wondering if you have suggestion on how to get depth/camera calibration when a photo is taken as well. Thank you

    public func capturePhoto() {
        /*
         Retrieve the video preview layer's video orientation on the main queue before
         entering the session queue. This to ensures that UI elements are accessed on
         the main thread and session configuration is done on the session queue.
         */
        
        if self.setupResult != .configurationFailed {
            let videoPreviewLayerOrientation: AVCaptureVideoOrientation = .portrait
            self.isCameraButtonDisabled = true
            
            sessionQueue.async {
                if let photoOutputConnection = self.photoOutput.connection(with: .video) {
                    photoOutputConnection.videoOrientation = videoPreviewLayerOrientation
                }
                var photoSettings = AVCapturePhotoSettings()
                
                photoSettings.isDepthDataDeliveryEnabled = true

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions