草庐IT

ios - 锁定/解锁 iPhone 时 AVCapture session 卡住/卡住

coder 2024-01-28 原文

我正在我的 xamarin.forms 应用程序中实现扫描仪功能,为此我正在使用 iOS native AVCaptureSession。但我的问题是在扫描或捕获 session 处于事件状态并且设备被锁定时,然后在解锁设备后卡住捕获 session ,这很奇怪。

我尝试使用 UIApplication.DidEnterBackgroundNotification|UIApplication.WillEnterForegroundNotification 处理它,我在其中停止并再次启动捕获 session 。但卡住仍在发生。

using System;
using AVFoundation;
using CoreFoundation;
using CoreGraphics;
using Foundation;
using UIKit;
using Xamarin.Forms;
using Xamarin.Forms.Platform.iOS;

namespace MarginPointApp.iOS
{
    public class ScannerViewController : UIViewController
    {
        AVCaptureDevice captureDevice;
        AVCaptureVideoPreviewLayer videoPreviewLayer;
        AVCaptureSession captureSession;
        UIView viewfinderView;
        MetadataObjectsDelegate metadataObjectsDelegate;
        public event EventHandler<String> OnScanSuccess;
        UIView redLineCenter;
        UIView overlayView;
        UIView overlay;
        UIView bottomBarView;
        UIButton flashLightButton;
        UIButton cancelButton;
        UIButton cameraButton;
        UILabel bottomTextLabel;
        bool isCameraDismissed;
        NSObject interuptStartNoti, interuptEndNoti;
        AVMetadataObjectType metaTypes = AVMetadataObjectType.Code128Code |
                           AVMetadataObjectType.Code39Code | AVMetadataObjectType.Code39Mod43Code |
                               AVMetadataObjectType.DataMatrixCode | AVMetadataObjectType.EAN13Code | AVMetadataObjectType.EAN8Code |
                           AVMetadataObjectType.Interleaved2of5Code | AVMetadataObjectType.PDF417Code |
                           AVMetadataObjectType.QRCode | AVMetadataObjectType.UPCECode;
        public override void ViewDidLoad()
        {
            base.ViewDidLoad();
            UIView statusBar = UIApplication.SharedApplication.ValueForKey(new NSString("statusBar")) as UIView;
            if (statusBar.RespondsToSelector(new ObjCRuntime.Selector("setBackgroundColor:")))
            {
                statusBar.BackgroundColor = UIColor.Black;
            }

            NavigationItem.Title = "Scanner";
            this.View.BackgroundColor = UIColor.White;

            if (marginpoint.im.iOS.AppDelegate.camPosition)
            {
                captureDevice = GetCameraDevice(AVCaptureDevicePosition.Front);
            }
            else
            {
                captureDevice = GetCameraDevice(AVCaptureDevicePosition.Back);
            }
            CameraSetup();
        }

        public void CameraSetup()
        {

            NSError error = null;
            if (captureDevice != null)
            {
                try
                {
                    var input = new AVCaptureDeviceInput(captureDevice, out error);

                    captureSession = new AVCaptureSession();
                    if (captureSession == null) { return; }
                    if (captureSession.CanAddInput(input))
                        captureSession.AddInput(input);

                    var captureMetadataOutput = new AVCaptureMetadataOutput();
                    if (captureSession.CanAddOutput(captureMetadataOutput))
                    {
                        captureSession.AddOutput(captureMetadataOutput);
                        // captureMetadataOutput.MetadataObjectTypes = captureMetadataOutput.AvailableMetadataObjectTypes;
                        captureMetadataOutput.MetadataObjectTypes = metaTypes;
                    }

                    var metadataQueue = new DispatchQueue("com.AVCam.metadata");
                    metadataObjectsDelegate = new MetadataObjectsDelegate
                    {
                        DidOutputMetadataObjectsAction = DidOutputMetadataObjects
                    };
                    captureMetadataOutput.SetDelegate(metadataObjectsDelegate, metadataQueue);

                    videoPreviewLayer = new AVCaptureVideoPreviewLayer(session: captureSession);
                    videoPreviewLayer.VideoGravity = AVLayerVideoGravity.ResizeAspectFill;
                    videoPreviewLayer.Frame = View.Layer.Bounds;
                    View.Layer.AddSublayer(videoPreviewLayer);
                }
                catch (Exception e)
                {
                    //Console.WriteLine("error device input" + e.ToString());
                }
            }

            // Prepare device for configuration
            captureDevice.LockForConfiguration(out error);
            if (error != null)
            {
                // There has been an issue, abort
                //Console.WriteLine("Error: {0}", error.LocalizedDescription);
                captureDevice.UnlockForConfiguration();
                return;
            }
            addOverlayOnScreen();

            /*
            string reason = string.Empty;           
            if (interuptStartNoti == null)
            {
                interuptStartNoti = AVCaptureSession.Notifications.ObserveWasInterrupted((sender, e) =>
                    {
                        reason = e.Notification.UserInfo?.ValueForKey(new NSString("AVCaptureSessionInterruptionReasonKey"))?.ToString();
                        if (captureSession != null && !reason.Equals(string.Empty) && !reason.Equals("3"))
                        {
                            captureSession.StopRunning();
                            //captureSession.Dispose();
                            //captureSession = null;
                        }                       
                    });
            }

            if (interuptEndNoti == null)
            {
                interuptEndNoti = AVCaptureSession.Notifications.ObserveInterruptionEnded((sender, e) =>
                  {
                      Device.BeginInvokeOnMainThread(() =>
                           {
                               if (marginpoint.im.iOS.AppDelegate.camPosition)
                               {
                                   captureDevice = GetCameraDevice(AVCaptureDevicePosition.Front);
                               }
                               else
                               {
                                   captureDevice = GetCameraDevice(AVCaptureDevicePosition.Back);
                               }
                               if (!reason.Equals(string.Empty) && !reason.Equals("3"))
                               {
                                   //CameraSetup();
                                   captureSession.StartRunning();
                               }
                              });
                  });
            }
            */

        }

        NSObject didEnterBackgroundNoti, willEnterForegroundNoti;
        public override void ViewWillAppear(bool animated)
        {
            base.ViewWillAppear(animated);
            if (didEnterBackgroundNoti == null)
            {
                didEnterBackgroundNoti = NSNotificationCenter.DefaultCenter.AddObserver(UIApplication.DidEnterBackgroundNotification, (obj) =>
                  {
                      Device.BeginInvokeOnMainThread(() =>
                      {
                          AddBlurEffect();
                          captureSession?.StopRunning();
                      });
                  });
            }
            willEnterForegroundNoti = NSNotificationCenter.DefaultCenter.AddObserver(UIApplication.WillEnterForegroundNotification, (obj) =>
            {
                Device.BeginInvokeOnMainThread(() =>
                {
                    RemoveBlurEffect();
                    CameraSetup();
                });
            });
        }

        public override void ViewWillDisappear(bool animated)
        {
            base.ViewWillDisappear(animated);

            NSNotificationCenter.DefaultCenter.RemoveObserver(didEnterBackgroundNoti);
            NSNotificationCenter.DefaultCenter.RemoveObserver(willEnterForegroundNoti);
        }

        private AVCaptureDevice GetCameraDevice(AVCaptureDevicePosition position)
        {
            AVCaptureDevice captureDevice = null;
            if (UIDevice.CurrentDevice.CheckSystemVersion(10, 0))
            {
                captureDevice = AVCaptureDevice.GetDefaultDevice(AVCaptureDeviceType.BuiltInWideAngleCamera, AVMediaType.Video, position);//AVCaptureDevice.GetDefaultDevice(AVMediaTypes.Video);
            }
            else
            {
                var devices = AVCaptureDevice.DevicesWithMediaType(AVMediaType.Video);
                foreach (var device in devices)
                {
                    if (device.Position == position)
                    {
                        captureDevice = device;
                    }
                }
            }
            return captureDevice;
        }

        UIVisualEffectView blurView;
        /// <summary>
        /// Adds the blur effect to camera preview.
        /// </summary>
        void AddBlurEffect()
        {
            if (blurView == null)
            {
                var blur = UIBlurEffect.FromStyle(UIBlurEffectStyle.Light);
                blurView = new UIVisualEffectView(blur);
                blurView.Frame = View.Frame;
                blurView.AutoresizingMask = UIViewAutoresizing.FlexibleHeight | UIViewAutoresizing.FlexibleWidth;
                View.AddSubview(blurView);
            }
        }

        void RemoveBlurEffect()
        {
            if (blurView != null)
            {
                blurView.RemoveFromSuperview();
                blurView = null;
            }
        }

        void addOverlayOnScreen()
        {
            overlayView = new UIView();
            overlayView.Frame = new CGRect(x: 0, y: 0, width: View.Frame.Width, height: View.Frame.Height);
            View.AddSubview(overlayView);
            View.BringSubviewToFront(overlayView);

            var overlayWidth = Application.Current.MainPage.Width * 0.7;
            overlay = new UIView();
            overlay.Layer.BorderColor = UIColor.Green.CGColor;
            overlay.Layer.BorderWidth = 4;
            overlay.Frame = new CGRect(x: View.Center.X - overlayWidth / 2, y: View.Center.Y - overlayWidth / 2, width: overlayWidth, height: overlayWidth);
            overlayView.AddSubview(overlay);
            overlayView.BringSubviewToFront(overlay);


            redLineCenter = new UIView();
            redLineCenter.BackgroundColor = UIColor.Red;
            redLineCenter.Alpha = 0.5f;
            redLineCenter.Frame = new CGRect(x: overlay.Frame.X + 4, y: overlay.Center.Y - 2, width: overlay.Frame.Width - 9, height: 4);
            overlayView.AddSubview(redLineCenter);
            overlayView.BringSubviewToFront(redLineCenter);


            // to find Qr code
            viewfinderView = new UIView();
            viewfinderView.Frame = new CGRect(x: overlay.Frame.X, y: overlay.Center.Y - 50, width: overlay.Frame.Width, height: 100);
            overlayView.AddSubview(viewfinderView);
            overlayView.BringSubviewToFront(viewfinderView);

            bottomBarView = new UIView();
            bottomBarView.BackgroundColor = UIColor.White;
            bottomBarView.Frame = new CGRect(x: 0, y: View.Frame.Height - 50, width: View.Frame.Width, height: 50);
            overlayView.AddSubview(bottomBarView);
            overlayView.BringSubviewToFront(bottomBarView);

            var centerPoint = (bottomBarView.Frame.Top - overlay.Frame.Bottom) / 2 - 15;
            bottomTextLabel = new UILabel
            {
                Frame = new CGRect(x: View.Frame.X, y: overlay.Frame.Bottom + centerPoint, width: View.Frame.Width, height: 30),
                Text = AppResources.ScanAutomatically,
                TextColor = UIColor.White,
                Font = UIFont.FromName("TitilliumWeb-Regular", 17),
                TextAlignment = UITextAlignment.Center
            };

            View.AddSubview(bottomTextLabel);
            View.BringSubviewToFront(bottomTextLabel);

            if (captureDevice.Position == AVCaptureDevicePosition.Back)
            {
                flashLightButton = new UIButton();
                flashLightButton.SetImage(new UIImage(filename: "flash_white_light.png"), UIControlState.Normal);
                flashLightButton.Frame = new CGRect(x: 0, y: 0, width: 50, height: 50);
                bottomBarView.AddSubview(flashLightButton);

                flashLightButton.TouchUpInside += async (object sender, EventArgs e) =>
                {
                    NSError error = null;
                    if (captureDevice == null) return;
                    captureDevice.LockForConfiguration(out error);
                    if (error != null)
                    {
                        captureDevice.UnlockForConfiguration();
                        return;
                    }
                    else
                    {
                        if (!captureDevice.TorchAvailable)
                        {
                            var alert = new UIAlertView
                            {
                                Title = AppResources.MarginPoint,
                                Message = AppResources.CameraFlash
                            };
                            alert.AddButton(AppResources.OkButtonTitle);
                            alert.Show();

                            return;
                        }
                        if (captureDevice.TorchMode != AVCaptureTorchMode.On)
                        {
                            captureDevice.TorchMode = AVCaptureTorchMode.On;
                        }
                        else
                        {
                            captureDevice.TorchMode = AVCaptureTorchMode.Off;
                        }
                        captureDevice.UnlockForConfiguration();
                    }
                };
            }

            string blueColor = "#1273B6";
            cancelButton = new UIButton();
            cancelButton.SetTitleColor(Color.FromHex(blueColor).ToUIColor(), UIControlState.Normal);
            cancelButton.SetTitle(AppResources.PickerCancelLabel, UIControlState.Normal);
            cancelButton.Font = UIFont.FromName("TitilliumWeb-Regular", 18);
            cancelButton.Frame = new CGRect(x: bottomBarView.Center.X - 50, y: 0, width: 100, height: 50);
            bottomBarView.AddSubview(cancelButton);
            cancelButton.TouchUpInside += (object sender, EventArgs e) =>
            {
                if (captureSession != null)
                    captureSession.StopRunning();
                DismissViewController(true, null);
            };

            Device.BeginInvokeOnMainThread(() =>
            {
                if (captureSession != null)
                    captureSession.StartRunning();
            });

            cameraButton = new UIButton();
            cameraButton.SetImage(new UIImage(filename: "camera.png"), UIControlState.Normal);
            cameraButton.Frame = new CGRect(x: bottomBarView.Frame.Width - 50, y: 0, width: 50, height: 50);
            bottomBarView.AddSubview(cameraButton);
            cameraButton.TouchUpInside += (object sender, EventArgs e) =>
            {
                if (captureDevice.Position == AVCaptureDevicePosition.Back)
                {
                    if (captureDevice.TorchAvailable)
                        captureDevice.TorchMode = AVCaptureTorchMode.Off;
                    captureDevice = GetCameraDevice(AVCaptureDevicePosition.Front);
                    marginpoint.im.iOS.AppDelegate.camPosition = true;
                }
                else
                {
                    if (captureDevice.TorchAvailable)
                        captureDevice.TorchMode = AVCaptureTorchMode.Off;
                    captureDevice = GetCameraDevice(AVCaptureDevicePosition.Back);
                    marginpoint.im.iOS.AppDelegate.camPosition = false;
                }

                if (captureDevice != null)
                {
                    try
                    {
                        NSError error;
                        var input = new AVCaptureDeviceInput(captureDevice, out error);

                        captureSession = new AVCaptureSession();
                        if (captureSession == null) { return; }
                        if (captureSession.CanAddInput(input))
                            captureSession.AddInput(input);

                        var captureMetadataOutput = new AVCaptureMetadataOutput();
                        if (captureSession.CanAddOutput(captureMetadataOutput))
                        {
                            captureSession.AddOutput(captureMetadataOutput);
                            //  captureMetadataOutput.MetadataObjectTypes = captureMetadataOutput.AvailableMetadataObjectTypes;

                            captureMetadataOutput.MetadataObjectTypes = metaTypes;
                        }
                        var metadataQueue = new DispatchQueue("com.AVCam.metadata");
                        metadataObjectsDelegate = new MetadataObjectsDelegate
                        {
                            DidOutputMetadataObjectsAction = DidOutputMetadataObjects
                        };
                        captureMetadataOutput.SetDelegate(metadataObjectsDelegate, metadataQueue);

                        captureSession.StartRunning();

                        videoPreviewLayer = new AVCaptureVideoPreviewLayer(session: captureSession);
                        videoPreviewLayer.VideoGravity = AVLayerVideoGravity.ResizeAspectFill;
                        videoPreviewLayer.Frame = View.Layer.Bounds;
                        View.Layer.AddSublayer(videoPreviewLayer);
                    }
                    catch (Exception ex)
                    {
                        // Console.WriteLine("error device input" + ex.ToString());
                    }

                    addOverlayOnScreen();
                }
            };
        }

        public void DidOutputMetadataObjects(AVCaptureOutput captureOutput,
                               AVMetadataObject[] metadataObjects,
                               AVCaptureConnection connection)
        {
            Device.BeginInvokeOnMainThread(() =>
            {
                if (metadataObjects != null && metadataObjects.Length == 0)
                {
                    //codeLabel.Text = "No Data";
                    //displayScanResult(string.Empty);
                    return;
                }

                var metadataObject = metadataObjects[0] as AVMetadataMachineReadableCodeObject;

                if (metadataObject == null) { return; }

                var visualCodeObject = videoPreviewLayer.GetTransformedMetadataObject(metadataObject);
                if (metadataObject.Type == AVMetadataObjectType.QRCode)
                {
                    if (viewfinderView.Frame.Contains(visualCodeObject.Bounds))
                    {
                        captureSession.StopRunning();
                        displayScanResult(metadataObject.StringValue);
                    }

                }
                else
                {
                    captureSession.StopRunning();
                    displayScanResult(metadataObject.StringValue);
                }
            });
        }

        private async void displayScanResult(string metadataObjectVal)
        {
            OnScanSuccess?.Invoke(this, string.IsNullOrWhiteSpace(metadataObjectVal) ? string.Empty : metadataObjectVal as String);
            captureSession.StopRunning();
            DismissViewController(true, null);
            //codeLabel.Text = metadataObject.StringValue;
        }

    }
    class MetadataObjectsDelegate : AVCaptureMetadataOutputObjectsDelegate
    {
        public Action<AVCaptureMetadataOutput, AVMetadataObject[], AVCaptureConnection> DidOutputMetadataObjectsAction;

        public override void DidOutputMetadataObjects(AVCaptureMetadataOutput captureOutput, AVMetadataObject[] metadataObjects, AVCaptureConnection connection)
        {
            if (DidOutputMetadataObjectsAction != null)
                DidOutputMetadataObjectsAction(captureOutput, metadataObjects, connection);
        }
    }
}


有没有人遇到过这个问题并找到了解决办法?

最佳答案

与其每次在用户进入前台时都创建一个新的captureSession,不如尝试只启动现有的captureSession

willEnterForegroundNoti =
NSNotificationCenter.DefaultCenter.AddObserver(UIApplication.WillEnterForegroundNotification,
(obj) => {

          Device.BeginInvokeOnMainThread(() => {

              RemoveBlurEffect();
              captureSession?.startRunning();
          });
});

关于ios - 锁定/解锁 iPhone 时 AVCapture session 卡住/卡住,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/54684301/

有关ios - 锁定/解锁 iPhone 时 AVCapture session 卡住/卡住的更多相关文章

  1. ruby-on-rails - Rails - 乐观锁定总是触发 StaleObjectError 异常 - 2

    我正在学习Rails,并阅读了关于乐观锁的内容。我已将类型为integer的lock_version列添加到我的articles表中。但现在每当我第一次尝试更新记录时,我都会收到StaleObjectError异常。这是我的迁移:classAddLockVersionToArticle当我尝试通过Rails控制台更新文章时:article=Article.first=>#我这样做:article.title="newtitle"article.save我明白了:(0.3ms)begintransaction(0.3ms)UPDATE"articles"SET"title"='dwdwd

  2. ruby - 如何验证 IO.copy_stream 是否成功 - 2

    这里有一个很好的答案解释了如何在Ruby中下载文件而不将其加载到内存中:https://stackoverflow.com/a/29743394/4852737require'open-uri'download=open('http://example.com/image.png')IO.copy_stream(download,'~/image.png')我如何验证下载文件的IO.copy_stream调用是否真的成功——这意味着下载的文件与我打算下载的文件完全相同,而不是下载一半的损坏文件?documentation说IO.copy_stream返回它复制的字节数,但是当我还没有下

  3. Ruby 文件 IO 定界符? - 2

    我正在尝试解析一个文本文件,该文件每行包含可变数量的单词和数字,如下所示:foo4.500bar3.001.33foobar如何读取由空格而不是换行符分隔的文件?有什么方法可以设置File("file.txt").foreach方法以使用空格而不是换行符作为分隔符? 最佳答案 接受的答案将slurp文件,这可能是大文本文件的问题。更好的解决方案是IO.foreach.它是惯用的,将按字符流式传输文件:File.foreach(filename,""){|string|putsstring}包含“thisisanexample”结果的

  4. Get https://registry-1.docker.io/v2/: net/http: request canceled while waiting - 2

    1.错误信息:Errorresponsefromdaemon:Gethttps://registry-1.docker.io/v2/:net/http:requestcanceledwhilewaitingforconnection(Client.Timeoutexceededwhileawaitingheaders)或者:Errorresponsefromdaemon:Gethttps://registry-1.docker.io/v2/:net/http:TLShandshaketimeout2.报错原因:docker使用的镜像网址默认为国外,下载容易超时,需要修改成国内镜像地址(首先阿里

  5. ruby - 为什么不能使用类IO的实例方法noecho? - 2

    print"Enteryourpassword:"pass=STDIN.noecho(&:gets)puts"Yourpasswordis#{pass}!"输出:Enteryourpassword:input.rb:2:in`':undefinedmethod`noecho'for#>(NoMethodError) 最佳答案 一开始require'io/console'后来的Ruby1.9.3 关于ruby-为什么不能使用类IO的实例方法noecho?,我们在StackOverflow上

  6. ruby - 为 IO::popen 拯救 "command not found" - 2

    当我将IO::popen与不存在的命令一起使用时,我在屏幕上打印了一条错误消息:irb>IO.popen"fakefake"#=>#irb>(irb):1:commandnotfound:fakefake有什么方法可以捕获此错误,以便我可以在脚本中进行检查? 最佳答案 是:升级到ruby​​1.9。如果您在1.9中运行它,则会引发Errno::ENOENT,您将能够拯救它。(编辑)这是在1.8中的一种hackish方式:error=IO.pipe$stderr.reopenerror[1]pipe=IO.popen'qwe'#

  7. ruby - IO::EAGAINWaitReadable:资源暂时不可用 - 读取会阻塞 - 2

    当我尝试使用“套接字”库中的方法“read_nonblock”时出现以下错误IO::EAGAINWaitReadable:Resourcetemporarilyunavailable-readwouldblock但是当我通过终端上的IRB尝试时它工作正常如何让它读取缓冲区? 最佳答案 IgetthefollowingerrorwhenItrytousethemethod"read_nonblock"fromthe"socket"library当缓冲区中的数据未准备好时,这是预期的行为。由于异常IO::EAGAINWaitReadab

  8. ruby - 如何使用 ruby​​ fibers 避免阻塞 IO - 2

    我需要将目录中的一堆文件上传到S3。由于上传所需的90%以上的时间都花在了等待http请求完成上,所以我想以某种方式同时执行其中的几个。Fibers能帮我解决这个问题吗?它们被描述为解决此类问题的一种方法,但我想不出在http调用阻塞时我可以做任何工作的任何方法。有什么方法可以在没有线程的情况下解决这个问题? 最佳答案 我没有使用1.9中的纤程,但是1.8.6中的常规线程可以解决这个问题。尝试使用队列http://ruby-doc.org/stdlib/libdoc/thread/rdoc/classes/Queue.html查看文

  9. ruby - 如何从 ruby​​ 中的 IO 对象获取文件名 - 2

    在ruby中...我有一个由外部进程创建的IO对象,我需要从中获取文件名。然而我似乎只能得到文件描述符(3),这对我来说不是很有用。有没有办法从此对象获取文件名甚至获取文件对象?我正在从通知程序中获取IO对象。所以这也可能是获取文件路径的一种方式? 最佳答案 关于howtogetathefilenameinC也有类似的问题,我将在这里以ruby​​的方式给出这个问题的答案。在Linux中获取文件名假设io是您的IO对象。以下代码为您提供了文件名。File.readlink("/proc/self/fd/#{io.fileno}")例

  10. iOS快捷指令:执行Python脚本(利用iSH Shell) - 2

    文章目录前言核心逻辑配置iSH安装Python创建Python脚本配置启动文件测试效果快捷指令前言iOS快捷指令所能做的操作极为有限。假如快捷指令能运行Python程序,那么可操作空间就瞬间变大了。iSH是一款免费的iOS软件,它模拟了一个类似Linux的命令行解释器。我们将在iSH中运行Python程序,然后在快捷指令中获取Python程序的输出。核心逻辑我们用一个“获取当前日期”的Python程序作为演示(其实快捷指令中本身存在“获取当前日期”的操作,因而此需求可以不用Python,这里仅仅为了演示方便),核心代码如下。>>>importtime>>>time.strftime('%Y-%

随机推荐