2022年2月26日 星期六

ARKit(SceneKit) 臉部貼圖

  1. 修改Tracking and Visualizing Faces範例中的貼圖『wireframeTexture.png』
  2. 指定ARSCNFaceGeometry的貼圖

    faceMesh?.firstMaterial?.diffuse.contents = UIImage(named: "wireframeTexture")

ARKit(SceneKit) 臉部表情追蹤

  1.  建立臉部追蹤
  2. // Create a session configuration

     let configuration = ARFaceTrackingConfiguration()

    // Run the view's session

    sceneView.session.run(configuration)

  3.  建立臉部模型
  4. func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {

        if anchor is ARFaceAnchor {

            let faceMesh = ARSCNFaceGeometry(device: sceneView.device!)

            faceMesh?.firstMaterial?.lightingModel = .physicallyBased

            let node = SCNNode(geometry: faceMesh)

            return node

        } else {

            return nil

        }

    }

  5.  更新臉部表情
  6. func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {

        if let faceAnchor = anchor as? ARFaceAnchor, let faceGeometry = node.geometry as? ARSCNFaceGeometry {

            faceGeometry.update(from: faceAnchor.geometry)

        }

    }

2022年2月11日 星期五

RealityKit 判斷手指觸摸物件

  1. 用Reality Composer編輯物件『屬性』,勾選『物理效果』『加入』
  2. 將ARFrame.capturedImage轉成CVPixelBuffer

    func session(_ session: ARSession, didUpdate frame: ARFrame) {

            guard let arView = arView else {return}

            // Do not enqueue other buffers for processing while another Vision task is still running.

            // The camera stream has only a finite amount of buffers available; holding too many buffers for analysis would starve the camera.

            guard currentBuffer == nil, case .normal = frame.camera.trackingState else {

                return

            }

            // Retain the image buffer for Vision processing.

            let currentBuffer = frame.capturedImage

    }

  3. 利用Vision偵測食指

    let requestHandler = VNImageRequestHandler(cvPixelBuffer: currentBuffer, orientation: orientation)

    visionQueue.async {

            do {

                    // Perform VNDetectHumanHandPoseRequest

                    try requestHandler.perform([self.handPoseRequest])

                    // Continue only when a hand was detected in the frame.

                    // Since we set the maximumHandCount property of the request to 1, there will be at most one observation.

                    guard let observation = self.handPoseRequest.results?.first else {

                        return

                    }

                    // Get points for index finger.

                    let indexFingerPoints = try observation.recognizedPoints(.indexFinger)

                    // Look for tip points.

                    guard let indexTipPoint = indexFingerPoints[.indexTip] else {

                        return

                    }

                    // Ignore low confidence points.

                    guard indexTipPoint.confidence > 0.3 else {

                        return

                    }

                    

                    

            } catch {

                    print("Error: Vision request failed with error \"\(error)\"")

            }

    }

  4. 將Vision的座標轉成ARView的座標

    // Convert points from Vision coordinates to ARView coordinates.

    let point = indexTipPoint.location

    let tip = CGPoint(x: point.x * width, y: (1 - point.y) * height)


sample code

2022年2月10日 星期四

RealityKit 判斷物件點擊方法2

  1. 用Reality Composer編輯物件『屬性』,勾選『物理效果』『加入』
  2. 替arView加上手勢功能

    arView.addGestureRecognizer(

          UITapGestureRecognizer(

                target: xxxx,

                action: #selector(XXXX.handleTap(recognizer:))

        )

    )


  3. @objc func handleTap(recognizer: UITapGestureRecognizer) {

            let tapLocation = recognizer.location(in: arView)

            let entity = arView.entity(at: tapLocation)

            guard entity == boxAnchor?.steelBox else { return }

            // entity 即點擊物件

    }

RealityKit 判斷物件點擊方法1

 直接用Reality Composer編輯物件行為

  1. 選取物件
  2. 顯示行為(⌥⌘B)
  3. 加入行為『點按與反轉』