2022年6月15日 星期三

Swift Decodable 找不到key時,使用預設值

使用JSONDecoder時,如果找不到Key值,錯誤如下

▿ DecodingError
  ▿ keyNotFound : 2 elements
    - .0 : CodingKeys(stringValue: "b", intValue: nil)
    ▿ .1 : Context
      - codingPath : 0 elements
      - debugDescription : "No value associated with key CodingKeys(stringValue: \"b\", intValue: nil) (\"b\")."
      - underlyingError : nil

其實只要利用KeyedDecodingContainer的decode(type:forKey)就可以解決這問題

protocol Init {
    init()
}

extension KeyedDecodingContainer {
    func decode<T: Codable & Init>(_ type: T.Type,
                forKey key: Key) throws -> T {
        try decodeIfPresent(type, forKey: key) ?? .init()
    }
}

之後只要將Codable後面,加上Init這個protocal就可以了

enum B: Int, Codable {
    case one = 1
    case two = 2
}

extension B: Init {     init() {         self = .one     } }

2022年3月17日 星期四

用swift讀寫NTag


  1. info.plist中,加入Privacy - NFC Scan Usage Description
  2. Capabilities 加入Near Field Communication Tag Reading
  3. 掃描

    func startScan() {

            let session = NFCNDEFReaderSession(delegate: self, queue: nil, invalidateAfterFirstRead: false)

            session.alertMessage = "Hold your iPhone near an NFC transit card."

            session.begin()

    }

  4. 連線,更新payload

    func readerSession(_ session: NFCNDEFReaderSession, didDetect tags: [NFCNDEFTag]) {

        guard

            let sms = "場所代碼:111111111111111 本次實聯簡訊限防疫目的使用。".addingPercentEncoding(withAllowedCharacters: .urlHostAllowed),

            let payload = NFCNDEFPayload.wellKnownTypeURIPayload(string: "sms:1922&body=\(sms)"),

            let tag = tags.first

        else {

            session.invalidate(errorMessage: "Could not process tag.")

            return

        }

        session.connect(to: tag) { error in

            guard error == nil else {

                session.invalidate(errorMessage: "Could not connect to tag.")

                return

            }

            tag.queryNDEFStatus { status, capacity, error in

                guard error == nil else {

                    session.invalidate(errorMessage: "Could not query status of tag.")

                    return

                }

                

                switch status {

                case .notSupported:

                    session.invalidate(errorMessage: "Tag is not supported.")

                case .readOnly:

                    session.invalidate(errorMessage: "Tag is only readable.")

                case .readWrite:

                    let messge = NFCNDEFMessage.init(records: [payload])

                    tag.writeNDEF(messge) { error in

                        if error != nil {

                            session.invalidate(errorMessage: "Failed to write message.")

                        } else {

                            session.alertMessage = "Successfully configured tag."

                            session.invalidate()

                        }

                    }

                    

                @unknown default:   session.invalidate(errorMessage: "Unknown status of tag.")

                }

            }

        }

    }


PS.找不到 Near Field Communication Tag Reading ,請切換開發者帳號到付費帳號

2022年3月5日 星期六

ARKit(SpriteKit) 臉部角度追蹤


  1. 建立😘 。由於前鏡頭左右相反,所以X軸要反向。

    let labelNode = SKLabelNode(text: "😘")

    labelNode.horizontalAlignmentMode = .center

    labelNode.verticalAlignmentMode = .center

    labelNode.xScale = -1

  2. 須將😘放入SKTransformNode,做後續的3D轉換。

    let face = SKTransformNode()

    face.addChild(labelNode)

    self.face = face

  3. 更新時,SKTransformNode再依據ARFaceAnchor旋轉

    let rotate = faceAnchor.transform.rotate

    face?.setRotationMatrix(rotate)

  4. 最後,記得將node轉回來,否則貼圖會是反的

完整程式

2022年2月26日 星期六

ARKit(SceneKit) 臉部貼圖

  1. 修改Tracking and Visualizing Faces範例中的貼圖『wireframeTexture.png』
  2. 指定ARSCNFaceGeometry的貼圖

    faceMesh?.firstMaterial?.diffuse.contents = UIImage(named: "wireframeTexture")

ARKit(SceneKit) 臉部表情追蹤

  1.  建立臉部追蹤
  2. // Create a session configuration

     let configuration = ARFaceTrackingConfiguration()

    // Run the view's session

    sceneView.session.run(configuration)

  3.  建立臉部模型
  4. func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {

        if anchor is ARFaceAnchor {

            let faceMesh = ARSCNFaceGeometry(device: sceneView.device!)

            faceMesh?.firstMaterial?.lightingModel = .physicallyBased

            let node = SCNNode(geometry: faceMesh)

            return node

        } else {

            return nil

        }

    }

  5.  更新臉部表情
  6. func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {

        if let faceAnchor = anchor as? ARFaceAnchor, let faceGeometry = node.geometry as? ARSCNFaceGeometry {

            faceGeometry.update(from: faceAnchor.geometry)

        }

    }

2022年2月11日 星期五

RealityKit 判斷手指觸摸物件

  1. 用Reality Composer編輯物件『屬性』,勾選『物理效果』『加入』
  2. 將ARFrame.capturedImage轉成CVPixelBuffer

    func session(_ session: ARSession, didUpdate frame: ARFrame) {

            guard let arView = arView else {return}

            // Do not enqueue other buffers for processing while another Vision task is still running.

            // The camera stream has only a finite amount of buffers available; holding too many buffers for analysis would starve the camera.

            guard currentBuffer == nil, case .normal = frame.camera.trackingState else {

                return

            }

            // Retain the image buffer for Vision processing.

            let currentBuffer = frame.capturedImage

    }

  3. 利用Vision偵測食指

    let requestHandler = VNImageRequestHandler(cvPixelBuffer: currentBuffer, orientation: orientation)

    visionQueue.async {

            do {

                    // Perform VNDetectHumanHandPoseRequest

                    try requestHandler.perform([self.handPoseRequest])

                    // Continue only when a hand was detected in the frame.

                    // Since we set the maximumHandCount property of the request to 1, there will be at most one observation.

                    guard let observation = self.handPoseRequest.results?.first else {

                        return

                    }

                    // Get points for index finger.

                    let indexFingerPoints = try observation.recognizedPoints(.indexFinger)

                    // Look for tip points.

                    guard let indexTipPoint = indexFingerPoints[.indexTip] else {

                        return

                    }

                    // Ignore low confidence points.

                    guard indexTipPoint.confidence > 0.3 else {

                        return

                    }

                    

                    

            } catch {

                    print("Error: Vision request failed with error \"\(error)\"")

            }

    }

  4. 將Vision的座標轉成ARView的座標

    // Convert points from Vision coordinates to ARView coordinates.

    let point = indexTipPoint.location

    let tip = CGPoint(x: point.x * width, y: (1 - point.y) * height)


sample code

2022年2月10日 星期四

RealityKit 判斷物件點擊方法2

  1. 用Reality Composer編輯物件『屬性』,勾選『物理效果』『加入』
  2. 替arView加上手勢功能

    arView.addGestureRecognizer(

          UITapGestureRecognizer(

                target: xxxx,

                action: #selector(XXXX.handleTap(recognizer:))

        )

    )


  3. @objc func handleTap(recognizer: UITapGestureRecognizer) {

            let tapLocation = recognizer.location(in: arView)

            let entity = arView.entity(at: tapLocation)

            guard entity == boxAnchor?.steelBox else { return }

            // entity 即點擊物件

    }

RealityKit 判斷物件點擊方法1

 直接用Reality Composer編輯物件行為

  1. 選取物件
  2. 顯示行為(⌥⌘B)
  3. 加入行為『點按與反轉』