AtelierClockwork

WWDC23 Week 4

July 2, 2023

Week 4 Progress

Progress report: 126 of 177 videos watched and summarized, or 71.0%. I managed to keep a slightly higher average of 4.14 videos compared to last week. More of the features in sessions are starting to fall into the “things to keep an eye on for future projects” bucket than things I’ll use day to day, but it’s good to have a mental index of what’s out there.

Build robust and resumable file transfers

This session covers both how to use the existing capabilities to resume a failed download, and also how to use resumable uploads. Most of the support for this is built into URLSession, and supporting resuming uploads will require a server that supports it as it’s using a brand new specification.

Design dynamic Live Activities

This session covers lots of the considerations about crafting a live activity that will fit into the dynamic island well, and also maintain the personality of your app when place in that space.

Design considerations for vision and motion

This session explains how to design for visionOS in a way that won’t make the user uncomfortable. It focuses on how to avoid eyestrain by sending the right messages about depth to the user, and understanding which eye movements take more energy. It also covers how to move elements on screen to avoid triggering motion sickness.

Keep up with the keyboard

This session explains the upcoming changes coming as the keyboard moves out of process with the currently running app and more of the operations that it performs are asynchronous. It also shows off the improvements to the keyboard layout guides, and how to handle the keyboard in contexts like Stage Manager where the app isn’t full screen.

Create a great ShazamKit experience

This session goes over improvements to ShazamKit, in particular it adds the ability to use a managed session for audio recognition to make the setup for use in app much easier.

What’s new in VisionKit

Lift subjects from images in your app

These sessions show off the improvements in VisionKit, in particular the ability to recognize and lift the subjects out of images. Given how hard it used to be to manually do this, getting a decent mask of the subject in an image for free is quite impressive.

Integrate your media app with HomePod

This session explains how to integrate your media app with HomePod, including how commands are routed to phones to help understand potential issues with that handoff process.

What’s new in StoreKit 2 and StoreKit Testing in Xcode

Explore testing in-app purchases

What’s new in App Store server APIs

Meet the App Store Server Library

These sessions all cover how to integrate with the App Store, both on the app side and server side. It’s nice to see that there’s more tools for testing, and that there’s a library to help with server implementation that’s available in Swift, Node, and Python.

Explore enhancements to App Intents

This session shows how to provide more data to app intents, and how to use the new features like providing the name of produced data at the end of a step in Shortcuts to make it easier to reason about the flow of data.

What’s new in App Clips

This session goes over improvements to App Clips. The interesting news is that the clip size is now 50 mb for digital invocations, but still 15 mb physical invocations. There’s now also default app clip links that are hosted on an Apple domain so that you can create an app clip link without having to perform all of the set up work on a live domain.

Meet Core Location for spatial computing

This is a session explaining how Core Location works on visionOS. It only supports while in use location data, and “in use” is when the app is being looked at or is in the users peripheral vision, which is interesting.

What’s new in Wallet and Apple Pay

This covers all fit new features in Wallet and Apple Pay. It’s interesting to see how may categories Apple Pay is moving into. Including allowing transferring funds There is also improvements to order tracking improvements to integrate more deeply into the system. The ID information is also interesting, but is of limited use to me until there is digital ID support in the EU.

What’s new in Core Motion

This session covers new Core Motion features, most of which are focused on watchOS. Newer watches can capture high frequency motion data, and also data related to water depth and temperature.

Update Live Activities with push notifications

This session explains how to use push notifications to update Live Activities, including important concepts like how different priority levels of notifications are handled and how the system may throttle updates.

What’s new in Background Assets

This session describes improvements to background assets. Among other things, how to mark assets as “essential” so they will attempt to download before the app can be launched, but if the download fails the app can be launched without those assets.

Discover streamlined location updates

Meet Core Location Monitor

These sessions show off the improvements to Core Location, in particular there is a new API for subscribing to updates that returns them as an AsyncSequence and attempts to simplify the process of monitoring updates, including more robust background tracking.

Meet Assistive Access

This session describes how Assistive Access works, and what you can do to improve support for assistive access in your app by adding configuration keys to let the system know your app has a layout that will work with the large bottom back button that’s always on screen in Assistive Access.

Extend Speech Synthesis with personal and custom voices

This session shows how to add annotations to better control speech synthesis, and also what you need to do to request permission from the user to use a personal voice.

Create accessible spatial experiences

This session went over the new accessibility needs that are specific to visionOS, including how to allow app hand gestures, and how to integrate RealityKit content into the accessibility system.

Build accessible apps with SwiftUI and UIKit

Lots of detail about basic accessibility features, the most interesting detail in the session was the ability to use closures to set UIKit accessibility values so that they’ll be evaluated on read and therefore always up to date.

What’s new in App Store pre-orders

What’s new in App Store pricing

Explore App Store Connect for spatial computing

Not much of interest here, having per-region pre-orders is nice, and it went over the extended pricing options that were previously announced, and did a nice job of explaining how to manage different pricing options.

Modeling Progress

Evangelion Unit 0 is close to complete, the last rack of parts are painting and waiting for assembly. After the kit is assembled, there’s a bunch of stickers + weathering to do to make the kit look finished. Plus I need to figure out what to do for a diorama for the kit.

Assembled, Front Assembled, Side