Streaming is available in most browsers,
and in the Developer app.
-
Explore Packages and Projects with Xcode Playgrounds
Xcode Playgrounds helps developers explore Swift and framework APIs and provides a scratchpad for rapid experimentation. Learn how Xcode Playgrounds utilizes Xcode's modern build system, provides improved support for resources, and integrates into your projects, frameworks, and Swift packages to improve your documentation and development workflow.
Resources
Related Videos
WWDC23
WWDC20
WWDC19
-
Download
Hello and welcome to WWDC.
Welcome to "Explore Packages and Projects with Xcode Playgrounds." I'm Chris Miles, an engineering manager on the Xcode team.
Developers love using Xcode Playgrounds to quickly try out new ideas and to learn and explore APIs and frameworks. Today, I'm excited to tell you about enhancements we've made to Xcode 12 to make Playgrounds work seamlessly with Swift packages and frameworks in your projects. In Xcode 12, Playgrounds is fully integrated with Xcode's modern build system. That integration brings with it a lot of improvements to how Playgrounds works with packages, projects and resources.
First, I'll show you how Playgrounds now works well in Swift packages and makes a great option for package documentation containing runnable code.
Then I'm going to demonstrate improvements for Playgrounds importing project targets, like frameworks and package dependencies.
In the final part of the presentation, I'm going to demonstrate working with resources that need build support, like ML models and asset catalogs, in a playground.
So, let's get started by looking at how Xcode Playgrounds works with Swift packages.
Here I have a Swift package checked out called NutritionFacts. I'm gonna open it in Xcode by double-clicking "Package.swift." NutritionFacts provides accurate nutritional information for many common types of food. It's a really handy package to use in recipe and food apps.
What's great about NutritionFacts is the authors have included rich documentation explaining how to use the API, and, even better, the documentation was written as a playground. Playgrounds are great for documentation, and they now work seamlessly with packages in Xcode 12. So, let's take a look.
In the navigator, I'm gonna open the Playgrounds folder and select the NutritionFacts playground. Here, we see a description of the package followed by detailed documentation about how to use the API, along with example code snippets. And being a playground, we can run that code. Let's do that now by hitting "execute" in the toolbar.
The playground will execute, and on the right, we see a live view.
That is a view provided by the package. It shows NutritionFacts in a familiar format.
In the editor, we see an example of looking up an item by identifier and fetching nutritional information about it. The playground helps us understand the code by showing the result of each statement in the results bar on the right.
As you can see, Playgrounds in Xcode 12 work seamlessly with packages. This was made possible because of Playgrounds' integration with the modern build system, which knows how to build packages and make them available for importing into playground code. I recommend making use of a playground the next time you need to write documentation or a tutorial for your Swift package. Don't forget to take advantage of the fact that users can run the example code and see live results in the playground. Next, I'd like to show you how Xcode 12 makes it easier for Playgrounds to use framework and package dependencies from a project.
I'm gonna use a project called Fruta, and let's open it in Xcode and run the project in the simulator. Fruta is an iOS and Mac app for browsing and ordering smoothies. You can even purchase smoothie recipes with Apple Pay so you can make your own smoothies at home.
Let's take a look at the app in the simulator. We see a list of some of the smoothies available, and if we select one, we can see an option to buy it with Apple Pay, and scrolling up, we see some of the ingredients that are in the smoothie. Let's select one to see a detailed image, and tap the "I" button to see nutritional facts about that ingredient. Now, you may recognize this as the view provided by the NutritionFacts package. This project has a dependency on that package, as you can see in the left in the navigator.
Even as a project dependency, any playgrounds in that package will still be available. Let's take a look by opening the package and opening the NutritionFacts playground that we opened earlier. Here, we see the same content, and, like before, we can execute the playground to see the results.
So, playgrounds in packages are available whether you open the package directly, like we did at the start, or whether it's part of your project as a package dependency, like it is here. Just remember that, like all files in a package dependency, the playground will be read-only.
This project also contains a framework called UtilityViews.
Let's take a look at the project editor where we can see the UtilityView's framework target. It is now easier than before to use your frameworks in Playgrounds, as long as they are in the same workspace. Let me show you an example.
In the navigator, I'm gonna open the Playgrounds folder, and we see a playground called SmoothieLab. Let's open SmoothieLab. Notice some build activity at the top.
This was due to a new playground option in Xcode 12 called Build Active Scheme. Let me show you by opening the inspector, and here we see in the playground settings the new option at the bottom of the settings. When "Build Active Scheme" is enabled, it tells Xcode to automatically build all targets in the currently selected scheme and make any modules importable to the playground.
So, that's why when we opened the playground, Xcode built all of the targets for the currently selected scheme, and the playground could import the UtilityViews framework and the NutritionFacts package. SmoothieLab is a utility playground that we use to design new smoothies. We can select some raw ingredients, combine them in various quantities and then inspect their nutritional values.
We can even chart a calorie breakdown using a chart view from the UtilityViews framework.
Let's run the playground to see the calorie breakdown for this smoothie.
In the live view, we see a chart of the calorie breakdown. Now, we usually like to keep the fat content to be less than about 25%, so let's tap the chart to take a look at the percentages.
It's about 33% in this case, so perhaps we could lower the amount of peanut butter, but I'll work on that later.
As you can see, this playground is a handy tool for helping us to combine ingredients to design well-balanced smoothies, at least from a nutritional perspective. We haven't designed an algorithm for calculating how good a smoothie might taste, but I'm fine with that as tasting is my favorite part of smoothie design. Playgrounds can now work seamlessly with project targets, such as frameworks and packages. Xcode will take care of building those targets so that they can be imported into your playground code. Just make sure to enable "Build Active Scheme." The good news is that this option is enabled by default for new playgrounds.
Also make sure that the module you want to import is either part of the active scheme or a dependency on another target built by that scheme.
Another great enhancement in Xcode 12 is the availability of full build logs for playgrounds. Let's take a look at the build log for SmoothieLab. We'll switch to the report navigator, and at the top, we see the build log for the SmoothieLab playground. So, let's select it to open the build logs. In the build logs, we see all of the build details for the targets that were built as part of the active scheme, as well as the module that was built to support the playground, containing compiled sources and resources from within the playground. Build logs are invaluable for diagnosing build issues, and we're happy that they're available for playgrounds.
Another benefit of Playgrounds' integration with Xcode's modern build system, is that all resource types supported by the build system are now supported in Playgrounds. Let me show you an example of using some resources in a playground that require build system support.
Returning to the Fruta project, I'd like to add a feature to Fruta where the user can point their iPhone's camera at their favorite fruit, and Fruta would magically suggest smoothies that match the fruit that it could see. To do this I'd like to use machine learning to perform object detection and image classification.
There are a number of classifiers available, so I'd like to try one of them with a few sample images to see if it would be suitable for the task, and I'm going to do that using a playground in Xcode. First, let's get some sample fruit images to test with, and I think we happen to have some already available in the Fruta project.
Let's switch back to the simulator and navigate back, and remember we saw the image of some oranges before? And if we dismiss that, we see images of the other ingredients. So, let's find these in the project.
I'm gonna hide the simulator, as we don't need that anymore, expand the Shared folder, and let's take a look in the asset catalog.
We see an app icon, some color sets, and in the Ingredients folder, we see all the images representing the ingredients that we put in our smoothies. I think this will be a good sample set for testing our machine-learning model. So, let's make a copy of this by Option-dragging this out to the desktop.
And now we don't need the project open, so we'll close that, and instead create a new playground with "File," "New Playground." We use an iOS blank playground.
And let's call it ML Fruta, and store it on the desktop.
I'm gonna expand the size of the window to give us some more working space. Now, let's drag in the asset catalog into the resources folder of the playground. Notice that Playground compiles the asset catalog for us so that we can access the resources from our code.
We can do that using UIImage(named.
Specify the name of the folder within the asset catalog and the name of an image. And now let's execute the playground to see the results.
In the results, we see the width of the image, and if we QuickLook the image, we see a preview. That looks great. So, we have access to images from that asset catalog. Now we just need a machine-learning model.
I'm gonna switch to Safari where I've been browsing the machine learning section of the Apple Developer website.
In the Core ML Models page, we see a number of image classification and object detection models. I'd like to try out this model called YOLOv3, which is able to detect multiple objects in a scene and classify each of them. I think that'd be good for pointing the camera at a bowl of fruit and having it identify all the fruit that it could see. So, let's click "View models," and we can download the full-precision model at the top. To save us time, I've already done this, so let's minimize Safari.
And I'll drag in the model from the desktop into the resources section of the playground. Xcode will automatically compile the model for us, and while it's doing that, let's select the model to see details about it in the editor. An important detail is the model class up near the top here. It tells us the name of the class that will be automatically generated by the build system for use in our Swift code. So, let's make use of that now.
Returning to the playground, let's add the code to import CoreML. Then I'll call my variable yoloModel, and we'll use the class that was automatically generated for us, YOLOv3, to create a new instance with a default configuration, and we'll fetch the model property.
Let's run the playground to see the results.
We get back a model description, which we can preview here, and see details about the inputs, outputs and other properties. So, that's great. We now have a machine-learning model, and images to test with, we just need to combine them together. An easy way to do that is using the Vision framework. I'm gonna replace this code and instead import the Vision framework.
I'll define an array with three ingredients to start testing with, and then we can use our yoloModel to create an instance of the Vision framework's CoreMLModel. And with that, we can create a CoreMLRequest that we can use to make machine-learning requests on images.
Now, I'd like to visualize the results of these requests by showing each image, and drawing a rectangle around any objects detected in the image. To help do that, I've got some code that I've used in the past. Let me QuickLook it for you. It defines a couple of structs to hold the results of our object detection and some SwiftUI views that we use to show the images and draw rectangles around any objects detected in the image. We won't go into detail about this code, but we'll make it available on our session website.
So, after dragging the support code into the sources section of the playground, Xcode will compile that, and now we can make use of the code.
We'll iterate over the ingredient names, fetch an image for each one from the asset catalog, and then we'll create an image request handler with that image.
We can use that to perform our machine-learning request. Then we just need to pass the results. Each result is a RecognizedObjectObservation.
We'll enumerate the observations, we'll fetch the label with the highest confidence and the bounding box of that object within the image, and return that as a RecognizedObject.
Finally, for each image, we'll return the results as an ObjectDetectionResult, and let's preview the results that we get at the bottom.
Xcode will iterate over the three images, and then we can preview the results. Indeed we have a result for each image, so let's visualize them.
We can do that by setting a playground LiveView to the RecognizedObjectVisualizer, and pass it the results. Let's continue running.
On the right, we see the results of our object detection. The first image we passed in was of a banana, and the orange indicates that a banana was detected with 100% accuracy. So, that's a pretty good result.
The second image was of three oranges, and we can see that all three oranges were detected with high confidence. So, that worked really well.
The third image was of a bottle of almond milk, and we can see that the machine-learning model detected a bottle with pretty high confidence, although it didn't classify it specifically as a bottle of milk or almond milk. And that would be because this model has been trained to detect bottles but hasn't been specifically trained to detect bottles of milk or almond milk. And I think that's fine. For what this model's been trained for, it performed really well. I think this would be the model to use for our app. The next step would be to go ahead and train that model even further, with the types of ingredients that we wanna detect for our use case. To find out more about how to train your ML models, I suggest watching the WWDC 2019 session called "Training Object Detection Models in Create ML." So, that was an example of using resources that require build system support in Playgrounds. Let's summarize what I've covered in this presentation. In Xcode 12, Playgrounds is fully integrated with the modern build system.
When the new option "Build Active Scheme" is enabled, all targets in the currently selected scheme will be automatically built and made importable into playgrounds that are in the same workspace. Any frameworks or Swift packages can then be imported into your playground code.
All resource types the build system knows about can now be used in Playgrounds. For example, asset catalogs and ML Models, like I demonstrated.
Full playground build logs are now available in the report navigator, making it much easier to diagnose playground build issues.
Playgrounds are great for documenting packages and frameworks with runnable code. And don't forget that playgrounds are a handy tool for quickly coding up new ideas or writing utility code. I look forward to hearing how playgrounds are able to help you make your packages and projects even better. Thank you for watching.
-
-
8:53 - Playgrounds and resources Demo: Part 1
import UIKit let image = UIImage(named: "ingredient/orange")
-
10:18 - Playgrounds and resources Demo: Part 2
import CoreML let yoloModel = try YOLOv3(configuration: MLModelConfiguration()).model
-
10:54 - Playgrounds and resources Demo: Part 3
import UIKit import CoreML import Vision let ingredientNames = [ "banana", "orange", "almond-milk", ] let yoloModel = try YOLOv3(configuration: MLModelConfiguration()).model let model = try VNCoreMLModel(for: yoloModel) let request = VNCoreMLRequest(model: model) {_,_ in }
-
11:24 - Recognized Object Visualizer
import Foundation import SwiftUI import UIKit // MARK: Model /// The result of object detection on an image. public struct ObjectDetectionResult : Identifiable { public var name: String public var image: UIImage public var id: String public var objects: [RecognizedObject] public init(name: String, image: UIImage, id: String, objects: [RecognizedObject]) { self.id = id self.name = name self.image = image self.objects = objects } } /// An object recognized by an image classifier. public struct RecognizedObject : Identifiable { public var id: Int public var label: String public var confidence: Double public var boundingBox: CGRect public init(id: Int, label: String, confidence: Double, boundingBox: CGRect) { self.id = id self.label = label self.confidence = confidence self.boundingBox = boundingBox } } // MARK: Views public struct RecognizedObjectVisualizer : View { public var results: [ObjectDetectionResult] public var imageSize: CGFloat = 400 public init(withResults results: [ObjectDetectionResult]) { self.results = results } public var body: some View { List(results) { result in Spacer() VStack(alignment: .center) { RecognizedObjectsView( image: result.image, objects: result.objects ) .frame(width: imageSize, height: imageSize) Text(result.name.capitalized) Spacer(minLength: 20) } Spacer() } } } struct RecognizedObjectsView : View { var image: UIImage var objects: [RecognizedObject] var body: some View { GeometryReader { geometry in Image(uiImage: image) .resizable() .overlay( ZStack { ForEach(objects) { object in Rectangle() .stroke(Color.red) .shadow(radius: 2.0) .frame( width: object.boundingBox.width * geometry.size.width / image.size.width, height: object.boundingBox.height * geometry.size.height / image.size.height ) .position( x: (object.boundingBox.origin.x + object.boundingBox.size.width / 2.0) * geometry.size.width / image.size.width, y: geometry.size.height - (object.boundingBox.origin.y + object.boundingBox.size.height / 2.0) * geometry.size.height / image.size.height ) .overlay( Text("\(object.label.capitalized) (\(String(format: "%0.0f", object.confidence * 100.0))%)") .foregroundColor(Color.red) .position( x: (object.boundingBox.origin.x + object.boundingBox.size.width / 2.0) * geometry.size.width / image.size.width, y: geometry.size.height - (object.boundingBox.origin.y - 20.0) * geometry.size.height / image.size.height ) ) } } ) } } }
-
11:48 - Playgrounds and resources Demo: Part 4
let results = ingredientNames.compactMap { ingredient -> ObjectDetectionResult? in guard let image = UIImage(named: "ingredient/\(ingredient)") else { return nil } let handler = VNImageRequestHandler(cgImage: image.cgImage!) try? handler.perform([request]) let observations = request.results as! [VNRecognizedObjectObservation] let detectedObjects = observations.enumerated().map { (index, observation) -> RecognizedObject in // Select only the label with the highest confidence. let topLabelObservation = observation.labels[0] let objectBounds = VNImageRectForNormalizedRect(observation.boundingBox, Int(image.size.width), Int(image.size.height)) return RecognizedObject(id: index, label: topLabelObservation.identifier, confidence: Double(topLabelObservation.confidence), boundingBox: objectBounds) } return ObjectDetectionResult(name: ingredient, image: image, id: ingredient, objects: detectedObjects) } results
-
12:33 - Playgrounds and resources Demo: Part 5
import PlaygroundSupport PlaygroundPage.current.setLiveView( RecognizedObjectVisualizer(withResults: results) .frame(width: 500, height: 800) )
-
-
Looking for something specific? Enter a topic above and jump straight to the good stuff.
An error occurred when submitting your query. Please check your Internet connection and try again.