We’ve been using UniFi Talk for a while for our home phone. It’s kinda… basic. But they got SMS support a while ago! Yay! But the SMS to email relay has a delay of usually 10 minutes. Boo! Not great when the repair guy texts that he’s 10 minutes out and you get the text 10 minutes later. But, turns out you can also be notified via Slack webhook, which happens immediately. Now we get instant delivery of SMS messages to our home number and only have to go through like three different systems. Yay?
Callisto, Jupyter and Mac Optimized Machine Learning – Part 2
In my last post, I looked at how to install TensorFlow optimized for Apple Silicon. This time around, I’ll explore Apple Silicon support in PyTorch, another wildly popular library for machine learning.
Setting up Callisto for PyTorch is easy! The suggested pip
command is
pip install torch torchvision torchaudio
And we can do that directly in the Callisto package manager. Remember, you can install multiple packages at a time by adding a space separated list, so paste torch torchvision torchaudio
into the install field and away we go!
I was looking for little example to run and compare performance of PyTorch on the Apple Silicon CPU with performance on the GPU. To be quite honest, it was difficult to find a straightforward example. Fortunately, I ran across this notebook by Daniel Bourke. Daniel works through an example training a model on both the CPU device and the MPS device. MPS is the Metal Performance Shaders backend which uses Apple’s Metal framework to harness the power of the M1’s graphics hardware. In this example, he creates a Convolutional Neural Network (CNN) for image classification and compares the performance of the CPU and MPS backends.
The bottom line? MPS is at least 10x faster than using the CPU. In Daniel’s posted notebook, he saw a speed up of around 10.6. On my machine, I saw a performance increase of about 11.1x. The best thing about optimization in PyTorch is that it doesn’t require any extra work. For Mac, the MPS backend is the default so everyone benefits from the performance boost.
In addition to TensorFlow and PyTorch, I checked some other popular Python ML libraries and to see how they took advantage of Apple Silicon. While some libraries have choosen not to pursue Apple Silicon specific optimization, all of them run correctly in CPU mode.
- Keras
- Built on TensorFlow, Keras should show significant performance improvements when you use an optimized version of TensorFlow
- FastAI
- Built on PyTorch, fastai should show significant performance improvements when you use an optimized version of PyTorch
- Scikit-learn
- To avoid the management overhead and complexity, scikit-learn doesn’t support GPU acceleration
- Numpy
- It maybe be possible to improve performance in numpy by compiling it against an optimized BLAS library which uses Apple’s Accelerate framework. The Accelerate framework provides high performance, vector optimized mathematical functions which are tuned for Apple Silicon. This is a bit involved and will require more research to see what impact this can have.
- XGBoost
- XGBoost seems to be focused on GPUs that support CUDA for hardware acceleration and currently have no plans to support Apple Silicon.
- Numba
- Numba also seems to focus only on CUDA based GPU acceleration
Callisto, Jupyter and Mac Optimized Machine Learning
We build Callisto with the mindset that Callisto is the best way to do data science on a Mac. A part of that is to helping users get the most out of their Mac hardware by using computational libraries optimized for Apple Silicon chips. TensorFlow is a very popular library for machine learning, so let’s take a look and see what it takes to use an M1 optimized version of TensorFlow with a Jupyter notebook in Callisto.
TensorFlow has a feature called PluggableDevice which let’s developers create plugins for different pieces of ML hardware. Conveniently for us, Apple has written a plugin for Metal which is heavily optimized for Apple Silicon devices like the M1 and M2 chips. Now we just have to get it installed.
You should be able to just install the TensorFlow library for the Mac and then the PluggableDevice for Metal, which you’d do with these commands:
pip install tensorflow-macos
pip install tensorflow-metal
With Callisto, you can use our fancy package manager interface and install tensorflow-macos
and tensorflow-metal
. Unfortunately, other package dependencies mean that pip won’t install the latest tensorflow-macos
, version 2.12.0, but instead, fails back one version to 2.11.0. On the other hand, pip will install the latest version of tensorflow-metal
but the PluggableDevice interface is a C API and is tightly bound to the version. While these modules installed, at runtime there’s a symbol mismatch error and the Metal plugin fails to load.
Cue montage of trying to install several permutations of these two packages.
To jump to the end, as suggested in this post on the Apple Dev Forum, more recent versions seem to have issues and falling back to tensorflow-macos
version 2.9.0 and tensor flow-metal
version 0.5.0 does work with no issues. Pip will install those versions with the following commands:
pip install tensorflow-macos==2.9.0
pip install tensorflow-metal==0.5.0
Don’t forget, you can specify versions using Callisto’s package manager right in the package field by adding the version specifier. Instead of just tensorflow-macos
, use tensorflow-macos==2.9.0
.
Now we’re up and running, let’s do some tests! We want to compare just running on the CPU versus running with the hardware accelerated Metal GPU. Here’s a little bit of code to disable the GPU accelerated device in TensorFlow:
import tensorflow as tf
tf.__version__
disable_gpu = True
if disable_gpu:
tf.config.set_visible_devices([], 'GPU')
tf.config.get_visible_devices()
When disable_gpu
is true, you should only see one CPU device in the output. When not disabling the GPU, you should see both the CPU and GPU in the output. TensorFlow doesn’t deal well with changing the visibility after the library is up and running, so to switch the state of the GPU, remember to restart your Jupyter kernel.
Now we’re ready to test! First I tried this Quickstart for Beginners from the TF website. Running this example on the CPU, it completed in 7 seconds. Enabling the GPU, it runs in 42 seconds. What, what?! It’s slower using the fancy Metal optimized GPU driver? Yep, turns out that’s right. As noted on Apple’s tensorflow-metal page, the CPU can be faster for small jobs. Well that’s a little disappointing.
Now if we look at Apple’s example on that same page, it’s got a little more heavy lifting to do. Running that on my M1 CPU, it runs in just under a half an hour at 29 minutes and 12 seconds. On the GPU, it blazes through the job in 5 minutes and 10 seconds! Cutting my run time to 1/6 of the original is defintely a solid improvement. That kind of performance spike makes all the installation headaches worth it!
With tensorflow-metal
on the cusp of a 1.0.0 release, we’re excited to see how we can integrate this into our builds and include this out of the box with Callisto, but until then, these instructions should help shepherd you through a manual install.
Christian (and others) got a raw deal a lot like the folks behind Tweetbot and Twitterific. As a small dev it’s become a huge gamble to create an app based on an API that you do not own. They can (and clearly will) rip the rug out from under you.
So glad to see the emphasis on testing in the Sync to iCloud with CKSyncEngine session! More of this please.
#WWDC
#WWDC 2023 – Keynote Post Mortem
Wow. That was a lot.
I was pleasantly surprised to see the Apple Silicon MacPro. Some folks online expressed a bit of sticker shock, but it’s a high end machine and comes with a high end price. Part of that is due to the Mac Studio. Before the release of the Studio, your desktop Mac choices were between a Mac mini and a Mac Pro. In that line up, the mini had to stretch to the mid-range and the old Intel Mac Pro started off in the upper mid range. Now the Studio is positioned to take up that “medium” position, where you’re doing serious work, but not making Avatar A20: The Final Avataring. With the Studio providing coverage for those mid-range workflows, the Pro is really only for top end jobs, especially those that require PCIe card interfaces like a fiber channel interface.
Siri didn’t get the big upgrade I was hoping for. The whole ChatGPT thing has really exploded in the last few months and that’s a little too soon for Apple to act on it in time for this year’s OS release. Siri is getting the ability to handle back-to-back requests. We’ll have to see how that plays out with the betas this summer.
Looking forward to today’s “What’s New in Xcode 15” to get a better feel for the Xcode improvements. The State of the Union seemed to focus on a lot of support for visionOS, but that may just be the shock and awe talking after the big reveal yesterday. Browsing through all the sessions, I’ve book marked 35 I’d like to see, which is a ton. Hopefully I’ll get through ten this week.
Scanning through the session topics, I see a lot of Swift / SwiftUI talks and, as you might expect, a lot of visionOS sessions. Sadly, a lot of topics I’m interested in aren’t getting a lot of attention. tvOS has one session about the Continuity Camera support. There are only a handful of sessions about iOS, iPhone, or iPad and most of those are about running your app in the visionOS environment. HomeKit isn’t mentioned at all. CarPlay and HomePod get one session each. The reality is that even at a week long, online developer conference, there’s only so much bandwidth. There’s so much for Apple to tell us about visionOS and this is their one chance, so it’s all magic googles, all the time.
Now if they just handed out some free samples.
Turns out if you register an Mac UUID on the Apple Dev Portal, but forget to set the platform to macOS, the portal decides that it is… an iPod. So I’ve got that going for me, which is nice.
California Dreamin’ – WWDC 2023
It’s about a week before WWDC 2023 kicks off in sunny California, so here’s a list of things I’m hoping to see come out of Cupertino. I’m skipping the AR/VR stuff, since it’s been speculated ad nauseam. I’m sure it’ll be cool and awesome and weird and probably not cheap. And maybe it’ll make me dizzy
TLDR; Better Xcode, Better Siri, Better Mac Catalyst
Xcode
Mostly, I just want Xcode to be better. Don’t crash. Don’t be slow. Is that too much to ask?
SPM Handling
We’ve all seen it. Switch branches, hit build and be faced with
Build operations are disabled: Package loading in progress. Please try again later.
I spend a lot of time working on Callisto, which is kind of a big app. We have dependencies split out into frameworks and it can take SPM a while to resolve all the packages when switching branches. So I hit this quite a lot. Usually, Xcode does a great job of queueing up actions when it’s busy. Like if you do a clean build and then run unit tests, it will finish the clean and then do the testing. That’s all I want for SPM resolution.
Better SwiftUI Tooling
I haven’t written a lot of SwiftUI. We toyed with it early on building Callisto, but it wasn’t ready for a big project like that. A couple months ago, we thought about trying to build a screen here or there with SwiftUI, but ran into roadblocks. One of the key features of SwiftUI is the live preview. To make those work, Xcode compiles bits of your code behind the scenes and shows them in preview pane. Callisto is a Catalyst app but has an AppKit plugin for doing Mac system things. Xcode could not handle that when trying to make a SwiftUI preview. Xcode would try to compile the AppKit code with UIKit and become very upset when it didn’t work. That left our foray into SwiftUI dead in the water.
Siri for Xcode
I’ve dabbled a bit with ChatGPT as a coding assistant. It’s great for small tasks with fiddly parameters. For instance, I needed to get the timestamp of some files. That’s the kind of thing that’s straightforward, but you don’t do it often, so you have to look up the specific API to stat the file and which attributes correspond to the creation date / modification date, etc. ChatGPT spit that code right out and I could move on to other things. But there’s friction there and the opportunity for a bespoke Xcode experience. I’m cautiously optimistic Apple will do something in this space, but I’m afraid it won’t be groundbreaking. Because, you know… Siri.
Catalyst and iOS Updates
Since Callisto is built with Mac Catalyst, these are the sorts of updates we’d love to see as developers.
Dynamic Type on Mac
Dynamic Type on iOS has been around for 10+ years. That’s the bit in iOS Settings that lets you make the text on your device a little bigger or a lot bigger than standard. All the apps that support Dynamic Type will pick up the change and text across the whole device changes size. That’s great! A boon for aging eyes everywhere.
But it isn’t supported at all on macOS. This recent announcement about new accessibility features coming in iOS 17 mentions
For users with low vision, Text Size is now easier to adjust across Mac apps such as Finder, Messages, Mail, Calendar, and Notes.
That sounds a lot like Dynamic Text on the Mac, so fingers crossed.
Auxiliary Window Replacement
Back in the day, lots of Mac apps used auxiliary windows that served as things like inspectors and floating tool panels. Apps like Photoshop had a bunch of these – paintbrushes, color panels, etc. You can still see these in AppKit apps like Preview and Quicktime Player, where the inspector is an auxiliary window. When you switch apps, it disappears to cut down on clutter. Right now, there’s nothing like that for Catalyst. The UIScene API for managing windows doesn’t have that level of distinction. Hopefully, at WWDC we’ll see a more mature Stage Manager implementation and at least some way of distinguishing main content windows and helper windows.
Multiple Processes on iOS
Callisto includes an embedded Python distribution for running the Jupyter Python Kernel. On the Mac, we just spawn a separate process and run Python normally. On iPad, we use a heavily modified Python to run in the same process space as the app to comply with App Store requirements. In practice, that hamstrings Callisto on the iPad a great deal. With a renewed interest in Pro apps on the iPad, it’d be incredible if we were allowed to run multiple processes and level the playing field between an M1 iPad and an M1 MacBook.
Yeah, never going to happen.
Other Stuff
Passwords
A lot of folks have called for a dedicated Passwords app instead of burying password management in Settings. I’m all for that, but I’d also love to see password sharing via your Apple Family. That’s really the last feature I’d need to leave 1Password behind. 1Password has been fantastic, but they’ve really pivoted to a corporate focus, so they’re less of good fit when I just want to share the Netflix password with my family. (Only for persons in my immediate family, residing in the same household. Netflix, if you’re reading this, I promise.)
HKSV Streaming API
I’m pretty heavily invested in the HomeKit ecosystem and I’m mostly happy with it. (I’m looking at you Siri.) I’ve got several third party apps that will stream video from my cameras, but they can only show live feeds. HKSV (HomeKit Secure Video) records events from these cameras and saves to iCloud, but scrolling back through time is only available via Apple’s Home app. Third party apps aren’t able to access the history of clips because there’s just no API for it. With HomeKit (hopefully) maturing a little more this year, Apple should make that available to developers.
Better Siri for Mac
Yes, macOS has Siri. But as underwhelming as Siri is on iOS, it’s worse on Mac.
In my notes, I wrote down “Not Brain Dead”. Siri just fails at the most basic things sometimes. When I try to open an app in my Applications folder with the phrase “Open AppName”, it fails maybe 2 out of 3 times. I’m not sure why it’s so bad and I don’t know of any way to debug it.
I usually use my laptop with the lid closed with external monitor, keyboard, webcam, etc. That means I can’t use “Hey Siri” for security reasons. I agree being able to turn off an always-on microphone is important, but if Siri is a serious feature, why can’t I explicitly grant permission for Siri to listen with an external microphone? It is interesting to note that “Hey Siri” does work with the ($1600) Mac Studio Display. That’s attributed to the onboard A13 chip.
Another bit of low hanging fruit for Siri is menu commands. At least it seems low hanging to me. If I’m using an app like Xcode, that has a menu command called ‘Build’, I would expect Siri to understand if I said “Hey Siri, Build”. But like the double meat burger, Siri is strictly off the menu.
But the real dream is a conversational Siri. We’ve all seen Iron Man. We’ve seen Tony talking to Jarvis like a person as he works. Could Siri ever do that in the context of Xcode? Playing with ChatGPT has teased this kind of reality. “Write a method to delete all the files in a given directory.” That’s a thing ChatGPT can do in a web browser window. Why can’t Siri to do that in Xcode? Imagine an Apple trained LLM that especially knew about Swift, UIKit, and all the Apple technologies. Add in the context of the project you’re currently working on. Talk about a developer accelerant. But that’s a big leap, so it’s doubtful we’ll see that this year, but maybe the first step?
Things We Won’t See at WWDC
Apple Silicon Mac Pro
Apple announced Apple Silicon for Mac at WWDC of 2020, three years ago, and the first commercial M1 Macs shipped in November of 2020. At the time, Apple estimated that the transition to Apple Silicon would take about two years. It’s been two and a half years since that first M1 MacBook Air shipped but the Mac Pro still sports an Intel chip. Something has clearly run amok.
But the rumor mill is pretty quiet on the Mac Pro front. Usually at this point, there would be at least some mention of the debut of a high profile, flagship Mac. It seems like we’ll be waiting until later in the year to see what kind of behemoth you can build out of lots of iPhone chips.
HomePod + AppleTV
Years ago, HomePod ran its own fork of iOS called AudioOS. Apple merged AudioOS with tvOS several revisions back so now both these home devices run tvOS. Because they do similar things – played media and power HomeKit – that makes a lot of sense. But the two devices remained separate at a hardware level – a smart speaker and a TV streamer. I’ve been waiting for the hybrid devices for years now. There’s the HomePod with a screen, like an Amazon Echo Show, that does HomePod things, but augmented with a display. And there’s the AppleTV with a mic and speaker. The mic let’s you talk to the TV without holding the button on the remote and the speaker acts as a center channel for multiple HomePod sound stage.
Those are both consumer products, so not the kind of thing they’d squeeze into the WWDC Keynote, especially with the AR/VR announcement, but maybe this fall, just in time for Christmas.
Will we see any of this at WWDC? I certainly hope so. But there’s only one way to find out.
No sleep ‘til DubDub!
Got some spam email today promising to use AI to increase my sales meetings by 10x! Thank you AI, you know just what I want. Coming soon – recreational root canals, weekend tax audits and more sauerkraut.
Pricing schemes for streaming services are weird. Most of them charge more for higher quality streams. So you get the same content, just an inferior version of it. It’s the equivalent of going to the Louvre and being allowed closer to the Mona Lisa if you paid more.
"There's something so human about taking something great, and ruining it a little so you can have more of it." -- Michael, The Good Place
About Final Cut for iPad… so many MacRumors commenters are writing it off because it’s a subscription - $5/mth or $50/yr. But Mac FCP is $300! So the part where Apple “wins” on this is at 6 yeas. If FCP is so good that you’re still using it in 6 years, then they totally deserve the money. It also means dabblers like me can pay $10 a year and do the two little hobby projects that come up where multi-cam would be handy. I can only imagine the fury if Apple had dropped a one-time $300 iPad app.
Better App Launching with SwiftUI for Unit Tests
TL;DR -- In SwiftUI, use a fake testing `App` instead of your real `App` to make sure you're actually testing your code.
For unit testing, test coverage is an important metric. How much of your code base are you exercising during unit tests? Xcode is, unfortunately, not too bright when it comes to measuring coverage. It doesn’t have the intelligence to know if you’re “testing” a line of code, just that the line of code was executed. This is a problem right off the bat.
I’ve got a new app project in Xcode. The app does a little and I want to start adding tests before it gets too big. So I add a unit test target and Xcode plops in some empty tests. I run the unit tests, and they all pass since they’re empty. I check the coverage and it’s at 36%! How can that be, I didn’t test anything?
Xcode, the blissful idiot, is really reporting that 36% of the code was executed while running the unit tests. Usually, when an app starts up, it does some bootstrapping. You might set up your persistent storage, draw a couple of Views on screen, and maybe talk to the network. Xcode counts all that as “testing” because it ran during a unit test.
To make the test coverage more accurate, we need to do as little as possible outside of our actual unit tests. Jon Reid has a write-up of how to do this with a UIKit app by swapping out the app delegate during tests. But SwiftUI introduces a whole new startup sequence so we need a new approach for SwiftUI’s new app lifecycle.
After some futzing around, turns out it’s easy!
struct TestApp: App { // 1
var body: some Scene {
WindowGroup {
Text("I'm running tests!")
}
}
}
@main // 2
struct TestDriver {
static func main() {
if NSClassFromString("XCTestCase") != nil { // 3
TestApp.main()
} else {
MyRealApp.main()
}
}
}
There are three key points that make this work:
- We need a dummy `App` struct to use instead of the real app. This simple stand-in circumvents all your usual app startup machinery. Instead of all the normal bootstrapping, we'll just get a window with the text "I'm running tests!".
- Remove the `@main` from your `App` implementation and add it here to `TestDriver`. Swift uses `@main` to figure out how to start your app. The `App` protocol provides a default implementation that, according to the docs, 'manages the launch process in a platform-appropriate way'. But by inserting our own wrapper layer here around, we can control _which_ `main()` is called.
- That brings us to the final point, use the good old `NSClassFromString` to decide if the testing bundle has been injected into our process. `XCTestCase` is only available during testing, so this is a reliable way to decide if unit testing is underway. Based on that, we can call the `main` method of either our real app or our testing stand-in. It turns out that the default implementation of `main` knows to use its parent struct to bootstrap the SwiftUI app.
Now when I run unit tests, my coverage is at 0.8%! That’s more like it. In order to boost my test coverage, I now have to actually test code. And the code coverage metric really starts to mean something.
Life, uh… finds a way
Kindle Unlimited porn discovered by parents; Apple ‘concerned’
Unit testing with UIDocumentPickerViewController – An Un-Googlable Bug
TLDR – If your unit tests crash with DocumentManager service tried to send a message to a deallocated host proxy
, make sure you’re dismissing any presented instance of UIDocumentPickerViewController.
In our Callisto Xcode project, we’ve got a lot of unit tests. Like over a thousand. We’re at the point where if tests have a 0.1% chance of failing, then it happens every time. Our tests need to be really rock solid, or there’s no way we’ll get a clean run which is the only way CI will let a build through.
Some time ago, we started noticing the occasional test failure with an uncaught exception:
DocumentManager service tried to send a message to a deallocated host proxy
It would crop up when running test both locally and in CI. With so many tests, we get the weird edge case now and then, but they’re not worth the time to track down. We ignored it. With the recent update to Xcode 14.3 and macOS 13.3, the DocManager exception went from occasional annoyance to ‘omg, this happens every time I unit test on iOS’. So now we have to fix it.
But what’s a DocManager? IDK – there’s nothing with that name in our code and nothing in the docs about an Apple framework called DocManager. Looking through the traceback, it’s pretty obvious that it’s some kind of internal Apple thing. Surprisingly, a Google search for this error returns absolutely no results. That’s never a good sign.
But what’s a DocManager do? Callisto is UIDocument based, so maybe DocManager manages documents? A bunch of the unit tests open instances of a UIDocument and aren’t 100% about cleaning those up, so maybe some housekeeping will help. Cue the plumbing montage. We added a some bookkeeping to make sure that any open UIDocuments were closed at the end of each test, so now we’re sure there are no dangling UIDocuments. No impact – still the DocManager throws the exception.
This particular problem is a huge pain to track down. Somewhere in the tests, things get into a bad state. Later, while another test is running, some background thread discovers the bad state and throws an exception. The cause of the crash and the actual crashing are quite loosely coupled, making it hard to pin down the culprit. It also means that the offending tests will run just fine by itself, but only cause a crash when a large number of tests run, giving the background issue time to percolate to the surface.
After a couple hours of testing the tests and narrowing down which ones fail, it started to look like UIDocumentPickerViewController might be involved. We’ve got some ViewControllers that open a docPicker and get feedback via the docPicker’s delegate methods. To test those, we programmatically tap a button to cause the dockPicker to be presented, get a handle to that docPicker and call the delegate with the docPicker and some fake results. This works great for testing! Except for that pesky exception that gets thrown now and then.
If we take a closer look at the exception, we’ll see there’s a little more info attached in the userInfo dict, specifically a file and line number, DOCRemoteViewControlller.m line 42. Not that we have access to the source, but the name of the file, “Remote View Controller”, does offer a hint of its purpose. Those docPicker view controllers offer our app an escape hatch out of the sandbox and into the rest of the file system on the device. In my limited understanding, the docPickers interface with a separate process that manages file system access, so they actually represent some state from another (remote) long lived process. In the end, these dangling, un-dismissed docPickers were the root of the problem. Make sure all your UIDocumentPickerViewControllers are dismissed properly and the problem goes away.
Whew!
MarsEdit 5
Tried to pay for an upgrade to MarsEdit 5 this morning, but it wouldn’t let me. The web store said I had purchased MarsEdit 4 too recently, and it gave me a free upgraded to version 5. Nice!
Greg Garcia’s Sprung on Amazon Prime Video is quite good. He’s the creator behind My Name is Earl and Raising Hope. If you liked those, Sprung will be right up your alley. Some familiar faces too. So good.
A New Watch!
I took the plunge and ordered a new Ultra Apple Watch sight unseen. I’m very excited for a new wrist computer, especially with a big orange button! Seems weird though that there’s no titanium watch band to dress it up. I guess Apple is going full tilt on the outdoor adventure lifestyle to start. The fancy matching titanium band will probably show up in the spring.
(I desperately want to make an Ultraman joke, but obviously I’m no Ultraman.)
Holy Shift! Koenigsegg’s New Transmission Is a 6-Speed Manual and a 9-Speed Automatic
I always wondered if someone would build a clutch-by-wire system like this where there’s a gear shift and a clutch, but not physically linked to the drivetrain. This Koenigsegg gearbox sounds like just the thing. The problem now is that we’re all switching to gear-less electric motors, so will this come to electrics too? Make ‘em feel like a Miata? I guess Dodge is going to try with the eRupt.
Holy Shift! Koenigsegg’s New Transmission Is a 6-Speed Manual *and* a 9-Speed Automatic:
"We still are in the process of developing it, but it's already crazy good. When we are done with it, I don't think anyone will be able to tell it apart from a traditional manual. That's the objective. It should feel like a mix between a Mazda Miata and a Ferrari gated shifter. The best of the two worlds."
Serendipity
On the occasion of a blind squirrel finding a nut.
Wordle 426 2/6
⬜🟩⬜🟨⬜
🟩🟩🟩🟩🟩