WWDC 20 – Day 1 – Platform State of the Unions

The one thing that you don’t see when you are at home following WWDC, is the “Platform State of the Unions”. This year it is streaming via the Developer App and everyone can watch it. This will be interesting to see how the non-developers react to the more detailed technology discussions.

One of the cool things that happen at WWDC, is the streaming video that is playing on the large screens when you come into each session. We are now having that with the streams… I’m digging it.

Lots of demos coming up: iOS, iPadOS, WatchOS, tvOS, and macOS. From a developer perspective I am really excited to learn more about Xcode 12, SwiftUI, and App Extensions. So now people can change their email and web browser defaults. Allowing others to use the FindMy network meal that the challenges to Apple Tags must have been real. Go to developer.appl.com/find-my.

Demo of Big Sur on a new Apple Silicon Mac. Showing the details from the keynote video. The key thing of this demo is that it shows things working, but as anyone who does demos, you know there’s the “Happy Path”. So hopefully my code will run on the test hardware. Speaking of which, I am thinking about applying for this.

Key idea behind building the chips is to customize the chip to the task at hand. The architecture of the design was done to give them the ability to do this, and this same architecture will be used to build Apple Silicon for the Mac. This will be a family of SoCs unique for the Mac. This should drive improved performance.

Balancing energy efficiency and performance is a key challenge. Years ago one of mentors showed me a heat sync for a leading edge performance chip from the late 90’s – it was almost the size of a cinderblock. So getting that balance right is always a challenge. Personally, I think Apple has done a great job with this over the years.

The new Mac will have a dedicated inference chip – meaning that AI will get much better. Additionally, dedicated caching for cloud content, improved image processing, and other improvements to GPU performance will change workflows on the Mac. If Apple lives up to the potential, things will certainly accelerate based on the architecture.

The Quick Start Program – is used to get started on the transition. The DTK will have the OS and Xcode. See software features will not be supported, register under Universal App Quick Start program. The new universal apps will have 2 versions of the code in the same bundle. This will allow the end user to not notice any differences regardless of which platform it is on.

Testing of Open Source projects has already begun after Apple ported many of them to Apple Silicon. This include Python 3, and many others. Apple has also been working with Unity, and demonstrated a version that can target both Intel and Apple Silicon at the same time. Next month Unity will release a preview version of this system, and later this year they will release for production.

Rosetta 2 will be released with new macOS Big Sur which include the ability to run intel apps. Most apps will be converted during the installation. During the example of how to do this, they went to pains to show that things would be “the same” for developers.

They showed Affinity Photo as the image editor app to show that the Rosetta2 app didn’t add any overhead. The overall approach here reminds me a lot of Wine apps, which swap out windows calls to Mac calls. I wonder if the Apple development teams looked at this model.

A key message was that if developers switched to DriverKit and ExtensionKit then your device drivers would be supported on Apple Silicon. If not, sounds like these are going to be critical for the future.

Another demo of Debian running under a ported version of Parallels running on Apple Silicon. This allowed for a version of an Apache server running to demo cross platform development environments. Apple is working with Docker to make sure they can get it running on the Apple Silicon environments.

Catalyst is the underlying process to allow for iPhone and iPad apps to run on the new Apple Silicon devices. As developer, you can choose to allow if you want your apps to run on the Mac. iOS apps will allow for some default menu options, etc. There are a couple of tricks that Apple plays. 1) They add a new wrapper for app package, so that the user can rename the app, etc. 2) The App runs in another wrapper the sanitizes the path information. iOS app extensions work “where appropriate”, for example an iOS photos extension will show up in the Mac Photos app.

macOS Big Sur – this is a huge software release.

The new design language will be provided as templates for developers. If you provide an image and text, you get new documents icon template. Apple will provide this for you automatically. I think I am liking the new layouts and design language, but I am sure that my mother in law and parents may have trouble with the auto hiding information. There are some new enhancements for SF Symbols, that I am planning on watching later this week. Apple is putting color back into many of the menus, etc.

Mac Catalyst – Apple is updating it for their own apps, so we should see new features in catalyst.

The new Mac Idiom will allow Catalyst apps to address screen views and sizes that are more like Mac Apps. A great example of this is the new version of Messages.

It is such a change that Apple is giving Big Sur a new number.

Mail showing off the new Sidebar API

Now that the iPadOS has keyboard and trackpad support. So there are new APIs with side bars, so you can slide over additional content. This allows for simplified navigation in your app. New pickers have been exposed, calendar, color, emojis, and more.

Action sheets have been moved over to new menus, this will allow for more commonality between the iPad and the Mac.

Hardware extensions on the iPad allow for more experiences for users. We then learned more about LiDAR. Scene geometry allows for building a 3D mesh of the room. The Depth Map uses the LiDAR to get much more detailed room measurements. And the positioning of virtual objects will better address people walking in front of or behind things. Apple pencil is really expanding capabilities with Scribble. Your apps should get it these features by default. Text fields automatically convert scribble to text. If you want to add additional features in your app, you will need to enable PencilKit. Drawing with Pencil is using a new CanvasKit. You can now get stoke data as the pencil is going across the screen. You can tell when the pen is on the screen verses the finger is on the screen.

iOS 14 – Widgets and App Clips are the key areas of focus.

There are only three sizes of the new Widgets. They are available in the “Today” view, but you can also add them to the home screen. They are written in SwiftUI, which makes it easier to share them across iOS, iPadOS, WatchOS, and macOS. You can archive the view of the widget which allows for better efficiency in drawing. Users can stack widgets so that you don’t lose screen space, similar to the Siri face on the Watch, stacks will put the “most important” widget on top of each stack at the right time. This is based on a timeline view that you define in your Widget archive. Sounds like Emoji Rangers may be the sample code for understanding widgets. WidgetKit is using the Intents framework. (I’ve been working on adding Siri Intents in my app, so it certainly seems like a good thing that I should get working).

App Clips – is all about getting access to things quickly. The pop-up is autogenerated by Apple based on your metadata. If the user sees the popup Apple will download the app in the background. You can also setup “offers” to encourage users to upgrade to the full version of your app. For App Clips, you can setup 8 hour notification permission, this means users don’t get the prompt and after 8 hours of the App Clip being on the device, it will automatically lose notification permission.

In App Store connect you update your privacy information. Very simple.

Apple Watch now allows for SwiftUI based complications. By setting up different identifiers for you complication you can provide a user with multiple complications from your app. Xcode 12 now supports complications previews in SwiftUI previews. Very cool! When you share a watch face it comes across as a text, and if you don’t have all the apps on your watch, it will automatically download those, and set it as the watch face on the watch.

And finally – Xcode 12 review. Document Tabs is the new way to organize your work space. Double clicking on a document and ti will show as a new tab. You can also show things like you Git Commit log. New SwiftUI templates allows you to automatically setup things easily. test coverage can also include UI responsiveness, so that you can work on performance improvements. Also added in StoreKit as a testing environment that does not require deployment fo the app.

Swift Packages now include the ability to have assets, control and previews. This should increase the ability to share code and features with others.

SwiftUI updates, prioritizing source stability, so all the new features are additive so that exiting code does not break. (YAY!!). They key is things like adding LazyVStack and LazyGrid mode so that you an reduce memory usage. There is now switch and if-then-let logic into SwiftUI. Apple added a bunch of new SwiftUI version of existing kits, like Maps, etc.

You can now define your App’s structure via SwiftUI. This is example like the new TextEditor App. So by defining the app structure in you @main section you get appropriate platform specific behavior. This should allow for native experiences across platforms.

So much to learn!

WWDC 2020 – Day 1

7:24am WWDC-time



After being able to make it to WWDC for two years in a row, it really feels strange getting ready for WWDC while sitting at home. At least I am on vacation from the day job for the week, so I am letting the excitement build, while social distancing. I am wearing my favorite shirt from last year’s WWDC, the mind blown dragon!



I am really looking forward to getting my mind blown again, with hopefully enhancements to SwiftUI, Catalyst, and Combine. These three big changes from last year’s WWDC have all be a bit rough around the edges, but exciting none the less. I will check in later to this post as we all get closer to the time!

=== 30 minutes later ===

So last year my Mac went belly up when I tried to install the latest macOS, so this year I created a new partition, installed the latest GA version of Catalina, and have put a giant background to worn me that this is the beta environment. The second thing I will do, is make sure I don’t turn on iCloud Drive. For the last three years, there have been synchronization issues with iCloud Drive with each version of macOS. So hopefully, this year, I will be safe.



The keynote is about to start.

Really enjoying the ambient music that is playing along with the keynote start video. If you look really closely you will see that all of the dots are actually Memoji from around the world. I wonder if they are doing a real-time query, or are they showing the last time people checked in. If they would zoom out, it would be easier to make out each country.

And my first streaming hiccup happened right at the beginning. But glad to see Tim Cook, start the keynote with #BlackLivesMatter and their commitment to racial justice. And a great shout out to Covid-19 heroes. They are also doing all the stream support coming out of Apple Park.

Over to Craig!

iOS – 14. Rethinking the iconic elements – starting with the home screen. Yup.. customized home screen right off the bat.

New home screen – the App Library

Start by auto sorting all the content into default folders, and notice seems like folders in folders. Jiggle mode – to tap and hide pages you no longer need. Search apps and listed from alphabetically. Most used apps are at the top level of the AppLibrary screen.

Widgets on the home screen.

Next – Widgets – reimagined widgets on the screen with different sizes, and you can take them out of the widget view to the home screen. The widget gallery allows you to see the different sizes available

Picture in picture on the iPhone! This is really starting to be cool… I am sure that my Android friends will talk about how some of the features have been their before.

Siri updates now – so let’s focus on improving the UI. A compact design to allow you to keep seeing things you may be working on while you talk to Siri.

Siri has over 25 billion requests per month. Siri is being expanded to private conversation with translate for conversation, will work completely offline – keeping your data private. Supporting 11 languages.

Messages has 40% increase in the last year with 2x usage to groups. so updating groups conversations, and pinning your most important conversations. Memoji is being updated to add ever more styles, including face covering. Inline replies with threading. This will make it easier to keep track of the conversation… and only be notified when you are mentioned. Nice to see some of these updates, reflecting things I’ve been using in slack for years.

Maps are being updated, finished rolling out the new maps earlier this year. The new changes have gotten them praises from Fast Company about UI and privacy. Adding new countries this year. Along with the new feature called Guides, but now adding features for increased use of green options, beginning with cycling. Nice feature to show when you have to carry you bike or have a steep hill to climb up. If you have an electric car they are adding EV Routing to help remove range anxiety. Additional green features for China to track when the user can drive their cars within certain congested areas.

CarPlay is now available on 97% of all new cars in the US. They are adding a bunch of new types of app-types for the car. But the rumors are true – digital car keys in the iPhone. First car to support is the 2020 BMW. Place the phone in the qi charger to start your car. You can share your key (digitally) with specific driving profiles, also enabling this feature early in iOS 13. In order to expand usage, Apple is working with the CarKeyConsortium. New cars will support in 2021.

AppStore – extending support to enable to use “app clips” get the app right when you need it. Looks like it is using NFC to get the data, and then the app has to support it. Can use Apple Pay for payments so you don’t need to give them your credit card information, and if you use Sign-in with Apple, then you don’t need to provide other information either. Very cool private focus!

Next up – iPadOS!

Now that iPadOS is separate, they can focus on the features separately. New design elements for iPadOS. There are now over 1m apps on the App Store designed just for iPad. Of course, we have redesigned widgets.

Updating photos with a new sidebar, which makes it behave a bit more like photos on the Mac. And that is being added across many of the iPad Apps – Photos, Notes, Files, Music, etc.

Yay!! Finally sort order changes in Files. Fullscreen player in music.

Siri – all the small views make it better by allowing it to put a compact notification on top of other apps. This feature will be updated for iOS too. Of course search is also updated, and now matches the macOS search feature. This will now be universal across all of iPadOS.

Apple Pencil get’s its own section. Automatically translating writing into text. Scribble recognizes multiple languages in the same line. You can copy handwriting and paste as text.

And another new section on AirPods – automatic switching between devices. Between Mac, iPad, iPhone, etc. Also adding Spatial audio in the AirPods Pro. Also, tracking your head so that they don’t shift audio sound field, when you move your head. matching motion data between head, screen and AirPods Pros. That’s a hell of a lot of calculations.

WatchOS.

Now enabling multiple complications from the same app. And as usual, updated watch faces, including improved method for searching for complications. You can also share watch faces with – Face Sharing! (what a name – didn’t we see this in Agents of Shield this season). Developers can now also add pre-configured watch faces, or even share them across social media.

Adding cycling directions to maps on the watch.

Workout app is one of the most used app on the watch. Adding Dance as a new workout. They are using adding sensor fusion to figure out the differences between arms only, arms and legs, and full body workout dancing. Changed the name of the App on the iPhone to “fitness”.

A few new features in the watch – Sleep: Adding wind down to help you get to bed on time. By adding small shortcuts and reminders to get you ready to go to bed. Screen is off when you sleep and tap for a very simple watch face. Sleep is also available on the iPhone.

Hand washing to have automatically detect when you are washing your hands. Adds features to help you wash long enough.

A whole new section in the keynote on Privacy!

  1. Data minimization
  2. On-device intelligence
  3. Security protections
  4. Transparency and control

These are the principles that Apple uses in developing their devices, services and software. There are over 200 million users for sign-in with Apple. Wish to be able to cover your accounts to Sign in with Apple. Developers can now do this.

Can change location to be approximate, rather than precise. Also adding indictors to let you know if an app is using microphone or camera. Apps must ask if they will be tracking you. Apps privacy practices will be exposed as a Nutritional label. Information will be included in all of the app stores. Really happy to see this.

In the home, all of the new features have the following requirements

  1. Ease of use
  2. Privacy
  3. Better together.

HomeKit – partnered with Amazon, Google and others to expand ecosystem. Adding automations to quickly setup new HomeKit equipment. The home app is expanded to give you a quick view of what is going on. Light app is adding adaptive lighting. HomeKit security video will now enable activity zones, so you don’t get notification when people walk by, but do when they come to your door. You can also add face recognition of who is at your door.

AppleTV – Expanding multi-user support for games, so that you can swap out players. in TVOS14 will add picture in picture across all of your services, so that you can watch the news while doing your workout. On AppleTV+ they are now working on Issac Asimov’s Foundation. I have to confess I have never read these books, but the trailer looks pretty cool.

MacOS – Big Sur

An entirely new design. This always worries me… but let’s see the video. First design elements have been cleaned up and made comparable to what you’ve seen on the iPad. Things appear and disappear so you only see them when you “need’ them. Of course the icons are different, but yet the same. Nice to see Craig is using MacPro. Looking at the design it really does look like the iPadOS look.

iWork has been updated. Funny haven’t heard that name in a long time. Many of the controls have been reworked to expose features. And Control center is now on the Mac. Notification is now exposed by clicking on the time. They are grouped like on the iPhone. Widgets are exposed in the control center. Developers can bring their own to the Mac too.

Messages on the Mac – added in search to help you find things. You can create Memoji on the Mac. Effects and Pin conversations. Looks like they have finally gotten messages current with the iPhone.

Apple Maps – another example of making it comparable to the iPhone, like: ETA, indoor mapping, and favorites.

Mac Catalyst updates – can fully utilize the native resolution of the full screen. Controls have been updated. Maps and Messages were created with Catalyst.

Safari is being updated to be 50% faster than Chrome on average. But the big thing is updates for Privacy. Adding a privacy report toolbar. Also monitoring passwords to make sure they are not compromised. Allowing for extension updates to make it easier for developers to support Safari. The privacy allows for you to restrict extensions to specific pages, and allow them to function for only one time, etc.

You can now build your own customized start page for Safari. Allowing for privacy on the web, you can see the intelligent tracking and improved information. This feature is something you can do with add ons like the EFF Privacy Badger.

Tab hovering allows you to see previews. And you have built in translation for websites on the fly. And that’s macOs!

And now we have the ARM transition. Using their own Apple Silicon. Too funny on the video to the “undisclosed location”.

Nice to see racks and racks of MacPro’s in the “undisclosed” lab. Each of their devices, iPhones, iPads, and Apple Watch, are considered “Best in Class” because of Apple Silicon. A key thing to consider is power, if apple can get to the upper left corner the Apple Mac would truly be innovative.

By building a customer system on a chip – SoCs could be optimized to their major features.

And now we can enable developers to quickly develop across all of these platforms. Many of the foundations have been enabled already in Big Sur. All of Apple’s apps are enabled as “native” apps. When you recompile you can flag to be a “universal 2” app. Microsoft and Adobe have already started this work.

All the earlier Big Sur features were demoed on the Apple SoC demo system. Craig then showed MS office, Lightroom and Photoshop running on this machine. Photoshop was a 5GB image with many layers.

And then a 4k video on Final Cut Pro with realtime color correction, animated titles and lens flare all in real time. To show the A12Z processor they had three full-res Apple Pro Res videos playing back at the same time.

In order to transition, they will do a Rosetta like feature, called Rosetta2 that automatically translates all non-native apps. Will do it during install for most apps, but will also happen just in time for websites, etc.

Virtualization options will be enabled so that you can run Linux and other environments. Demoed Maya and Tomb Raider game to show how well a translated app can run. Also demoed Parallels to demo running a server. And the one more thing is, iPhone and iPadApps will run native on the new Macs.

Quick start program – allows for developers to access DTKs Developer Transition Kits, enable to get started with new hardware chips. You must apply and units will be shipped this week!

Timeline for the transition is – by the end of this year for the 1st customers, and will complete the transition in 2 years. BUT, will still support Intel Macs going forward for the foreseeable future. Developer betas will be available today, and public betas will be available in July.

Time to code!!!!

iPad only Podcast Editing

This week, I finally did an iPad only edit of my podcast GamesAtWork.Biz. I’ve been wanting to see if I could pull it off, and after last week’s edit (which was only partially done on the iPad), I took the plunge. Thank you to the team over at Ferrite for an amazing product, and for Jason Snell over at SixColors for showing his process.

Before going thru the process, let me describe my prior workflow. I use Logic Pro X from Apple, Loopback from Rogue Amoeba, and Audacity to put the edit together, and I ask my co-host and guests to use a local recording solution like Audio Hijack from Roque Amoeba. I record myself and the remote participants in Logic Pro X, using Loopback so that my audio is one track and all others are a second track (without picking up any system sounds), while the remote participants place their audio in Dropbox. Their audio has two channels, the left is only their voice, and the right is everyone else. Once I get their files, I use Audacity to split the file into two mono tracks, level the sound, and only keep their independent track.

After putting all the audio in Logic Pro, I then trim the tracks, cut out any major screw ups we may have had, do a final leveling and compression before exporting the final edit to Forecast – in order to add the cover art, and compress the file for final uploading to my website. On a good day without any major problems, I can pull together the final edit in about 2x the length of the podcast.

Overall it’s not too bad of a workflow, but getting rid of background noise can be problematic, and not being an expert at Logic Pro X, mean I am still having issues with filters, etc. The biggest one is, when I try to use remove silence, I get too many clipped words or other audio artifacts. So I decided to try Ferrite from beginning to end.

The first thing is to make sure everyone uploads their captured audio to the same Dropbox. I changed my local process so that I am also capturing my sound directly to the same Dropbox folder.

Captured files for Episode 275

I have created a template in Ferrite that has the show intro and outro music and titles. This template has appropriate ducking for the various voice overs, but does not include any tracks for the speakers. We’ll come to that later.

Games At Work dot Biz Template

Now I have to import that audio and split the tracks so I only get the individual speaker’s tracks. Ferrite makes this pretty easy, I click on import audio, go to my Dropbox folder in the files app, and select the individual track.

Import files

This will create four new files in my main view. I select each file individual and use the share sheet and select “Covert to Mono” and the “Split to Separate Tracks” option.

Converto Mono

This create two new files in Ferrite with the original name of the file plus either “- Left” or “-Right”. I can now just delete the ones named “-Right” as they are not need.

Left and Right Files

I create a new Project using the above Template, Choose the Episode information from the created pick list (Year and Air Date), and then update the various meta data, including the art work, show title and tags. At this point I now have a base that I can add the individual tracks into to finalize the podcast.

After add in all the files and aligning the beginnings, removing any major problems sections as agreed with by my cohosts, I can select each speaker’s track and choose Strip silence.

When I am done with all the rest of the edits, I go back to the main menu and choose the share sheet again and select “Save to Dropbox”. This creates the final edit, with the show art, base tags, and the right file name. The only step left is to upload to the website so my co-host Michael Martine can do the show notes.

Export to Dropbox

Simples!

Twitter is Back – Wasted Time

It took me a while to figure out all the ins and outs of callback URLs and the new (yeah a few years old) twitter APIs. But I have it working on both the Catalyst app and the iOS app. I’ve not figured out how to get it working on the WatchOS version… but excited to have submitted the App to Apple for review. Let me know what you think!

Apple’s iPad Magic Keyboard thoughts

I’ve been a huge iPad fan since it first game out. I bought the first four versions sight unseen, and with each one I loved it more. I have the original iPad keyboard stand, that plugged into the 30 pin slot and only worked in portrait mode. When the first iPad Pro came out, I immediately picked it up, along with the Keyboard Folio. It was amazing! It showed the promise that was yet to come in the iPad – a device that not only was great for reading and consuming concept, but a true creation device.

But to do a lot of development for iOS you need the mouse. Yes, you can use text editors and write code, but the ability to run simulators, wasn’t there yet. And not many apps have added keyboard shortcuts to their code.

When the 11inch iPad Pro was released, and iPadOS was released at WWDC last year, I envisioned that xCode was coming soon.The support of a mouse for accessibility showed that something was coming, and now we see how that vision was fulfilled.

So how is the new Magic Keyboard? There are a bunch of great detailed reviews, my favorite is from Federico Viticci over at MacStories. His level of details in all his reviews is fantastic! I highly recommend you read it, in bringing together your own perspective. Having said that:

  • Is it worth the money? – Hate to give a consultant’s answer to this one, but “it depends.” What do you plan to do with it? What are you doing with your iPad today, that would require you to get a keyboard and trackpad? Do you already have a keyboard? (As I mentioned above, I’ve used the iPad Pro keyboard case from Apple).
  • What’s the most annoying thing about the Magic Keyboard? I had to think hard about this one. I am using the 11inch 2018 iPad Pro. I have an 13inch MacBook Pro from my day job, a 16 inch MacBook Pro for my personal development machine, and a 27inch iMac for fun. Each of these devices has a different keyboard. So the annoying part would be moving back and forth between the devices and taking a few seconds to get my finger positioning right. I am a touch typist, so the muscle memory takes a few minutes to kick in. The other thing is, I do touch the trackpad more on the iPad Magic Keyboard, more than I do on any other device. I feel that is because of the keyboard size. If I had the 12.9 inch iPad Pro, I don’t think I would have this issue.
  • What’s the most surprising thing? This is easy, how quickly I was able to figure out all the gestures. I was surprised that much of the experience just seemed natural.

I am hoping that the rumors that xCode is coming to the iPad are true. I am ok if it is limited functionality. I am used to that as apps come to the iPad, as long as they add functionality overtime. Just look at what Adobe has been doing with PhotoShop, or Microsoft with Office.

I don’t think the iPad is a Laptop, even with the keyboard and trackpad. It is a different experience and when I am using it, I do things differently. I feel I am much more focused, and in some cases more productive. Now if I could only have a few more of these Magic Keyboards, I would put them around the house!!!

Time keeps on slippin, splippin, slippin

I’ve been amazed by all the posts and stories from people having time to do all the things they have been putting off. They are taking the “extra” time made available during our current lockdowns, and learning new skills, playing old games, and basically just relaxing. Unfortunately, I’ve not had that ability.

On March 12th, I presided over an emergency meeting of the board and production team of the non-profit theatre group were I am currently President of the board. We were two weeks from opening night of this year’s performance, and the governor of our state had just announced that groups of 250 people or more could not meet. The group decided that we could not, keep rehearsing, and had to quickly develop a plan for how we could restart rehearsals and the production, when restrictions were removed. We’ve currently moved the show date to early August, and I am hoping that the dates hold.

I had hoped that this break would free up my evenings to relax a bit, but instead I still had to submit the group’s main yearly grant proposal. Given that this is my last year as President of the group, I’ve been working hard to document and automate many of the processes, something I had wished had been done before. Due to COVID-19, the deadline for the grant was moved to our a few weeks, which was a good thing for my automation, but meant that I filled the time with more work on the grant.

The other automation I’ve been working on is to complete the transition from Quicken to Quickbooks for the non-profit’s financial management. The amount of manual work that the group does to generate reports and budgets for each production, has been amazingly high. So transitioning to quickbooks would greatly improve the reporting, and also benefit in the generation of grant proposals.

These two activities easily filled the vacuum of time that opened due to the show shifting. (In my best Ronco – Ron Popeil voice) but wait, there’s more. I’ve been working to nominate new board members for the non-profit, cancelling all our March ad buys for the show, and rescheduling them for the August date. So that takes a bit more time.

I had planned on working on re-adding Twitter support into my app – Wasted Time. If you follow my blog, you know that last year, I completely rewrote the app in Swift UI and Combine making it available for the Mac and Apple Watch, on top of the original iPhone version. The twitter API I was using is no longer supported, and I would need to completely rewrite that part of the app, which I didn’t want to cause to delay the app. Of course, I’ve not had time to circle back and work on this, but I have plans. 😉

It seems that the extra time has been filled with extra work.

Mac App Store = Tighter checks on new app

I’ve been playing with Catalyst since WWDC. Well not so much playing with it, but thinking about it while completely rewriting my Wasted Time app into SwiftUI. The main catalyst for rewriting the app is all the frustrations I had with Storyboards and Autolayout. Every time I wanted to add a new UI element to my app, it seemed that Autolayout would freak out! I also used this opportunity to write a simple WatchOS only version of Wasted Time, and surprisingly it translated well. (It has been available on the Watch App Store since Oct.)

After writing the WatchOS version of Wasted Time, I used this as the basis for the full rewrite of the iOS app. I learned that some of the SwiftUI elements, didn’t translate well; just like some of the iOS elements didn’t work on WatchOS (Tabbed Menu Items). (Available on the iOS and iPadOS App Store since Jan. 2nd).

One of the key differences between this version on prior versions, is that I need to rewrite my Twitter code, which I have not yet done. As such, I don’t have Twitter enabled in the new version, and will probably not have this done for a few months (I am currently rehearsing for a new light opera which I will perform in late March, it consumes all my free time).

I also submitted the Mac version, using catalyst, at the same time as the iOS version; however, it has been rejected from the App Store multiple times! This has been an incredible learning experience. The most surprising aspect of the rejects is that they find one issue and immediately reject. Each of the items so far, could have been easily seen in the first submitted version. Let’s go thru them:

1) Rejection due to name mismatch. I had setup the app for the App Store as Wasted Time Mac! But when using Catalyst, it just used the name of the App in the about window. As such it was listed as “Wasted Time”. I was able to just go to iTunes Connect and rename the app, and then hit resubmit the same binary.

2) Rejection due to twitter switch not “working”. This was on me completely. I had added the UI element for the Twitter code as I started to work on Twitter. But I had not yet completed the Twitter code. So the App Store toggled the switch and nothing happened. (Unfortunately my plan was to get this work, but after half of my vacation coding time being taken away due to illness, the code wasn’t ready). Funny thing is, the iOS version has the same switch, and it passed the App Store check.

3) Reject due to help menu. When I first started working on the Mac version of the app, I did a truly native app version. I had issues with the App Store upload, I also decided it would be cooler to actually do the catalyst version. On the native version, I actually had gone in to the Storyboard and removed a bunch of menu items. On the Catalyst version, I had not removed non-functioning menu items. The basic app layout has a tab which explains how to use it. This is because on iOS devices there’s no real help system. In Catalyst you can’t use the storyboard “fix” to remove the menu items, but by using the menuBuilder function, I was able to remove all those items which do not function.

4) Forth time is the charm! Here’s the URL for Wasted Time – on the Mac!

Cracking the Code

Today’s been very productive so far:

  • I’ve been able to fix a few UI issues that have really bugged me in my Mac version of WastedTime.
  • I’ve uploaded version 9.0 of WastedTime for iOS to the App Store – the total rewrite of Wasted Time using SwiftUI and Combine
  • I’ve built both a Mac Native and a Mac Catalyst version of Wasted Time

Things I still need to do:

    Figure out how to use Twitter api with the call back URL

Year End Thoughts

Finally taking some time off and getting time to reflect on the year/decade in technology. Einstein’s idea of relative time is very appropriate for thinking about the last decade. Every year that I am on this planet, the time between years seems to run quicker and quicker.

It’s been an amazing decade for technology: We’ve scene the launch of AR and VR in a meaningful way – remember Google Glass and the original Oculus Rift? Well now we have HoloLens and Oculus Quest, and we are seeing meaningful usage in maintenance within the enterprise. We’ve seen Apple shift from Objective-C to Swift, and expansion to new ways to manage the explosion of screens (who has played with SwiftUI?).

We’ve gone from a web-based / desktop world, to a app based / mobile world. Personally, I spend more time on my iPad for reading, writing, and some content creation then I do on my MacBook Pro.

The Internet of Things has gone from hype to the trough of disillusionment. This is actually great news, as this means that it is common enough that we will now start building out those solutions that are meaningful.

Electric vehicles have gone from rare to the Tesla 3 being the best selling car in the world. The number of these that I see in the lot at the office, seems to reinforce this.

On the negative side, all this connectivity and innovation has driven an explosion of ransomware with cities, hospitals, and other government. Security and privacy continue to be under attack, as most people still do not understand how much information they leak online.

And finally, we still don’t have cheap and affordable flying cars! Well maybe this coming decade!

Status update on WastedTime (all platforms)

Over the last few weeks, I’ve been working to do two things on the WastedTime set of apps. (Wow! I can say set of Apps now).

While the basic functionality is working on all three platforms, iOS, macOS, and WatchOS, I am not happy with the behavior of SwiftUI to render the various graphics. SwiftUI has introduced some really cool images with the SFSymbols library, and using those simplifies the sizing of elements across platforms.

Well, not really, given that SFSymbols is not really supported correctly on macOS. So that means that my buttons to add and remove attendees in a meeting cannot be the same on macOS. That wouldn’t be a problem, if SwiftUI would allow me to use the Image() for buttons. My plan was to go back to my original images used in the iOS app, and just leverage them for the macOS version. Nope – they don’t render at all.

So, I decided I would go back and start working on the Twitter integration instead. This seemed like it would be pretty easy, since I had that working fine using TwitterKit on iOS for years now.

Well, not really, given that Twitter discontinued that service… So, I would have to try and use a new library. I decided to try Swifter. I had heard good things about this library, and they support both iOS and macOS. While I setup the code to enable Twitter and to create a Tweet, nothing happened. Evidently, you need to setup a callbackURL to leverage the new Twitter serviced. Having not dealt with callbackURLs at all, I took to StackOverflow to get more information. Net-net, there is no information. So, go to the gitrepo for Swifter, and again, no help. So I posted an issue to the repo, and have been in a wait state until I hear back.

Guess, I am going to have to do a lot more reading and learning. More to come…