So, I did it, I wrote my first ever macOS application. It is WastedTime for Mac, and I am amazed it works. It is still beta, and I am going to add in the twitter integration. I also need to add keyboard short cuts so you can you the (+) and (-) keys on your keyboard to add and remove attendees, but it works!
The most amazing part was that I could take my WatchOS app, and extend the project to add a new target, and reuse 90% of the program logics. Once I have the macOS version completed I will recreate the iOS and iPadOS version. Wish me luck!!
oh, and by the way, here’s a picture of the app – yeah I could really use a designer:
Now that WastedTime Watch! Has been released, and so far, I’ve not heard any new bug reports, or even seen any crash reports, I’ve started re-writing WastedTime completely. What I mean by that is, I am working on converting the iPhone/iPad version to a Catalyst application. I will then release a Mac version.
What does this mean? Well I’ve been using Twitter’s own Swift based APIs that were made available a few years back via Cocoapods. That has been killed a while back, and it certainly is not supported on the Mac. This means I am learning a whole new API to support the Twitter integration. It is called Swifter.
The biggest transition, however, is covert in from RxSwift to Combine. Luckily, when I wrote WastedTime Watch!, I did it from the ground up, so I had written a Combine version of most of the logic. So hopefully I will get a few hours each week for the next few weeks to complete the re-write.
After much struggle and gnashing of teeth, I had a few people help me on StackOverflow! And I am proud to announce that the big bug has been squashed. The issue was the app would not store the settings. So now, after I get approval from Apple, you can finally save your settings!
WOW! I excited to announced that Wasted Time for the Apple Watch is now available. While there is one bug I need to squash, I would love it people would down load it.
The bug is that the cost information is not being saved, but if you don’t kill the app you will be OK. If you go to the setup screen, enter the rate. Use a keyboard or the scribble function, and don’t enter commas. After you enter the rate, toggle the Hourly / Salary buttons… and then start your first meeting!
Over the last almost two month I’ve been staying crazy busy, at work (the day job), at my non-profit (the opera group), and dealing with problems in the various Apple betas from this summer. The biggest issue of the Apple betas has been my experience with Catalina. While Catalina itself hasn’t been a big user change, it has had numerous problems on my 2016 MBP with Touchbar.
As I mentioned during WWDC, I had lost my entire hard drive during the initial install and had to completely rebuild the machine. This isn’t a major issue by itself. Sometimes it’s nice to get rid of all the cruft and rebuild a machine, but the machine has been extremely unstable since. A few issues are the most troublesome:
1) iCloud Drive – as others have mentioned, Apple has had major issues with iCloud Drive during these betas. To me it has shown itself in two major ways. First, I’ve not been able to actually restore all my data from iCloud Drive. iCloud will start the download, and then get in what seems like a loop, uploading the same data, and never finishing. I have all my source code in iCloud Drive (and yes, also in GIT) so that I can go from one machine to the next and pick up where I was, when working on a program. The second issue is that when working in a file, especially in my source code, iCloud Drive seems to want to backup the file with the name “Name #.extension”. And then litters your drive with multiple copies of files. I have seen upwards of 6 versions of files. This can cause both xCode and Git to then add a bunch of files into my projects. Of course things get confused and I have to find which one is “current” so that I don’t back level my code.
2) Open from the internet issue – while working on my code across machines, xCode will see that I’ve updated code on my Mojave machine and since iCloud comes from the internet, it will be get confused saying my code that I am trying to open is from the internet, causing very long open times for some projects. While this is very very annoying, so far this hasn’t caused any data loss.
3) To wind down, I like to play World of Warcraft at times, but Apple’s new security model won’t let it run, since it doesn’t have a sign certificate in it. I’ve reached out to Blizzard multiple times to let them know, but so far no fix coming.
4) The splitting up of iTunes to multiple apps seems to be fine, BUT they’re have not yet implemented homesharing. This is a major show stopper for me, as I have a NAS full of my ripped music, tv shows, and movies. I cannot use homesharing to stream them to my MBP, and I am very worried that this will be a big issue for my Apple TV when Catalina goes GA. I’ve raise feedback/ bug report on this one. I hope others raise Cain about this one, as this is a major step backwards.
What are your experiences with Catalina? I’d love to hear how it is working out for you.
I’ve been working on learning SwiftUI and Combine. I figured the best way to do this was to completely re-write wasted time as a native stand alone watch app. To begin with I am working on the main Meeting screen. The following video shows how cool SwiftUI is in making this UI just “work” on the Apple Watch. Amazing!
I decided to wait a week to write my WWDC wrap up, not because I’ve not had any time to write it due to the day job, non-profit work, and dealing with the WWDC Flu (okay that’s most of the reason), but also to give me time to reflect on all the things I saw at WWDC.
The first thing is the new Mac Pro. As promised I put together a little video from the demo room of the new Mac Pro. Having gotten a degree in Journalism and television production as an undergraduate, I fully understand the cost structure of the new Mac. It is a high end machine design for a studio. It’s not for you and me, but if you are a movie editor, this is the machine for you. The price is not an issue for a studio, and power that I saw in use in the demo space was amazing. The 28 core machine, would render the ENTIRE set of the movie Toy Story (not final rendering we’ll get to that) on a single machine, allowing you to move around and zoom in to any section to visualize how to setup a shot, etc. For final rendering, they showed an incredibly high rez render that the camera would move around, and within 2-5 second render a final shot. Normally this would be 2 hours of CPU time to render a single frame. Amazing.
I put together the above little video as for fun (on my iPad – but somehow the rendering is messing up and not all of it is showing – oh well Beta 1 of iPadOS).
I’ve been thinking thru what I would like to work on this summer. There’s so much to do, with the introduction of SwiftUI, Catalyst, and Combine. I am thinking about a few projects:
Create a stand alone watch version of Wasted Time – this would allow me to start from scratch and learn Combine and SwiftUI
Create a Mac version of Wasted Time – this would require that I clean up more code in Wasted Time, and possibly remove all RXSwift support. Doing this would require me to update it to Combine to get the features I have added around Siri shortcuts.
While I am at it, I would go ahead and swap out the UI to using Swift UI. So basically a rewrite.
Completely rewrite my draft Holiday Card App with SwitftUI and change the paradigm. I am thinking that what should happen is, you start with either taking a picture or picking a picture of a card, and then (if I can do this on device) use machine learning I suggest what kind of card it is, and you pick who it is for.
Now I just need to get more hours in the day… between a ton of work at the day job and a ton of work with my non-profit, I hope I get to do at least one of the above items.
Thursday at WWDC ends with the “Bash”. This year the band was Weezer. I’ve liked their music in the past, and they were enjoyable. I didn’t stay for the entire show, since I was tired after an incredibly long day of really good sessions. Oh yeah, and I was carrying my book bag, which was very very heavy by three hours of standing with it on my back.
So what were all the sessions about today? I started with a session on Data Flow with SwiftUI. This session talked about how you would use Combine and SwiftUI to make a highly reactive UI. The example was based on a WatchOS App.
Next I a went to the introduction session on Combine. This is Apple’s new framework which allows for asynchronous communications between data streams and UI elements. Basically, this is the Apple specific replacement for RXSwift. The basic idea is you have three kinds of objects – Publishers (which emit messages and data events), subscribers (which catch them and do things like update UI elements), and I believe the last one is observers (but I can’t find them mentioned in the docs – (https://developer.apple.com/documentation/combine ). These can transfer data via operators like .map. Last summer I had changed WastedTime to use RXSwift to do a similar process. I will be updating to Combine shortly.
At 11am, I went to a session on Testing in Xcode. Last year’s session was good, but this year was even more detailed, and I got more out of it based on the foundation from last year. One thing I really enjoy the fact that at WWDC the testing session was PACKED. Who says developers don’t want to test their code. As a developer, just because it compiles doesn’t mean that the work is done. Having a strong test plan, Unit Tests, UI Tests, Performance tests, and a continuous integration test setup makes for a much better experience with your App. Xcode has done a great job of building this into the tool and enhancing the capabilities every year.
After this I grabbed a very quick lunch in order to come back and listen to Dr. Ayanna Howard talk about Robotics and her work at JPL, NASA, Georgia Tech and now working with kids with disabilities. Her work on empathy and bias in robotics was fascinating. The most interesting part was a set of experiments that showed that people will trust a robot even when they shouldn’t. They would setup an experiment, where a robot brought the person into a room to take a test. During the test, they would set off a fire alarm, and when the person would open the door the hall would be filled with smoke. They would have the same robot direct the person on how to get out. But they would send the people to areas that were obviously wrong. Most people would trust the robot, even standing in a middle of a hallway as if it were the safe space. amazing!
After this I got sometime in the lab with the Siri intent team. I fixed a bug with Siri intents in Wasted Time and have a bit of a better understanding on Siri short cuts. We shall see.
The afternoon started with a more detailed look at Combine. The session used notificationCenter as the sample and he did hit on CoreDate but it seems that this is less of a major match right now. They referenced a “passthrough” type of processing for CoreData. Hope to learn more about this, as my new Holiday Card Tracker uses CoreData.
The session integrating with SwiftUI was about having a combination of SwiftUI and non-SwiftUI elements within the same app. Basically the idea is how to migrate your app or add SwiftUI to an existing app. I am thinking I should use this for my new App as a way of redoing the “getting started” flow. Perhaps this will allow me to add this feature while not breaking the basic flow. The idea that the DataModel exist outside of SwitfUI was brought up and again links SwiftUI and Combine in order to create the UI that you will want in your app. Using the @Bindable and and @ObjectBinding to connect your data model.. . And we use the $datavalue mapping to connect the data element to the @objectBinding.
I skipped the next session because I wanted to actually start working on the SwiftUI tutorials. This worked pretty well, but I ran into a few bugs. Check it out yourself over at – https://developer.apple.com/tutorials/swiftui/interfacing-with-uikit .
The final session of the day was Taking iPad App to the Mac and to the Next Level. This session really walked through a lot of the actual differences. The advice apple kept giving was if you want a good Mac app, first do a good iPad app, and that would make the mapping a lot easier. While this is great advice, they then showed a lot fo preprocessor command to check if you were on the Mac to execute certain code. This makes sense if you really want to take advantage of Mac unique functions, but seemed a bit jarring in the examples. Overall, the design of SwiftUI to enable this level of cross platform development is very promising. So between SwiftUI, Catalyst, Catalina, and Combine we will either see a resurgence of new Mac apps, or a migration of more Mac apps to the iPad. The settings for Xcode and integration into the Appstore was also exposed. Making it very easy to distribute your Mac (iPad) app via the store, or thru traditional methods.
The evening ended with the “Bash” … The food was good, the beer was cold, and the band was Weezer. After such a long day, I didn’t make it thru the entire show, as I had to walk back to my car (about a mile away) and then drive back to the hotel. But the crowd had a great time.
Today turned into a pretty productive day. After the problems of Monday and the recoveries of yesterday, I was looking forward to a day when I could just enjoy the sessions an and possibly take to the labs to figure things out.
I began the day with a session on Sign-In with Apple. I am really looking forward to this being adopted by more websites. The number of websites, apps, and games that require logins drives me nuts. I always create new accounts with email, and refuse to use login with Google or Facebook. Being able to take advantage of Apple’s approach to privacy and the dynamically generated email addresses will make a few of the tracking things better. I am sure that they sites, will still send emails, and put tracking pixels in them.
Apple has made this service available for all of their platforms, and will be requiring that if an app uses Sign-In with Google or Facebook, then they MUST use Sign-In with Apple to remain in the App Store. They have also made a javascript version available for websites. This hopefully will drive adoption quickly.
My next session was on implementing dark mode. I’ve been working on my Holiday Card Tracking app this week, and after this session, I spent lunch completely enabling dark mode in it and most of Wasted Time too! The basics are very easy to do. This session showed some of the processes that you need to implement if really want to address custom colors (beyond just using the default system colors. I will have to do some more reading up on that, once I finally create a meaningful design for my apps.
These two sessions got me ready for the SwiftUI Essentials session at 11am. I am really excited by SwiftUI as a way of improving the design and behavior of both of my apps – WastedTime and Holiday Card Tracker. Last year I re-wrote WastedTime to use RXSwift. This was a lot of work and not to intuitive for me. Once I finally got it done, the app correctly updates based on Siri and I have cleaned up some of the code. But the UI is till the same old tired UI. So if I can switch to SwiftUI I may actually improve the look and feel. And with Holiday Card Tracker, it is such a basic app, that right now, I should be able to just “switch” it to SwiftUI, and thereby fix the Image view problems that I have going on. We shall see.
I then had lunch and implemented Dark Mode!!
After lunch, there was another session on SwiftUI but this time on WatchOS. There were some really great examples of how to sort and store data for the Watch, but my head was thinking about fixing a crash I’ve been working on in my Holiday Card Tracker app. So I stepped out and went to another lab. The lab was a great success as the Apple Engineer pointed out how my on device debugging was right. I was thinking that the unwrapping of a nil could only happen on the right side of an assignment operator. I was wrong, and was able to fix the app. I have posted a new version to testflight, but it is currently dependent on iOS 13. So I can’t have my testers use it. I will try and fix that later this week.
The next session I sat in was all about this year’s Apple Design award winners and what made their apps special. What an incredible talk. It really changes the way I am looking at my own code. Then at dinner tonight I was talking to a developer form the UK who said when he saw through this same talk last year, he completely redesigned his app. I think I should re-look at how the Holiday Card Tracker actually works.
The final session today was on modernizing your UI for iOS13. This talk went thru multiple key changes in the UI, from both a design and function perspective. Many things come by default when you update your app to iOS 13, but to really take advantage of the power of the new versions, there is work to be done. It’s sessions like these that make me wish I were independently wealthy, and I could spend my time coding during the day. Not that I don’t get time to code now, but not as much as I’d like.
All in all today was great! I am excited to get some time this summer to do the work on my apps.
Well, the team at the install lab were awesome. I wish I had left the machine alone when it went belly up, for multiple reasons. One, I would have spent most of Monday night getting frustrated, I wouldn’t have lost any data, and I would have given them a problem to debug. Instead, I got a rebuild of my machine, with the full install of the beta. I was able to finish setting up the machine, install all my data from iCloud backups, etc. I was then able to update my iPad and iPhone, and reconnect my Watch to the iPhone. So by about 10:45 this morning everything was updated as it should be.
I then got into the 11 o’clock session on Building your first SwiftUI application. Unfortunately my machine was still installing all my data, so I wasn’t able to follow along, however, it became clear to me that this will be the replacement for RXSwift. This realization will play out thru many other sessions during the day.
A few cool things about SwiftUI: it is the new UI framework that is for Swift, built in Swift. It allows for connecting up your data, (this is done by making it “Identifiable”, as such, when things in data change or in the UI, the other is updated. When working in SwiftUI, you are working simultaneously in code or in a simulator like environment. This looks amazing, and I can’t wait to update WastedTime to use this framework. Doing so will simplify the code a lot and also will allow me to remove an external framework.
I spent lunch talking with people and working thru more updates on my machines and helped a few other people go thru some of the same problems I did.
At 2pm, I went to the session on updating an iPad app for the Mac. I was so excited for this one and had fired up a copy of WastedTime to test it out. At the simplest, you just flip a bit in Xcode and build the app for Mac. Well, since I am using RXSwift, Xcode tried to recompile it for the Mac, and it failed. It is using many old APIs that are no longer supported. So I will have to update to SwiftUI before this will work. I am looking forward to making this work. It will be one of my summer projects. Given I just added RXSwift in the last year, removing it shouldn’t be too hard (I hope).
The next session was on creating an independent watch App. A few years ago, I tried to add Watch support to Wasted Time. I ultimately gave up on it, after realizing that WatchOS really wasn’t ready. Now I think it, and by it – I mean I am ready. SwiftUI will make the UI much easier to develop, my updates to WastedTime to use a more realistic object model, and better understanding of the whole platform will make this so much more easier.
The last session I went to was about Reality Kit and Reality Composer.
Reality Kit expands AR Kit to have even more realistic lighting, physics, and scenes. The amount of code you need to enable RealityKit apps is pretty minimal, if you use the right tools to build the packages. The number of things that Apple has done to make the models and scenes more realistic is pretty amazing. I am always fascinated by AR, and in my day job I believe that AR is critical for Internet of Things. My basic belief is that AR and Voice are THE new UI for the future. In this session there was no discussion of SwiftUI.
I ended up skipping the 5pm session because I got an appointment to go check out the new Mac Pro.
The area was setup with multiple work stations showing all the ways you can use the new Mac Pro. The highest machine in the demo area had a 28 core CPU and was doing studio level image rendering at basically real-time. Final rendering, would still be 15 minutes per image, verses 22 hours that many studios use right now. Amazing!
I took some videos and when I hope to put that together into a little showcase.