My Apple Watch Series 4 Prediction

So, I’ve been thinking about WWDC’s demos, and the rumors that the Watch Series 4 is going to be bigger.. so let me be the first one to blog this out loud…

The Apple Watch Series 4 will become your Dick Tracy watch. Here’s why I say this.. If you watched the keynote, they showed how Apple Messages will be adding multi-party video chat… this will be available both on iOS and on Mojave.

If you add a few millimeters to the size of the Apple Watch you can add in the the Front facing camera to the watch, and then use the new UI, allows you to see the current speaker and then scroll up and down to the other participants on the video chat.

The crown on the Watch is the perfect interface for this scrolling, and now that the Watch Series 3 has LTE, you can see all the pieces on the board.

Enjoy!

Holiday Card App and other progress

So, I finally have had a few things slow down enough so I could do some more coding. I am working on two projects at once. The first is adding Siri support to Wasted Time v8.0 – this has been really fun, as I am learning about Siri Intents, changing the app to run without a UI, and learning about RxSwift to make many of the events “observable”. This will really clean up the code for Wasted Time. I have added in all my intents and am now fighting with Xcode and cocoaPods to support RxSwift. Hopefully I will solve that issue soon. In the mean time I’ve added support for the following features in the Holiday Card Tracking App – Search (YEAH!! you can now pick people from your contacts list), Edit – (you can now correct addresses on your recipient list), and Full Screen image view (want to get a better look at the front of the card you sent, press on the card and it will go full screen). I have a few other minor updates I want to do, but I am trying prioritize the work between so many things I am working on: 1) Pan and zoom of the fullscreen image so you can zoom in on the cost of a card, 2) saving images in the camera roll, instead of in the App, and 3) pick a picture from the camera roll – this is important if you are sending a ton of the same card for an event – you don’t want to have to take 100 pictures, and use up 100 pictures worth of storage.
I am releasing this version without the last three features to my TestFlight team for feedback. Hopefully they will provide some!

Don’t Change your Account on your iMac

When I replaced my old iMac (after it’s graphics card died), I had setup my new one and then migrated the old one via Apple’s account migration tools. Of course I had messed up and could not use my old account name, so in an effort to just get a working iMac I appended -blank to the account name. This got me up and working quickly and I then deleted the temporary account, that I had setup to test the machine (which had my old account name).

So on the US 4th of July holiday I had time to finally rename the account and get rid of the annoying -blank at the end of the account name. Since I was not going to change the home directory, I figured life would be good and I would not have any issues with installed software, etc. Boy was I wrong.

Many of the non-appStore software required me to re-register (not really a big surprise) and I had to log into the AppStore for some of the AppStore based software to re-authenticate the application with my account. These were not big deals, but when I started my iTunes library, all hell broke lose.

I have a very large library of ripped Movies, TV Shows, and Albums (from my purchased Blu-ray’s, DVDs, LPs, and Cassette tapes), as well as a significant amount of iTunes and MoviesAnywhere content. So large in fact that I have terabytes of storage used up on my Drobo5N with my library. When iTunes starts it points at a iTunes Library on the Drobo5N that contains 6-8TBs of media content. The the first thing that happens is iTunes starts re-organizing the library and creating additional copies of the files on the Drobo5N, while I have 6TBs of free space, duplicating the library again would likely fill the Drobo5N and cause issues. I canceled the re-organization and discovered that iTunes now couldn’t find any of the content. Every song required me to “locate” the file, which then would scan the drive and not find the rest of the library. Same with Movies and TV Shows. I looked and iTunes had created a new iTunes library file, which was having issues. I figured this was an easy fix, just point back at the old library file, Nope! That made it worse.

After 3 more trial and error fits and starts, I exited iTunes and started looking at my directory structure on the Drobo5N. There were multiple copies of the music, TV Shows, and Movies directories, all in various stages of completeness. Music was duplicated, TV Shows were duplicated, and Movies had both duplicates and strange directory setups. So over the last weekend, I decided to create a new manual directory structure and de-dupe all the music, movies, and TV Shows. This is where the ditto command comes in (ditto -V source_directory destination_directory [the -V will show you all the copies as they happen]). If you’ve never used the ditto command, it will merge directories and keep all the metadata for files. If you try to merge directories via Finder it has been known to replace the directories and there by deleting some files you want to keep. The music library took 3 ditto commands, to merge all the directories together, and two days. The TV Shows was done via move in the finder, as the shows were much more all or nothing in the splits. The Movies were done as ditto (a parallel 2 days).

Today when I have time, I will delete the iTunes “library” file, and build a new one, importing all the movies, music and tv shows into the library. The directory structure will be the already defined iTunes library so hopefully it won’t create yet another copy of all the content. If you follow my blog you will remember the music sort discussion last year. I really really hope that the meta data is correctly attached to the files, and I will not have to do too much clean up.

The lesson after all of this, don’t rename your account.

Siri Intents and Wasted Time

This past weekend I was able to get Siri to start seeing Siri Shortcuts for Wasted time. I’ve not implemented the shortcuts themselves yet, but I am “donating” certain actions, so that when iOS12 launches this fall, you will be able to use your voice with Wasted Time. I am hoping that over the next few weekends I can add the processing for shortcuts, and allow you to say – “Hey Siri – Start a meeting with X people”. What do you think?

Looking for Testers

I am working this summer to add Siri Intents and Siri Shortcuts to Wasted Time this summer. To that end, I am looking for people who are running any version of the iOS 12 beta, who would like to be added to a TestFlight for testing out the new features. Right now I have updated the internals to prepare for the new APIs, and will need people to help test it to make sure there has been no regression. If you are interested please let me know… either drop me a comment on the site here, or DM on Twitter. Thanks!

Adding Siri Intents to Wasted Time

As we finished packing up the warehouse this weekend for the upcoming – unplanned move of my non-profit, my thought turned to how I can upgrade Wasted Time for this summer. Obviously, the key item to add to the app is support for Siri Shortcuts. While I could do this via NSUserActivities, I thought that is a stop gap approach and I should focus on supporting the more forward looking API called Siri Intents. This focus would also allow me to build my own custom Intents.

I have created six intents:

  1. Start a meeting with XX attendees. (This would actually address one of the big short comings of the application to date, quickly adding a bunch of attendees to a meeting and starting it)
  2. Add an attendee.
  3. Remove an attendee.
  4. Quorum Reached.
  5. End a Meeting.
  6. Reset Meeting.

These intents pretty much address all of the major functions, excluding the Tweet function. I did not want to add a rest history option, as this is more appropriately done within the app itself.

Creating the Siri Intents and donating them to Siri for Siri short cuts was pretty trivial. I am trying to decide if Add and Remove attendee make sense for donating, and I will probably remove that function, but for now I am learning.
The bigger issue is all the technical debt that has built up in the app over the last 7 years since it was first released. I built the app using the standard MVC technique and since the meeting started view is designed to handle all of it’s messages, it doesn’t really lend itself for deep linking. There is a lot of duplicate code, a big no-no in object oriented design, and the UI really does drive the logic flow.

I believe the best approach for refactoring the app would be to create a Meeting Object. This object would be able to be started, stopped, add and remove attendees, etc. The UI would just show the state of the object at any given time. This approach would allow me to reduce duplicate code, have a meeting “run” in the background purely from voice, and allow for deep linking so that you could jump to a meeting via Siri Shortcuts.

I am hoping that I can get enough time to pull this off this summer, while still working on my new app – Holiday Card Tracker.

WWDC Day 5 – It’s a Wrap!

Thank you to Apple for doing a developer conference that kept almost everyone to the end. I am used to go to large enterprise conferences, and usually by day 3 or 4 the crowds really thin out. Today was day 5, and the lunch session today was PACKED! The room held 3,000 people, and the break out sessions both before and after probably held over 1,000 people and were full too.

WWDC was 6,000 developers from around the globe and there were only about 4 major sessions at any given time, plus labs. The labs were designed to get you one on one time with engineers, designers, and App Store admins; while the sessions hit the key capabilities that Apple wants developers to focus on over the next year. What was really cool was that every session also pointed you back to prior year WWDC sessions for more information. This let’s you build up your knowledge across the multiple WWDC sessions.

Today was a set of sessions that really helped round out my understanding of key technologies. First was a session on using Collection Effectively. Collections are language constructs, Arrays, Dictionaries, etc. that conform to a set of basic methods, allowing you to get to various entries. The key thing I learned was that you can’t take for granted that [0] is always the first element. Nor can you assume +1 will advance you to the second element in the collection.

Next I went to a session on designing notifications. While I don’t really do notifications in my apps, I may want to one day. Apple has done a lot in iOS12 to allow you to better group and respond to notifications, in a much better way. This session gave you tips and tricks so that your notifications are more meaningful, engaging and relevant. One thing that you will see soon, is the practice of giving you a Silent notification, that just shows up in Notification Center, and then allows you to decide if it is worth while. If not, as a user you can kill them easily for that app. And once you say NO, the app can’t bug you again… Nice!

One thing that has happened with my new app that I am working on is that on iOS12 it crashes on my iPhone when trying to take a picture. It didn’t crash on the iPad, and the logs that I got from TestFlight didn’t really make sense. The session on understanding Crashes and Crash logs was amazing! Mid-way thru the session I was going deep into the crash log, identifying the issue, and fixing my app. I posted a new version to TestFlight for my testers. Can’t wait to learn more here!

I had planned on going to the Lab to talk to the team about my Crash log, but since I had gone to the session, I was able to go to the lunch talk. The talk was from the Director of Lighting at Pixar. She took us through how the build the world of the dead in the movie Coco, and how she came to work at Pixar. No pictures or recordings were allowed… but if you want to see her first job at Pixar, go back and watch Monster’s Inc. Look for the leaves waving in the background of one of the nightmare scenes in a kids bedroom. See if you can focus on it. I’ve been into ray-tracing since 1985, when I saw a picture of train on a track done via 24 hours of computation of a “super-computer”. We now do that at animation speeds (30-60 frames a second) on our iPhones. This talk was inspiring and engaging.

The last two sessions of WWDC that I went to, had to do with adding delight to your apps (via performance items) and addressing Apps regardless of the size of the display (via UIKit changes). I’ve not fully gone thru my notes on them, but both sessions were really well attended and had nuggets that I am going to use in future apps of mine.

Overall, WWDC has been a great experience and I am so glad to have finally made it. I hope I can come again. See ya next year???

WWDC Day 4 – All About Debugging

I spent yesterday in sessions all day looking at Xcode, LLVM, and LLDB and it was amazing.

I had plans to go to the Panic at the Disco party and beer bash (even got my above wrist band to confirm I could have a beer), but decided it would be more productive looking at Xcode more.

I started the day in the session on building faster in Xcode, the basic idea here was explaining how to setup your asset dependencies to ensure that Xcode’s parallel and incremental compiler technology could take advantage of as many cores as you has ever on your machine. My apps tend to be pretty simple so this wasn’t a big issue for me, however understanding how it all holds together will be key as I move to more complex projects and start including more frameworks in my projects.

The next session “What’s new in LLVM” went into the depth of the compiler itself. The session explained what certain complex error codes meant, and other internals. To be honest this was the one session this week that pretty much went straight over my head. I am sure that if I spent more time doing very complex projects with lots of multiple languages and frameworks, this session would be very informative. But I am not there…yet.

I took a break from deep technology to get into a session about Creating Great AR Experiences. This was pretty much a best practices session and went thru how Apple builds their AR experiences to make the application natural for users. The most insightful part was a discussion of VR and 2D apps, which are still using ARKit. Not all AR apps have to have a camera into the real world. While I understand the point the speaker was trying to make, I don’t consider an app that watches you raise your eyebrows to be an AR experience.

The next sessions was “Core Data Best Practices” – this was GREAT! Not only because the speakers took us thru enough concepts to help me get my head around Core Data but because of the guy from Microsoft I was sitting next to, who works on the outlook App, and helped me understand how to start versioning my database. I’ve immediately edited my app that I am working on, and put in database versioning, so hopefully, my testers will no longer have to deal with database crashes.

The next part of the build / deploy pipeline I went to was a session on Automating App Store Connect. The App Store connect team has started exposing much of their process through standard RestAPIs. This change is great news, and will allow for improved pipelines for adding and removing testers, creating reports, and managing your app. While I don’t need this process yet, it was another session which shows the maturing of the AppStore process.

The penultimate session for the day was around building “Better Apps through Better Privacy”. This is a topic that is near and dear to my heart, and while the speaker did a great job about going thru the concepts and big ideas, there was not much technical details in this session. He did point to other session that went through the technical details, so to that end it was worth while. As I mentioned earlier in the week in the session on deleting the learning that Siri Shortcuts may have, the number of developers in this session was less than I would like to see. I think the bigger issue is that many of the developers I have talked to this week are focused on enterprise, and they don’t consider privacy as much for internal facing apps.

Finally, the most AMAZING session of the day was “Advanced Debugging with Xcode and LLDB”. I’ve been using IDEs for my development since the days of Turbo Pascal and Borland C. I hadn’t realized how set in my ways and thinking I’ve become in how a debugger should work. This session showed how you can basically change your code to test bug fixes through breakpoints in the code (including doing screen changes!!). You can build python scripts and attach them to your Xcode debugging startup. I had not seen this type of debugging before in Xcode, and was really happy to have gone to this session. (Oh and a free tip for people who may not have access to the replay video – if you are at a break point, you see a green bar off to the side with three horizontal lines, if you want to skip a line of code, just grab it and move to where you want to go…but beware, you can put your app into an unknown state).

Today is the last day of WWDC, and I can’t wait to see what I learn today!

WWDC and Relay.FM Day 3

I love podcasts, and have way too many that I listening to. I’m going thru them lately to re-prioritize those that I listen to. I realize that I’m getting stale with some of the ones I listen to, mostly the TWIT ones, MacBreak, TWIT and IOS today have gotten to the point where they’ve gone from being informative to the point were they are just a bit of an echo chamber. Relay.FM has had some really good ones I listen to: Upgrade and Canvas. And I am thinking it is time to give a few other shows on that network a chance.

The reason I bring this up as part of my as WWDC thoughts is that last night I went to a live taping of Relay.FM and it was really enjoyable. Here are a few pictures from that show:

Probably not to exciting to see people record a podcast, but it was fun.

The rest of the day was really enjoyable too. I started the morning learning about testing on the Mac for iOS. I really should start working on building test cases into my personal apps. The ability to kick of multiple parallel tests on different simulators is amazing. Years ago, when I was looking at the mobile industry for my day job, the idea of testing across different phones, required a rack of phones and automation tools. We’ve come a long way on this, and it looks amazing.

Next I went to a few sessions on Siri, specifically how to build out the voice of your Siri interactions, which focused on best practices to creating your user interactions. And the second on building Siri shortcuts on the Siri Watch face. These sessions continue to show how important Siri is this year for Apple. The work they are doing on automation with Siri Shortcuts, all the session on Siri across different devices, and the fact that the HomePod is out there and HomeKit is now on the Mac, tells me that 2018-2019 will be Siri’s coming out party. I am hoping that it will quickly get to parity with the other voice assistants in the market.

In the afternoon I only went to two sessions, as I wanted to get some time to work on my app. The sessions were all about performance improvements, and best practices. The first on image and graphics best practices. This session took me thru the way the iPhone actually processes images in memory and the impact on battery and CPU performance. This really helped me understand how to tune an application that is highly graphic intensive. My upcoming app I am working on, does process a lot of graphics and I will look and see if I am correctly handling my graphics.

The final session was on Auto Layout. The biggest issue I’ve had with AutoLayout in the past was not really understanding how it actually worked. This session talked about how the AutoLayout engine worked. What is Churn? How to improve your app’s performance with AutoLayout. What does Interface Builder do for you, verses setting up your screen in code? Another great session that explained to me how to address a lot of the issues I’ve had in AutoLayout and hopefully improve how my users interact with my App. After seeing this session, I got so excited that I went back to my hotel room and updated my app, yet again, before heading to the Hammer Theater to listen to the above Relay.FM live show.

Can’t wait for today’s session. I will be focused on debugging with session on Xcode, Core Data, and the debugging issue. We also end the day with a performance by Panic at the Disco.

WWDC Day Two

The best advice I was given by people who have been at WWDC before was to focus on getting time with the engineers. All the sessions are streamed as videos, and you can spend time later to watch the ones you missed. This is excellent advice!

I stared the day yesterday with getting with the AutoLayout engineers. I’ve been working on a new app, and having issues laying out a simple address input screen. I got in to the engineers and was the 10th or so person to get to talk to the engineers. The guy I got assigned to was really good at addressing asset catalogs, and tried to help me. He helped me with getting the City, State, ZIP line addressed correctly. I took his advice and went to a development area and started working on my app. I wasn’t able to get a lot more fixed in AutoLayout, and wanted to hit a few more sessions. I did get some design advice on moving some of my buttons up to the navigation bar.

I then went and sat thru a session called “I have this idea for an App”. The session was a simple getting started session, which talked you thru how to create a simple game. They did a good job of explaining the basics of xCode, interface builder, and even a bit on UITableView. The best part of the session was the list of other sessions you may want to watch to go deeper on other topics.

I made it down to lunch, which I wasn’t able to do on Monday, and sat down next to a pair of developers from Romania. We started talking, and I discussed the issues I was having with AutoLayout. The eyes light up on one of their faces and we spent the next 90 minutes or so, working thru the issues I was having with one of my screens in my new app. It was amazing! He had a great understanding of AutoLayout and of Interface Builder, He spent time explaining the concepts, and why you would do certain things. By the end of lunch the screen was perfect! I decided I was going to find time later in the day and fix the rest of the app. (Btw, now my images resize to fill the remainder of the screen).

The afternoon was filled with sessions, I went to the session on Create ML – Apple’s new machine learning engine which builds models based on data ingestion. It was amazingly fast and build much smaller models for your apps. While I’ve not done much work in my apps using ML, I am interested in seeing how this helps iOS and Mac Apps improve their prediction models. The next session was what is new in WatchOS 5. This session was interesting, not in what it said, but in what it implied. They didn’t mention in the session that Series 0 watches will not support WatchOS 5 but I read that later in the day. But this has to be because of all the new Siri functions that are coming to the Watch. Apps will be able to surface their own shortcuts and intents on WatchOS even if the app doesn’t run on the watch. We are going to get much better integration of ML and Siri. That led me to the final session I went to in the afternoon: How to create your own Siri Shortcuts. This was great! Last summer I tried to add Siri to my Wasted Time app, with the idea of simply allowing someone to say – “Start a meeting” and launch my App. Now I can do it, but the likelihood of it showing up automatically for people is going to be pretty light.

Shortcuts and Intents are registered by an app, and every time you do certain functions in your app you “donate” this knowledge to the OS. This allows Siri to “learn” pattern and ultimately predict for a user that it may be something they want to do. The most interesting part was that Apple did show how you can “unlearn” or delete the learned data. The number of developers who left during this topic was interesting. I wonder if GDPR will get them to pay attention next year!

I grabbed a pizza on the way back to my room and then spent the evening relaxing. Today’s going to great…