This past weekend I was able to get Siri to start seeing Siri Shortcuts for Wasted time. I’ve not implemented the shortcuts themselves yet, but I am “donating” certain actions, so that when iOS12 launches this fall, you will be able to use your voice with Wasted Time. I am hoping that over the next few weekends I can add the processing for shortcuts, and allow you to say – “Hey Siri – Start a meeting with X people”. What do you think?
Looking for Testers
I am working this summer to add Siri Intents and Siri Shortcuts to Wasted Time this summer. To that end, I am looking for people who are running any version of the iOS 12 beta, who would like to be added to a TestFlight for testing out the new features. Right now I have updated the internals to prepare for the new APIs, and will need people to help test it to make sure there has been no regression. If you are interested please let me know… either drop me a comment on the site here, or DM on Twitter. Thanks!
Adding Siri Intents to Wasted Time
As we finished packing up the warehouse this weekend for the upcoming – unplanned move of my non-profit, my thought turned to how I can upgrade Wasted Time for this summer. Obviously, the key item to add to the app is support for Siri Shortcuts. While I could do this via NSUserActivities, I thought that is a stop gap approach and I should focus on supporting the more forward looking API called Siri Intents. This focus would also allow me to build my own custom Intents.
I have created six intents:
- Start a meeting with XX attendees. (This would actually address one of the big short comings of the application to date, quickly adding a bunch of attendees to a meeting and starting it)
- Add an attendee.
- Remove an attendee.
- Quorum Reached.
- End a Meeting.
- Reset Meeting.
These intents pretty much address all of the major functions, excluding the Tweet function. I did not want to add a rest history option, as this is more appropriately done within the app itself.
Creating the Siri Intents and donating them to Siri for Siri short cuts was pretty trivial. I am trying to decide if Add and Remove attendee make sense for donating, and I will probably remove that function, but for now I am learning.
The bigger issue is all the technical debt that has built up in the app over the last 7 years since it was first released. I built the app using the standard MVC technique and since the meeting started view is designed to handle all of it’s messages, it doesn’t really lend itself for deep linking. There is a lot of duplicate code, a big no-no in object oriented design, and the UI really does drive the logic flow.
I believe the best approach for refactoring the app would be to create a Meeting Object. This object would be able to be started, stopped, add and remove attendees, etc. The UI would just show the state of the object at any given time. This approach would allow me to reduce duplicate code, have a meeting “run” in the background purely from voice, and allow for deep linking so that you could jump to a meeting via Siri Shortcuts.
I am hoping that I can get enough time to pull this off this summer, while still working on my new app – Holiday Card Tracker.
WWDC Day 5 – It’s a Wrap!
Thank you to Apple for doing a developer conference that kept almost everyone to the end. I am used to go to large enterprise conferences, and usually by day 3 or 4 the crowds really thin out. Today was day 5, and the lunch session today was PACKED! The room held 3,000 people, and the break out sessions both before and after probably held over 1,000 people and were full too.
WWDC was 6,000 developers from around the globe and there were only about 4 major sessions at any given time, plus labs. The labs were designed to get you one on one time with engineers, designers, and App Store admins; while the sessions hit the key capabilities that Apple wants developers to focus on over the next year. What was really cool was that every session also pointed you back to prior year WWDC sessions for more information. This let’s you build up your knowledge across the multiple WWDC sessions.
Today was a set of sessions that really helped round out my understanding of key technologies. First was a session on using Collection Effectively. Collections are language constructs, Arrays, Dictionaries, etc. that conform to a set of basic methods, allowing you to get to various entries. The key thing I learned was that you can’t take for granted that [0] is always the first element. Nor can you assume +1 will advance you to the second element in the collection.
Next I went to a session on designing notifications. While I don’t really do notifications in my apps, I may want to one day. Apple has done a lot in iOS12 to allow you to better group and respond to notifications, in a much better way. This session gave you tips and tricks so that your notifications are more meaningful, engaging and relevant. One thing that you will see soon, is the practice of giving you a Silent notification, that just shows up in Notification Center, and then allows you to decide if it is worth while. If not, as a user you can kill them easily for that app. And once you say NO, the app can’t bug you again… Nice!
One thing that has happened with my new app that I am working on is that on iOS12 it crashes on my iPhone when trying to take a picture. It didn’t crash on the iPad, and the logs that I got from TestFlight didn’t really make sense. The session on understanding Crashes and Crash logs was amazing! Mid-way thru the session I was going deep into the crash log, identifying the issue, and fixing my app. I posted a new version to TestFlight for my testers. Can’t wait to learn more here!
I had planned on going to the Lab to talk to the team about my Crash log, but since I had gone to the session, I was able to go to the lunch talk. The talk was from the Director of Lighting at Pixar. She took us through how the build the world of the dead in the movie Coco, and how she came to work at Pixar. No pictures or recordings were allowed… but if you want to see her first job at Pixar, go back and watch Monster’s Inc. Look for the leaves waving in the background of one of the nightmare scenes in a kids bedroom. See if you can focus on it. I’ve been into ray-tracing since 1985, when I saw a picture of train on a track done via 24 hours of computation of a “super-computer”. We now do that at animation speeds (30-60 frames a second) on our iPhones. This talk was inspiring and engaging.
The last two sessions of WWDC that I went to, had to do with adding delight to your apps (via performance items) and addressing Apps regardless of the size of the display (via UIKit changes). I’ve not fully gone thru my notes on them, but both sessions were really well attended and had nuggets that I am going to use in future apps of mine.
Overall, WWDC has been a great experience and I am so glad to have finally made it. I hope I can come again. See ya next year???
WWDC Day 4 – All About Debugging
I spent yesterday in sessions all day looking at Xcode, LLVM, and LLDB and it was amazing.
I had plans to go to the Panic at the Disco party and beer bash (even got my above wrist band to confirm I could have a beer), but decided it would be more productive looking at Xcode more.
I started the day in the session on building faster in Xcode, the basic idea here was explaining how to setup your asset dependencies to ensure that Xcode’s parallel and incremental compiler technology could take advantage of as many cores as you has ever on your machine. My apps tend to be pretty simple so this wasn’t a big issue for me, however understanding how it all holds together will be key as I move to more complex projects and start including more frameworks in my projects.
The next session “What’s new in LLVM” went into the depth of the compiler itself. The session explained what certain complex error codes meant, and other internals. To be honest this was the one session this week that pretty much went straight over my head. I am sure that if I spent more time doing very complex projects with lots of multiple languages and frameworks, this session would be very informative. But I am not there…yet.
I took a break from deep technology to get into a session about Creating Great AR Experiences. This was pretty much a best practices session and went thru how Apple builds their AR experiences to make the application natural for users. The most insightful part was a discussion of VR and 2D apps, which are still using ARKit. Not all AR apps have to have a camera into the real world. While I understand the point the speaker was trying to make, I don’t consider an app that watches you raise your eyebrows to be an AR experience.
The next sessions was “Core Data Best Practices” – this was GREAT! Not only because the speakers took us thru enough concepts to help me get my head around Core Data but because of the guy from Microsoft I was sitting next to, who works on the outlook App, and helped me understand how to start versioning my database. I’ve immediately edited my app that I am working on, and put in database versioning, so hopefully, my testers will no longer have to deal with database crashes.
The next part of the build / deploy pipeline I went to was a session on Automating App Store Connect. The App Store connect team has started exposing much of their process through standard RestAPIs. This change is great news, and will allow for improved pipelines for adding and removing testers, creating reports, and managing your app. While I don’t need this process yet, it was another session which shows the maturing of the AppStore process.
The penultimate session for the day was around building “Better Apps through Better Privacy”. This is a topic that is near and dear to my heart, and while the speaker did a great job about going thru the concepts and big ideas, there was not much technical details in this session. He did point to other session that went through the technical details, so to that end it was worth while. As I mentioned earlier in the week in the session on deleting the learning that Siri Shortcuts may have, the number of developers in this session was less than I would like to see. I think the bigger issue is that many of the developers I have talked to this week are focused on enterprise, and they don’t consider privacy as much for internal facing apps.
Finally, the most AMAZING session of the day was “Advanced Debugging with Xcode and LLDB”. I’ve been using IDEs for my development since the days of Turbo Pascal and Borland C. I hadn’t realized how set in my ways and thinking I’ve become in how a debugger should work. This session showed how you can basically change your code to test bug fixes through breakpoints in the code (including doing screen changes!!). You can build python scripts and attach them to your Xcode debugging startup. I had not seen this type of debugging before in Xcode, and was really happy to have gone to this session. (Oh and a free tip for people who may not have access to the replay video – if you are at a break point, you see a green bar off to the side with three horizontal lines, if you want to skip a line of code, just grab it and move to where you want to go…but beware, you can put your app into an unknown state).
Today is the last day of WWDC, and I can’t wait to see what I learn today!
WWDC and Relay.FM Day 3
I love podcasts, and have way too many that I listening to. I’m going thru them lately to re-prioritize those that I listen to. I realize that I’m getting stale with some of the ones I listen to, mostly the TWIT ones, MacBreak, TWIT and IOS today have gotten to the point where they’ve gone from being informative to the point were they are just a bit of an echo chamber. Relay.FM has had some really good ones I listen to: Upgrade and Canvas. And I am thinking it is time to give a few other shows on that network a chance.
The reason I bring this up as part of my as WWDC thoughts is that last night I went to a live taping of Relay.FM and it was really enjoyable. Here are a few pictures from that show:
Probably not to exciting to see people record a podcast, but it was fun.
The rest of the day was really enjoyable too. I started the morning learning about testing on the Mac for iOS. I really should start working on building test cases into my personal apps. The ability to kick of multiple parallel tests on different simulators is amazing. Years ago, when I was looking at the mobile industry for my day job, the idea of testing across different phones, required a rack of phones and automation tools. We’ve come a long way on this, and it looks amazing.
Next I went to a few sessions on Siri, specifically how to build out the voice of your Siri interactions, which focused on best practices to creating your user interactions. And the second on building Siri shortcuts on the Siri Watch face. These sessions continue to show how important Siri is this year for Apple. The work they are doing on automation with Siri Shortcuts, all the session on Siri across different devices, and the fact that the HomePod is out there and HomeKit is now on the Mac, tells me that 2018-2019 will be Siri’s coming out party. I am hoping that it will quickly get to parity with the other voice assistants in the market.
In the afternoon I only went to two sessions, as I wanted to get some time to work on my app. The sessions were all about performance improvements, and best practices. The first on image and graphics best practices. This session took me thru the way the iPhone actually processes images in memory and the impact on battery and CPU performance. This really helped me understand how to tune an application that is highly graphic intensive. My upcoming app I am working on, does process a lot of graphics and I will look and see if I am correctly handling my graphics.
The final session was on Auto Layout. The biggest issue I’ve had with AutoLayout in the past was not really understanding how it actually worked. This session talked about how the AutoLayout engine worked. What is Churn? How to improve your app’s performance with AutoLayout. What does Interface Builder do for you, verses setting up your screen in code? Another great session that explained to me how to address a lot of the issues I’ve had in AutoLayout and hopefully improve how my users interact with my App. After seeing this session, I got so excited that I went back to my hotel room and updated my app, yet again, before heading to the Hammer Theater to listen to the above Relay.FM live show.
Can’t wait for today’s session. I will be focused on debugging with session on Xcode, Core Data, and the debugging issue. We also end the day with a performance by Panic at the Disco.
WWDC Day Two
The best advice I was given by people who have been at WWDC before was to focus on getting time with the engineers. All the sessions are streamed as videos, and you can spend time later to watch the ones you missed. This is excellent advice!
I stared the day yesterday with getting with the AutoLayout engineers. I’ve been working on a new app, and having issues laying out a simple address input screen. I got in to the engineers and was the 10th or so person to get to talk to the engineers. The guy I got assigned to was really good at addressing asset catalogs, and tried to help me. He helped me with getting the City, State, ZIP line addressed correctly. I took his advice and went to a development area and started working on my app. I wasn’t able to get a lot more fixed in AutoLayout, and wanted to hit a few more sessions. I did get some design advice on moving some of my buttons up to the navigation bar.
I then went and sat thru a session called “I have this idea for an App”. The session was a simple getting started session, which talked you thru how to create a simple game. They did a good job of explaining the basics of xCode, interface builder, and even a bit on UITableView. The best part of the session was the list of other sessions you may want to watch to go deeper on other topics.
I made it down to lunch, which I wasn’t able to do on Monday, and sat down next to a pair of developers from Romania. We started talking, and I discussed the issues I was having with AutoLayout. The eyes light up on one of their faces and we spent the next 90 minutes or so, working thru the issues I was having with one of my screens in my new app. It was amazing! He had a great understanding of AutoLayout and of Interface Builder, He spent time explaining the concepts, and why you would do certain things. By the end of lunch the screen was perfect! I decided I was going to find time later in the day and fix the rest of the app. (Btw, now my images resize to fill the remainder of the screen).
The afternoon was filled with sessions, I went to the session on Create ML – Apple’s new machine learning engine which builds models based on data ingestion. It was amazingly fast and build much smaller models for your apps. While I’ve not done much work in my apps using ML, I am interested in seeing how this helps iOS and Mac Apps improve their prediction models. The next session was what is new in WatchOS 5. This session was interesting, not in what it said, but in what it implied. They didn’t mention in the session that Series 0 watches will not support WatchOS 5 but I read that later in the day. But this has to be because of all the new Siri functions that are coming to the Watch. Apps will be able to surface their own shortcuts and intents on WatchOS even if the app doesn’t run on the watch. We are going to get much better integration of ML and Siri. That led me to the final session I went to in the afternoon: How to create your own Siri Shortcuts. This was great! Last summer I tried to add Siri to my Wasted Time app, with the idea of simply allowing someone to say – “Start a meeting” and launch my App. Now I can do it, but the likelihood of it showing up automatically for people is going to be pretty light.
Shortcuts and Intents are registered by an app, and every time you do certain functions in your app you “donate” this knowledge to the OS. This allows Siri to “learn” pattern and ultimately predict for a user that it may be something they want to do. The most interesting part was that Apple did show how you can “unlearn” or delete the learned data. The number of developers who left during this topic was interesting. I wonder if GDPR will get them to pay attention next year!
I grabbed a pizza on the way back to my room and then spent the evening relaxing. Today’s going to great…
A few keynote thoughts
So, I’ve had dinner, and am currently skipping the Loop party that I had gotten tickets to. I am tired, exhausted and really excited. So the following picture talks about the key things that the Apple team was going to talk about at the keynote for iOS:
Let’s break it down:
The first icon is their focus for iOS 12 – Performance. This was a great story for people with older devices. While 50% of people updated to iOS 11 within a week of it being available, and right now over 93% of active iOS devices are running iOS 11 (compared to 6-7% of active android devices running the current release), it is not necessarily a great experience. Apple is showing significant performance improvements for older devices which upgrade to iOS 12. And they will continue to support all the same devices as iOS 11. The more I thought about this, the more I realized that Apple is working on approving performance so that all their new Machine Learning services still function. I am really excited to see what this means in practice.
The second icon (going left to right top to bottom), is about AR. Apple announced a new file format that is supported by major AR vendors, Adobe and many others. The one that was interesting to see was PTC. Will have to learn more about what they are really doing. However the format is a much smaller file format, with all the data you need for high powered AR objects. Apple also used this to show case their new “measure” app, which allows for AR measuring of real world 3D objects. Given we are currently remodeling a bathroom, this app should come in handy.
The third icon is major updates to the photo app, most of this is driven by the upgrades in machine learning that Apple is introducing. You can now do much better image recognition and classifiers. I am hoping that the app updates my library soon, so I can search for Dragons.
The forth icon is Siri updates. Basically Siri is getting better, and not just better, I can now go back and build a “Siri shortcut” for my Wasted Time application. So we are going to see a lot more apps being able to use Siri.
The fifth icon is a new app (basically the Workflow team has been integrated into Siri) to allow you to build receipts, or workflows, that string together actions. This looks really powerful, the question that was not addressed on stage was, can you interrupt a workflow and have Siri ask for more information.. Will hope to learn more on this over the summer.
The sixth icon is updates to the voice memos app, 1) it will now be on the iPad and 2) it will sync up the data via iCloud. (Spoiler alert, it will also be on the Mac.
The seventh icon is the new Apple Books app, os as it used to be called, iBooks. A major UI re-write, improved store, and more.. didn’t really focus on this, since I tend to have all my books on Kindle.
The eighth icon (first one on the second row) is the news app. Again, UI fixes, and better split view support on iPad, (another spoiler – this will also be on the Mac). I have installed on both iPad and Mac, and so far it doesn’t seem to be sharing my settings, etc. but I am sure that is a timing issue.
The ninth icon (by now do you think we are singing the 12 days of Christmas?) is the stocks app… another UI fix and support for Mac.
The tenth icon is the Do Not Disturb icon, they are allowing for much better fine grain support, and more importantly when you look at your phone during the night it will only show you the time, not any other notifications that have popped up during the night.
The eleventh icon is notifications, you can have them grouped, you can disable them when they pop-up, and the people around me were really happy. I have already killed most of the notifications on my devices, so this is not a big deal; however, I may turn them back on and see how it works.
The twelfth icon is screen time, you can now address how you use apps, and control your usage time, as well as your kids.
The thirteenth icon is messages. You now have Memoji! This is all about creating your own Animoji’s and it looks loads of fun. Definitely, means that I expect all new phones to have Animoji support on front facing cameras.
The fourteenth and final iOS icon is Face time. Here’s where it gets great, multi-user FaceTime videos for up to 32 people! Sounds great to me.
The watch and MacOS updates were also great.. but I am focused on iOS for now. But we can say – Wakie Talkie and Dark Mode.. I bet you can figure out which is on what platform.
They also talked Apple TV.
WWDC Keynote First Hand Experience – Quick note.
This is my first year actually getting to WWDC. And I can say, the two hours sitting in the keynote went a lot faster than I expected. I am really excited to download some new code and get things tested out. I will keep my home iMac on the current High Sierra so I can keep working on things over the summer, but I will be updating my devices, as usual. I am really excited to be testing out dark mode for my own personal eyesight, but as a developer I am looking forward to adding my own Siri Shortcuts to my meeting tracking app.
Can’t wait!
I now know the way to San Jose
Ok, I am sure many people use this same joke… but When I got here, I realized the last time I was in San Jose was 2008 for a Virtual Worlds Conference. It’s amazing how memories cause you to think things look one way, and they don’t really look that way. I guess that is why there are so many problems with eye witnesses. But that’s one why I am here…
Yup.. I am here for WWCD 2018! I have picked up my badge, and Jeans Jacket. A nice, stylish black jeans material..that I can’t wait to wear in about 10 more pounds. Looking at the world of Developers I really don’t understand how companies get away with all the skinny sizes. I would wear it, but I can’t button it up… maybe soon.
I’ve already had a great talk with one of the developers of game, he’s made it to WWDC 6 years running. I wonder if once you go if your more likely to make it thru the lottery. I hope so…
I had to get up really really early to catch my flight to San Jose. So I am sure I will easily stay on East Coast time and get in line early tomorrow to get a great seat for the keynote. Wish me luck!!