Open Sourcing Greeting Keeper

While my post about Greeting Keeper being available had somehow got stuck in draft, I am posting a second blog entry to talk about the challenges that the app has had and what I am trying to do about it.

First, shortly after releasing the app, and buying the first copy myself, I put out a quick TestFlight fix to address a problem when there are too many cards in the Gallery. My gallery view didn’t scroll! I tool the time to correctly add a scrollView and also add the name of the card to the view. This made the Greeting Card picker so much nicer! The unfortunate thing was suddenly the app started having a background crash. I have not yet figured that out, as it only happens when you are not doing anything.

Second, I discovered that the GA version of the code introduced a very frustrating bug. One feature I had added in the shift to SwiftData was an edit feature. This would allow you to edit the Card for a specific recipient and the descriptive data about any specific card in the gallery. Well suddenly SwiftUI started having updates in the background that caused the view to accept and return as soon as you tried to change anything! This effectively disabled editing!

To address the second issue, I got some great feedback in my Swift Slack channels, to re-implement local variables in the view and manually process the edit. That did fix the problem, but after that, I started seeing major performance issues.

I am sure the performance issues are not related to the edit feature, but to the fact that I suddenly started to really add my history of data to the system. Now whenever you try and load a view with more than a few cards, the app really hangs while loading. To that end I have been trying to add the AsyncImage feature of SwiftUI so that image loads can happen in the background, giving the app a snappier feel. However, that doesn’t seem to be working at this time.

I think the big issue is that Swift Data is loading all the columns of data on the fetch, and I should exclude the image data. I can then do a separate fetch of the image data within the AsyncImage view. There may be better ways to address this issue, and as such I am going to let others take a look at the code which is currently hosted on github.

If you are interested in taking a look and helping me improve this app, please drop me a line at Michael Rowe

Greeting Keeper Released

Well, after writing, rewriting, refactoring, and replatforming for six year, I have finally released a personal project to the App Store. Greeting Keeper has gone thru UIKit, SwiftUI, CoreData, CloudKit, and SwiftData to become a pretty useful app. It does, however, have two major bugs, that crept in during the release, and I am trying to deal with them, when I can.

The bugs are as follows:

  1. If you add a new card to your card gallery, while viewing the gallery for that type of card, the UI freezes up. I can recreate this bug every time, but I can’t yet figure out what is causing it.
  2. My various edit screens, will allow you to change one selection and then automatically return you to the higher menu OR if it is a text field that you want to type into, then when you select the text field, it automatically returns you to the prior screen.

Both of these bugs make the app unusable for the average user. To prevent that, I made it a paid app, and so far only one person has bought it… ME. Which is fine.

So this is the weirdest launch post ever, basically I am telling people don’t download the app. Not yet. I need to resolve the two issues.

A VisionPro-ductive Week

It’s been a solid week now with the Apple Vision Pro. When I got it last Saturday, I provided a before and after view of the experience as part of my Weekly Podcasts Games At Work dot Biz (you can find it here Episode 452 – Before and After ). Since then, in order to see if it can be a productive device, I have tried to spend as much time as possible using the headset. In this post, I hope to give you a little look into how that experience has been.

Using Business Applications on the Apple Vision Pro

Let’s begin with how I was able to use the Apple Vision Pro for my day job. I currently work in a large multi-national company. In my capacity as a Technical Strategist, I spend at least half my day on Webex meetings. To that end, I installed the native Webex Vision Pro app, and surprised a few teammates by joining in with my persona. When I tried it on Monday, I had a few people tell me that I looked angry, I guess resting persona face is really not flattering.

Persona Version 1.0

After the 1.1 upgrade I captured a new Persona and was told that it didn’t make me look as angry. I think adding the glasses helped.

Persona Version 1.1

The most productive aspect of using the Vision Pro was that I could get my work MacBook Pro mirrored and then use the environments to block out all the distractions when I working on building presentations, doing email, and working on some development tasks.

As someone, who is easily distracted by others, being able to really focus in allowed me to be more productive. I also liked the ability to have music playing in the Vision Pro, without having to have my AirPods Pro or AirPods Max on. It blocked just the right amount of noise, while still allowing me to be aware of my environment.

Doing Development on the Apple Vision Pro

While my day job coding does not include any work on Apple platforms, I do some personal coding in Xcode, including having an app on the Vision Pro (launch app!), as such I was really excited to see how easy it was to deploy to the headset. Net-net, it is the same as doing wireless deployment to the iPhone or iPad. The only unique part was how you enabled developer mode on the Vision Pro to connect to your development Mac. After enabling the developer setting as you would on any iSO device, while wearing the headset, Go to Settings > General > Remote Devices and select your computer. You can now see your Vision Pro in the Devices and Simulators setting of Xcode.

I had looked into the Apple Developer strap (a $299 add-on only available thru the developer program), but don’t think it is worth the cost at this time. If I were more worried about bricking the device, I believe this is the hardware you’d need to run the Apple Configurator and refresh the headset. Apple had a similar piece of hardware for early Apple Watches, which was used by the Genius Bar to fix a broken watch.

Reading on the Apple Vision Pro

The other productivity aspect I wanted to hit on is reading manuals and other technical documentation. I am a big user of Calibre to make sure that any ebook I purchase is readable on any device of my choosing. I buy books from Amazon for the Kindle, from Apple in the Books App, and from independent ebook sellers like Cory Doctorow who does not DRM his books. Depending on how I am working, I like to have access to the books in the most platform native manner. While the Books App on the Apple Vision Pro is just the iPad version, I figured having a reference book up next to my Mac screen share would be a great test for development.

So far, the iPad app is not quite up to par. It the same issue I’ve had with other iPad apps, the eyesight targets are not consistently visible. I am not sure if this is a VisionOS bug, or just bad UI implementation by the developers of iPad apps. I personally had issues with some of my custom buttons of my own iOS app that I ported to VisionOS. As such I had to remove some of my custom button designs to allow VisionOS to correctly handle targeting.

Headset placement and Comfort

While I tried the default Solo Knit band to start with, and while the headset did pull forward some, it wasn’t too uncomfortable. However, once I switched to the Dual Loop Band, I could wear the headset as long as I wanted without any discomfort. What did happen, about midway thru the week, I started getting a hot forehead. I also started getting a deep red mark on my forehead. This came about the same time that I tried switching from the W to the W+ Light Seal Cushion. I tried switching back to the W after a day and was still having issues. I then finally realized that I had been slowly tightening up the bands to have the headset more and more stationary. This wasn’t because it felt loose or anything, but I guess subconsciously I felt it needed to be tight to the face. After loosening the straps of the Dual Loop Band, thing have gotten back to feeling good.

What was strange, was that after installing VisionOS 1.1 beta, it seems that the eye tracking has gotten worse. I am have issues with some of the targets being recognized. I am hoping that this is a beta issue and will be fixed soon.

The other comfort issue I wanted to touch base on is the Zeiss Optical Inserts. My regular glasses are progressives and I also have astigmatism. I send in my prescription and so far the lenses that I got are working great.

Overall Verdict

Right now I am loving the Apple Vision Pro. As I mentioned, I am easily distracted and the ability to focus on my work is a huge benefit. I have started playing a few of the games, and if you have a big enough space for Fruit Ninja, it is fun. I really enjoyed playing Battleship (reminded me of being a kid again). 3D movies and videos are amazing. And finally, I really need to take more panoramic pictures. I found about 75 in my photo library. Some going back to 2009, some hand stitched together in PhotoShop while on vacation in Prague, some while at rocket launch in Florida, and everyone one of them transported me back to the time and place they were taken. I had even taken one the last time I met my brother, his wife, and my sister for a great dinner in Georgia. It felt like we were sitting in the restaurant, enjoy the wine, and getting ready for another great conversation.

Always learning new things

Yesterday I went to #Unwrap Live 2024. This online, all day, SwiftUI programming class by Paul Hudson from Hacking With Swift fame, was all about the Apple Vision Pro this year.

I love the way Paul explains code, techniques, and APIs. The other valuable thing is, he is willing to rathole on a question that someone brings up during the sessions. His incredible knowledge of Apple’s APIs is amazing. During the session yesterday, someone asked about rendering a video on a 3D object, he was able to add the feature, explaining how and why you may make certain choices, etc. Amazing.

Well, during the session yesterday, Paul explained a few things about Apple’s materials choice for VisionOS. I realized that the version of Wasted Time that I made available for day one release, would not only violate much of the material design, but also cause people eye strain. Today, I spent time redesigning the UI of Wasted Time Pro, and also all the related versions of the app. I have submitted it to Apple for review, and hopefully the newer version will be available on launch day too. Fingers crossed!

Wasted Time Pro – Launch Date

Well, I’ve finally done it. I have submitted an app that will launch on the same day as a new piece of Apple hardware. I have taken my simple application – Wasted Time, and did a total re-write over the last year to better take advantage of SwiftUI and the latest Swift APIs.

Wasted Time version 1.0 (2010)

I changed the underlying architecture to use the new Swift @main entry point, and even got rid of the old AppDelegate model I used to use. AppDelegate was the model that you used way back in Objective-C programs.

The basic functionality of Wasted Time hasn’t changed since 2010, however I did remove Twitter a few years back.

Here’s what it will look like on the Apple Vision Pro.

I can’t wait to get ahold of the Apple Vision Pro hardware, so I can actually see my code running outside of the simulator.

SwiftData / SwiftUI Combo Frustrations

The last few posts have been about my re-write of my personal application to address greeting cards.

This has been a very frustrating effort as I’ve tried to restructure the app’s UI to better adhere to Apple’s design principles in order to allow the user edit both a recipient’s Address and to add their own event types. This has required me to do the following things:

  • Restructure the database itself, going from only two tables:
    • A recipient of a card
    • An instance of a card that a recipient has been sent
  • To four tables:
    • An Event Type, i.e. Christmas, Birthday, etc.
    • A recipient, name and address
    • A gallery of greeting cards
      • With pictures, event types, descriptive name, and a URL where you can find it
    • An instance of a specific card for that a recipient has been sent

This approach would allow me to dramatically reduce the size of the database by only having one image for each Greeting Card. And it would also allow me to dramatically increase load speed, since loading the image would only occur as an AsyncImage (and could be cached, since the same image would be used by multiple instances).

Apple is pushing people to start using NavigationStack, with a navigationPath so that you can push to any screen and pop back thru the stack. However, I would like to structure the app with a NavigationSplitView since it is designed for iPad and iPhone. The basic idea would be to allow the user to start with picking do they want to add a recipient, an eventType, or a card to the Card Gallery. Those would each be a separate NavigationStack so you would get a list of all the existing items (events, recipients, or greeting cards) in column two, and then a detailed listing of all the specific cards that match it in the third column. Simple right?

The third column would be the “Detailed” view, so for a recipient, you’d get a LazyVGrid with all the Cards that they have received, sorted by date. For an EventType, you’d get a LazyVGrid with all the Cards for all recipients with that specific event sorted by date. And finally for the Card Gallery, you’d get all the cards by type.

While there is still a lot of UX improvements I can do here, the basic structure here is done using the original approach.

As I try and figure out the three column view with the NavigationSplitView, I am wondering if I can still use the NavigationPath. Also, since my SplitView starts with a list of the enumeration states (and I’ve tried both “Buttons” and just “Lists”, I can’t seem to highlight the selected item, so now I have to a title in the center column. It’s a little bit better. But for some reason you see that the “Cards Sent” in the by recipient view, is showing up in Black. It blinks for a second in the right color.

The other feature I have added was a custom picker for the image’s in the gallery by event type.

As you can see the images show up at the bottom of the window for some reason. it is very frustrating, as I am trying to use a SwiftUI Form to give the nice gray background and grouping of various sections of information.

VStack {
   Form {
     Section("Card Information") {
        Picker("Select type", selection: $selectedEvent) {
            Text("Unknown Event").tag(Optional<EventType>.none)
            if events.isEmpty == false {
                 Divider()
                 ForEach(events) { event in
                    Text(event.eventName)
                       .tag(Optional(event))
                 }
             }
         }           
         DatePicker(
            "Event Date",
            selection: $cardDate,
            displayedComponents: [.date])
         }
      }
      .padding(.bottom, 5)
      if selectedEvent != nil {
         GreetingCardsPicker(eventType: selectedEvent ?? EventType(eventName: "Unknown"), selectedGreetingCard: $selectedGreetgingCard)
      }
   }
}

If I place the GreetingCardsPicker in a Section it collapses to a single line (completely unusable). So for now, I am stuck with the cards all showing up at the bottom. Hopefully I can figure this one. At the same time, right now, the “GreetingCardsPicker” is not returning the selected item. As a matter of fact the .onTapGesture code doesn’t even execute.

Hope to make more progress soon.

Card Tracker Rewrite Update

This weekend I had a major milestone of the rewrite. All features are working (except edits). I was successful in adding a new feature of allowing for looking at all the cards sent for a specific event, i.e. Show me all the Christmas cards by date and recipient.

This new feature is something I’ve been wanting to add for a long time. What’s great about it is that all the hard work I did recently to allow for a print of all the cards for a single recipient, now also works for ll the cards for a specific event.

After this update, I finally started moving five years of card history over to the new version of the app. This reinforced a big issue I have been ignoring. Space bloat! The app currently takes every card’s picture and attaches it to a specific recipient’s event. Meaning that if two people are getting the same Thank You card, then it takes up twice as much space. As you an image, Christmas cards would take up a ton of space.

To that end, I’ve decided to implement a Card Gallery feature. This will all a user to take a picture (or scan) of a specific card, and then use it for all the people who get the same card. This will dramatically improve performance (thanks to caching by SwiftData), and will also radically reduce storage requirements.

Of course adding this feature means I am going to have to adjust the UI and delay my desired release of the app until I get that (and the edit feature) working. Hopefully the next few weeks will allow me to make major progress.

Card Tracker ReWrite

For the last five years or so, I’ve been working on a pet project to track the various greeting cards we send out each year. The initial version had a lot of UI problems, so I used it as an opportunity to learn SwiftUI.

The second major rewrite was to add CloudKit integration so that I could sync my cards across devices. That showcased significant issues with my database design. So I’ve been trying to work thru rewriting it lately with SwiftData.

Before I could do that, I had to enable a way to print out all the information in the application. After five years of usage we had hundreds of cards we’ve sent to people and I didn’t want to lose this data, nor was I sure I’d be able to successfully migrate from CoreData and CloudKit to SwiftData. I also knew that my database redesign was going to be very aggressive, and it was time to clean up the storage requirements.

Today I’ve been able to finish in less than a day, a significant portion of the problematic aspects of the application. I know now create new card types, list the card types, list all the recipients of the cards, and filter each of these lists.

It has been loads of fun working on this over the course of a weekend, and I am hoping during my upcoming year end vacation to not only completely rewrite the applications, but to also reenter five years of cards. This effort will allow me to validate that the new design is not only faster, but easier to use!

Can’t wait to share more!

Deletes in Holiday Card Tracker

I’ve been working on a Greeting Card tracking app, in my spare time for years now, five to be exact. I may get it on the App Store one day, but primarily it is just used to track the cards my wife and I send to friends and family. I’ve re-written it a few times, going from UIKit to SwiftUI, and then improving the Swift code to do CoreData with CloudKit syncing. I am hoping to rewrite it sometime in the next year to fully SwiftData, but right now I am struggling with deleting cards correctly.

Conceptually, deleting a child object in a parent-child relationship in CoreData is not hard. You just have to delete the child and CoreData should handle the rest; however, I am showing all the children in a LazyVGrid, so that you can get a quick and easy overview of all the cards you have sent to a single recipient.

As you can see above, we have at least 6 cards visible at once, and each card will allow you to edit, view in detail, or delete. Each of these “events” is a separate SwiftUI View, and on top of it is the MenuOverlayView:

//
//  MenuOverlayView.swift
//  Card Tracker
//
//  Created by Michael Rowe on 4/16/22.
//  Copyright © 2022 Michael Rowe. All rights reserved.
//

import SwiftUI

struct MenuOverlayView: View {
    @Environment(\.managedObjectContext) var moc
    @Environment(\.presentationMode) var presentationMode

    @State var areYouSure: Bool = false
    @State var isEditActive: Bool = false
    @State var isCardActive: Bool = false

    private let blankCardFront = UIImage(contentsOfFile: "frontImage")
    private var iPhone = false
    private var event: Event
    private var recipient: Recipient

    init(recipient: Recipient, event: Event) {
        if UIDevice.current.userInterfaceIdiom == .phone {
            iPhone = true
        }
        self.recipient = recipient
        self.event = event
    }

    var body: some View {
        HStack {
            Spacer()
            NavigationLink {
                EditAnEvent(event: event, recipient: recipient)
            } label: {
                Image(systemName: "square.and.pencil")
                    .foregroundColor(.green)
                    .font(iPhone ? .caption : .title3)
            }
            NavigationLink {
                CardView(
                    cardImage: (event.cardFrontImage ?? blankCardFront)!,
                    event: event.event ?? "Unknown Event",
                    eventDate: event.eventDate! as Date)
            } label: {
                Image(systemName: "doc.text.image")
                    .foregroundColor(.green)
                    .font(iPhone ? .caption : .title3)
            }
            Button(action: {
                areYouSure.toggle()
            }, label: {
                Image(systemName: "trash")
                    .foregroundColor(.red)
                    .font(iPhone ? .caption : .title3)
            })
            .confirmationDialog("Are you Sure", isPresented: $areYouSure, titleVisibility: .visible) {
                Button("Yes", role: .destructive) {
                    withAnimation {
                        deleteEvent(event: event)
                    }
                }
                Button("No") {
                    withAnimation {
                    }
                } .keyboardShortcut(.defaultAction)
            }
        }
    }

    private func deleteEvent(event: Event) {
        let taskContext = moc
        taskContext.perform {
            taskContext.delete(event)
            do {
                try taskContext.save()
            } catch {
                let nsError = error as NSError
                fatalError("Unresolved error \(nsError), \(nsError.userInfo)")
            }
        }
    }
}

It’s a pretty simple view, with a menu overlay to Edit, View, or delete. The delete function just calls CoreData’s delete() and then save() methods on the current managedObjectContext. No problems here.

The problem is actually in the parent view…I won’t show all the code, but will show a simplified version of the LazyVGrid:

LazyVGrid(columns: gridLayout, alignment: .center, spacing: 5) {
                        ForEach(events, id: \.self) { event in
                            HStack {
                                VStack {
                                    Image(uiImage: (event.cardFrontImage ?? blankCardFront)!)
                                        .resizable()
                                        .aspectRatio(contentMode: .fit)
                                        .scaledToFit()
                                        .frame(width: iPhone ? 120 : 200, height: iPhone ? 120 : 200)
                                        .padding(.top, iPhone ? 2: 5)
                                    HStack {
                                        VStack {
                                            Text("\(event.event ?? "")")
                                                .foregroundColor(.green)
                                            Spacer()
                                            HStack {
                                                Text("\(event.eventDate ?? NSDate(), formatter: ViewEventsView.eventDateFormatter)")
                                                    .fixedSize()
                                                    .foregroundColor(.green)
                                                MenuOverlayView(recipient: recipient, event: event)
                                            }
                                        }
                                        .padding(iPhone ? 1 : 5)
                                        .font(iPhone ? .caption : .title3)
                                        .foregroundColor(.primary)
                                    }
                                }
                            }
                            .padding()
                            .frame(minWidth: iPhone ? 160 : 320, maxWidth: .infinity,
                                   minHeight: iPhone ? 160 : 320, maxHeight: .infinity)
                            .background(Color(UIColor.systemGroupedBackground))
                            .mask(RoundedRectangle(cornerRadius: 20))
                            .shadow(radius: 5)
                            .padding(iPhone ? 5: 10)
                        }

As you can see it just displays all the events, where each event is a card. Well since the delete method is in the MenuOverlayView, as defined above. So what happens is when the overlay delete occurs, I should pass back a message to reload the events in the LazyVGrid. But I don’t have that working, as I can’t figure out the correct way to do this.

At least this week, I was able to update my WastedTime in TestFlight to see if I have fixed an issue with the Gauge Complication not displaying the data correctly. So far so goo.

Wasted Time’s Latest Rewrite

As is my habit each summer after WWDC, I rewrite my Wasted Time app to try and take advantage of the latest changes for iOS, macOS, tvOS, and watchOS. This summer has been no difference; however, I also took some time to completely refactor the application to no longer use the AppDelegate feature of older swift programming models. This change was the most dramatic and is still causing me a few problems.

In prior years, the AppDelegate required that I had a different set of code to handle launching on watchOS verses macOS and iOS. I also ended up with lots of compiler directives in order to handle the fact that on watchOS the delegate was a different type than on iOS. This level of #if os(watchOS), really caused a lot of crud in the code. Getting rid of that and using the @State variables, injecting them into the .environment has made things a lot cleaner. It also has helped me fix a lot of long running bugs in my statistics page.

Another challenge I hope to resolve over the course of the year, is removal of @UIApplicationMain and full conversion to @Main in all versions of the app. While I don’t fully have this one worked out in my mind, it is a necessity since my macOS and iPadOS version have broken their keyboard handling. The keyboard was using App Intents, which I have also messed up this summer, and I am sure solving this problem will provide me with improved capabilities for my Widgets.

My widget code has been completely rewritten due to changing over to WidgetKit. I am hoping to make an interactive widget and Live Activity over the course of this year. If I can pull that off I may actually change the version major this year twice!

In the meantime, the investment over the last few years on SwiftUI, along with this summers change to @Main has allowed me to easily create a native VisionOS version of the app. I can’t wait to get my hands on a VisionPro to test it out; but I assume I will not have the opportunity to do so before Apple launches the device early next year.