Upcoming

Watch the video

Most digital calendar representations are rooted in the metaphor of a traditional paper calendar. They find themselves constrained by this metaphor instead of being freed by their digital manifestation. How unsatisfying. We started thinking about how calendars help people use their time and, through research, we started to see the calendar differently.

We began asking what kinds of user interface components we could use to help visualize time differently on iOS – that’s when UICollectionView came to mind. UICollectionView is a new component Apple introduced in iOS 6. Collection views are incredibly powerful, allowing developers to more easily visualize data in interesting ways. So far, the component has been mainly used to display grids of photos – they’re capable of so much more.

We saw an Opportunity

We’ve been exploring UICollectionView – it’s still very new and its bounds are largely untested. It can organize data in interesting and unique ways. Calendar data is accessible via the iOS SDK – the opportunity to represent data from a user’s calendar in a novel collection view was too irresistible to pass up.

Meanwhile, we also noticed ReactiveCocoa, a new way to write apps declaratively, was gaining interest from developers around the Internet. We were intrigued and wanted to explore its potential.

Could we represent event data in a way that departs from the paper calendar metaphor while pushing the envelope of iOS technology?

Coming up with a new metaphor for representing calendar data would prove to be difficult. We were all familiar with the traditional paper calendar, so that’s where we started. We involved strategists and planners to help us explore how we use time in our day­to­day lives. Without their help, we would find ourselves falling back on traditional calendar metaphors instead of exploring alternative representations of time. We ultimately settled on a presentation that attempts to strip away as much noise as possible in an effort to focus only on the upcoming events.

The day/week/month calendar metaphors prevent other, more useful views—such as a detailed look at today’s events, while also providing a gloss on future commitments.
Jason Snell

Our plan was to remain as focused as possible on the use of time, so we chose to display only the current day. We could have made a calendar replacement, but chose to stay hyper-focused and create a supplement to the system calendar.

Eventually, we decided to represent a single day’s timeline linearly. However, we also wanted to show an entire day at a glance. The interface would divide a day into 24 equal segments, one for each hour. When the user touched the interface, the hours surrounding their touch would grow while other hours would shrink. This allowed us to display the event title in the expanded row. We abandoned the traditional scrolling interface metaphor in favour of scrubbing over events to see their details.

The decision to display only the current day presented a problem: what if someone has an early meeting the next morning? How can we provide that information to them while only focusing on the current day?

To solve this problem, we decided to show tomorrow’s first event by having the user pull up from the bottom of the screen. To provide the user with a clue about how to use this feature, at the end of the day, the next event peeks up from the bottom.

Getting Our Hands Dirty

Sometimes, we would want UICollectionView to behave in a certain way but would quickly discover some shortcomings. Specifically, we faced some limitations on the interaction design. There are some things that UICollectionView just wasn’t designed to do. By iterating between design and development, we explored what was possible with UICollectionView and prototyped different interaction models. The design lead the development while the development informed the design.

Additionally, there were some performance issues regarding UICollectionView’s own implementation. These are just limitations of the framework that we were using – we couldn’t work around them – so we did our best to optimize every other part of the application.

One of the effects our designers wanted was a Gaussian blur as a menu was exposed – this would provide a sense of layered depth to the interface. Gaussian blurs are not a park of UIKit or CoreAnimation so we relied on some open source code. Instead of applying a blur effect to the layer, which is not supported, we take a screenshot of the layer we want to blur. Using CoreImage filters, we blur and darken the screenshot to create a new image. Finally, we insert the new image into our view hierarchy, modifying its opacity as the menu is opened, or closed.

// Originally created by Bryan Clark: https://github.com/bryanjclark/ios-darken-image-with-cifilter

#import <CoreImage/CoreImage.h>

@implementation UIImage (Blur)
 
+(UIImage *)darkenedAndBlurredImageForImage:(UIImage *)image {
    CGFloat scaleFactor = 1.0f;
 
    if (AppDelegate.device == TLAppDelegateDeviceIPhone3GS || AppDelegate.device == TLAppDelegateDeviceIPhone4) {
        scaleFactor = 0.25f;
    } else if (AppDelegate.device == TLAppDelegateDeviceIPhone4S) {
        scaleFactor = 0.5f;
    }
 
    CIImage *inputImage = [[[CIImage alloc] initWithImage:image] imageByApplyingTransform:CGAffineTransformMakeScale(scaleFactor, 0.25f)];
 
    CIContext *context = [CIContext contextWithOptions:nil];
 
    //First, create some darkness
    CIFilter *blackGenerator = [CIFilter filterWithName:@"CIConstantColorGenerator"];
    CIColor *black = [CIColor colorWithString:@"0.0 0.0 0.0 0.92"];
    [blackGenerator setValue:black forKey:@"inputColor"];
    CIImage *blackImage = [blackGenerator valueForKey:@"outputImage"];
 
    //Second, apply that black
    CIFilter *compositeFilter = [CIFilter filterWithName:@"CISourceOverCompositing"];
    [compositeFilter setValue:blackImage forKey:@"inputImage"];
    [compositeFilter setValue:inputImage forKey:@"inputBackgroundImage"];
    CIImage *darkenedImage = [compositeFilter outputImage];
 
    //Third, blur the image
    CIFilter *blurFilter = [CIFilter filterWithName:@"CIGaussianBlur"];
    [blurFilter setDefaults];
    [blurFilter setValue:@(2 * scaleFactor) forKey:@"inputRadius"];
    [blurFilter setValue:darkenedImage forKey:kCIInputImageKey];
    CIImage *blurredImage = [blurFilter outputImage];
 
    CGImageRef cgimg = [context createCGImage:blurredImage fromRect:inputImage.extent];
    UIImage *blurredAndDarkenedImage = [UIImage imageWithCGImage:cgimg];
    CGImageRelease(cgimg);
 
    return blurredAndDarkenedImage;
}
 
@end

This is an expensive operation, so we scale down the image before applying a blur (which itself has a reduced blur radius). This gives us a lower­quality blur, but one that’s sufficiently fast for older devices.

Another challenging aspect of the design was the soft light effect applied to hours. In Photoshop, this is easy. In code, it’s much more challenging. We explored a lot of different techniques before eventually settling on something of a hack – take a screenshot of the background, crop it, then draw it using kCGBlendModeSoftLight.

CGImageRef imageRef = CGImageCreateWithImageInRect([rootViewController.gradientImage CGImage], CGRectMake(point.x, self.minY + self.superview.frame.origin.y, size.width, self.maxY - self.minY));
UIImage *img = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
 
UIGraphicsBeginImageContext(self.backgroundImage.frame.size);
[img drawAtPoint:CGPointZero blendMode:kCGBlendModeSoftLight alpha:1];
[aImage drawAtPoint:CGPointZero blendMode:kCGBlendModeSoftLight alpha:alpha];
 
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
 
self.backgroundImage.image = image;

This made the hour bars pop out more than just using semitransparent white boxes.

Our application logic was written declaratively, using ReactiveCocoa. This is a relatively new technology and we had lots of questions. Fortunately, the community was very supportive. We made some mistakes, but learned from each one. For example, we were explicitly subscribing to many signals, whose values were being explicitly sent. Essentially, we were using ReactiveCocoa to program imperatively instead of truly embracing the declarative nature of functional reactive programming. The creators of the code helped us and amended their coding guidelines to help others learn from our experience.

Our collection view was shaping up. One of our biggest departures from traditional collection views was that ours didn’t scroll. We abandoned the scrolling paradigm completely in favour of a scrubbing one.

The problem was collection views want to scroll, so we implemented our own collection view layout. This is the code necessary to stop a UICollectionView from ever scrolling.

-(CGSize)collectionViewContentSize {
    return self.collectionView.bounds.size;
}

The scrubbing animation was achieved using some basic math to calculate new heights for each cell depending on where the user’s finger was. UICollectionView took care of animating itself when the user touched the interface and lifted their finger.

One final aspect of the interface that we changed was the layout of the information within the collection view. Details were originally on the right, which were covered by users hands.

We switched the details to the left side of the screen so most users – those holding it in their right hands – wouldn’t be affected. However, we learned that the scrubbing interface metaphor only makes sense if you don’t have your data under the area being scrubbed over.

Open Source

All of the code for this project has been open sourced. Go check it out. There’s now a great, open source example illustrating how to use UICollectionView that doesn’t just involve grids of photos. We iterated through many trials of interfaces until we found one that worked, completely re­imagining what a calendar day can look like.

ReactiveCocoa helped us in this iteration process by keeping our application logic modular. Application logic flows from one module to the next, to the next.

Changing one module of application logic would not affect others, making rapid prototyping incredibly fast.

We’re proud of the delightful interaction we achieved. We connected interesting calendar data to a UICollectionView in a way that’s never been done before – we’re happy to share it with the world.

After learning a lot about ReactiveCocoa, we’ve shared what we learned in two blog posts: Functional Reactive Programming with ReactiveCocoa and Getting Started with ReactiveCocoa. These have had over ten thousand page views in the weeks since we’ve posted them.

The Upcoming app represents calendar data in a way that’s never been done before. We explored the cutting edge of iOS development technologies and we’re taking those lessons back to our client work.