BlendConf 2013: Immediate Take Aways

I had a blast at BlendConf this past weekend, and am still marinating on all the cool information that I gleaned from it. I want to share some things that I am already putting into action.

But first, what is BlendConf?! I’m glad you asked. The 3-day conference is primarily aimed at web designers and developers, and has 3 main tracks: User Experience, Design, and Development. The first day (Thursday) is workshops. The second day (Friday) is the traditional track talks. The third day (Saturday) is the blend day, where experts in each track speak in a session that is not their specialty. For instance, a developer talking about open source development in the design track. It was a really great opportunity to get exposure to the various aspects of software (web) development, especially considering this was the inaugural year.

Practical Application from a Talk

The first comes from Amanda Costello’s talk on working with specialists, specifically people with PhDs and other highly specialized individuals. This struck me as particularly interesting as I currently work with many healthcare professionals who are highly educated and specialized (PhDs and MDs). Amanda had a lot of great tips and stories about some of her experiences working with people in higher education.

At one point, she talked about giving her stakeholders (the PhDs) homework which was a list of 4 really simple questions to help her get the information she needs to do her job. Keep in mind that she’s a content strategist, so her main goal is to extract the specialized content from her stakeholders, and get that content onto a website.

Here are the 4 questions:

  1. Who is it for?
  2. Who is it not for?
  3. Are there other sites like this? What makes it different?
  4. What is its name?

When I came into work on Monday after the conference, one of the first e-mails I received was a request for what questions we have around some app requirements where the stakeholders are PhDs and MDs. The app is a mobile app for a healthcare provider, and we recently discovered the requirements and scope had changed since we last talked about the app (about 2 months ago). When I started to get frustrated about what to ask (I don’t know what I don’t know), I thought, “What would Amanda do?” I immediately pulled out my conference notes, and pulled up the 4 questions, all of which had some direct relevance. I was able to put together a list of “comprehensive questions” (see above) so my stakeholders would be more prepared for the face-to-face conversation we need to have.

Thank you, Amanda!

Practical Abstraction From One Point of a Talk

The second major lesson that I am already benefitting from is being more intentional about what activities take up my time. Cameron Moll, founder of Authentic Jobs, spoke about Authenticity in Creativity. He had several powerful stories enforcing his point about authenticity, but something stuck out beyond the obvious topic.

One of his points was to be skeptical of what technology is and what it is not. He applauded the fact that devices were banned/discouraged in the conference sessions, and talked at length about changing his technology usage habits to ensure his own personal authenticity (e.g. keeping his phone in his pocket at dinner). This made me really consider what I hold important and what takes my time. What activities steal my time from other more useful activities?

Afterwards, I did 2 things as a follow-up for myself. The first was to make a list of all the commitments I’ve made and activities I participate in (or want to). I prioritized & refined the list as I’m a chronic over-committer who needs to clear her plate regularly. The second thing has been far more overreaching. Everytime I go to a website or open an app, I try to think consciously about what I’m doing. Is this really how I want to spend this time? Are there other things that are more important to me? Then I think of my list.

Since that session, I have played very little Candy Crush Saga and surfed Facebook a lot less.

Practical Sense of Greater Purpose

The final thing I have been pondering since the keynote by Carl Smith. There really is a greater purpose in programming than simply meeting requirements or getting a paycheck. Carl talked in the keynote about a lot of things, in particular leaving a high paying job to go do something he wanted to do. At the end of his talk, he quoted Invictus, the poem Nelson Mandella cited for prisoners on Robben Island, and I came to a full realization. I am the master of my fate: I am the captain of my soul.

This theme resonated for the rest of the conference. In every session, I heard a consistent theme that we are building amazing things and changing the world one little bit at a time. Literally! One 0 and 1 at a time. I came away from the conference with a renewed purpose not to get mired down in politics or the muck of frustration. I have renewed determination to find my purpose and change the world one bit at a time.

And I was reminded of Bill Nye’s WWDC talk where he kept saying the phrase, “…we could, dare I say it, change the world!” That talk isn’t posted publicly, but this video will give you a sense of what I’m talking about.

I encourage you to remember your purpose, whatever it is, is greater than the politics you play. You could, in fact, change the world!

Testing a Location-Aware App

In order to support this post, I have published a demonstration project on Github. Feel free to clone it and follow along in this post.

I recently posted about how to implement location tracking via iOS Location Services, and felt a follow-up would be useful. It’s one thing to make a location aware app, and it’s quite another to see it work in the wild. Testing is very important here. But you don’t have time to drive around, you say? You don’t have the money to fly across country to see how your app behaves in other regions? Fortunately, Apple has provided some tools and mechanisms to test location aware apps from within the comforts of your own development environment.

With the release of iOS 5, location simulation was added to the corresponding development tools. As a result, there are a few ways to simulate locations.

  • Xcode: GPX files, schemes
  • Simulator (iOS 5+): manually set the location
  • UI Automation Instrument: load up various lat/long points via a script to simulate movement

Xcode

GPX (GPS exchange) files are XML files that conform to the GPX schema, which allows interchanging of GPS data. GPS systems generate and consume this format, and so does Xcode! Creating a location is fairly simple if you have the latitude and longitude of a point. You can also create routes and a whole host of location files that can be used to simulate locations. The following is a simple GPX file that targets a location in Lincoln, Nebraska, USA.

<?xml version="1.0"?>
<gpx version="1.1" creator="Xcode">
      <wpt lat="40.828359" lon="-96.699257">
          <name>Lincoln, NE</name>
      </wpt>
</gpx>

Once you have added your GPX files to your Xcode project, there are 2 ways to utilize them. The first is by setting the default location in a scheme. If you have several locations you need to test regularly, you can create a scheme for each location to make it simpler to test your app in each location quickly. When you run your app under the custom scheme, Xcode will automatically simulate the app running in the location according to the scheme configuration. To do this:

  1. From the scheme menu, click New Scheme…
  2. new schema menu

  3. Enter a name, and click OK. I used my project name + location. For example, CSLocationTestKit-Lincoln, NE.
  4. From the scheme menu, click Edit Scheme…
  5. In the Options tab, check the Allow Location Simulation in the Core Location section.
  6. Select the GPX file for the Default Location. If you added a GPX file to your project, it should be displayed in the list for you to select. Alternatively, you can add an existing GPX file from this menu.
  7. set default location on the schema

But what if I want to change the location in the middle of simulator testing? You can change the location in Xcode during runtime as well. After your app starts up, bring up Xcode and ensure the Debug pane is showing (the bottom view). Select the blue arrow to get a list of locations including those specified in GPX files in your project.

change location in debug view of xcode

Simulator

To change the location in the simulator directly, click the Debug menu, and select Navigation. This can be done at runtime while debugging or while navigating through the simulator detached from any Xcode projects.

Debug menu then Location menu then click on custom location
The simulator options are vastly simpler and thus more limited. You can select Custom Location… to enter a set of coordinates, but this is less robust than using GPX file. However, if you have a need to do some ad hoc testing of coordinates, this method is sufficient.

UI Automation Instrument

The final and arguably the most powerful method of testing a location aware app is to use the UI Automation Instrument. It is also the trickiest because Apple’s documentation of instruments isn’t very explicit on how to use it for testing movement amongst locations. Once you get started though, it gets much simpler.

In my code example, I have a basic view that shows the current location coordinates. Once you click on the map button, the map view is displayed along with a button to turn region monitoring on (“Watch for the country club!”). I want to automate testing on this screen because I want to verify that my region monitoring logic works properly without having to go anywhere. My approach, for demonstration purposes, is to use a list of GPS coordinates to simulate a driving route from my current location to the Lincoln Country Club. This route be extracted from a GPX file, and injected into a script to simulate movement. You can also simulate this movement by making the GPX file with the route your default location in the scheme, but I wanted to demonstrate changing location in the UI automation instrument (via JavaScript). Note: I created this route in Google Maps, exported the KML, and then converted KML->GPX via GPSBabel.

To use the UI Automation instrument:

  1. Click and hold the Run button in Xcode, and select Profile (or click Product > Profile from the menu at the top)
  2. Run button long click results in profile being an option

  3. Instruments will open. Select Automation and click Profile
  4. Instrument selection screenshot

  5. Stop the recording that is automatically triggered by Instruments.
  6. In the Scripts section, click Add to create a new script. You can import existing scripts as well as calling up scripts you’ve recently used/exported. I have included a sample script to test driving a route to a country club in Lincoln, NE. This script works best if you use the Lincoln, NE GPX file as the starting location.
  7. import scripts screenshot

  8. In Xcode, click the Profile button again to trigger a profiling restart using the script you just imported/created. Don’t forget you have to stop the Recording once your script has run (this is documented).

Now, you have automated the testing of your driving route! At this point, you can add other instruments to your session in order to track allocations, leaks, etc. You can also run any variety of scripts to test various movement scenarios. Automating this kind of testing can make regression testing much more efficient. And because the test scripts are written in JavaScript (and are fairly primitive), you can enlist the help of JavaScript programmers who might not know Objective-C very well (or not at all). The UI Automation JavaScript Reference is very helpful for creating these scripts. The only major caveat that I’ve uncovered is the Automation instrument refers to values in the accessibility fields. For instance, if you want to check a value on a label, you need to make sure the accessibilityValue property is set.

Supplemental Recommended Viewing/Reading
Testing iPhone Location App with Automation by Plain Old Stan
Location Awareness Programming Guide from Apple
Session 500 at WWDC 2011: What’s New in Core Location – good basic intro to location services, particularly the new features that came with iOS 5
Session 518 at WWDC 2011: Testing Your Location-Aware Application – this is a must watch! It has a lot of good info on how to setup your environment to test different locations and even moving from location to location.
Session 303 at WWDC 2012: Staying on Track with Location Services​

Scroll position, UITableViews, and You

Came across a scenario on an iOS app today that didn’t have a clear answer in a single place. I basically had to cobble together a solution for my problem, and wanted to document it here.

The Problem

I have a UITableView that I’ve developed in order to mimic a form. My “form” has a single UITextField in it, but this should work with any number of text fields in a UITableView. The text field is at the very bottom of my form, and I need to set the scroll position of the table when the keyboard is shown so the field remains in the user’s view.

The Solution: Psuedo Code

  • Make the controller conform to the UITextFieldDelegate protocol
  • Implement the textDidBeginEditing method of the UITextFieldDelegate protocol to add the UITapGestureRecognizer
  • Create methods to respond to the keyboard display/hide notifications that will resize the scroll content and set the scroll position
  • Add notification observers to listen for keyboard display/hide notifications
  • Create a UITapGestureRecognizer that will be used to dismiss the keyboard when the user taps on the UITableView

The Solution: Actual Code

UITextFieldDelegate Protocol

Set your class to conform to the UITextFieldDelegate protoocal, and implement the textFieldDidBeginEditing:(UITextField *)textField method. This is where you’re going to add your gesture recognizer whenever the user taps in the field so the view knows to change scroll position.

Keyboard Notifications

Create methods to respond to the display and hiding of the keyboard. The following code was somewhat ripped from Apple’s documentation on managing scroll position for keyboards. The trick I found was to use the UITableView’s scrollToRowAtIndexPath:atScrollPosition:animated: method to get the scroll position I really wanted.

-(void)keyboardWasShown:(NSNotification *)theNotification{
    //adjust the scroll position so the zip code field is in view when the keyboard shows up
    
    NSDictionary *info = [theNotification userInfo];
    CGSize keyboardSize = [[info objectForKey:UIKeyboardFrameBeginUserInfoKey] CGRectValue].size;

    //set the insets to account for the keyboard height
    UIEdgeInsets contentInsets = UIEdgeInsetsMake(0, 0, keyboardSize.height, 0);
    self.myTableView.contentInset = contentInsets;
    self.myTableView.scrollIndicatorInsets = contentInsets;
    
    UITableViewCell *cell = (UITableViewCell*) [[_activeField superview] superview];
    [self.myTableView scrollToRowAtIndexPath:[self.myTableView indexPathForCell:cell] atScrollPosition:UITableViewScrollPositionTop animated:YES];
}

-(void)keyboardWillBeHidden:(NSNotification *)theNotification{
    //reset the content insets
    UIEdgeInsets contentInsets = UIEdgeInsetsZero;
    self.myTableView.contentInset = contentInsets;
    self.myTableView.scrollIndicatorInsets = contentInsets;
}

Add observers to the notification center that listen for the keyboard display and hidden events so you can adjust the scroll position. I did this in the init method of my class, but there are lots of alternatives to your placement of this code.

        [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(keyboardWasShown:) name:UIKeyboardDidShowNotification object:nil];
        
        [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(keyboardWillBeHidden:) name:UIKeyboardDidHideNotification object:nil];

Create the UIGestureRecognizer, and set it on the UITableView of your class. I created an instance variable (_tap) so I could add/remove it at various places in my code.

_tap = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(tableTouched:)];
[self.myTableView addGestureRecognizer:_tap];

And here’s the tableTouched: method. As I said before, I have only 1 UITextField in my table, and I created an ivar (instance variable) so I could reference it quickly within the code. I removed the UIGestureRecognizer at this point because I don’t want the gesture to interfere with any otherwise normal table gestures.

-(void)tableTouched:(id)sender{
    [_activeField resignFirstResponder];
    [self.myTableView removeGestureRecognizer:_tap];
}

Carolinas

CHS download