Please enter your text to search.

An Update on the #Aviate / #Tasker Project

An Update on the #Aviate / #Tasker Project

A few weeks ago I tried Aviate and although I liked the concept, I decided it wasn’t for me. I wanted more control than it gave me over organizing my life into a group of contexts. And thanks to the experience of trying Aviate, I realized that I could do something similar using Tasker.

The basic idea was to create a different home screen in my launcher (I use Nova Launcher) for each context I wanted and then use Tasker’s “Go Home” action to automatically present the appropriate one when I unlocked my phone or tablet (#Gnex or #Nexus7).

Aviate uses time of day and location to trigger different contexts: Morning, Night, Home, Work, and Going Somewhere. But I don’t use either my phone or tablet in a way that meshes with that structure. And since the whole point of Aviate and context awareness is to get structure out of the way, it kind of defeats the purpose.

I started with my phone because I use it more contextually than I do the tablet, or so I thought. And keeping it simple I just tried implementing “Home” and “Not Home” which are really the only two contexts I use my phone in. You could use location to toggle between these, just like Aviate does. Or with Tasker you could also toggle them based on whether a certain wifi access point is in range. In my case, I already had an NFC tag by the kitchen door that I used with Tasker to toggle radios on and off when I come and go, so I just added this new context awareness to that mechanism. This approach has the added benefit of saving a little battery because it doesn’t rely on a location check, although you do have to keep your NFC radio on.

With the phone working so well, I started thinking about context on my tablet. “Home” and “Not Home” didn’t make sense there, but time of day did. Email and Evernote in the morning where daily status reports on my Roku channels are waiting for me, News in the afternoon, social in the evening. And I could control the exact times at which each context became active. That’s all pretty simple to make happen with Tasker and I can even set a different background image for each context. 

Once I had that working, I started thinking about things that might be used to trigger a context other than time and location. This lead me to realize that I use my tablet a lot more contextually than I realized and the contextual awareness I’ve ended up building for the tablet is more complex than the phone’s.

Light level… If I take my tablet outside, it almost always means I’m going to read. So when a very bright environment is detected, show me a home screen with my reading apps and widgets.

Screen orientation. The only time i ever turn my tablet to landscape is when I’m going to watch video, so let’s detect that automatically and present a home screen with Netflix and Beyond Pod and my other video apps.

Other than NFC tags, I haven’t found an automatic way to detect different rooms within the house. In the living room, show me entertainment stuff; in the kitchen, show me recipes; in the bathroom, show me reading material; etc.

What other events/environments might be used to trigger a context?

via [G+]


4 Responses to “An Update on the #Aviate / #Tasker Project”

  1. Jose Daidone says:

    Hello, very nice blog.

    Could you please explain how to integrate Tasker + Aviate?

    • nowhereman says:

      I didn’t integrate Tasker with Aviate. I used Tasker to build a contextual experience in Android that works better for me than Aviate did. Aviate was just inspiration.

  2. mike says:

    I stumbled accross your post and tried to replicate this as I was interested in context aware home screens, but it seems my knowledge in tasker just isn’t deep enough. Can you walk me through your set up? I also tried to post a comment in Google+, but for whatever reason, I couldn’t submit my comment.