Traditionally a large focus of IDF is looking at future computing trends. For the longest time the name of the game was convergence. The shift to mobility highlighted many IDF keynotes in years past. Today, in Intel's final keynote of the show, Justin Rattner began by talking about context-aware computing.

The first demo was the most compelling - an application called the personal vacation assistant.

The personal vacation assistant takes information your device already knows about you. Things like your calendar, current GPS location as well as personal preferences for things like food.

The app then takes this data and can provide you with suggestions of things to do while you're on vacation. Based on your current GPS coordinates and personal preferences, the app can automatically suggest nearby restaurants for you. It can also make recommendations on touristy things to do in the area, again based on your location.

The personal vacation assistant has an autoblog feature. It can automatically blog your location as you move around, post photos as you take them, and even provide links to all of the places you've visited. Obviously there are tons of privacy issues here but the concept is cool.

The idea is within the next few years, all devices will support this sort of context aware computing. While the personal vacation assistant shows location aware computing, there are other vectors to innovate upon. Your device (laptop, smartphone, tablet) can look at other factors like who you're with or what you're doing. Your smartphone could detect a nearby contact, look at both of your personal preferences and make dining/activity recommendations based on that information.

Modern smartphones already have hardware to detect movement (e.g. accelerometer), the next step is using that to figure out what the user is doing. This could apply to things like detecting when you're running, figuring out that you may be hungry afterwards and have your phone supply you with food recommendations next.

Motion sensors in a smartphone could also detect things like whether or not the user has fallen and automatically contact people (or emergency services) in the address book.

Context-aware computing can also apply to dumb devices like remote controls. In his next demo, Justin Rattner showed a TV remote control that made show/channel recommendations based on who is holding it.

The remote control determined who the user is by the manner in which the person held it.

This all depends on having a ton of sensors available and combining it with compute power. Intel was also careful to point out that context-aware computing must be married to security policies to avoid this very-personal information from automatically being shared with just anyone.

Personally I want my smartphone, notebook and desktop to work for me a little more autonomously. Intel talked about the idea that your phone could make a recommendation for what you should eat at a restaurant based on your level of activity for the day and the restaurant's online menu (determined by your GPS location and an automatic web search). Obviously this depends on nutritional information being shared in a uniform format by the restaurant itself, but the upside is pretty neat.

Hard sensing is important (location, physical attributes) but combining that with soft sensing (current running applications, calendar entries, etc...) is key to making context-aware computing work as best as possible.

Context-aware computing feels a lot like the promises of CE/PC convergence from several years ago. I do believe it will happen, and our devices will become much more aware of what we're doing, but I suspect that it'll be a several years before we start seeing it as commonplace as convergence is today.

Justin Rattner ended the keynote with a look into the next 5 - 10 years of computing. Intel is working with CMU researchers on sensing brain waves. Feeding the results of those types of sensors into computing devices can enable a completely new level of context aware computing. That's the holy grail after all, if your smartphone, PC, or other computing device is not only aware of your external context but what you're thinking.

Comments Locked

21 Comments

View All Comments

  • Quantumboredom - Wednesday, September 15, 2010 - link

    "That's the holy grail after all, if your smartphone, PC, or other computing device is not only aware of your external context but what you're thinking."

    That's a the worst possible situation / my worst nightmare, not a holy grail. With good interfaces it is not at all necessary for my device to know anything about me, where I am or what I'm doing. As long as I know these things, I'll be quite capable of doing everything that needs doing without this "context aware" computing crap.

    And does anyone think this can/will be done in a 100% secure manner? Almost everyone walking around with a device which tracks their location, their physical condition, and even with rudimentary brain-monitors. With all this information in the cloud. Who actually thinks that sounds like a good idea? I would be uncomfortable with anyone having that kind of access to myself and others, and certainly not a large company or any government.
  • anactoraaron - Wednesday, September 15, 2010 - link

    Not to mention that AI machines will have the leg-up on us once they become "aware"... They will know what we like to do, eat, who/what we are attracted to... Once they are aware they will easily be able to control us...
  • bji - Wednesday, September 15, 2010 - link

    OK the headline graphic for this article is one of the funniest things I have seen in a while. I can't help but envision a completely dazed and confused individual asking those questions one after the other while punching buttons on their cell phone looking for the answers ...
  • nermie - Wednesday, September 15, 2010 - link

    This way if you get hit in the head and suffer amnesia, you can resync your brain with the cloud using Intel's new phone.
  • Perisphetic - Wednesday, September 15, 2010 - link

    Basically It's a device for really old people with senile dementia aka Alzheimer's (as shown in a later picture). Also I didn't know they resurrected Lenin from the mausoleum to do presentations on remote controls oh wait, that's just Louis CK the stand up comedian doing a gig...
  • AnnonymousCoward - Wednesday, September 15, 2010 - link

    Hilarious.
  • bji - Wednesday, September 15, 2010 - link

    I just got to the "Sensing Human Gait" slide. They totally should have combined that with the first slide. Those blue alzheimer's bubbles from the first slide should be floating around that old guy's head ...
  • JTKTR - Wednesday, September 15, 2010 - link

    I've fallen and I can't get up.
  • Spazweasel - Wednesday, September 15, 2010 - link

    Replace the word "suggestions" with "advertisements" and "sensors" with "passive monitoring devices we can turn on remotely" and you've got it.

    This is just a vector for enforcement. If a technology can be used to impose control, it will be used for that by a government whether the law permits it or not. If a technology can be used to gather information about someone with a wallet, it will be used for that by the Googles, AdSenses and Facebooks of the world whether the person consents or not. The only way to prevent that is to choose which technologies you allow into your life.
  • prophet001 - Monday, September 27, 2010 - link

    hmm
    you stole my post lol

    scary stuff.
    personally, i don't want the machine (read liberal activist technophiles) to know my every move

    and no.. it's not the bizhub that's going to take over the world.. it's the people behind the bizhub

Log in

Don't have an account? Sign up now