Editor’s note: This is a featured post in our ongoing series “30 Days With” which outlines the use of a productivity tool, service, or product that we have used for the past 30 days. We want to provide our readers with an in depth view of tools and products that they are interested in and provide them our thoughts as well as ways to use these products faster and better. Enjoy.
This is the tale of a 2 hour a day commuter, 8 hour a day Fortune 500 company employee, and part time Lifehack editor living in the USA using Siri over the last 30 days. Enjoy.
It’s hard to believe that the iPhone 4S has already been out for a month. In this small amount of time we’ve seen iPhone 4S battery issues come and go, Siri outages, and data usage problems (that may not really be problems). But, just like every year, the iPhone proves to be one of the best selling smart phones of all time. It isn’t really surprising, is it?
It’s funny just how important Siri has become in my life in the last 30 days. Being an old Android user, I was privy to the idea of interacting with my phone by voice using Google’s Voice Actions. Google Voice Actions worked well but for some reason my use of them never stuck.
But Siri isn’t just about commanding your phone to do things. It’s about interacting with your phone in a way that Google Voice Actions (or any product for that matter) never brought to the mainstream. Not only does Siri provide the user with a way to interact with one’s device like never before, “she’s” also a tad snarky and has an attitude of “her” own.
This is the “special sauce” that Apple adds to their products. The way to make them more human.
Yes, voice interaction existed with phones before Siri came, and yes, Google did a hell of a job with making voice work with a smartphone. But, Apple is the company that brings “outlying” technologies to the mainstream by making them approachable by humans.
When I saw the demo of Siri during the iPhone 4S announcement I was super excited. Mostly because I am an Apple fanboy, but also because I am a geek and could see myself using voice to interact with my phone to get things done faster and better. But, deep down, I was scared that Siri wouldn’t be as good as it looked. I was afraid that the attractive man running next to the river effortlessly changing his meeting appointments with his voice through Apple headphones was fake. I mean, whose Apple headphones stay in their ears while running anyways?
But my biggest fear was that this “personal assistant” was going to be a digital interface that only worked if I learned the perfect voice syntax to interact with it.
This is delightfully not the case.
I started doing the normal things first like sending text messages, scheduling appointments, reading text messages, sending emails, checking the weather, seeing how many calories were in a bagel; normal queries and actions to see how well Siri worked.
I would say 95% of the time, Siri was spot on. It transcribed my text messages and emails, added appointments correctly to my calendar (even repeating appointments), created new reminders that nagged me when I got home, played songs from my music library, etc. Siri was so good at first, that it threw me for a loop when she would mess things up like what text note to append something to or the name of the artist that I was giving her to play.
Because of how well Siri worked “out of the box”, I quickly changed my habits and workflows that I have had with my smartphones for the past 3 years.
Outsourcing with Siri
Here are the following things that I now outsource to Siri at least 90% of the time:
- Appointment creation with the calendar
- Quick reminders and time specific things that I need to remember (I use to put all of this in OmniFocus)
- Calling people
- Checking the weather
And here are the things that I outsource while driving or at home (not in front of co-workers)
- Everything from above
- Sending text and reading messages
- Sending emails
- Making notes
- Sending tasks to my OmniFocus inbox (with the “hack” I detailed in our last Siri post)
- Playing music
- Searching things (as long as I don’t have to dive into Mobile Safari to get it done)
Siri is a joy to use. When we talk about being and staying productive here at Lifehack, we all have this idea of staying in the flow of what we are currently working and concentrating on, allowing us to be in a productive state. Siri allows me to do that. Once the following three things happen, Siri will allow for people to stay in this state more of the time:
- Apple opens the doors to Siri for developers (which may be a good or bad idea depending on how Apple and developers identify their roles)
- “Normal” people accept the idea of people around them interacting with their devices, telling them what to do.
- The Siri service covers more ground and is more selective in the ways that it requires a data connection.
We will be able to use a “digital assistant” that allows us to keep thoughts at bay while we work and use other apps that we need to use with little resistance. Siri can then become a major part of one’s workflow. This is what Apple has planned for Siri in the future and once it becomes more “acceptable” in public to talk to our devices, this type of use of Siri will be a reality.
You’d think from reading above that Siri is a bed of roses and that Apple is the king of everything in the world. Well, that’s sort of true. But, I did have some issues with Siri, some of them have still yet to be resolved.
The biggest complaint that I have regarding Siri is that even if you do the simplest of actions (like calling a contact) you have to make a data call to Apple’s servers. It makes sense to do this sort of call for intense queries that require some backend processing like that of transcription and dictation or for queries that require some sort of call over the network (Wolfram Alpha search), but for doing simple things that are native to the iPhone it seems unnecessary.
The idea of making a data call for every Siri query isn’t entirely noticeable until the Siri service is down, which over the past 30 days I have experienced 2 times. I’m not talking about down for one query and then back up, I’m talking about Siri being down for several hours at a time.
When I wanted to send a SMS to my wife on the way home from work the other day, I got the typical “I’m having trouble connecting to the network” message. Some consumers may think that this means that the network is down, not that the Siri network is down. This is a truly frustrating thing and for all the times that I had used the Google Voice Actions on Android, the only time I couldn’t access that service is when I couldn’t get a network connection to my data provider.
Apple needs to rethink the way that it uses (and requires) access to the a data network and the Siri network to utilize Siri.
Where am I?
Another bad thing about Siri is how it doesn’t completely support different places around the world. Lifehack’s editor, Mike Vardy, can’t use Siri for location based queries in Canada. Siri just doesn’t know where Canda is right now. This may be fixed sometime in 2012.
I’m not entirely sure how Siri is working in other parts of the world, but when it first launched, location data was seriously lacking. If Apple expects Siri to truly take over, how can it if it doesn’t know where the closest Pizzeria is?
Something else that seems obvious that Siri should do is have the ability to change settings on my iPhone. Things like “Siri, turn off WiFi” or “turn on airplane mode” simply don’t work (good thing about the airplane mode though, you couldn’t turn it off without your network connection!).
Some other oddities that Siri faces is that searching the web can be lacking and inconsistent. I like how you can get a locksmith or find local escort services easily through Yelp! but searching for something like “where was the Lord of the Rings filmed” sometimes brings back a Siri provided search result (that is the search result inside of the Siri app) while other times gives you the option to search the web. It seems that Apple is still trying to figure out how to make sure that the search results that Siri will present are correct and the best.
This is definitely a natural language and processing issue. Apple probably thinks that rather than incorrectly presenting the correct, best results, users should be suggested to search the web through Mobile Safari. This way they can make up their own mind which is the correct, best answer.
When Siri finds things and is certain of what she is presenting it amazes me. But, when I search for something that I think should just work rather than take me into Mobile Safari it starts to reduce the “amazingness” of Siri.
Once again; it’s beta right?
My 30 days with Siri has been excellent even with the slight snafus of Siri being down and general feature issues (both of which will get better over time). Like I said above, I now “outsource” a decent amount of what I do with my phone to Siri. And as the Siri service becomes stronger and more ubiquitous, you better believe that I will use it more and more.
I think that Siri is revolutionary yet has its flaws. When I can raise my phone to my face and simply say, “remind me to take out the trash before I leave home” and have my phone alert me when I’m leaving my house to take out the trash, it makes me feel like I’m living in a dream world. But when I do the same action and Siri says, “Christopher I cannot connect to the network”, I’m reminded that there is still work to be done.
Siri, is by far the best voice recognition and natural language software that I have ever used. No matter what happens, Siri will continue to get better and smarter allowing us to be more productive with our iOS devices.
Note: Apple has done such a good job of giving a Siri a persona that many times throughout this article I will refer to Siri as ‘her’ or ‘she’. Thanks for making me think my phone is a person, Apple.