Who Wants A Stylus?

The stylus is an overlooked and under-appreciated mode of interaction for computing devices like tablets and desktop computers, with many developers completing dismissing it without even a second thought. Because of that, we’re missing out on an entire class of applications that require the precision of a pencil-like input device which neither a mouse nor our fingers can match.

Whenever the stylus as an input device is brought up, the titular quote from Steve Jobs inevitably rears its head. “You have to get ‘em, put ‘em away, you lose ‘em,” he said in the MacWorld 2007 introduction of the original iPhone. But this quote is almost always taken far out of context (and not to mention, one from a famously myopic man — that which he hated, he hated a lot), along with his later additional quote about other devices, “If you see a stylus, they blew it.”

What most people seem to miss, however, is Steve was talking about a class of devices whose entire interaction required a stylus and couldn’t be operated with just fingers. If every part of the device needed a stylus, then it’d difficult to use single-handedly, and deadweight were you to misplace the stylus. These devices, like the Palm PDAs of yesteryear were frustrating to use because of that, but it’s no reason to outlaw the input mechanism altogether.

Thus, Steve’s myopia has spread to many iOS developers. Developers almost unanimously assert the stylus is a bad input device, but again, I believe it’s because those quotes have put in our minds an unfair black and white picture: Either we use a stylus or we use our fingers.

“So let’s not use a stylus.”

Let’s imagine for a moment or two a computing device quite a lot like one you might already own. It could be a computing device you use with a mouse or trackpad and keyboard (like a Mac) or it could be a device you use with your fingers (like an iPad). Whatever the case, imagine you use such a device on a regular basis with solely the main input devices provided with the computer like you do. But this computer has one special property: it can magically make any kind of application you can dream of, instantly. This is your Computer of the Imagination.

One day, you find a package addressed to you has arrived on your doorstep. Opening it up, you discover something you recognize, but are generally unfamiliar with. It looks quite a bit like a pencil without a “lead” tip. It’s a stylus. Beside it in the package is a note that simply says “Use it with your computer.”

You grab your Computer of the Imagination and start to think of applications you can use which could only work with your newly arrived stylus. What do they look like? How do they work?

You think of the things you’d normally do with a pencil. Writing is the most obvious one, so you get your Computer of the Imagination to make you an app that lets you write with the stylus. It looks novel because, “Hey, that’s my handwriting!” on the screen, but you soon grow tired of writing with it. “This is much slower and less accurate than using a keyboard,” you think to yourself.

Next, you try making a drawing application. This works much better, you think to yourself, because the stylus provides accuracy you just couldn’t get with your fingers. You may not be the best at drawing straight lines or perfect circles, but thankfully your computer can compensate for that. You hold the stylus in your dominant hand while issuing commands with the other.

Your Computer of the Imagination grows bored and prompts you to think of another application to use with the stylus.

You think. And think. And think…


If you’re drawing a blank, then you’re in good company. I have a hard time thinking of things I can do with a stylus because I’m thinking in terms of what I can do with a pencil. I’ve grown up drawing and writing with pencils, but doing little else. If the computer is digital paper, then I’ve pretty much exhausted what I can do with analog paper. But of course, the computer is so much more than just digital paper. It’s dynamic, it lets us go back and forth in time. It’s infinite in space. It can cover a whole planet’s worth of area and hold a whole library’s worth of information.

But what could this device do if it had a different way to interact with? I’m not claiming the stylus is new, but to most developers, it’s at least novel. What kind of doors could a stylus open up?

“Nobody wants to use a stylus.”

I thought it’d be a good idea to ask some friends of mine their thoughts on the stylus as an input device, both on how they use one today, and what they think it might be capable of in the future (note these interviews were done in July 2013, I’m just slow at getting things published).

Question: How do you find support in apps for the various styluses you’ve tried?

Joe Cieplinski: I’ve mainly used it in Paper, iDraw, and Procreate, all of which seem to have excellent support for it. At least as good as they can, given that the iPad doesn’t have touch sensitivity. In other apps that aren’t necessarily for art I haven’t tried to use the stylus as much, so can’t say for sure. Never really occurred to me to tap around buttons and such with my stylus as opposed to my finger.

Ryan Nystrom: I use a Cosmonaut stylus with my iPad for drawing in Paper. The Cosmonaut is the only stylus I use, and Paper is the only app I use it in (also the only drawing app I use). I do a lot of prototyping and sketching in it on the go. I have somewhat of an art background (used to draw and paint a lot) so I really like having a stylus over using my fingers.

Dan Leatherman: Support is pretty great for the Cosmonaut, and it’s made to be pretty accurate. I find that different tools (markers, paintbrushes, etc.) adapt pretty well.

Dan Weeks: For non-pressure sensitive stylus support it’s any app and I’ve been known to just use the stylus because I have it in my hand. Not for typing but I’ve played games and other apps besides drawing with a stylus. Those all seem to work really well because of the uniformity of the nib compared to a finger.

Question: Do you feel like there is untapped (pardon the pun) potential for a stylus as an input device on iOS? It seems like most people dismiss the stylus, but it seems to me like a tool more precise than a finger could allow for things a finger just isn’t capable of. Are there new doors you see a stylus opening up?

Joe Cieplinski: I was a Palm user for a very long time. I had various Handspring Visors and the first Treo phones as well. I remember using the stylus quite a bit in all that time. I never lost a stylus, but I did find having to use two hands instead of one for the main user interface cumbersome.

The advantage of using a stylus with a Palm device was that the stylus was always easy to tuck back into the device. One of the downsides to using a stylus with an iPad is that there’s no easy place to store it. Especially for a fat marker style stylus like the Cosmonaut.

While it’s easy to dismiss the stylus, thanks to Steve Jobs’ famous “If you see a stylus, they blew it” quote, I think there are probably certain applications that could benefit more from using a more precise pointing device. I wouldn’t ever want a stylus to be required to use the general OS, but for a particular app that had good reason for small, precise controls, it would be an interesting opportunity. Business-wise, there’s also potential there to partner up between hardware and software manufacturers to cross promote. Or to get into the hardware market yourself. I know Paper is looking into making their own hardware, and Adobe has shown off a couple of interesting devices recently.

Ryan Nystrom: I do, and not just with styli (is that a word?). I think Adobe is on to something here with the Mighty.

I think there are 2 big things holding the iPad back for professionals: touch sensitivity (i.e. how hard you are pressing) and screen thickness.

The screen is honestly too thick to be able to accurately draw. If you use a Jot or any other fine-tip stylus you’ll see what I mean: the point of contact won’t precisely correlate with the pixel that gets drawn if your viewing angle is not 90 degrees perpendicular to the iPad screen. That thickness warps your view and makes drawing difficult once you’ve removed the stylus from the screen and want to tap/draw on a particular point (try connecting two 1px dots with a thin line using a Jot).

There also needs to be some type of pressure sensitivity. If you’re ever drawing or writing with a tool that blends (pencil, marker, paint), quick+light strokes should appear lighter than slow, heavy strokes. Right now this is just impossible.

Oleg Makaed: I believe we will see more support from the developers as styluses and related technology for iPad emerge (as to me, stylus is an early stage in life of input devices for tablets). As of now, developers can focus on solving existing problems: the fluency of stylus detection, palm interference with touch surface, and such.

Tools like the Project Mighty stylus and Napoleon ruler by Adobe can be very helpful for creative minds. Nevertheless, touch screens were invented to make the experience more natural and intuitive, and stylus as a mass product doesn’t seem right. Next stage might bring us wearable devices that extend our limbs and will act in a consistent way. The finger-screen relationships aren’t perfect yet, and there is still room for new possibilities.

Dan Leatherman: I think there’s definite potential here. Having formal training in art, I’m used to using analog tools, and no app (that I’ve seen) can necessarily emulate that as well as I’d like. The analog marks made have inconsistencies, but the digital marks just seem too perfect. I love the idea of a paintbrush stylus (but I can’t remember where I saw it).

Dan Weeks: I think children are learning with fingers but that finger extensions, which any writing implement is, become very accurate tools for most people. That may just have been the way education was and how it focused on writing so much, but I think it’s a natural extension that with something you can use multiple muscles to fine tune the 3D position of you’ll get good results.

I see a huge area for children and information density. With a finger in a child-focused app larger touch targets are always needed to account for clumsiness in pointing (at least so I’ve found). I imagine school children would find it easier to go with a stylus when they’re focused, maybe running math drills or something, but for sure in gesturing without blocking their view of the screen as much with hands and arms. A bit more density on screen resulting from stylus based touch targets would keep things from being too simple and slowing down learning.

Jason: What about the stylus as something for enhancing accessibility?

Doug Russell: I haven’t really explored styluses as an accessibility tool. I could see them being useful for people with physical/motor disabilities. Something like those big ol cosmonaut styluses would probably be well suited for people with gripping strength issues.

Dan Weeks: I’ve also met one older gentleman that has arthritis such that he can stand to hold a pen or stylus but holding his finger out to point with it grows painful over time. He loves his iPad and even uses the stylus to type with.


It seems the potential for the stylus is out there. It’s a precise tool, it enhances writing and drawing skills most of us already have, and it makes for more precise and accessible software than we can get with 44pt fingertips.

Creating a software application requiring a stylus is almost unheard of in the iOS world, where most apps are unfortunately poised for the lowest common denominator of the mass market. Instead, I see the stylus as an opportunity for a new breed of specialized, powerful applications. As it stands today, I see the stylus as almost entirely overlooked.

Yuck!

Join the Discussion 👂🤔✍️

Please read the Discussion Guidelines before replying.

☑️ Email me when someone replies.

Speed of Light