


The table shows many differences, but depending on the context you’re designing for, some will have more impact than others. Here’s a summary of the main differences between mouse and finger, adapted from Jakob Nielson: Therefore, even if we’re designing for desktop, it’s still important to consider touch interactions. When any desktop machine could have a touch interface, we have to proceed as if they all do. However, touch isn’t just limited to smaller screens – tablets can scale up and beyond the size of desktop monitors, as highlighted by Josh Clark: Mouse pointers are most widely used on laptops and desktop devices, whereas touch interfaces are common on both phones and tablets.

As well as the theory, it’s how I’ve feasibly implemented things.

To continue from my previous post on building a drag and drop UI, this article studies the differences in experience between touch and mouse input, especially (but not exclusively) for drag and drop interactions. Device types, browsers and input devices vary (presented in Polypane) While this approach certainly acknowledges the uncertainty of the Web, I wonder how sustainable it is when voice, 3D gestures, biometrics, device motion, and more are factored in. Web developers and designers have smartly decided to simply embrace all forms of input: touch, mouse, and keyboard for starters. The range of different devices used to access the web means we can never control how people will experience what we create – not only in terms of screen size, but also how they interact – touch or click?Īs well creating flowing layouts that adapt to smaller screen sizes, touch interactions can’t be an afterthought either – they’re as essential as mouse and keyboard input: On Touch Support for Drag and Drop Interfaces
