Design edge cases and where to find them
Edge cases can be difficult—even impossible—to define. Thankfully there are strategies to spotlight where edge cases are likely to hide, and you can use these strategies without changing much of your existing design process.
Of course, considering edge cases matters because what inevitably happens when they're ignored is something, somewhere in what’s built gets overlooked. And when a product is built without consideration for how it might stretch, break, or be abused, those using the product are the ones who suffer most.
A headline in another language is translated into a much longer string and breaks the layout of the information on the page. Coming from a content writer, the text for a button label is too long to fit in the designed space, so it gets truncated and ends up failing to effectively communicate the button’s purpose.
Or a bad actor — hoping to do harm to users of the product — takes advantage of a profile picture feature by uploading lewd photos for their avatar or stealing other's photos for illicit purposes. Or the product is used for something it was never intended to, psychologically or physically hurting someone in the process. A protected class of people go unrepresented and can’t use the product at all, or those who need accessible functionality — screen readers or dynamic type sizing, for example — don’t get what they need in a crucial part of using the product.
We call these types of occurrences “edge cases” because they typically fall outside the defined operating parameters of the project scope.
The task of ensuring a product is flexible enough to withstand edge cases is everyone’s job. However designers have a unique responsibility to ensure everything someone sees, interacts with, or gains context from within the product, is capable of withstanding as many edge cases as possible.
However, we can't—despite our best intentions—uncover every possible edge case for what it is we’re designing. The world is just too big and too complex to cover everything every possible user may encounter.
Instead, we should strive to be aware of the far edges of possibilities, quite literally the edges, in order to make more informed design decisions.
1. Think at extreme scales
When we design, we should consider anything that can be big to be colossal, and anything that can be small as being microscopic.
Thinking in such extremes ensures our work will scale, even for situations the team my not be readily accounting for.
A text input field will need to work for one input character or 10,000. A button will need to work for a twelve character label and the 80 character equivalent in another language. A nice 14 point type paragraph on our screen might be viewed at a super high resolution to appear as though it’s just 11 point, it may also be viewed by someone with their browser base font size set to 120 points. A tab structure will need to support one single tab and maybe three dozen in the future. A drop-down menu will need to account for a single item or maybe 25,000.
We should consider scale outside of visual decisions as well: a user may navigate one screen in our product while another person buries themselves 100 screens deep. In either case: how do we help those users navigate where they need to go? How do we communicate where they are in the experience? Or consider how a user may want to upload a single photo or 40,500. The product may generate one message or error as part of a workflow or possibly 200.
How do you design for each of these extremes? What happens when the design is carefully crafted for one extreme but not the other end? There is no right or necessarily “best” way to answer these questions, as each will be depending on the platform being designed for, business objectives, design systems, etc. Some extremes may never actually even occur.
By accounting for each of these scales we ensure the product can deal with them as best as possible if they ever do take place.
2. Consider accessibility as sense difference
When designing for a digital app that relies on touch input, some users won't have the ability to use touch. Others may have extreme touch sensitivity.
The sames goes for being able to see and read the screen: according to some reports more than 10% of the global population have some variation of colorblindness. The World Health Organization estimates 1.3 billion people around the globe have some form of visual imparement, ranging from distance related problems to totally blindness.
One in five people in the United States have some type of disability which hinders their ability to use digital products in some form. And then there's a whole slew of other disabilities which impact people who may encounter your designs at one point or another: mental illness or cognitively damaging diseases, emotional states and ability, and more.
While product teams tend to talk about accessibility in terms of the most common physical disabilities—typically that of visual or motor impairment—there are many, many forms of accessibility issues which reside at the edges of research or trends.
To cover accessibility edge cases we should consider each of the five senses—touch, sight, hearing, taste, and even smell—as they relate to the product experience and how those sense can be taken to extremes.
It's good to design for people who may have poor vision or issues with color differentiate, but what about those with overly sensitive vision? How do you ensure a high visual contrast ratio for a screen isn't also going to hinder someone who is experiencing visual fatigue, for example? What do you do to ensure audible screen reader cues in an app don't also clutter up the interface for style-free visual readers—that is: users who have deliberately disabled style sheets on web products?
If we think of accessibility as differences of senses a whole world of considerations opens up to us as digital creators.
How do we ensure our products are easily digestible for the person who experiences debilitating migraines and can only consume digital experiences in daily micro doses? What can our designs do to help those who may rely on their eyes to navigate and interact with technology rather than their hands? How do we ensure things like auto-playing videos work effectively for those with hearing impairment or whose hearing is overly sensitive?
Here again there is no one right answer. It's up to you and your team to determine what's right for your product and which edge cases matter most and which can simply inform how design decisions are made.
In my meditation app Center I designed the experience to not only make an audible chime when a session starts and ends, but also vibrate the device and change physical colors as well as language of the primary button to accommodate different types of accessibility.
3. Design for the lowest denominator in technology
In 2015 there were 24,000 unique devices—phones, tablets, etc.—running the Android operating system. Today that number is much, much larger. And that’s just one operating system. There are ~27 different devices today that can run Apple’s iOS operating system, phones, iTouch devices, and iPads. Then there are hundreds of thousands of other different types of devices that can access the web and run software on similar operating systems.
A product team may be building for one or two OSes, but how many unique device types are they building for? If you’re building for Android—as an example—the number is more than 24,000. For iOS, it’s more than 20. Yet how often do product teams test and ensure a quality experience on more than one or two devices? What about internet connection speeds? Or device manufacture age?
A modern app may run smoothly on the latest Google Pixel 3a or the iPhone XS, but it might also run painfully slow on a Sony phone from eight years ago (the typical life span of a smartphone in India).
As product designers we must consider not merely human edge cases but also technological ones too.
Elaborate animations to communicate information on a screen might feel advantageous to users, but what if the device the person is viewing the animation on takes more than a few seconds to render each frame? Or consider how a complex page may look great on a brand new, 5120x2880 iMac screen, only to look like a jumbled mess on a 480x800 Samsung Galaxy Star Advance phone. What about the person browsing our website or app on a 1 TB internet connection as opposed to the person on a 23.43 Mbps connection, how is what we’re building work effectively and efficiently for each?
How do you design a smooth navigation experience for someone who has a brand new, fresh-out-of-the-box iPhone versus the person who is using an older phone with a completely cracked glass screen; having to scroll endlessly on one of those devices is a physically painful experience.
When building digital products we should consider not only the limitations of those who encounter our work, but also the technologies they’ll be using.
Here is one powerful thing to do in order to accommodate different technologies: focus on the lowest denominator. The smallest screen size and the most debilitated tech. Because the people with the biggest screens, fastest connections, and most powerful devices will get the benefit of the minimal experience regardless, but if you design for the “best in tech” experience those at the lowest end suffer.
Again, however, the solution will vary depending on who you’re optimizing the experience for.
At Facebook we had to design for a global audience, so ensuring our experiences worked across different types of technologies meant crafting a reliable and enjoyable experience without making things overly complex. Facebook may not be the most beautiful or ornately designed app, but that’s by design.
4. Remember that intentions scale too
The last strategy is to scale to extreme intentions. Most people will want to use an experience as it was designed, but there are those who will fall at extremes.
People looking to cause harm—known as “bad actors”—will seek to take advantage of the product to inflict damage, while others may want to do “good” at scale as well.
Bad actors will take advantage of anything in the product to harm others or the business. They will flood input forms with spam, utilize image socialization to spread vulgar or misleading photos, or poke around for holes in data validation to steal information or crash servers. The bad actor’s intentions tell us as product builders that anything which can be used for harm will be.
On the other end of the scale are good actors who want to utilize the product as intended, but at extremes. They’re the ones who use Twitter or Facebook at every minute of every hour of every day. They capitalize on free bandwidth offered by products like Dropbox or Google Photos to upload and share every possible photo for friends and family.
To design for intentions we should consider each extreme.
How might information be used against other users or our business in the product we design? How will we communicate effective and healthy behaviors to those who wish to cause harm or who might be spending a bit too much time in our products? What can be done to mitigate the private data of good actors that becomes accessible to bad actors due to complex privacy settings?
In each case our job as product designers is to consider these questions and keep them in mind as we work. We may not be able to catch every situation that will occur, but we can be diligent about closing gaps and resolving problems whenever they make themselves known.
Working closely with product managers and engineers—as well as researchers, content strategists, and business leaders—to keep these extreme intentions in mind can set us up for success.
We may never be able to identify and plan for every possible use case, but we can look for areas where when something may go wrong the experience is flexible enough to meet it.
To do so we should consider thinking at extreme scales, considering accessibility in terms of sense difference in people using what we design, focus on the lowest denominator when it comes to technology, and remember that intentions scale too.