The Black Sheep of App Dev

Summary: If you’re a UI design professional, then you can probably relate to the way UI designers are usually the outcasts in most application development projects.


Where it all began

Since the beginning of software development, creating a usable interface has always been a challenge. Before the Internet boom and the wave of web-based applications that followed, technology teams developed green screen and client/server applications with limited UI capabilities and design freedoms. These inherent limitations caused many development teams to overlook the need and importance of including UI Designers in the development process because the perception was that there was only so much “designing” that you could do. Unfortunately, it appears we are still struggling to find a new solution to an old problem. Today, the lack of focus on UI design has been further compounded by the dramatic increase in demand for web and mobile applications.

In addition, during the Internet boom, developers became a very hot commodity because they were the only ones that could get companies up on the Web. During the late 1990’s developers were demanding $200 to $250 hourly rates and some were even getting professional athlete-type signing bonuses. This explosion of demand put IT departments in the driver’s seat and enabled them to control all web-based projects. This ownership is exactly why most applications are still developed using a system-centered approach rather than a user-centered design.

If it’s not “usable” then what’s the point?

Sounds simple, yet the idea of making an interface usable continues to fall to the bottom of the barrel when it comes to the priorities identified in most application development projects today. Many projects start out with good intentions and agree that the application should be easy-to-use. In fact, often times making an easy-to-use application was the entire business driver behind getting the project started in the first place. The irony lies in the fact that the tasks involved with making a usable interface are usually the first to be overlooked or cut out entirely. (The next to get cut is QA/Testing…but that we’ll save for another article.)

Since most application development projects are managed and executed primarily by Information Technology (IT) teams, the project’s priorities are usually focused on developing functionality that “works” according to a set of Business Requirements and Use Cases. This approach can be characterized as a “system-centered approach”. The primary flaw in this approach is that the User Requirements are often left out and never defined. User Requirements help the entire project team understand what the user’s needs really are and how they will perform the tasks driven by the Business Requirements. For example, a Business Requirement may define the need to capture product data into a database, which can be accessed through a web browser. The User Requirements may define how a user will enter the data into the system based on previous learnings and current needs.

Countless articles are published on a regular basis about companies that have lost millions of dollars in developing and deploying failed applications and websites – mainly because their end-users struggle to use them. The unfortunate part about these colossal failures is that many of them could have been easily avoided if the had utilized a more “user-centered design” approach.

The bottom line is making something “work” is never good enough. Making it work and making it usable should always be the goal. Throughout the years there have been countless examples of products that have failed not because they didn’t work but because people could not figure out how to use them. For example how many VCR clocks needed to blink before they finally redesigned the interface to make it easier to set the time? Of course, if you couldn’t set the time you couldn’t record a show – so one poorly designed feature often impacts another. The same holds true in designing an application. If an application is difficult to navigate, chances are many of its features will go unused because the users can’t find them.


The value of using UI design specialists

A great deal of a UI design team’s time is spent on educating project teams and clients on the value of involving UI design specialists from the beginning and throughout the entire lifecycle of any application development project. Often times the education process involves a presentation, which may include:

  • A walkthrough of the user-center design methodology.
  • Sample deliverables so clients know what they are paying for and technology teams understand what will be delivered to them and how they will need to be involved in the process.
  • Samples of applications that did not utilize a UI design specialist versus ones that did.
  • Research that demonstrates compelling facts about the importance of usability and UI design in the words of industry experts, instead of your own.


Let the developers develop and the designers design

The most successful application development projects are those that utilize resources for their strengths and do not stretch them into something they are not. Obviously some projects with smaller budgets cannot avoid using resources that need to wear many hats but making the database engineer design the UI should be avoided at all costs. In my experience even a little help from a UI design specialist can have a tremendous positive impact on a project’s success. In one project, the budget did not account for a UI designer at all but halfway through the project the Project Manager could see that the interface was not coming together so they found a way to involve a UI designer for 2 weeks so the developers could at least be given some templates to apply to all the screens of the application. In this case, the project went from a path of certain disaster to one that the customer praised and more importantly the users could easily use.

Advertisements

About this entry