company
quick finder


EXECUTIVE SUMMARY

The Tristream study team researched, evaluated, and defined the best practices of development teams that result in highly usable and highly productive user strategies, user experience, and user interfaces for web-delivered applications. Applications that were developed using a significant number of these best practices resulted in higher user satisfaction, ease-of-use, productivity, accuracy, efficiency, customer service satisfaction, reduced employee needs, reduced training costs, reduced help desk time, and bottom-line ROI.

Differentiating Practices

Our goal was to identify the practices used by teams that “stood out” from the rest. We did not try to catalogue all the practices in use by development teams—even when those practices are essential to development, such as many project management practices. Instead, we looked for practices that clearly resulted in superior application development and which were absent from, or poorly executed by other teams.

The best practices we identified range from team structure to team member skill sets, from user strategy development techniques to wire-framing, from internal documentation to establishing ongoing user groups, from prioritizing business requirements to working with offshore teams. In all cases, the best practices we identified, like cream, rose to the top.

“Core” Teams

In this, the first version of our best practices study, we concentrated on “core team” practices. Core teams vary in size from 5 to 15 people. The core team is hands on and works directly with a specific application, or portion of an application, and is responsible for the development of all the screens required, down to the last pixel. Frequently the “core team” hands off the programming to a larger group, often off-shored.

There are frequently multiple core teams working on the same application in large application development projects. In the next version of this report, we will address the best practices for managing and supporting multiple teams working on the same application. However, in this report we do not have sufficient information to address the coordination of multiple teams other than with an observation here and an observation there.

However, we will say without reservation that it is the core team that really insures great application development. All the coordination in the world cannot offset poor work in the core team—and vice versa, a great core team can even rise above some of the problems created by poor coordination.

So whether your company has an IT department under a hundred people, or in the thousands of people, you will find the best practices identified and described in this report to be very applicable to your core teams.

Off-shoring

Many of the best practice descriptions address specific ways to work well with off-shored elements of the development process, particularly the programming team. Off-shoring appears to be here to stay and the best teams have adapted to the opportunities and problems presented by an offshore team.

Challenges and solutions center on communication and cultural understanding, which are sometimes intermixed. Offshore programming teams in India, for example, are not comfortable with the directness of the typical

American conversational style, and will often take a passive role in the process as a result. The best teams make an effort to draw out their counterparts' good solutions, often sending team members to India on a regular basis.

On the other hand, offshore teams need to send key Team Leaders to their client facilities on a regular or long-term basis. Only then can the offshore team gain a real feel for the users and business needs around which the application is being developed.

Thorough communication is essential. Offshore teams generally need a more specific design specification than an internal programming team sharing the same facilities. This report addresses many best practices that alleviate many of the problems associated with off-shoring, including a special section, “Tips on Outsourcing.”

New Applications and New Releases

Most of the best practices that have been identified in this report best apply to the creation of new applications and to the process of developing major new releases of existing applications. We focused on how the best teams applied their methods to applications where there were as few pre-existing limitations as possible. A clean-slate start, or a major new release (new version) of an application affords the maximum opportunity to implement greater usability and productivity.

However, the best practices we identified are also applicable to making incremental changes between major releases by recognizing pre-existing limitations and by a common-sense application of these best practices. The same principles apply, but typically a team isn't able to go as deeply into changing user strategy or the user experience as they might like to do when involved in incremental change. When making incremental changes, teams typically need to preserve the overall user experience conventions that are already in use throughout the rest of the application, otherwise the user will have an inconsistent and confusing experience.

Differentiating Between User Strategy, User Experience, and User Interface

The application development industry tends to treat user experience development (UE) and user interface development (UI) as the same process, sometimes using the acronym UE/UI. We have found that to be emblematic of many of the problems we routinely encounter—teams do not recognize or make distinctions between what should be separate phases or separate processes in the development process. As a result, teams often develop user strategy, user experience, and user interface elements out of sequence—or in the case of user strategy—not at all.

There is an old story (told with many variations) of a science professor filling up a glass jar in front of his students without any explanation. First he fills the jar with two or three large rocks. Then he adds as many medium-sized rocks as he can fit around the big rocks, then he fills up all the remaining space he can with fine rocks and sand, then finally he adds water right up to the very brim. Then he asks his students, “What did you learn from this experiment?”

Many students offer answers such as, “You can always get more in the jar,” or more metaphorically, “You can always do more with your time.”

Finally the professor, acknowledging the truth of what his students offered, says, “All that may be true, but the most important thing you learned was that you have to get the big rocks in the jar first.”

Using this analogy, user strategy corresponds to the big rocks. User strategy typically addresses big picture considerations such as, where does this application fit in with the rest of the company's information systems? Is it part of a portal? Does it serve vendors and customers as well as internal employees? What is the basic “flow” of the application? Does it mirror internal divisions, such as sales, accounting, etc.? Or does it mirror workflow, such as initial call, proposal creation, order confirmation, fulfillment, follow-up? Is the application primarily for new users? Occasional users? Power users? Will the application have more than one “track,” or have significant user-preference settings?

If these overarching user strategy decisions are not consciously made in the beginning of the development process, user strategy will be determined almost randomly by a series of other decisions—and it is rarely clear, usable, or productive as a result. As the old saying goes, “If you don't know where you're going, any road will get you there.”

The best teams address user strategy first and make it a distinct step in the process. Your user strategy will probably need to be adjusted as the development process unfolds, and as new needs or limitations emerge, but the core strategy should be strong enough to survive such adjustments without losing clarity.

User experience development flows from user strategy and corresponds to the medium-sized rocks. Once overall user strategy has been decided upon, the best teams address themselves to understanding their users' task and workflow. The best applications make the users feel as if the designers knew exactly what they want to do next. In addition, the best applications recognize distinct user types and allow them to perform their tasks without any confusion. New users, or occasional users, will not have an agreeable user experience in an application designed for everyday power users, and vice versa.

There are no hard and fast “best user experience” conventions. If a team were to put all the best conventions together in one application, it would be a confusing mess. The “best user experience” is one that is clearly consistent and matched well with the specific needs of each user type.

Great user experience design is born of an intimate knowledge of business needs, the users' needs, and a robust, iterative wireframe design process. We did not find a single excellent, or even good, application that was not first designed visually. A user's experience is primarily visual. What he or she sees on the screen is the basis of his or her experience.

User interface design flows from user experience design and corresponds to the sand and water that go into the jar last. The user interface is made up of the visual elements seen on the screen—colors, shapes, banners, headers, modules, typography, icons, buttons, and controls. The user interface “clothes” the user experience in an easy-to-see, easy-on-the-eye ergonomically developed toolkit of interface elements.

The basic development flow—user strategy, to user experience, to user interface—insures that the most important decisions get made first and then cascade down to the smallest decisions.

Early in the Development Lifecycle

This study focused on the elements of the development lifecycle that had the most influence on creating highly usable and highly productive applications. We found, not surprisingly, that the initial development stages have the

most to do with creating great applications: determining business requirements, gathering user requirements, designing user strategy, then user experience, then the user interface elements and, throughout the process, good communication, and documentation.

We did not spend significant time studying acceptance-testing, IT implementation, QA, user-training, initial launch, and other very important steps to getting an application developed, because we found that these activities, falling late in the process as they do, had a diminishing impact on the final quality of the application.

The most important work—as far as usability and productivity gains are concerned—happens in the early stages of the development process. If you don't get things right early, it is very hard to do much about them later in the process.

Across All Types of Applications

The best practices we identified and describe in this report work for any web-delivered application. We included both publicly accessible Web sites, employee-only “intranet,” and business-to-business applications in our study. The best practices would, however, be applicable to any process involving a screen, whether a full-sized computer monitor, or a small, handheld screen.

The applications we studied are all on the critical path of each company's service delivery or internal operational processes or, where directly accessible to consumers, are central to the company's revenue generation. The majority of applications under study are internal and/or B2B applications and therefore our study results are strongly applicable to teams engaged in the development of these types of applications.

However, the development teams that were engaged in the development of B2C applications, and who employed these best practices, also engendered high user satisfaction and productivity.

Company investment in the applications was significant to extremely significant. The companies that participated in the study ranged in size from small ($20 million annual revenues) to global (Nokia with nearly $40 billion annual revenues). The applications under study ranged from small, single-purpose applications, to enterprise-wide applications with dozens of user types and thousands of users. Development team size ranged from 5 to 100 (depending on the size of the programming team). Development budgets ranged from hundreds of thousands, to tens of millions of dollars. Development time frames ranged from a few months to as many as five years.

Despite this wide range of company types, application types, sizes, and budgets, development team sizes and development time frames, very clear and consistent best practices emerged that cut across all these factors. Tristream found clear and universally effective best practices for gathering user input, gathering business requirements, developing user strategy, user experience, and the user interface, team dynamics and composition, anddocumentation that any development team can take advantage of regardless of team size or project scope.

Across All Types of Tools

Our evaluation does not address the strengths or weaknesses of any particular development processes (such as RUP, GPLC, XP), support applications (such as Visio, Rational Rose), or development tools (such as UML). The decisions which drive the selection or deployment of processes, support applications, and tools relate more to the

budgets, established culture, and preferences of the individual companies. By themselves, they do not insure, or work against, creating a great application.

The best practices Tristream identified cut across all processes, applications, and tools.

Best Practices Contrasted against “Normal” Practices

In the detailed descriptions of the best practices, you will find that we also include the “Normal Practices” that we encountered as well. We found that many teams considered themselves to be following the best practices we shared with them as the study concluded—when in fact we knew that they were not fully following the best practices. Perhaps they were short-cutting some of the key aspects of the best practices, or perhaps they had been doing it one way for so long that they could no longer truly see the fine points of what they were doing.

We therefore added descriptions of “Normal Practices” which will allow development teams to contrast themselves against actual best practices. The differences are sometimes subtle and therefore we tried to provide as many quotes and anecdotes as possible to illustrate both the best practices and the normal practices.

Bottom Line

Where the best practices identified were used, application usability and productivity were clearly superior; where they were omitted, application usability and productivity were clearly inferior.

Enterprise computing is in transition from the client/server model to a web-based delivery model. Many companies have made significant steps in moving to web-delivered applications, thousands more are in process, or are just beginning. The transition appears to be inevitable, as surveys indicate almost all IT departments in major corporations intend to make the transition in order to take advantage of the flexibility and freedom of web-delivered applications—whether it's now or at some time in the future.

While underlying technologies, such as data bases, programming environments (such as .net), offer solutions to many of the technical problems encountered in the transition to web-delivered applications, the processes for the development of user strategies, user experience (UE), and user interface (UI) have not kept pace. The result of these practices lagging behind the technology potential is that most applications take neither full advantage of new technology, nor provide high user satisfaction. Both results prevent companies from realizing the full ROI of their technology investment—most applications therefore are neither user-friendly, nor highly productive.

The transition from the client/server model to a web-delivered model for applications has brought together two independently evolved methodologies for developing applications. The client/server model, the backbone of corporate computing for decades, has been supported by rich and stable development methodologies, which have had decades to mature. The methodologies that have evolved with the Web are newer and less well-defined. Developing web-delivered applications requires that development teams merge these two processes.

While the two development processes (for client/server and Web development) do not differ greatly, there are important differences, particularly where user strategy and user interface development are concerned. Traditional client/server development projects had the benefit (or, at the same time, the obstacle) of an already developed user interface, which in turn would drive the potentials for user strategies. Most programming environments, which were designed to develop client/server applications, came with well-developed UI components.

In the web-delivered application arena, the user interface cannot be treated as a given, as it often was in the client/server application development process. The wide variety of browsers that can be (or need to be) supported, and the continuing evolution of display code and browser capabilities, make user interface development for web-delivered applications more challenging—and more of an opportunity—to deliver highly usable interfaces.

On the other hand, user strategy and user experience development, while partly driven by the capabilities of the user interface (UI), remains the core challenge of either development methodology—and neither methodology insures good results. It is here that we found the greatest weakness among the teams evaluated, regardless of the overall methodology their team employed for the development process.

Our thanks to the participating companies listed below. They provided generous and unfettered access to their applications, their users, and their development teams. Without their cooperation this study would not have been possible.

The Tristream study team interviewed from six to eight users from each primary user group that use the applications in order to evaluate their usability and productivity. Tristream then ranked each application according to user perceptions of usability and productivity. Additionally, all members of the Tristream team, after familiarizing themselves with the applications under study, ranked the applications by the same criteria to add a greater measure of objectivity to the results.

The user interviews and ranking process established the comparative success of each application in terms of satisfying user needs and meeting work goals. The ranking then provided a baseline of success for evaluating the practices of the teams that developed the applications.

We then evaluated the development practices, team dynamics, and team composition of each development team. The development teams interviewed were internal, though in some cases partial outsourcing was used to help with the development of the user experience strategy, the user interface design, or code development.