Out -of-the Box Instructional Design: Moving from Assembly-Line Models to Non-linear Performance Models

Diane M. Gayeski, Ph.D.  OmniCom Associates

a similar version was published as "Out-of-the-Box Instructional Design" in the April issue of ASTD's magazine, Training and Development.

I'll get right to the point: traditional step-wise, linear models for instructional design no longer fit today's learning and performance improvement environments. Few people ever really followed them, even if they tried. Moreover, in today's environment, these models are not merely ineffectual -- they are actually detrimental to our developing more strategic and effective roles within organizations.

The old, assembly-line "ADDIE" (analyze, design, develop, evaluate) instructional design models just doesn't make it anymore. Over the past decade, my colleagues and I have developed some alternative models for the practice of designing and managing training and performance improvement systems that I'd like to share with you here.


When I work with training and HR managers, they often are looking for solutions to problems like these:

* Training developers seldom have the time or resources to do needs analyses or evaluation.

* Clients and sponsors generally come to training departments assuming that they know the solution to performance problems, and demand training solutions -- but often those solutions are not really effective in improving the situation. This cycle causes training and HR professionals to lose credibility and influence within the organization.

* Organizations need to develop training faster -- by the time training departments get courses or materials together, the content is often out of date.

* As hard as designers try to incorporate subject matter experts' input, it seems harder and harder to get content that everybody can agree on. After programs are developed, subject matter experts or managers seem to suddenly appear and either openly or privately dismiss the credibility of the courses or trainers.

* Many enterprises don't have enough trained instructional designer and performance consultants, and the ones they do have don't use consistent methods. This causes an inconsistency in style and methods of both interventions and of the process of working with clients and subject matter experts.

* When training and performance improvement projects are handed off to a new person or when a program needs to be updated, the new developers generally have very little documentation to go on from the original designer. This causes departments to continually re-invent the wheel in terms of individual projects as well as in terms of general knowledge about project management.

* Our profession's language and methods don't seem to be really embraced or understood by our executives and sponsors -- we are not as successful as we'd like in getting our models as widely accepted as other business improvement notions, for example "quality" or "reengineering".

Sound familiar? OmniCom Associates' benchmarking research and my experience with many hundreds of client engagements reveals that these situations are common, even among the leading HR and training groups internationally.

My critique of current instructional design models is that :

* They assume a "top-down", behavioristic, and subject matter expert-driven approach to training rather than a more collaborative and learner-based approach. Somehow the assumption is that there's some body of "correct" knowledge out there and we just need to find a subject-matter expert, suck the knowledge out of his or her head, package it in the right container of media and messages, and transfer it to the heads of our learners. The underlying mindset here is that somehow between the SME and the designer, they can decide on a complete set of objectives and content in isolation from the dynamic exchange among learners and teachers in a real-time communication environment, that they know what's best for the learner, and that there is little debate or diversity about the topic. These models really have no mechanism for explaining how to deal with content that is controversial, or with a diversity of approaches or procedures. They are designed to avoid unplanned discussion of topics and "getting off the track" or incorporating new ideas or methods during the actual learning process. Lastly, they don't incorporate the learner into the design and knowledge-exchange process -- except as the passive "receiver" at the end of the chain.

* Discrete step-by-step procedures are too linear and time-consuming to be practical in the "real world" of training on fast-changing topics. The cycle time to development is too long and too much time is spent in detailed analysis and design before getting feedback on drafts or before any sort of training can be "rolled out". They force developers into thinking that that they can't offer anything until long processes of analysis and design are completed, and often this means that critical problems aren't addressed at all for many months. In fact these linear models don't reflect the ways that experts actually work when designing learning materials and performance improvement projects. It's been documented that experts in learning and performance improvement systems apply a much more iterative approach -- challenging assumptions, brainstorming possible solutions, and rapidly generating a rich set of possibilities before gradually narrowing them down by considering a wide set of factors.

* These models seem to assume that one particular kind of intervention -- instruction -- will solve a performance problem. They are oriented to creating courses and training media programs, not "information bites" or performance support initiatives. ISD models don't help designers choose when NOT to develop instruction, and when to use or combine other types of performance improvement interventions such as new selection procedures, feedback systems, or job redesign.

Perhaps by reflecting on these critiques, you can see why the "boxes" models for instructional design may be doing more harm than good. We need to step out of the boxes and look at approaches that help us get the kinds of results and buy-in that are necessary for success. Instead of struggling against your clients and sponsors who don't see the need to go through the ADDIE steps, let's step back and see if they don't have a point.

We've moved away from assembly lines, even in factories, and the management cultures that most executives are trying to foster have to deal with speed, performance, collaboration, flexibility, continuous improvement, and diversity. Our step-wise models are out of step with these new initiatives, and it's no wonder that we are as a profession getting further out of the mainstream of strategic decision-making in many businesses. The ADDIE model's language and philosophy only serves to further isolate us from contemporary management practices, getting our profession even further "out of the loop".

Unfortunately, many HR and training managers are just trying to make the same old ISD processes faster or to somehow convince trainers and clients to use these old models by putting them through expensive instructional design certification courses and "installing" complex procedures developed by consultants. Many organizations are rushing to software systems that "automate" instructional design through some sort of expert system or templates -- trying to see if they can't speed up the process or make it more consistent. My comment on this is that we're speeding up and institutionalizing what's often a bad process and inappropriate product.


So let's think about a new set of assumptions for our work in performance improvement and learning:

* Design is a collaborative process, and is an important end-product in and of itself. In my work designing educational programs on ethnic studies and in developing scores of interactive media programs, I've found the need to support the efficient solicitation and analysis of input from a variety of constituencies -- including prospective learners -- within the design phase. It's no longer adequate to get the opinion of one subject-matter expert. Rather, the design process when well-done is an organizational development project in itself. In many projects, the uncovering of hidden differences in approaches and the facilitation of a group of content experts, managers, and performers to come to some mutual understandings was in itself a major accomplishment. My colleagues and I often commented that we could throw away the "end product" of the videotape or CBT program -- the entire collaborative design process had itself accomplished important objectives of shared understandings and new relationships in an organization. This goes way beyond "analysis" and "design" -- and involves important dimensions of facilitation, teamwork, and the modeling of the organizations' commitment to diversity and novelty in approaches.

* Training is only one possible solution to performance problems. We shouldn't assume that all problems coming from clients need to be solved with a training package. Rather, the solution might be an electronic performance support system, recommendations for incentives or the removal of barriers to effective performance, a knowledge capture and dissemination system, or a better mechanism for communication and feedback. Many organizations are trying to make the leap from traditional training to performance improvement but most have yet to adopt some type of model that helps analysts decide which set of interventions will fit the bill. Unfortunately, learning a complex set of ISD procedures only biases everyone towards using them and relying too heavily on training interventions.

* Potential interventions must be evaluated by their potential for return on investment -- both of time and of money. Most instructional design processes are devoid of mechanisms to forecast and compare the likely impact of learning and performance systems upon the financial and human resources within an organization .. or even for an individual student. New systems must help designers calculate potential and actual ROI and must advise designers when not to produce any intervention!

* Designs are only good if you can get potential sponsors' and colleagues' buy-in. New models need to not only help designers make decisions, but also to help them collaboratively develop and communicate their visions to sponsors and others on the development team. Digital systems, for example, allow designers to easily show clips of other similar projects and to create customized presentations for different audiences to enhance understanding through the various phases of development. They can also support online and collaborative development processes that allow more participation from a design team throughout the entire process.

* Design is an iterative process, not a linear one; projects are never "done". In today's environment, content changes too quickly to allow for long, drawn-out development schedules. Updates are a reality, not a problem. Therefore, new systems should be more like a "digital workbench" that supports prototyping through built-in audio and video capture and mock-up tools that designers can use to rough out ideas. Prototypes can be polished and evaluated iteratively and even used by learners before they're complete. Most interventions should be seen positively as works-in-progress, rather than "events" that are "completed". We need to put more emphasis on content and the ability of some intervention to improve performance rather than on the "glitz" of an end-product. A simple text check-list may be more effective than a dramatic videotape and might be able to be developed and productively used within several hours.

* Projects and development teams will span greater distances of time and space. New software tools for analysts and developers will need to help them communicate via e-mail and perhaps teleconferencing, assist in translation, and will need to be portable so that systems can easily be carried to different locations. The system of the future will make it easy for members of a team to "zap" information back and forth over phone lines, the Internet, or even wireless communication.

* Everyone is a researcher and a learner. We need to move away from thinking of ourselves as "producers of training" towards a professional identity as "managers of the knowledge and communication infrastructure". Rather than thinking about developing static courses and materials, we should consider how we can facilitate ongoing sharing of experiences, coaching, and on-the-job performance support systems that can incorporate the contributions from a wide range of people.

* One size does not fit all. Many organizations are looking for an off-the-shelf system to "automate" or "standardize" the design and development of learning materials. Although certainly such software and hardware can and have been developed, they may wind up producing "cookie-cutter" outputs. One of the most important tasks of a performance improvement professional is to create and nurture a unique corporate culture. This needs to be done by carefully developing styles of communication, instruction, and collaboration that are consistent with the unique "voice" of the organization. In our interactions with successful companies such as SouthWest Airlines, Ben and Jerry's Homemade, and Cracker Barrel Old Country Store that are built on strong, innovative cultures, we have seen that training systems actually teach more lasting and important lessons by how they teach than what they teach. No expert system or set of procedures would advise a trainer to run "culture days" like SouthWest Airlines does, or to teach new employees about courtesy and service by having them sit on the restaurant porch and listen to customer conversations as they leave, as do some trainers at Cracker Barrel Old Country Store. Rather, some off-the-shelf set of rules or templates would probably have recommended a CD-ROM tutorial as more "efficient". However, it's easy to see that unique expressions of culture are much more important to impart than a set of "learning objectives" that will never really capture what makes a company and its employees really successful.


Our profession needs to develop systems that are more in sync with contemporary management approaches. That's exactly what I've been working on for the past decade -- both as tools and mindsets for OmniCom's practice, and as methods that we can pass on to our clients who look to us for new approaches. The practice model we developed, IcoM, (Integrated Communication Model), is a combination of analysis and design procedures with a "digital workbench" set of hardware and software tools that support rapid collaboration and prototyping. These are selected and deployed in different ways in different organizations, rather than being an "off the shelf" product. (It's more like a salad bar than a frozen dinner.)

Let's look at what "out of the box" design in action looks like. I'll give you an example that's an amalgam of some of our actual projects:

It's 8AM and the office phone is already ringing. it's Susan, our client at Cayuga EnergeX who says that that one of their product managers has requested a CD-ROM on selling skills and product knowledge for a new line of energy-efficient motors. The new product line will be launched within eight weeks, and sales reps to be trained are located all over North and South America. A CD-ROM in eight weeks! Impossible! And how can she ever break away from doing only training and get into other types of performance improvement projects when her product managers keep asking for rush jobs like these?

Taking another sip of coffee, I assure her that some intervention to help the sales force can be offered in short order. We need to analyze what the sales reps will need to be able to do that they can't already do in order to effectively sell this new product. In order to get the information we need, it's necessary to get the input from many different constituencies -- the engineers who developed the product, the marketing and advertising folks who are creating sales aids and strategies, the sales managers who will be responsible for setting goals and coaching the reps ... and let's not forget the reps themselves and their customers. This approach to participatory design that we use can sound pretty formidable, but Susan knows that this can happen quickly and that it's essential to get this level of contribution early on.

I tell her to get a list of representatives from each of these constituencies and ask them to contribute about a half an hour to 1) participate in one of three kick-off Internet-based conferences that we'll have at different times to accommodate their schedules; and 2) work with our online program, CEI: The Content Expert Interviewer, that we can quickly customize and post on our website to gain their initial input on performance gaps, audience characteristics, major information points, and evaluation measures.

Within three days, we've have short audio or video Internet conferences with different subsets of our 18-person design team, and thanks to the computer microcphone/speaker systems (and in some locations, little video cameras that cost under $250.00 that attach to individuals' computers), we have successfully kicked off the project with worldwide collaboration without even making a long distance call. Within another three days, each of the design team members has filled out the information requested by CEI; even those members who are traveling can use their laptops to sign onto our website at night from their hotel rooms. We quickly analyze their input for areas of commonality and areas of divergence, summarize these, and open up the document file for the communication and training standards we developed for the company. We send out e-mail to each person on the design team asking them to again check our website and using a groupware package, work through prioritizing and discussing the areas of agreement and divergence in their initial recommendations. Since we post these discussion points without associating them with individuals, the design team feels free to candidly offer their opinions without the usual complications of speaking in a group meeting where participants are of different levels of status or influence.

These design team members are busy folks -- we never could have corralled them into a one-day meeting or flown around to meet all of them. Fortunately, the kick off teleconferences and the solicitation of their input only took each of them about an hour, and we've gotten some fantastic input. It turns out that most people agree that the sales reps already have good sales skills and that they have a general awareness of the new technology that will be employed in the energy-efficient motors. The sales and marketing department told us that they have some extensive brochures designed for customers that are in the works, and that a lot of major features and benefits of this line are well articulated in these pieces already. The problem is that they won't be actually printed for another month since they are still waiting for pictures to be taken.

The group is able to focus on three major performance gaps: 1) the sales reps are not yet able to do a good payback analysis for these motors which is essential to show customers why and how their investment in these expensive new motors will actually save them money in the long run; 2) the sales reps don't know what these motors will look or sound like, and they often need to be able to spec motors that have very specific criteria for use within existing customer applications, and 3) the sales reps aren't sure which customers they should target first for the greatest payoff.

Based on this input, we collectively decided that a full-blown CD-ROM tutorial is not only impractical and expensive, it's not needed. I advise Susan that we'll work simultaneously on four interrelated projects:

1) The marketing department is going to ship us the desktop publishing files for the brochures on the motors. We create a simple digital movie that steps the sales reps through the major sections of the brochure along with a simple audio narration track that's done by the product manager. This will explain to reps how to use the brochure with customers. The product manager records the audio track right on his computer and sends us the file as an e-mail attachment. It's not slick -- but it's very credible and effective. We put the sound track together with screen shots of the desktop publishing program for the brochures using an inexpensive program that runs right on my laptop. Only the actual photos for the motors are missing. We place this digital movie on the EnergeX Intranet and sales reps can watch it on their laptops, so they can immediately become comfortable with the primary sales tool that they'll be using. Susan also asks the marketing department to send the actual desktop publishing files to each of the sales reps via an e-mail attachment that they can open it on their own computers. This way they can read through it carefully, and even make some recommendations for changes to the marketing department before it goes to press.

2) The payback analysis for this series of motors is cumbersome and hard to memorize. Since sales reps will likely only have the opportunity to use this procedure about once a month, it doesn't seem to make sense to teach them to memorize all these formulas and specs. Rather, we work with the engineering department via phone and fax to come up with payback formulas that we provide in two ways: one is a simple fill-in-the-blank form with the formulas written in that reps can use in sales calls, and one is a computer-generated calculation template that works within the spreadsheet program that the sales reps use. We again send all the reps these two files via e-mail. The job aid will be much more effective than trying for memorization. Also, the reps can actually give their customers the spreadsheet model or printed form for them to play with themselves.

3) We ask the engineering department to take their camcorder and capture a couple of minutes of video of the three new motors in action back at their test labs, as well as some shots of the motors placed next to a conventional model so that one can see the differences in sizes and shapes. We also request the CAD/CAM illustration files for the motors. Our plan is to edit the video and drawings and post short movie clips on the Cayuga EnergeX Intranet. As it turns out, all of us at OmniCom are on the road for two weeks at client sites or speaking at conferences, but that doesn't hold us back. I have the engineers overnight me the videotape to my hotel and send the illustrations via e-mail attachments; I edit these together using my laptop that has video and audio digitizing capability, and post the final short digital movies right to the Cayuga EnergeX website that night from my hotel room.

4) We help Susan and their Intranet webmaster set up an online "chat room" for engineers, the product manager, and the sales force. Using this simple functionality in the web page authoring program we use, we make it possible for this "community of practice" to exchange questions, answers, and hints -- especially regarding which potential customers and approaches seem most successful. This will continue to be "live" after the product is launched so that the learning is ongoing within the context of actual sales challenges and can also serve as input for new product designs.

It's only four weeks from the first call, and everybody feels ready for the launch of this important new product. It took a team effort, but in fact not as much time as reviewing detailed scripts, storyboards, and then trying to deliver a formal training "event". And nobody had to leave their offices or spend more than about a half an hour at a time on this project; we live in a multi-tasking world. Most people can't even get out an RFP for a CD-ROM course in that time frame, and look at how much we accomplished with very low-cost but performance-focused interventions.

Luckily, input by the design and management team was readily offered. You see, a year before we had worked with the CEO and Board of Directors of EnergeX to help them consider and adopt some of the notions of the "learning organization", "performance technology", and mentoring. Training and coaching are now a part of everybody's job and performance evaluation, and those who make significant contributions in this area are honored at an annual "Star Search" celebration held at a different resort each year. The company now not only honors top sales reps with bonuses and exotic trips, but it also honors those who made it possible for those sales reps to succeed. It's become an integral part of their culture, and they enjoy using new high-tech tools to contribute in these ways.

Sound like science fiction? Hardly. We've been using many of these tools and mindsets since 1985, and with today's Internet technologies and inexpensive laptops, all of the infrastructure that's mentioned here is available for the price of just one traditional course. More importantly, this new practice model will assist trainers in becoming more closely coupled with contemporary management and organizational development initiatives and in providing quick performance solutions that build strong cultures.


Diane M. Gayeski, Ph.D. is a designer and advocate of new models for learning and communication management in organizations. As a Partner in Ithaca, NY-based OmniCom Associates, she's led over 200 client projects; she also maintains an academic affiliation as a professor in the Roy H. Park School of Communications at Ithaca College.

Back to OmniCom Associates home page

copyright 1997 OmniCom Associates  all rights reserved   updated April 11, 1998