Introduction to Analysis and Design of Information Systems
What Is, Is
Over Forty years of developing requirements for systems have taught us that the only successful approach to analysis is to accept what exists in the user’s environment, however far from ideal those conditions may be, and work within those limitations. It may be very tempting to use analysis time to try to refocus how the user does business. Yet efforts to re-design or reengineer, unless specifically requested by the user, will typically be a waste. Although your assessment may be correct and your suggestions potentially useful, being correct is less important in this situation than being wise and understanding the ability of your users to successfully implement and utilize what they need. Analysts tend to ignore this simple wisdom, much to their own distress and that of their clients.
Looking at a typical example of an analysis situation will help to illustrate this point. Let us assume that an enterprise needs a relational database model to gather information about a subject area of the business. There are 200 offices that will need to connect into a nationally provided service. Users disagree on the mission of the applications and cannot determine what reports or query information they want. Some offices are automated, but they do not have the same software and hardware. There is little expertise in the user community to determine the data requirements and file layouts (identifying elements in each file). Management has requested that the analyst establish a specification which identifies the requirements of the system as well as the necessary hardware and software.
Faced with so unwieldy a task, many analysts will adopt the following approach in an attempt to impose order on a disorderly situation:
1. Force users to define all requirements. Since they are unable to do so, this insistence will probably result in their guessing or providing incomplete information.
2. Determine the hardware and software configuration, despite having inaccurate or incomplete requirements.
3. Ignore the political environment.
4. Establish a project plan that everyone knows will fail, but push ahead with it anyway.
It should be clear that this approach is the wrong one, on a number of different counts. Yet such an approach is all too typical for analysts confronted with a less-than-ideal working-environment. Happily, there is a better approach for all concerned, one that recognizes and responds to the conditions actually present at the users’ site. In this case it is evident that the users are not positioned to provide the requirements for a system, largely because they do not fully understand their own needs and because they do not agree on what those needs are. What the analyst must understand in such a situation is that because of this lack of knowledge and organization, user needs will tend to change during the process of product analysis and design. Such changes are to be expected; they are simply part of the life cycle for this particular implementation. To ignore the situation and try to implement a system is to invite failure. Put simply then, what is, is. The task of the analyst is to work with what is rather than trying to change it or— even worse—simply denying it. Once you as an analyst understand that reality, you understand that your solution must accommodate what will inevitably occur.
Here is a more sensible approach to the situation described above:
1. Focus on designing a model that can provide the users with the capability they want. Create a project plan that assumes that the database will be incomplete during phase I because of the users’ inability to define the correct information. The process will therefore be iterative and thus will be finalized during the later parts of the development life cycle.
2. Do not try to identify hardware before it is clear what the usage requirements are, such as peak-time processing, number of users, and so on. It will be more beneficial to establish the operating system or architectural environment that you want to support, pending the results of the analysis.
3. Utilize a software system or CASE tool that will allow users to generate new scenarios such that they can see how these scenarios relate to the entire system.
4. Set up a pilot program. This will require that certain offices agree to be test sites for the early versions of the software. The function of the pilot is to provide feedback on the effectiveness and shortfalls of the product. It is important to state clearly the objectives of the pilot and the format of the feedback in order to ensure the success of the exercise.
5. Formulate a plan that depicts a schedule for getting the entire enterprise implemented and live on the new system. Be sensitive to the politics of the situation, and use a realistic approach that will not require a cultural change in order to implement software in the existing environment.
The essence of this approach is to develop a strategy that fits the reality of the environment rather than force the environment to change. Throughout this book, we will explore this simple but crucial concept. No two system development projects are identical, and the more familiar the analyst is with the environment, the more successful the project will be. This book will also argue against the conventional wisdom that suggests using an approach based on only a single methodology (e.g., Yourdon, Martin, Booch, etc.). The mixing of methodologies allows the analyst a wider range of tools. Hands-on experience shows that this kind of mixing of methodologies can be done quite successfully and that it is appropriate in a large number of analysis situations.
Just What Is a Complex Project?
Most analysts, project team members and users worry about the complexity of their projects. Their requirements seem entirely unique to them, and therefore a very special approach seems to be required. How many times have you heard: “the tools and approaches used elsewhere just won’t work in this environment”?
The truth, however, is very different: the only truly complex projects are those that people make so! It is important for the analyst to recognize that the procedures utilized, regardless of the size of the project, should remain fundamentally the same. As we have discussed above, the analyst’s approach to the implementation of each project should be tailored individually; however, the procedures for this implementation should remain constant. Very often the organization of interviews, the utilization of techniques such as Joint Application Development (or JAD), discussed later in this chapter) or the simple addition of more analysts to the project can solve what appear to be insurmountable problems.
In fact, most of the myriad problems that arise in product development can be traced to two fundamental issues:
1. People are trying to solve the wrong problem, i.e., the identified problem is not really what is wrong.
2. The solution to the real problem is often much simpler than it first appears to be.
Because we have failed to recognize these issues, the industry’s frustration with developing appropriate software solutions has been chronic, and this situation has not really improved over the last twenty-five years! The question is why?
To put it bluntly, analysts often fail to do their jobs properly! We tend to put together plans and schedules that are doomed from the start to fail, an issue treated in more detail later. The ultimate goal of the analyst must take into account the reality of the environment in which the work is occurring. Remember, work within the environment. Let users decide what degree of change is appropriate for their own operation; do not take it upon yourself to demand that they change.
For example, how many times have you seen a Gantt Chart1 for a project
schedule that resembles Figure 1.1 below?
Figure 1.1 Sample Gantt Chart.
It looks nice, but in reality the plan it depicts could never happen. Focus in particular on the intersection of Development and Quality Assurance (QA) activities. The plan shows that once Development is finished, the materials are forwarded to QA for testing. The sequence assumes, however, that QA will never find an error and that therefore the materials will never be returned to Development! Any analyst knows that this scenario is very unlikely to occur. Such poor planning results in deficient allocation of resources to the project. Should the development schedule be met, programming resources most probably will be allocated to other projects. Thus, if QA finds errors (which they undoubtedly will), reallocating these programming resources becomes difficult and problematic. And remember: programmers do not like returning to an “old” program to do maintenance or error fixing.
Figure 1.2 reflects a more realistic view of the life cycle of the project:
The difference in approach is striking. The question is, as sensible as this plan appears to be, why don’t we always do it this way? Quite frankly, this plan does not look as nice—as neat and tidy—as the previous plan. But of course simply denying the pain of reality—the inevitable inconveniences and delays— does not make that reality go away. In defense of the previous configuration,
Figure 1.2 Modified Gantt chart reflecting realistic project activity behavior.
some developers might suggest that the iterations of efforts between testing and fixing the software are assumed to be included in the QA time. Maybe, but don’t count on it! Just look at the second schedule and you will see how the results of this proper allocation added to the delivery time of the project. It is clear that the original plan was simply incorrect.
There is absolutely no reason that a schedule should not reflect the reality of what will most probably occur. The results are clear: Realistic planning provides a more reliable schedule. Among the many benefits of such a schedule are the confidence and respect gained by both the users and the development staff. There is nothing like producing a schedule that reflects what everyone is confident will occur.
At this point, experienced analysts are no doubt wondering what happens when management dictates how much time we have and shows no flexibility about running behind schedule. This problem is unfortunately not uncommon, and typically fits into one of three scenarios:
1. Management is ignorant of the analysis and construction of systems and simply has no idea how much time is required to complete the project. In this case the analyst will need to develop a convincing presentation for management about how systems are designed and developed. The presentation should be carefully documented to refer to the industry statistics for similar projects in similar companies. This kind of documentation adds much credibility to the discussion. You can also consider having an independent source, such as a respected consulting firm, support your position.
2. Management has little confidence in Development. They feel that picking a date and sticking to it is the best method of getting the project finished. Yes, this is the bully technique! It usually results from bad experiences, probably from looking at those unrealistic Gantt Charts. In this situation, the analyst must take steps to gain the management’s confidence. Using the suggestions above would be a good start. In addition, you will need to research and to understand the history of what your predecessors did to encourage this type of distrusting and dictatorial attitude from management, and you will need to find a tactful way to address those issues.
3. Unfortunately, bad management does exist. If you cannot win any concessions or understanding from management, you may have reached what is known as the “no-win scenario.” Management is simply unwilling to allot adequate time for the completion of the project and to be persuaded otherwise. When this situation exists in the workplace, the advice is straightforward: You can leave, or you can find some way to deal with this constraint. In either case, be aware that under the no-win scenario there is little hope that the project will result in the development of quality software. This perspective is not cynical, but instead realistic: some projects are doomed to fail before they begin. What is important is that the analyst recognize as early in the life cycle as possible that the project cannot be successful.
Comments
Post a Comment