Annotated Bibliography - Requirements Engineering
[AAB99] T.A. Alspaugh, A.I. AntŪn, T. Barnes and B.W. Mott. "An Integrated Scenario Management Strategy." IEEE Fourth International Symposium on Requirements Engineering (RE`99), University of Limerick, Ireland, pp. 142-149, 7-11 June 1999.
Abstract: There are five main components of the scenario management strategy: tool support, glossaries, episode management, similarity measures, and coverage estimation. There are also main components of the scenario management tool: scenario database, glossaries for each scenario attribute, configuration management system for time-stamping and documenting changes in the scenario and glossary database, episode identification and management, similarity measures, and system coverage analysis. Project glossaries help provide a common understanding of scenario terms. It can also help search for and prevent redundancies. The system manages episodes by distinguishing events and recognizing identical events (share same actor and action). Similarity measure is a function that produces a number expressing the degree of similarity between two scenarios, a percentage of overlap. This can quickly identify duplication in large sets of scenarios. The similarity measure can be used to estimate the coverage of a group of scenarios. This strategy was used in the meeting scheduler problem. It points out the effectiveness. The strategy allows analysts to vary the level of redundancy and consistency checking required for scenario evolution.
This is a new tool.
Scenario - a linear sequence of event, with associated attributes
Event - consists of an actor and an action
Subsequence - sequence of one or more events that forms all or part of a scenario's sequence
Episode - named subsequence that is usually shared among several scenarios
Attributes - ex. system goals, viewpoint, pre- and post-conditions, purpose, concreteness level, author, requirements, events, actors and actions, and also episodes
[ADS01] A.I. AntŪn, J.H. Dempster and D.F. Siege. "Deriving Goals from a Use-Case Based Requirements Specification for an Electronic Commerce System." Requirements Engineering Journal, Springer-Verlag, May 2001.
Abstract: Scenarios describe concrete system behaviors by summarizing behavior traces of existing systems. Use cases describe the possible system interactions that external agents may have with a system. In this goal analysis, goals are derived from use cases. Goals were refined using the heuristics of the GBRAM. There were five specific challenges found. The first challenge is that use cases are not clear. One solution to this is to attach goals to each use case in the form of its title. Also, the authors of the SRS did not consider tasks the system was to accomplish. A solution to this is to have more end-user involvement. Another challenge is traceability of goals, source and origin. A solution is to use a tool support to make this easier. Also, use cases based on GUI design and implementation. A solution to this is to base the use cases on user goals and objectives. The final challenge is missing or inconsistent naming of use cases. A solution to this is to use an "Includes Tree" which helps track the use cases. Many of these challenges were also mentioned in the risk analysis. Some lessons learned are to further categorize goals by only using the verb "achieve" for user goals and "make" only for system goals. There are also domain specific classifications: process support, electronic commerce, information display and organization, and security and access control goals. Also, using constraints in goals maintains context.
HCI - human computer interaction
[ALR96] A.I. AntŪn, E. Liang and R.A. Rodenstein. "A Web-Based Requirements Analysis Tool." IEEE Fifth Workshops on Enabling Technologies: Infrastructure for Collaborative Enterprises (WET-ICE96), Stanford University, California, USA, pp. 238-243, 19-21 June 1996.
Abstract: Goal-Based Requirements Analysis Tool (GBRAT) web-based goal-based requirements analysis. Serves as a way to get team members working from different locations to participate in decision-making processes. The structure starts off with a project repository which is given the name of the project, a description, and the names of the analysts. In the repository are goals which are created using three different filters: maintenance and achievement goal filter, agent filter, and the total order filter. Goals must identify a source (document) for traceability reasons. Goals can be viewed by name, responsible agent, precedence relation, or properties.
[Ant96] A.I. AntŪn. "Goal-Based Requirements Analysis." 2nd IEEE International Conference on Requirements Engineering (ICRE '96), Colorado Springs, Colorado, 15-18 April 1996, pp. 136-144.
Abstract: Goal-Based Requirements Analysis Method (GBRAM) used in an application. Goal analysis is going through documentation and forming goals. Goal evolution is the changing of goals from creation to operationalization into a system specification. In goal analysis, goals are identified through review of documentation and also scenarios. Goals are classified into two categories: maintenance and achievement. In goal evolution, there are three methods to improving a goal set: eliminate duplicate goals, refine goals based on system entities, and combine nearly identical goals. Creating obstacles causes specific cases to be looked at. An obstacle is a reason why a goal failed. A scenario is a circumstance under which a goal may fail, an instance of an obstacle. There are many lessons learned stated in the CTTS case. The more information sources there are, the more complete the goal set is going to be. It is helpful to categorize goals. Constraints provide additional information about requirements and identify new goals. Scenarios play a major role in uncovering issues.
This is a mini validation of the GBRAM.
Operationalization - the process of defining a goal with enough detail so that its subgoals have an operational definition; translating goals into requirements specifications.
Precedence relationship - goal 1 has to be completed before goal 2.
Achievement goal usually found in process description
[AP98] A.I. AntŪn and C. Potts. "The Use of Goals to Surface Requirements for Evolving Systems." IEEE International Conference on Software Engineering (ICSE '98), 19-25 April 1998, pp. 157-166.
Abstract: Goal analysis consists of exploring, identifying, and organizing. Goal refinement is refine and elaborate. There are four principles for turning goals into requirements: validate requirements through scenarios, categorize/classify goals, use obstacles, and refine goals through Inquiry Cycle model and scenarios. The GBRAM is used to analyze the CommerceNet web server. The four principles were applied to the CommerceNet. Requirements are seen as functions the system has to perform. Goals are seen as functions the system must support. The system's responsibilities are described as what should happen automatically. Goals are classified by two independent schemes: achievement or maintenance goals and subject matter. Classifying helps organize goals into different functional requirements. Questions help develop a deeper understanding of requirements. Scenarios help identify new goals, elaborate requirements, and offer alternative implementation options. Obstacles are helpful in anticipating exception cases the system might have to handle.
Verbs for goals: avoid, ensure, improve, increase, keep, maintain, make and reduce/speedup.
[CAD01] R.A. Carter, A.I. AntŪn, A. Dagnino and L. Williams. "Evolving Beyond Requirements Creep: A Risk-Based Evolutionary Prototyping Model." IEEE 5th International Symposium on Requirements Engineering (RE'01), Toronto, Canada, August 2001.
Abstract: Requirements creep is very frequent in e-commerce applications. The EPRAM has risk analysis to check requirements during the entire process. The EPRAM defends against requirements creep by its risk mitigation, which helps identify requirements changes that create significant risk problems. The CMM level 2 was customized to be consistent with small e-commerce applications. All of the level 2 key process areas were kept besides the Subcontract Management because most small organizations do not deal with this. The elements of the key process areas were tailored to fit the small e-commerce. The EPRAM is evolutionary prototyping with risk analysis and mitigation that has a solid CMM maturity. Looking into lightening the documentation area.
This is a new model/process. It is in the process of being validated.
Requirements creep - significant additions or modifications to the requirements of a software system throughout the lifecycle, resulting in extensions to and alterations of the software's functionality and scope.
?s: heuristics? - principles
Generically, what is prototyping? - a sample of a completed system
What is evolutionary prototyping? - prototypes that build off of each other, put together, get closer and closer to a complete system
[JB00] D.L. Johnson and J.G. Brodman. "Applying FMM Project Planning Practices to Diverse Environments." IEEE Software, pp. 40-47, July/August 2000.
Abstract: Project - the development of a new product or major product enhancements that significantly change the scope of the product; everything else is a subproject. One major focus for diverse organizations is documentation. The documentation should be useful and adequate for the resources available. Supplemental documentation is good for subprojects. The CMM requires that practices and procedures be documented and that the organizations follows it. Diverse organizations should focus on the CMM goals, not the practices. The goals are broad and the organizations should tailor the practices to fit them. If this is done, most organizations' practices will be consistent with the CMM. The CMM should not be interpreted literally.
This article is sort of a tool. The authors are trying to help small organizations satisfy the CMM; help them try to interpret it correctly. (Pp. 43 and on) They seem to try to confuse people by stating (I guess) the CMM practices and showing how small organizations cannot do things that way. Then they say, for example that the CMM doesn't give a method for making estimates, it just requires that they are made and documented. Apparently there is a difference between CMM practices and goals. It is more important to follow the goals and generate tailored practices for each company. The "CMM practices" the authors are referring to, seem to be geared towards large organizations.
?s: anomalies - irregularities
[KC99] D.P. Kelly and B.Culleton. "Process Improvement for Small Organizations." IEEE Software, pp. 41-47, October 1999.
Abstract: Smaller organizations must try to minimize their limitations and maximize the benefits from their environment. Silicon & Software Systems started out at a level 1 according to the CMM but had partially worked on some key process areas (KPAs) from levels 2 and 3. They implemented a software process improvement (SPI) initiative to try to reach a higher level of maturity. They needed to focus on requirements management, software project planning, software project tracking and oversight, software quality assurance, and software configuration management. They created a task force to develop custom key process areas and also figure out which of the CMM's were relevant to their company. They implemented the changes in the key process areas and tested them in a small pilot project. Through this they found problems and could make changes before implementing the changes to the entire company.
This is the experience of Silicon & Software Systems. Shows what they learned from their experience in developing and implementing their SPI.
[OC99] S. Otoya and N. Cerpa. "An Experience: A Small Software Company Attempting to Improve its Process." Proc. Software Technology and Engineering Practice STEP '99, pp. 153-160, 1999.
Abstract: Winapp used a tailored CMM for small businesses by LOGOS. The improvements were developed according to the company's needs and problems, then compared to the CMM. Although Winapp is still at a level 1 maturity level, the improvements made have greatly benefited the company and they are much closer to reaching a level 2.
This is an experience of one company. Winapp is very similar to the company I work for. We have a program similar to WinTimesheet, use Source Safe, etc. I think every small or large company should have a program like WinTimesheet that keeps track of projects, requirements and time spent. One downfall is that if the company develops this program itself, it is an ongoing project that requires continual maintenance.
[Pau98] M.C. Paulk. "Using the Software CMM in Small Organizations." Carnegie Mellon University, pp. 1-13, 1998.
Abstract: The CMM should be used as guidance rather than requirements. Both small and large organizations have problems with the software process. The CMM requires intelligence and common sense to be used effectively. The areas of guidance given for small organizations can also apply to large organizations. A main recommendation is to customize the language used in the CMM to fit the individual organization. For small and large organizations, certain things are needed such as documentation, planning, communication, and commitments; but the implementation for small and large organizations is very different. There are many areas of focus for small organizations. But after others have tailored the CMM for small organizations, the changes are not radical. One mistake is using the maturity levels as goals of improvement instead of measures of improvement. It is recommended that small organizations use the Team Software Process and the Personal Software Process.
This is more a position although there are some others cited that are validating what he is saying.
Paulk recommends the PSP which is for individuals. You would think it would be better to work on this sort of thing as a team rather than a bunch of individuals.
DOD - department of defense
SEPG - software engineering process group - coordinate process definition, improvement, and deployment activities
SCM - software configuration management
[PCC93] M.C. Paulk, B. Curtis and M.B. Chrissis. "Capability Maturity Model, Version 1.1." IEEE Software, pp.18-27, July 1993.
Abstract: The CMM is a guide on how companies can become well managed in the whole software process. Companies need to be more mature in their software process. There are five maturity levels: Level 1: Initial, Level 2: Repeatable, Level 3: Defined, Level 4: Managed, and Level 5: Optimizing. Performance can be predicted according to the maturity level; the higher the maturity, the better the productivity and quality. It is important to increase one level at a time and not to skip levels, which would be counterproductive. Each level has key process areas, which are a focus for improvement. And each key process area has common features, which include goals, commitments,
abilities, activities, measurements and verifying implementations.
This is a somewhat new, but improved process. All on improving the software process for companies.
"A software process is a set of activities, methods, practices, and transformations that people use to develop and maintain software and associated products."
[PTA94] C. Potts, K. Takahashi and A.I. AntŪn. "Inquiry-Based Requirements Analysis." IEEE Software, 11(2), pp.21 - 32, March 1994.
Abstract: Inquiry Cycle model has three phases: requirements documentation, requirements discussion, and requirements evolution. Shortcuts are always possible. There was a case study with the meeting scheduler. This case is a good example because it relates to real systems, most people know about scheduling meetings so no extra knowledge is required, and there is already a requirements document written. Creating scenarios is just as good, if not better than prompting questions about requirements as analyzing the requirements document. Requirements changes were discovered by six different types of questions: what-is, how-to, who, what-kind-of, when, and relationship. All assumptions should be noted and should try to be justified. Sometimes assumptions need to be made to move forward. Scenarios in the test case proved to be useful, more than half the improvements to requirements came from analyzing scenarios.
This is a new/refined model.
Parenthetical insights - when analyzing something produced questions about something else.
Stakeholder - anyone who can share information about the system, its implementation constraints, or the problem of domain.