Skip to content

Day 14: Quantitative methods

  • Kitchenham, B. A., S. L. Pfleeger, et al. (2002). “Preliminary guidelines for empirical research in software engineering.” IEEE Transactions on Software Engineering 28(8): 721-734.
    • propose a preliminary set of research guidelines aimed at stimulating discussion among software researchers based on guidelines developed for medical researchers and on our own experience in doing and reviewing software engineering research
    • guidelines are intended to assist researchers, reviewers, and meta-analysts in designing, conducting, and evaluating empirical studies
    • In our view, the standard of empirical software engineering research is poor

    Context

    • Be sure to specify as much of the industrial context as possible. In particular, clearly define the entities, attributes, and measures that are capturing the contextual information
      • there are multiple standards for counting function-points, and confusion exists about what is meant by counting a line of code
    • If a specific hypothesis is being tested, state it clearly prior to performing the study and discuss the theory from which it is derived, so that its implications are apparent.
      • empirical studies of software engineering phenomena are often contradictory
    • If the research is exploratory, state clearly and, prior to data analysis, what questions the investigation is intended to address and how it will address them.
      • exploratory studies are an important mechanism for generating hypotheses and guiding further research
    • Describe research that is similar to, or has a bearing on, the current research and how current work relates to it.

    Experiments

    • Define the process by which the subjects and objects were selected.
    • Define the process by which subjects and objects are assigned to treatments.
    • Restrict yourself to simple study designs or, at least, to designs that are fully analyzed in the statistical literature. If you are not using a well-documented design and analysis method, you should consult a statistician to see whether yours is the most effective design for what you want to accomplish
    • Define the experimental unit.
    • For formal experiments, perform a pre-experiment or precalculation to identify or estimate the minimum required sample size.
    • Use appropriate levels of blinding
    • If you cannot avoid evaluating your own work, then make explicit any vested interests (including your sources of support) and report what you have done to minimize bias.
    • Avoid the use of controls unless you are sure the control situation can be unambiguously defined.
    • Fully define all treatments (interventions).
    • Justify the choice of outcome measures in terms of their relevance to the objectives of the empirical study.

    Data collection

    • Define all software measures fully, including the entity, attribute, unit and counting rules
    • For subjective measures, present a measure of interrater agreement, such as the kappa statistic or the intraclasscorrelation coefficient for continuous measures.
    • Describe any quality control method used to ensure completeness and accuracy of data collection
    • For surveys, monitor and report the response rate and discuss the representativeness of the responses and the impact of nonresponse
    • For observational studies and experiments, record data about subjects who drop out from the studies.
    • For observational studies and experiments, record data about other performance measures that may be affected by the treatment, even if they are not the main focus of the study.

    Analysis

    • Specify any procedures used to control for multiple testing.
    • Consider using blind analysis.
    • Perform sensitivity analyses.
    • Ensure that the data do not violate the assumptions of the tests used on them.
    • Apply appropriate quality control procedures to verify your results

    Presentation

    • Describe or cite a reference for all statistical procedures used.
    • Report the statistical package used.
    • Present quantitative results as well as significance levels. Quantitative results should show the magnitude of effects and the confidence limits
    • Present the raw data whenever possible. Otherwise, confirm that they are available for confidential review by the reviewers and independent auditors
    • Provide appropriate descriptive statistics.
    • Make appropriate use of graphics.

    Interpretation

    • Define the population to which inferential statistics and predictive models apply
    • Differentiate between statistical significance and practical importance.
    • Define the type of study.
    • Specify any limitations of the study.

     

     

     

     

  • Perry, D. E., A. A. Porter, et al. (2000). Empirical studies of software engineering: a roadmap. Proceedings of the Conference on The Future of Software Engineering. Limerick, Ireland, ACM: 345-355.
    • present a roadmap for improvement, which includes a general structure for software empirical studies and concrete steps for achieving these goals: designing better studies, collecting data more effectively, and involving others in our empirical enterprises.
    • the most important part of doing an empirical study is drawing conclusions. Many papers fail to do anything with their results. We need to learn something from every study and relate these things to theory and practice.
    • Creating Better Empirical Studies
      • we must be clear about the goals of our studies, design them more effectively, and maximize the information we get out of them.
    • Credible interpretations
      • Validity is a characteristic of an empirical study and is the basis of establishing credible conclusions.
        • Construct validity means that the independent and dependent variables accurately model the abstract hypotheses.
        • Internal validity means that changes in the dependent variables can be safely attributed to changes in the independent variables.
        • External validity means that the study’s results generalize to settings outside the study.
    • Structure of an Empirical study
      • research context,
      • hypotheses,
      • experimental design,
      • threats to validity,
      • data analysis and presentation
        • Quantitative
          • Hypothesis testing
          • Power Analysis
        • Qualitative
      • results and conclusions.

       

       

       

       

  • Tamai, T. and M. I. Kamata (2009). Impact of Requirements Quality on Project Success or Failure
    • Exploring the relationship between the quality of the requirement specifications for software projects and the subsequent outcome of the projects.
    • 32 projects by s/w development division of a large company in Tokyo (2003-05)
    • Lack of evidence that requirements quality affects the outcomes of software projects
    • There is relationship between SRS quality and project outcomes

     

     

     

     

  • Gorschek, T., P. Garre, et al. (2007). “Industry evaluation of the Requirements Abstraction Model.” Requirements Engineering 12(3): 163-190.
    • Many requirements engineering best practices, frameworks, and tools are adapted to suit a bespoke environment with traditional, project focused, customer–developer relationships.
    • Requirements Abstraction Model (RAM) is designed toward a product perspective, supporting a continuous requirement engineering effort. It can handle large quantities of requirements of varying degrees of detail and offers a structure and process for the work-up of these requirements.


    • based on the concept that requirements come on several levels of abstraction


    • when requirements arrive they are placed and specified on an appropriate level by using good-example requirements suitable for the organization and product in question
    • Work-up entails abstracting low-level requirements up to product level and also breaking down high-level requirements to Function Level
    • Work-up rules:
      • No requirement may exist without having a connection to the product level
      • All requirements have to be broken down to function level

       

       

       

  • Sadraei, E., A. Aurum, et al. (2007). “A field study of the requirements engineering practice in Australian software industry.” Requirements Engineering 12(3): 145-162.
    • reports an exploratory study which provides insight into industrial practices with respect to requirements engineering (RE)
    • both qualitative and quantitative data is collected, using semi-structured interviews and a detailed questionnaire from 28 software projects in 16 Australian companies
    • Due to the uncertain and multidisciplinary nature of the RE process, a process model is a prerequisite for applying any structure or discipline to RE, managing its complexity
    • A survey of practitioners indicates the popularity of the waterfall model in RE, but other studies indicate that RE process models used in practice differ from RE process models represented in literature.
    • There is no complete agreement amongst researchers on how this effort should be distributed to each activity
    • Boehm states that a mere 6% of the total cost of software development is devoted to RE.
    • Questionnaire was divided into three sections:
      • The background details of the project, the company, the product and the market,
      • The RE process of the project in detail, including the tasks, products and roles identified in the requirement engineering and
      • The practices and methods used as RE techniques.

     

    • RE process model
      • Project creation
      • Elicitation
      • Analysis
      • Negotiation
      • Verification and validation
      • Change management
      • Requirement tracing
      • Documentation


    • findings highlight two important issues:
      • the nature of the projects, whether they are internal or external, played an important role in the effort distribution,
      • there is a trade-off between RE activities for both internal and external projects

     

     

     

  • Weidenhaupt, K., K. Pohl, et al. (1998). “Scenarios in system development: current practice.” Software, IEEE 15(2): 34-45.
    • we developed a scenario classification framework based on a comprehensive survey of scenario literature
    • used the framework to classify 11 prominent scenario-based approaches
    • 15 projects in four European countries for site visits


    • The form view pertains to a scenario’s expression mode
    • The contents view concerns the kind of knowledge a scenario expresses
    • The purpose view captures the role a scenario aims to play in software development, such as describing system functionality
    • The life cycle view considers scenarios as artifacts existing and evolving in time and pertains to technical handling, evolution, and project management

     

     

     

     

  • Cao, L. and B. Ramesh (2008). “Agile Requirements Engineering Practices: An Empirical Study.” Software, IEEE 25(1): 60-67.
    • Software development organizations often must deal with requirements that tend to evolve quickly and become obsolete even before project completion
    • Agile methods advocate the development of code without waiting for formal requirements analysis and design phases – iterative discovery approach
    • Requirements emerge throughout the development process

    Agile RE practices

    • Face-to-face communication over written specifications
    • Iterative requirements engineering
    • Requirement prioritization goes extreme
    • Managing requirements change through constant planning
    • Prototyping
    • Test-driven development
    • Use review meetings and acceptance tests
    • most common challenges are the inability to gain access to the customer and obtaining consensus among various customer groups

       

       


     

Advertisements

Day 13 : Requirement Engineering and Strategy

  • Donzelli, P. and P. Bresciani (2003). Goal-Oriented Requirements Engineering: A Case Study in E-government
    • presents a requirements engineering framework based on the notions of Actor, Goal, and Intentional Dependency, and applies it to a case study in the field of Information Systems for e-Government.
    • capturing high-level organizational needs and transforming them into system requirements in a smooth and controlled manner, and for redesigning, at the same time, the organizational structure that better exploit the new system.
    • very diverse kinds of actors (e.g., citizens, employees, administrators, politicians and decision-makers in general —both at central and local level) are involved
    • REFsupports the analysts in dealing with complex and system/organizational design related issues, such as shared and clashing stakeholders’ needs, by introducing some specific analysis-oriented notations to allow an early marking and detection of such situations

     

     

  • Rolland, C. (2009). Exploring the Fitness Relationship between System Functionality and Business Needs Design Requirements Engineering: A Ten-Year Perspective. K. Lyytinen, P. Loucopoulos, J. Mylopoulos and B. Robinson, Springer Berlin Heidelberg. 14: 305-326.
    • strategic alignment between IS and business objectives is a top priority for CIOs and IT executives.
    • an understanding of the fitness relationship requires a position to be adopted on the following issues
      • Issue 1: Conceptual mismatch;
      • Issue 2: Modelling and measuring the relationship;
      • Issue 3: Dealing with the complexity of the relationship.

     

     

  • Loucopoulos, P. and J. Garfield (2009). The Intertwining of Enterprise Strategy and Requirements
  • RE, acting as a conduit between business-oriented and system-oriented concerns, is ideally placed to examine systemically the feedback mechanisms involved in complex systems thus, yielding benefits to both activities of the designing of the enterprise and that of the system

  •  

     

     

     

     

     

     

  • Bleistein, S. J., K. Cox, et al. (2006). “B-SCP: A requirements analysis framework for validating strategic alignment of organizational IT based on strategy, context, and process.” Information and Software Technology 48(9): 846-868.
    • CIOs and IT executives consistently rank alignment of IT with business strategy as a top priority
    • a requirements engineering framework for organizational IT that directly addresses an organization’s business strategy and the alignment of IT requirements with that strategy. B-SCP integrates the three themes of strategy, context, and process using a requirements engineering notation for each theme.

    • UML goal modeling extension lacks the richness of other established goal modeling notations and frameworks such as KAOS , i* , GBRAM , and the BRG-Model , which B-SCP leverages in its strategy theme

     

     

     

     

  • Kartseva, V., J. Hulstijn, et al. (2006). Modelling Value-based Inter-Organizational Controls in Healthcare Regulations
    • whether controls in highly regulated environments should be modelled as value-based controls or not. To investigate this question we have carried out a case study in the healthcare sector, specifically on recent changes to the governance and control of the Dutch public insurance system for exceptional healthcare (AWBZ).

Day 12 : Goal-Oriented Requirement Engineering

  • van Lamsweerde, A. (2001). Goal-oriented requirements engineering: a guided tour. Requirements Engineering, 2001. Proceedings. Fifth IEEE International Symposium on.
    • concerned with the use of goals for eliciting, elaborating, structuring, specifying, analyzing, negotiating, documenting, and modifying requirements.
    • other forms of goal-based analysis, called context analysis, definition study, participative analysis
    • A goal is an objective the system under consideration should achieve. Goal formulations thus refer to intended properties to be ensured; they are optative statements as opposed to indicative ones, and bounded by the subject matter
    • Goals may be formulated at different levels of abstraction, ranging from high-level, strategic concerns (such as “serve more passengers” for a train transportation system or “provide ubiquitous cash service” for an ATM network system) to low-level, technical concerns (such as “acceleration command delivered on time” for a train transportation system or “card kept after 3 wrong password entries” for an ATM system
    • Different type of concerns: functional concerns associated with the services to be provided, and nonfunctional concerns associated with quality of service –such as safety, security, accuracy, performance, and so forth
    • Why are goals needed?
      • Achieving requirements completeness
      • Avoiding irrelevant requirements
      • Explaining requirements to stakeholders
      • Goal refinement provides a natural mechanism for structuring complex requirements documents for increased readability
      • To provide the roots for detecting conflicts among requirements and for resolving them eventually
      • Separating stable from more volatile information
      • identification of requirements to support them; they have been shown to be among the basic driving forces, together with scenarios, for a systematic requirements elaboration process
    • Goal identification is not necessarily an easy task
    • A common misunderstanding about goal-oriented approaches is that they are inherently top-down ; this is by no means the case
    • the soonest a goal is identified and validated, the best.
    • Goals are generally modelled by intrinsic features such as their type and attributes and their links to other goals and to other elements of a requirements model.

     

     

     

     

  • van Lamsweerde, A. (2004). Goal-oriented requirements enginering: a roundtrip from research to practice [enginering read engineering]. Requirements Engineering Conference, 2004. Proceedings. 12th IEEE International.
    • GORE processes are in general a mix of bottom-up and top-down sub-processes as goal models are built by asking WHY and HOW questions about source material obtained from interviews, available documents, etc

    • The size of the requirements document was typically ranging from 100 to 300 pages.

    • well-structured goal model provides an ideal communication interface between business managers and software engineers
    • appreciated feature of GORE models was their built-in vertical traceability – from strategic business objectives to technical requirements to precise specifications to architectural design choices

     

     

     

     

  • Dardenne, A., A. van Lamsweerde, et al. (1993). “Goal-directed requirements acquisition.” Science of Computer Programming 20(1-2): 3-50.
    • Formal methods, supported by automated tools, enable engineers to capture and specify the software requirements carefully and precisely.
    • Most existing specification languages focus on functional requirements – that is, requirements about what the software system is expected to do.
    • Requirements acquisition and formal specification are not necessarily sequential tasks; from a process programming perspective, one could see them as coroutines.
    • KAOS has three components:
  1. a conceptual model for acquiring and structuring requirements models, with an associated acquisition language,
  2. a set of strategies (acquisition strategies) for elaborating requirements models in this framework, and
  3. an automated assistant to provide guidance in the acquisition process according to such strategies.


 

 

 

 

  • Yu, E. S. K. (1997). Towards modelling and reasoning support for early-phase requirements engineering. Requirements Engineering, 1997., Proceedings of the Third IEEE International Symposium on.
    • most existing requirements techniques are intended more for the later phase of requirements engineering, which focuses on completeness, consistency, and automated verification of requirements
    • a different kind of modelling and reasoning support is needed for the early phase which aims to model and analyze stakeholder interests and how they might be addressed, or compromised, by various system-and-environment alternatives
    • i*
      framework – consists of two main modeling components.
      • The Strategic Dependency (SD) model is used to describe the dependency relationships among various actors in an organizational context.
      • The Strategic Rationale (SR) model is used to describe stakeholder interests and concerns, and how they might be addressed by various configurations of systems and environments

 


 

 

 

 

  • Castro, J., M. Kolp, et al. (2002). “Towards requirements-driven information systems engineering: the Tropos project.” Information Systems 27(6): 365-389.
    • Requirements analysis is arguably the most important stage of information system development – phase where technical considerations have to be balanced against social and organizational ones and where the operational environment of the system is modeled
    • Existing software development methodologies (object-oriented, structured or otherwise) have traditionally been inspired by programming concepts, not organizational ones, leading to a semantic gap between the software system and its operational environment
    • Tropos which is founded on concepts used to model early requirements.
    • proposal adopts the i*
      organizational modeling framework, which offers the notions of actor, goal and (actor) dependency, and uses these as a foundation to model early and late requirements, architectural and detailed design
    • 4 phases
      • Early requirements, concerned with the understanding of a problem by studying an organizational setting; the output of this phase is an organizational model which includes relevant actors, their respective goals and their interdependencies.
      • Late requirements, where the system-to-be is described within its operational environment, along with relevant functions and qualities.
      • Architectural design, where the system’s global architecture is defined in terms of sub-systems, interconnected through data, control and other dependencies.
      • Detailed design, where behavior of each architectural component is defined in further detail

 

 

 

 

  • Rolland, C., C. Souveyet, et al. (1998). “Guiding goal modeling using scenarios.” Software Engineering, IEEE Transactions on 24(12): 1055-1071.
    • Goal modeling is an effective approach to requirements engineering, it is known to present a number of difficulties in practice
    • paper discusses these difficulties and proposes to couple goal modeling and scenario authoring to overcome them.
    • Scenarios are used to discover goals rather than concretize goals (as is done in other existing techniques)

 

 

 

 

 

  • Kavakli, E. (2002). “Goal-Oriented Requirements Engineering: A Unifying Framework.” Requirements Engineering 6(4): 237-251.
    • Since different goal-oriented approaches are appropriate in different RE stages, then we can argue that by putting together the various goal-oriented approaches a stronger RE framework that takes advantage of the contributions from the many streams of goal-oriented research can be built, and this is the subject matter of this paper.
    • Despite the fact that there is no common definition of the RE process, three tasks to be performed have been identified:
      • requirements elicitation – focuses on understanding the organizational situation that the system under consideration aims to improve and on describing the needs and constraints concerning the system under development
      • requirements specification – maps real-world needs onto a requirements model
      • requirements validation – task intends to ensure that the derived specification corresponds to the original stakeholder needs and conforms to the internal and/or external constraints set by the enterprise and its environment.


 

 

 

 

 

  • Horkoff, J. and E. Yu (2011). Analyzing goal models: different approaches and how to choose among them. Proceedings of the 2011 ACM Symposium on Applied Computing. TaiChung, Taiwan, ACM: 675-682.
    • Survey of available approaches for goal model analysis
      • The KAOS Methodology introduced a formal goal framework applying AND and OR decompositions between goals describing desired states over entities, achieved by actions assigned to agents
      • The GBRAM technique guides the elicitation of goals from system activities and artifacts, classifying goals, and associating them with constraints and scenarios. Goals in GBRAM are refined using questions and scenarios, and are represented in tabular form.
      • The Annotated Goal-Oriented Requirements Analysis (AGORA) approach attempts to address missing capabilities of existing goal-oriented approaches by including goal priorities and methods for solving goal conflicts, selecting alternatives, and measuring the quality of models.
      • The NFR (Non-Functional Requirement) modeling aims to represent human intentions in technical systems. The framework uses soft goals, goals that are not satisfied via clear-cut criteria, AND and OR decompositions amongst goals, and contribution links, representing potentially partial negative and positive contributions to and from such goals.

      • i*
        (distributed intentionality) Framework, made use of notations in the NFR Framework, including soft goals, AND/OR decompositions, and contribution links. To this it added tasks, (hard) goals, resources, and dependencies between actors (agents).

 

 

 

 

  • Ghanavati, S., D. Amyot, et al. (2011). A systematic review of goal-oriented requirements management frameworks for business process compliance. Requirements Engineering and Law (RELAW), 2011 Fourth International Workshop on.
    • systematic literature review of 88 papers focusing on goal-oriented legal compliance of business processes.
    • eight categories based on a set of criteria and then highlight their main contributions
    • legal requirements, modeling them with goal modeling languages, and integrating them with business processes
    • What goal-oriented frameworks are there that helps organizations establish their legal compliance and manage the evolution of their compliance?
      • Are there any goal modeling notations that support modeling legal aspects and support compliance?
      • Are there any methods or frameworks that can integrate legal requirements with business processes to provide law-compliant business processes?
      • Are there any guidelines for extracting legal requirements and mapping them to goal models?
      • Are there methods that provide templates for modeling compliant business processes?
      • Are there methods that help organizations prioritize instances of non-compliance?
      • Is there any tool support for managing compliance?


 


 

Day 11 : Design Science

  • Osterle, H., J. r. Becker, et al. (2011). “Memorandum on design-oriented information systems research.” European journal of information systems
    20: 7–10.
    • 10 authors propose principles of design-oriented information systems research. Moreover, the memorandum is supported by 111 full professors from the German-speaking scientific community, who with their signature advocate the principles specified therein
    • research in the field has lacked relevance for the practitioners’ community, which could be surmised from the fact that very few Ph.D.s from the IS discipline have ended up working in business.
    • To get work published in the leading IS journals, researchers, need to have articles providing statistical evidence of empirically identified characteristics of existing IS (which are favored over publications presenting innovative solutions that are considered highly beneficial for business) – European IS research is in danger of shifting from a design-oriented discipline into a descriptive one.
    • IS researchers in Europe have often preferred publishing books to publishing papers in journals, they have largely neglected publishing in English, which is required to be visible on a global level, and they have shown little commitment to the international scientific community
    • A newly emerging branch of Anglo-Saxon IS research known as design science strives for the same objective as European IS research does in order to meet the demand for more practical relevance of scientific results.
    • Design-oriented IS research is not a non-judgmental scientific discipline, rather it is normative, in a sense that the construction of artifacts is guided by the desire to yield a specific benefit and to satisfy certain objectives
    • Design-oriented IS research must comply with four basic principles:
      • Abstraction: Each artifact must be applicable to a class of problems.
      • Originality: Each artifact must substantially contribute to the advancement of the body of knowledge.
      • Justification: Each artifact must be justified in a comprehensible manner and must allow for its validation.
      • Benefit: Each artifact must yield benefit – either immediately or in the future – for the respective stakeholder groups.

     

     

     

  • Chen, H. (2011). “Editorial: Design science, grand challenges, and societal impacts,.” ACM Transactions on Management Information Systems (TMIS)
    2(1).
    • our discipline will be judged based on the tangible societal impacts that we have made, not the number of citations generated or the number of MIS faculty tenured
    • The impact of IS research has arguably been relatively small, especially compared with research in fields like finance with its capital assets pricing model, efficient market hypothesis, and options pricing model. Our fear is that the information systems discipline may not survive in academia [Agarwal and Lucas (2005)]
    • I am afraid that MIS research has contributed mostly to the successful P&T of professors, instead of making a substantive and measurable impact on the society as a whole.
    • Design-oriented MIS research has much to contribute to society through our current and future research and teaching

 

 

 

  • Baskerville, R. and J. Pries-Heje (2010). “Explanatory Design Theory.” Business & Information Systems Engineering
    2(5): 271-282.
    • “Truth informs design and utility informs theory” (Hevner et al. 2004, p. 80)
    • There is not complete agreement about the characteristics and components of design theories, and of course there is no proof or evidence.
    • Design theory is a prescriptive theory based on theoretical underpinnings which says how a design process can be carried out in a way which is both effective and feasible
    • Information Systems Design Theory can be divided into two classes: “Design Product” and “Design Process”
    • Natural science includes traditional research in physical, biological, social, and behavioral domains. . . Such research is aimed at understanding reality. . . .
    • Design science attempts to create things that serve human purposes. It is technology-oriented. . . . Rather than producing general theoretical knowledge, design scientists produce and apply knowledge of tasks or situations in order to create effective artifacts”
    • Design science products are of four types, constructs, models, methods, and implementations
    • “IT research should be concerned both with utility, as a design science, and with theory, as a natural science”
    • One form of theory can be an explanatory account of reality. In the philosophy of science, we can find four types or patterns of explanations: deductive, probabilistic, functional, and genetic
    • The idea of capturing architectural design ideas for reuse in an archetypical form was pioneered under the name patterns.

 

 

 

  •  J. Pries-Heje, R. Baskerville, and J. Venable, “Strategies for design science research evaluation,” 2008, pp. 255-266.
    • Design science research (DSR) in IS emphasizes on the importance of evaluation – discussion of evaluation activities and methods is limited and typically assumes an ex post perspective, in which evaluation occurs after the construction of an IS artifact.
    • This paper analyzes a broader range of evaluation strategies, which includes ex ante (prior to artifact construction) evaluation
    • Ex Ante Evaluation
      • simplest form, ex ante evaluation operates as a cost benefit analysis
      • ex ante evaluation provides models for theoretically evaluating a design without actually implementing the material system or technology. In other words, the artefact is evaluated on the basis of its design specifications alone.
      • 1st dimension
        • Fundamental measures are metrics that capture characteristics of the technology investment as a single measure – capital budgeting, return on investment, user satisfaction ratings, etc
        • Composite approaches that combine several fundamental measures and produce a more complex or comprehensive representation of the value of the investment – information economics, portfolios, balanced scorecard, etc
        • Meta-approaches which use the context as the basis for selecting the optimum set of measures.
      • 2nd dimension
        • Positivist/reductionist, in which the metrics ultimately determine the decision.
        • Hermeneutic, in which decision-makers operate according to their understanding and interpretation of the metrics. Hermeneutics applications recognize instinct and intuition, among other influences, that affect the perception of value held by the decision-maker.

       

    • Ex-Post Evaluation


    • Among their seven guidelines, Hevner et al (2004) require researchers to rigourously evaluate design artefacts. They summarize five kinds of evaluation methods (observational, analytical, experimental, testing, and descriptive

 

 

 

  • Baskerville, R. (2008). “What design science is not.” European Journal of Information Systems
    17: 441–443.
    • Design science is not design
    • Design science is not design theory
    • Design science is not an IT artifact
    • Design science is not methodology
    • Design science is not action research
    • Design science is not computer science
    • Design science is not a separate academic discipline
    • Design science is not new

     

  • Osterle, H., J. r. Becker, et al. (2011). “Memorandum on design-oriented information systems research.” European journal of information systems
    20: 7–10.
    • 10 authors propose principles of design-oriented information systems research. Moreover, the memorandum is supported by 111 full professors from the German-speaking scientific community, who with their signature advocate the principles specified therein
    • research in the field has lacked relevance for the practitioners’ community, which could be surmised from the fact that very few Ph.D.s from the IS discipline have ended up working in business.
    • To get work published in the leading IS journals, researchers, need to have articles providing statistical evidence of empirically identified characteristics of existing IS (which are favored over publications presenting innovative solutions that are considered highly beneficial for business) – European IS research is in danger of shifting from a design-oriented discipline into a descriptive one.
    • IS researchers in Europe have often preferred publishing books to publishing papers in journals, they have largely neglected publishing in English, which is required to be visible on a global level, and they have shown little commitment to the international scientific community
    • A newly emerging branch of Anglo-Saxon IS research known as design science strives for the same objective as European IS research does in order to meet the demand for more practical relevance of scientific results.
    • Design-oriented IS research must comply with four basic principles:
      • Abstraction: Each artifact must be applicable to a class of problems.
      • Originality: Each artifact must substantially contribute to the advancement of the body of knowledge.
      • Justification: Each artifact must be justified in a comprehensible manner and must allow for its validation.
      • Benefit: Each artifact must yield benefit – either immediately or in the future – for the respective stakeholder groups.

 

 

 

  • R. Baskerville, K. Lyytinen, V. Sambamurthy, and D. Straub, “A response to the design-oriented information systems research memorandum,” European Journal of Information Systems, vol. 20, pp. 11-15, 2010.
    • To dispute and place into a larger context some of the key points communicated in the memorandum, crucial in that many of these key points are critical premises to justify the main argument in the memorandum.
    • Most of the IS research carried out not just in North America, but also in most other parts of the world encompasses at least:
  1. design science research on the design of IT-based artifacts;
  2. behavioral research on understanding issues like user acceptance or other individual or team level impacts of IT;
  3. economic research on the value of IS;
  4. strategic and organizational research on the management and impacts of IT in organizations
  • Theory is not only the product of ‘descriptive’ explanatory or predictive research. In design research, design theories can provide teleological, or functional explanations that may constitute an important scholarly product of such research (Nagel, 1961; Simon, 1996).
  • all of these authors agree with the EJIS co-editors that researcher behaviors are different across the Atlantic (and in other parts of the world) because the incentive systems are different


 

Day 10 : RE & Law

Anton, A. I., H. Qingfeng, et al. (2004). “Inside JetBlue’s privacy policy violations.” Security & Privacy, IEEE 2(6): 12-18.

  • provided travel records of five million customers to Torch Concepts, a private US Department of Defense (DoD) contractor, for an antiterrorism study to track high-risk passengers or suspected terrorists
  • Torch then purchased additional customer demographic information on those passengers from Acxiom
  • data transfer directly violated JetBlue’s privacy policy (“JetBlue’s Online Privacy Policy,” 24 Sept. 2003; http://www.jetblue.com/privacy.html) and might have violated federal privacy laws
  • was reckless in creating contractual relationships with these organizations before examining privacy policies to ensure that customer records would be treated consistently with JetBlue’s privacy policies
  • JetBlue shared personal information with third parties.
  • JetBlue committed a violation of omission by failing to expressits use of third-party cookies
  • JetBlue’s policy reveals a contractual vulnerability with Open-Skies
  • Ambiguities in JetBlue’s privacy policy point to potential vulnerabilitiesand raise a virtual red flag.
  • The JetBlue privacy policy overemphasizes the Children’s Online Privacy Protection Act (COPPA)-related issues.

 

 

Ghanavati, S., D. Amyot, et al. (2011). A systematic review of goal-oriented requirements management frameworks for business process compliance. Requirements Engineering and Law (RELAW), 2011 Fourth International Workshop on.

  • systematic literature review of 88 papers focusing on goal-oriented legal compliance of business processes.
  • eight categories based on a set of criteria and then highlight their main contributions
  • legal requirements, modeling them with goal modeling languages, and integrating them with business processes
  • What goal-oriented frameworks are there that help organizations establish their legal compliance and manage the evolution of their compliance?
    • Are there any goal modeling notations that support modeling legal aspects and support compliance?
    • Are there any methods or frameworks that can integrate legal requirements with business processes to provide law-compliant business processes?
    • Are there any guidelines for extracting legal requirements and mapping them to goal models?
    • Are there methods that provide templates for modeling compliant business processes?
    • Are there methods that help organizations prioritize instances of non-compliance?
    • Is there any tool support for managing compliance?


 

 

 

Breaux, T. and A. Anton (2008). “Analyzing Regulatory Rules for Privacy and Security Requirements.” IEEE Transactions on Software Engineering 34(1): 5-20.

  • Regulations describe stakeholder rules, called rights and obligations, in complex and sometimes ambiguous legal language.
  • “Rules” are often precursors to software requirements that must undergo considerable refinement and analysis before they become implementable.
  • To support the software engineering effort to derive security requirements from regulations, we present a methodology for directly extracting access rights and obligations from regulation texts.
  • Health Insurance Portability and Accountability Act (HIPAA) require members of the healthcare industry who use electronic information systems to protect the privacy of medical information
  • requirements engineering methodology for encoding rules from regulations discussed in this paper was developed using Grounded Theory
  • Methodology was applied to the HIPAA Privacy Rule, to yield 300 stakeholder access rules. The analysis encompassed four passes through all 55 pages of the Rule , with two people working in tandem.
  • Among the total 1,894 constraints acquired from the Privacy Rule, 1,033 of these were parameterized constraints,

     

     

Otto, P. N. and A. I. Antón (2009). Managing Legal Texts in Requirements Engineering

  • legal compliance is critical in systems development, especially given that non-compliance can result in both financial and criminal penalties
  • legal texts can be very challenging, because they contain numerous ambiguities, cross-references, domain-specific definitions, and acronyms, and are frequently amended via new statutes, regulations, and case law.
  • research efforts over the past 50 years in handling legal texts for systems development.
  • Characteristics of legal text and challenges
    • Legal texts tend to be very structured and hierarchical documents.
    • legal texts may complement, overlap, or even contradict one another due to differing objectives and changes over time
    • amendments and revisions to the same provision of a legal text can lead to internal contradictions
    • the frequent references to other sections within a given legal text and even to other legal texts. – cross-references force requirements engineers to spend additional time reading and understanding legal texts before they can even begin to extract key concepts or apply the legal texts to system design & requirements engineers are prone to make interpretations and inferences that are inconsistent with the law
    • extensive definitions necessitate a significant amount of domain knowledge before the legal texts are comprehensible and usable.
    • requirements engineers may need to consult with their organization’s legal counsel in order to establish interpretations of a given ruling, due to complexities such as the authority of a given ruling (e.g., binding, persuasive) and whether all or part of the ruling is still good law.
  • Information security and data privacy law, are still emerging fields and are therefore subject to greater fluctuation in the requirements stemming from laws and regulations;
  • Prolog logic programming language targeted for knowledge representation and expert systems

 

Otto, P. N. and A. I. Anton (2007). Addressing Legal Requirements in Requirements Engineering. Requirements Engineering Conference, 2007. RE ’07. 15th IEEE International.

  • Working with legal texts can be very challenging, however, because they contain numerous ambiguities, cross-references, domain specific definitions, and acronyms, and are frequently amended via new regulations and case law.

 

 

Antón, A. I. and J. B. Earp (2004). “A requirements taxonomy for reducing Web site privacy vulnerabilities.” Requirements Engineering 9(3): 169-185.

  • The identified taxonomy categories are useful for analysing implicit internal conflicts within privacy policies, the corresponding Web sites, and their manner of operation
  • categories can be used by Web site designers to reduce Web site privacy vulnerabilities and ensure that their stated and actual policies are consistent with each other
  • Used content analysis technique, goal-mining (the extraction of pre-requirements goals from post-requirements text artifacts), to derive the privacy-related goals of various Internet health care Web sites.
  • Goal-mining efforts were conducted in the spirit of grounded theory
  • goals derived from 25 Internet e-commerce privacy policies were categorised according to common characteristics that emerged and coded into the following categories: notice/awareness, choice/consent, access/participation, integrity/security, enforcement/redress, monitoring, aggregation, storage, transfer of information, collection, personalisation and solicitation
  • The World Wide Web (WWW) Consortium is establishing the Platform for Privacy Preferences Project (P3P)4 as an industry standard to provide an automated way for users to gain control of and manage the use of their personal information on Web sites they visit.
    • a TRUSTe seal simply ensures that TRUSTe has reviewed the licensee_s privacy policy for disclosure of the following uses of information by a Web site: what personal information is being gathered; how the information will be used; who the information will be shared with; the choices available regarding how collected information is used; safeguards in place to protect personal information from loss, misuse, or alteration; and how individuals can update or correct inaccuracies in information collected about them.


 

 

Kiyavitskaya, N., N. Zeni, et al. (2007). Extracting rights and obligations from regulations: toward a tool-supported process. Proceedings of the twenty-second IEEE/ACM international conference on Automated software engineering. Atlanta, Georgia, USA, ACM: 429-432.

  • “regulation compliance” problem, whereby companies and developers are required to ensure that their software complies with relevant regulations, either through design or reengineering.
  • examine the challenges of developing tool support for this process
    • first step of the process: the annotation of regulatory text to identify basic concepts such as rights and obligations is achieved through Cerno framework for semantic annotation
    • Cerno uses context-free grammars, generates a parse tree, and applies transformation rules to generate output in a target format
    • traceability challenges – Rights and obligations do not always appear in separate statements; they may be intermixed, distributed or refined across different statements.


 

Day 9 – Event Mining – Privacy

  • Jensen, C. and C. Potts (2005). “Privacy practices of Internet users: self-reports versus observed behavior.” International Journal of Human-Computer Studies
    63: 203-227.
    • Study on user concerns were analyzed deeply and
    • What users said was contrasted with what they did in an experimental e-commerce scenario.
    • 175 subjects – approximately two thirds were currently involved in education (students, faculty and researchers). Subjects were asked a series of multiple choice demographic questions
      • Privacy-oriented answers to all three questions are classified as ”Privacy fundamentalists,”
      • No privacy-oriented answers are classified as ”Privacy unconcerned”,
      • Those in-between are classified as ”Privacy pragmatists”.

the study demonstrates that users do not do what they say , and they do not know what they claim to know.

Day 9 – Roadmap for Comprehensive Requirements Modeling

  • Robinson, W. N. (2010). “A Roadmap for Comprehensive Requirements Modeling.” Computer
    43(5): 64-72.
    • large software systems result from weaving together many independently developed systems
    • Developers typically generate a bug dashboard of severity, component, and bug counts. After reviewing the dashboard, they plan a new update and push it out to users periodically;
    • Adaptive software enterprises continually monitor their stakeholders through focus groups, beta software, and so on, evolving their software to meet customer needs
    • This article provides a comprehensive, four-layer service framework for understanding the architectural and research issues of current and future monitoring practices.
    • Requirements Monitoring addresses 5 problems
  1. Distributed debugging. Determines what’s wrong with software.
  2. Runtime verification. Determines whether the software works as specified.
  3. Runtime validation. Determines whether the software satisfies the user goals, especially in the face of an evolving software system.
  4. Business activity monitoring. Determines whether an enterprise system satisfies the business goals.
  5. Evolution and codesign. Informs users and developers as they jointly evolve the system to meet their needs.