Enterprise Architecture Quarter

From CNM Wiki
Revision as of 16:45, 30 March 2018 by Test.user (talk | contribs) (Practices)
Jump to: navigation, search

Product Design Quarter (hereinafter, the Quarter) is the third of four lectures of Project Quadrivium (hereinafter, the Quadrivium):

  • The Quarter is designed to introduce its learners to enterprise design, or, in other words, to concepts related to creating architecture for achieving enterprise goals; and
  • The Quadrivium examines concepts of administering various types of enterprises known as enterprise administration as a whole.

The Quadrivium is the first of seven modules of Septem Artes Administrativi, which is a course designed to introduce its learners to general concepts in business administration, management, and organizational behavior.


Lecture outline

The predecessor lecture is Business Analysis Quarter.

Concepts

  1. Product vision statement. A brief statement or paragraph that describes the why, what, and who of the desired software product from a business point of view.
    • Product vision statement. a high-level description of a product which includes who it is for, why it is necessary and what differentiates it from similar products.
    • Product. A solution or component of a solution that is the result of a project.
    • Feature. A cohesive bundle of externally visible functionality that should align with business goals and objectives. Each feature is a logically related grouping of functional requirements or non-functional requirements described in broad strokes.
    • Defect. A deficiency in a product or service that reduces its quality or varies from a desired attribute, state, or functionality. See also requirements defect.
  2. Product backlog. A set of user stories, requirements or features that have been identified as candidates for potential implementation, prioritized, and estimated.
    • Backlog. A changing list of product requirements based on the customer’s needs. The backlog is not a to-do list; rather, it is a list of all the desired features for the product. The Agile team uses the backlog to prioritize features and understand which features to implement first.
    • Backlog grooming. The process that occurs at the end of a sprint, when the team meets to make sure the backlog is ready for the next sprint. The team may remove user stories that aren’t relevant, create new stories, reassess priority, or split user stories into smaller tasks. Backlog grooming is both an ongoing process and the name for the meeting where this action occurs (a backlog grooming meeting).
    • Product backlog item (PBI). A single element of work that exists in the product backlog. PBIs can include user stories, epics, specifications, bugs, or change requirements. The product owner of an Agile team compiles and prioritizes the product backlog, putting the most urgent or important PBIs at the top. PBIs comprise tasks that need to be completed during a Scrum sprint—a PBI must be a small enough increment of work to be completed during a single sprint. As PBIs move up to a higher priority in the product backlog, they are broken down into user stories.
    • Product backlog. The list of requirements requested by the customer. The product backlog is not a to-do list; rather, it is a list of all the features the customer has requested be included in the project. The requirements include both functional and non-functional customer requirements, as well as technical team-generated requirements. While there are multiple inputs to the product backlog, it is the sole responsibility of the product owner to prioritize the product backlog. During a Sprint planning meeting, backlog items are moved from the product backlog into a sprint, based on the product owner's priorities.
    • Sprint backlog. A segment of Product Backlog Items (PBIs) that the team selects to complete during a Scrum sprint. These PBIs are typically user stories taken from the product backlog.
  3. System. A set of interrelated and interdependent parts arranged in a manner that produces a unified whole.
  4. Systems engineering.
  5. Informational architecture (IA). The art and science of organising and labeling websites, intranets, online communities and software to support usability.
    • Usability. The ease of use and learnability of an object, such as a book, software application, website, machine, tool or any object that a human interacts with.
    • Interaction design (IxD). Sometimes referred to as IxD, interaction design strives to create meaningful relationships between people and the products and services that they use.
  6. UX design.
    • Adaptive design. Like Responsive web design it is an approach to web design aimed at crafting sites to provide an optimal viewing and interaction experience on different screen and devices. The difference is that adaptive design is less fluid then RWD, and ‘serves’ few fixed width versions of the design depending on viewport size. It can utilise server side techniques to ‘detect’ viewport size prior to rendering html. The advantage for designer is that it gives more control over images and typography, and hence is easier approach to ‘retrofit’ fixed width websitest to work on mobile devices.
    • Responsive design. A design approach that responds to the user’s behavior and environment based on screen size, platform and orientation. The practice consists of a mix of flexible grids and layouts, images and an intelligent use of CSS media queries.
  7. Acceptance criteria. Specification for a set of conditions that the product must meet in order to satisfy the customer. In Agile development, the product owner writes statements from the customer’s point of view that explain how a user story or feature should work. In order for the story or feature to be accepted it needs to pass the acceptance criteria; otherwise, it fails.
  • Interaction model. A design model that binds an application together in a way that supports the conceptual models of its target users. It defines how all of the objects and actions that are part of an application interrelate, in ways that mirror and support real-life user interactions.
  • Service design. The practice of designing a product according to the needs of users, so that the service is user-friendly, competitive and relevant to the users.
  • Service. Work carried out or on behalf of others.
  • Visual design. Also called communication design. A discipline which combines design and information development in order to develop and communicate a media message to a target audience.
  • Wireframe. A rough guide for the layout of a website or app, either done with pen and paper or with wireframing software.
  • Action design. A change process based on systematic collection of data and then selection of a change action based on what the analyzed data indicate.
  • Commitment concept. Plans should extend for enough to meet those commitments made when the plans were developed.
  • Load chart. A modified Gantt chart that schedules capacity by entire departments or specific resources.
  • Organizational development. A collection of planned change interventions, built on humanistic-democratic values, that seeks to improve organizational effectiveness and employee well-being.
  • Organizational development. Change methods that focus on people and the nature and quality of interpersonal work relationships.
  • Statement of work (SOW). A narrative description of products or services to be supplied under contract.

Roles

  1. Architect. There is no architect role in Agile development, instead all Agile team members are responsible for emerging the architecture.
  2. Agile team. A work team that is responsible for committing to work, delivering and driving the product forward from a tactical perspective in terms of Agile development. Usually, an Agile team is a small, high-functioning group of five to nine people who collaboratively work together to complete an iteration or project. The team has the necessary skills and competencies to work on the project. Scrum teams are cross-functional; Kanban teams can either be cross-functional or specialists. Scrum teams lack any roles. Kanban teams usually have team leads.
  3. Scrum role. One of the following: product owner, Scrum master, Agile team member.
    • Scrum master. A facilitator for the team and product owner. Rather than manage the team, the Scrum master works to assist both the team and product owner in the following ways: (1) Remove the barriers between the development and the product owner so that the product owner directly drives development. (2) Teach the product owner how to maximize return on investment (ROI), and meet his/her objectives through Scrum. (3) Improve the lives of the development team by facilitating creativity and empowerment. (4) Improve the productivity of the development team in any way possible. (5) Improve the engineering practices and tools so that each increment of functionality is potentially shippable. (6) Keep information about the team's progress up to date and visible to all parties. Scrum master is often viewed as the coach for the team.
    • Product owner. A person who holds the vision for the product and is responsible for maintaining, prioritizing and updating the product backlog. In Agile development, the product owner has final authority representing the customer's interest in backlog prioritization and requirements questions. This person must be available to the team at any time, but especially during the Sprint planning meeting and the Sprint review meeting. Challenges of being a product owner: (1) Resisting the temptation to "manage" the team. The team may not self-organize in the way you would expect it to. This is especially challenging if some team members request your intervention with issues the team should sort out for itself. (2) Resisting the temptation to add more important work after a Sprint is already in progress. (3) Being willing to make hard choices during the sprint planning meeting. (4) Balancing the interests of competing stakeholders.
  4. Customer. The organization or individual that has requested (and will pay for) a product or service.
  5. Engineer.
  6. Project sponsor.
  • Regulator. A stakeholder with legal or governance authority over the solution or the process used to develop it.
  • Stakeholder. An individual or group affected in some way by the undertaking. Stakeholders are valuable sources for requirements.
  • Stakeholder. Anyone outside the Scrum team who has an interest in the product that the team is producing. Stakeholders can include but are not limited to direct managers, subject matter experts, account managers, salespeople, and legal officers.

Methods

  1. Agile development. The project management approach of developing increments of software in frequent iterations based on evolving requirements. The Agile Manifesto was the initial public declaration for Agile development related to software. Its authors believed that they found "better ways of developing software by doing it and helping others do it."
    • Agile software development methodology. A methodology fundamentally incorporating iteration and continuous feedback to refine and deliver a software system. It involves continuous planning, testing, integration, and other forms of continuous evolution of both the project and the software.
    • Lean Agile development. An example of lightweight Agile methodology applied to project development. Lean Software Development combines the Lean manufacturing approach pioneered by Toyota in the 1950s (also known as just-in-time production) and Lean IT principles, and applies them to software. LSD places a strong emphasis on people and effective communication. LSD is defined by seven principles: (1) Eliminate waste, (2) Create knowledge, (3) Build quality in, (4) Defer commitment, (5) Optimize the whole, (6) Deliver fast, (7) Respect people
    • Lean UX. Inspired by Lean and Agile development theories, Lean UX speeds up the UX process by putting less emphasis on deliverables and greater focus on the actual experience being designed.
    • Test-driven development (TDD). The practice of designing and building tests for functional, working code, and then building code that will pass those tests.
  2. Kanban. A highly visual framework that falls under the Agile umbrella. The Kanban process uses continuous work flow rather than fixed iterations to produce shippable deliverables. When applied over an existing process, Kanban encourages small, incremental changes to the current process and does not require a specific set up or procedure. Kanban focuses on completing entire projects rather than sprints.
  3. Iterative design. A methodology based on a cyclic process of prototyping, testing, analysing, and refining a product or process. Based on the results of testing the most recent iteration of a design, changes are made. This process is intended to ultimately improve the quality and functionality of a design.
    • Iterate. The act of repeating a process with the aim of approaching a desired goal, target or result. Each repetition of the process is also called an iteration.
    • Iteration. A fixed or timeboxed period of time, generally spanning two to four weeks, during which an Agile team develops a deliverable, potentially shippable product. A typical Agile project consists of a series of iterations, along with a Sprint planning meeting prior to development and a Sprint retrospective at the end of the iteration. Iterations are referred to as sprints in Scrum.
    • Iterative development. The process of breaking down projects into more manageable components known as iterations. Iterations are essential in Agile methodologies for producing a potentially shippable deliverable or product.
  4. Waterfall model. A sequential design process where progress is seen as flowing steadily downwards through the phases of Conception > Initiation > Analysis > Design > Construction > Testing > Implementation > Maintenance.

Instruments

    • Axure. A wireframing and interactive prototyping tool, available for both Windows and Mac.
    • Balsamiq Mockups. A wireframing and interactive prototyping tool, available for both Windows and Mac.
  1. Scrum meeting. One of the following: story time, Sprint planning meeting, Sprint review meeting, Sprint retrospective, daily standup.
    • Sprint planning meeting. A working session held before the start of each sprint to reach a mutual consensus between the product owner's acceptance criteria and the amount of work the development team can realistically accomplish by the end of the sprint. The length of the sprint determines the length of the Sprint planning meeting, with two hours being equivalent to one week of the sprint. Using this formula, the Sprint planning meeting for a two-week sprint would last about four hours, although this can vary.
    • Daily standup. A brief communication and status-check session facilitated by the Scrum Master where Scrum teams share progress, report impediments, and make commitments for the current iteration or sprint. The Daily Scrum consists of a tightly focused conversation kept to a strict timeframe; the meeting is held at the same time, every day (ideally, in the morning), and in the same location. The Scrum task board serves as the focal point of the meeting. During the Daily scrum each team member answers three questions: (1) "What have I done since the last Scrum meeting? (i.e. yesterday)" (2) "What will I do before the next Scrum meeting? (i.e. today)" (3) "What prevents me from performing my work as efficiently as possible?"
    • Story time. A regular work session where items on the backlog are discussed, refined and estimated and the backlog is trimmed and prioritized.
    • Scrum of scrums. A meeting that is a scaling mechanism used to manage large projects involving Scrum multiple teams. A Scrum of Scrums is held to facilitate communication between teams that may have dependencies on one another. One member from each team attends the Scrum of Scrums to speak for the team—this could be the Scrum Master but may be any team member who can effectively relay information and handle questions or concerns for the team.
    • Sprint review meeting. A meeting that a Scrum team holds immediately following the completion of a sprint to review and demonstrate what the team has accomplished during the sprint. This meeting is attended by the product owner or customer, Scrum Master, Scrum team, and stakeholders. The Sprint review meeting is an informal meeting (no Powerpoint slides allowed). The length of the sprint determines the length of the Sprint review meeting, with one hour being equivalent to one week of the sprint. Using this formula, the Sprint planning meeting for a two-week sprint would last two hours, although this can vary.

Results

  1. Product scope. The features and functions that characterize a product, service or result.
    • Branding. The process of creating and marketing a consistent idea or image of a product, so that it is recognizable by the public.

Practices

  • Abstraction. The ability of engineers to think of design concepts that are not dependent on specific solutions.
  • Analysis stage. The stage of the UX process where insights are drawn from data collecting during the earlier Research stage. Capturing, organising and making inferences from the “what” can help UX designers begin to understand the “why”.
  • Analytics. A broad term that encompasses a variety of tools, techniques and processes used for extracting useful information or meaningful patterns from data.
  • Association. A link between two elements or objects in a diagram.
  • Assumption. Assumptions are influencing factors that are believed to be true but have not been confirmed to be accurate.
  • Attribute. A data element with a specified data type that describes information associated with a concept or entity.
  • Boundary. A separation between the interior of a system and what lies outside.
  • Business analysis approach. The set of processes, templates, and activities that will be used to perform business analysis in a specific context.
  • Business analysis communication plan. A description of the types of communication the business analyst will perform during business analysis, the recipients of those communications, and the form in which communication should occur.
  • Business analysis plan. A description of the planned activities that the business analyst will execute in order to perform the business analysis work involved in a specific initiative.
  • Business architecture. A subset of the enterprise architecture that defines an organization's current and future state, including its strategy, its goals and objectives, the internal environment through a process or functional view, the external environment in which the business operates, and the stakeholders affected by the organization's activities.
  • Business constraint(s). Business constraints are limitations placed on the solution design by the organization that needs the solution. Business constraints describe limitations on available solutions, or an aspect of the current state that cannot be changed by the deployment of the new solution. See also technical constraint.
  • Business domain model. A conceptual view of all or part of an enterprise focusing on products, deliverables and events that are important to the mission of the organization. The domain model is useful to validate the solution scope with the business and technical stakeholders. See also model.
  • Business event. A system trigger that is initiated by humans.
  • Business goal. A state or condition the business must satisfy to reach its vision.
  • Business policy. A business policy is a non-actionable directive that supports a business goal.
  • Business rule(s). A business rule is a specific, actionable, testable directive that is under the control of the business and supports a business policy.
  • Capability. A function of an organization that enables it to achieve a business goal or objective.
  • Capacity. The amount of work that can be completed within a certain time frame and is based on the number of hours that an individual or team will be available to complete the work.
  • Card sorting. A technique using either actual cards or software, whereby users generate an information hierarchy that can then form the basis of an information architecture or navigation menu.
  • Cardinality. The number of occurrences of one entity in a data model that are linked to a second entity. Cardinality is shown on a data model with a special notation, number (e.g., 1), or letter (e.g., M for many).
  • Change control board (CCB). A small group of stakeholders who will make decisions regarding the disposition and treatment of changing requirements.
  • Change-driven methodology. A methodology that focuses on rapid delivery of solution capabilities in an incremental fashion and direct involvement of stakeholders to gather feedback on the solution's performance.
  • Class model. A type of data model that depicts information groups as classes.
  • Class. A descriptor for a set of system objects that share the same attributes, operations, relationships, and behavior. A class represents a concept in the system under design. When used as an analysis model, a class will generally also correspond to a real-world entity.
  • Collaborative design. Inviting input from users, stakeholders and other project members.
  • Commercial-off-the-shelf software (COTS). Software developed and sold for a particular market.
  • Comparative analysis. Performing an item by item comparison of two or more websites or apps to determine trends or patterns.
  • Competitive analysis. A structured process which captures the key characteristics of an industry to predict the long-term profitability prospects and to determine the practices of the most significant competitors.
  • Competitor analysis. Performing an audit or conducting user testing of competing websites and apps; writing a report that summarises the competitive landscape.
  • Constraint. A constraint describes any limitations imposed on the solution that do not support the business or stakeholder needs.
  • Content audit. Reviewing and cataloguing a client’s existing repository of content.
  • Content Management System (CMS). Software that allows publishing, editing and maintaining content from a central interface. See also: Content management
  • Content management. The suite of processes and technologies that support the collection, management, and publication of information in any medium.
  • Context diagram. An analysis model that illustrates product scope by showing the system in its environment with the external entities (people and systems) that give to and receive from the system.
  • Context. The users, other systems and other features of the environment of the system that the system will interact with.
  • Continuous improvement. A process of improving quality and efficiency by making small, incremental changes over time. In Kanban, continuous improvement refers specifically to the process of optimizing workflow and reducing cycle time, resulting in increased productivity.
  • Cost-benefit analysis. Analysis done to compare and quantify the financial and non-financial costs of making a change or implementing a solution compared to the benefits gained.
  • Customer Journey Map. an holistic, visual representation of your users’ interactions with your organisation when zoomed right out (usually captured on a large canvas). See also: Experience Map
  • Data dictionary. An analysis model describing the data structures and attributes needed by the system.
  • Data entity. A group of related information to be stored by the system. Entities can be people, roles, places, things, organizations, occurrences in time, concepts, or documents.
  • Data flow diagram (DFD). An analysis model that illustrates processes that occur, along with the flows of data to and from those processes.
  • Data model. An analysis model that depicts the logical structure of data, independent of the data design or data storage mechanisms.
  • Decision analysis. An approach to decision-making that examines and models the possible consequences of different decisions. Decision analysis assists in making an optimal decision under conditions of uncertainty.
  • Decision table. An analysis model that specifies complex business rules or logic concisely in an easy-to-read tabular format, specifying all of the possible conditions and actions that need to be accounted for in business rules.
  • Decision tree. An analysis model that provides a graphical alternative to decision tables by illustrating conditions and actions in sequence.
  • Decomposition. A technique that subdivides a problem into its component parts in order to facilitate analysis and understanding of those components.
  • Design stage. The stage in a user-centred design process where ideas for potential solutions are captured and refined visually, based on the analysis and research performed in earlier stages.
  • Desired outcome. The business benefits that will result from meeting the business need and the end state desired by stakeholders.
  • Dialog hierarchy. An analysis model that shows user interface dialogs arranged as hierarchies.
  • Dialog map. An analysis model that illustrates the architecture of the system's user interface.
  • Diary Study. Asking users to record their experiences and thoughts about a product or task in a journal over a set period of time.
  • Domain. The problem area undergoing analysis.
  • Done done. A product increment that is considered potentially releasable; it means that all design, coding, testing and documentation have been completed and the increment is fully integrated into the system.
  • Emergence. The principle that the best designs, and the best ways of working come about over time through doing the work, rather than being defined in advance, cf. empiricism, self organization.
  • Empiricism. The principle of "inspect and adapt" which allows teams or individuals to try something out and learn from the experience by conscious reflection and change, cf. emergence, self organization.
  • Engineering. The application of scientific principles to practical ends.
  • Evaluation. The systematic and objective assessment of a solution to determine its status and efficacy in meeting objectives over time, and to identify ways to improve the solution to better meet objectives. See also metric, indicator and monitoring.
  • Event response table. An analysis model in table format that defines the events (i.e., the input stimuli that trigger the system to carry out some function) and their responses.
  • Event. An event is something that occurs to which an organizational unit, system, or process must respond.
  • Evolutionary prototype. A prototype that is continuously modified and updated in response to feedback from users.
  • Experience Map. An experience map is an holistic, visual representation of your users’ interactions with your organisation when zoomed right out (usually captured on a large canvas). See also: Customer Journey Map
  • External interface. An interface with other systems (hardware, software, and human) that a proposed system will interact with.
  • Feasibility study. An evaluation of proposed alternatives to determine if they are technically possible within the constraints of the organization and whether they will deliver the desired benefits to the organization.
  • Feature creep. The tendency to add additional requirements or features to a project after development is already underway. Feature creep can occur on either a project or sprint level.
  • Feedback. Information about the output of a system that can be used to adjust it.
  • Fibonacci sequence. Originally derived in the 12th century by Leonardo Pisano, the Fibonacci Sequence is a mathematical sequence in which each subsequent number is determined by the sum of the two previous numbers, that is: 1, 2, 3, 5, 8, 13, 21… Each interval becomes larger as the numbers increase. The sequence is often used for Story Points, simply because estimates are always less accurate when dealing with epic stories.
  • Fishbone diagram. A diagramming technique used in root cause analysis to identify underlying causes of an observed problem, and the relationships that exist between those causes.
  • Force field analysis. A graphical method for depicting the forces that support and oppose a change. Involves identifying the forces, depicting them on opposite sides of a line (supporting and opposing forces) and then estimating the strength of each set of forces.
  • Gantt Chart. A project management tool in the form of a bar chart showing the start and finish dates of activities.
  • Glossary. A list and definition of the business terms and concepts relevant to the solution being built or enhanced.
  • Heuristic review. Evaluating a website or app and documenting usability flaws and other areas for improvement.
  • Human factor. Also called ergonomics. The scientific discipline of studying interactions between humans and external systems, including human-computer interaction. When applied to design, the study of human factors seeks to optimise both human well-being and system performance.
  • Impact analysis. An impact analysis assesses the effects that a proposed change will have on a stakeholder or stakeholder group, project, or system.
  • Indicator. An indicator identifies a specific numerical measurement that indicates progress toward achieving an impact, output, activity or input. See also metric.
  • Information scent. An important concept in information foraging theory referring to the extent to which users can predict what they will find if they persue a certain path through a website. As animals rely on scents to indicate the chances of finding food, so do humans rely on various cues in the information environment to acheieve their goals.
  • Initiative. Any effort undertaken with a defined goal or objective.
  • Interdisciplinarity. People from different disciplines working together to design systems.
  • Interface. A shared boundary between any two persons and/or systems through which information is communicated.
  • Interoperability. Ability of systems to communicate by exchanging data or services.
  • Interview. A systematic approach to elicit information from a person or group of people in an informal or formal setting by asking relevant questions and documenting the responses.
  • Knowledge area. A group of related tasks that support a key function of business analysis.
  • Lessons learned process. A process improvement technique used to learn about and improve on a process or project. A lessons learned session involves a special meeting in which the team explores what worked, what didn't work, what could be learned from the just-completed iteration, and how to adapt processes and techniques before continuing or starting anew.
  • Metadata. Metadata is information that is used to understand the context and validity of information recorded in a system.
  • Methodology. A set of processes, rules, templates, and working methods that prescribe how business analysis, solution development and implementation is performed in a particular context.
  • Metric. A metric is a quantifiable level of an indicator that an organization wants to accomplish at a specific point in time.
  • Mission. An undertaking that is supported by the system to be designed to be successful (e.g. space mission).
  • Model(s). A representation and simplification of reality developed to convey information to a specific audience to support analysis, communication and understanding.
  • Monitoring. Monitoring is a continuous process of collecting data to determine how well a solution is implemented compared to expected results. See also metric and indicator.
  • Mood Board. A collage, either physical or digital, which is intended to communicate the visual style a direction is heading.
  • Objective. A target or metric that a person or organization seeks to meet in order to progress towards a goal.
  • Object-oriented modeling. An approach to software engineering where software is comprised of components that are encapsulated groups of data and functions which can inherit behavior and attributes from other components; and whose components communicate via messages with one another. In some organizations, the same approach is used for business engineering to describe and package the logical components of the business.
  • Operational support. A stakeholder who helps to keep the solution functioning, either by providing support to end users (trainers, help desk) or by keeping the solution operational on a day-to-day basis (network and other tech support).
  • Operative rule(s). The business rules an organization chooses to enforce as a matter of policy. They are intended to guide the actions of people working within the business. They may oblige people to take certain actions, prevent people from taking actions, or prescribe the conditions under which an action may be taken.
  • Opportunity analysis. The process of examining new business opportunities to improve organizational performance.
  • Optimization. The process of choosing the best alternative that will satisfy the needs of the stakeholders under the constraints given (e.g. cost, schedule and available technology).
  • Optionality. Defining whether or not a relationship between entities in a data model is mandatory. Optionality is shown on a data model with a special notation.
  • Organization modeling. The analysis technique used to describe roles, responsibilities and reporting structures that exist within an organization.
  • Organization. An autonomous unit within an enterprise under the management of a single individual or board, with a clearly defined boundary that works towards common goals and objectives. Organizations operate on a continuous basis, as opposed to an organizational unit or project team, which may be disbanded once its objectives are achieved.
  • Organizational process asset. All materials used by groups within an organization to define, tailor, implement, and maintain their processes.
  • Organizational readiness assessment. An assessment that describes whether stakeholders are prepared to accept the change associated with a solution and are able to use it effectively.
  • Organizational unit. Any recognized association of people in the context of an organization or enterprise.
  • Output. What is produced by a system.
  • Pair programming. A scenario where two programmers share a single workstation and work together to develop a single feature.
  • Paper prototype. A rough, often hand-sketched, drawing of a user interface, used in a usability test to gather feedback. Participants point to locations on the page that they would click, and screens are manually presented to the user based on the interactions they indicate.
  • User persona. A fictitious identity that reflects one of the user groups for who you are designing.
  • Plan-driven methodology. Any methodology that emphasizes planning and formal documentation of the processes used to accomplish a project and of the results of the project. Plan-driven methodologies emphasize the reduction of risk and control over outcomes over the rapid delivery of a solution.
  • Planning poker. A team building exercise or game used to arrive at a group consensus for estimating workload based on the Delphi method.
  • Prioritization. The process of determining the relative importance of a set of items in order to determine the order in which they will be addressed.
  • Problem statement. A brief statement or paragraph that describes the problems in the current state and clarifies what a successful solution will look like.
  • Production stage. The stage at which the high-fidelity design is fleshed out, content and digital assets are created, and a high-fidelity version of the product is validated with stakeholders and end-users through user testing sessions. The role of the UX Designer shifts from creating and validating ideas to collaborating with developers to guide and champion the vision.
  • Progressive disclosure. An interactive design technique that helps maintain the focus of a user’s attention by reducing clutter, confusion, and cognitive workload. It improves usability by presenting only the minimum data required for the task at hand. The principle is also used in journalism’s ‘inverted pyramid’ style, learning’s ‘spiral approach’, and the game ‘twenty questions’.
  • Project charter. A document issued by the project initiator or sponsor that formally authorizes the existence of a project, and provides the project manager with the authority to apply organizational resources to project activities.
  • Project kick-off. The formally recognised start of a project.
  • Project scope. The work that must be performed to deliver a product, service, or result with the specified features and functions. See also scope.
  • Project. A temporary endeavor undertaken to create a unique product, service or result.
  • Project. An activity having goals, objectives, a beginning and an end.
  • Release plan. The plan that outlines the features to be included in an upcoming release and provides an estimated date for the release. The plan should include responsibilities, resources, and activities required to complete the release.
  • Release. The transition of an increment of potentially shippable product or deliverable from the development team into routine use by customers. Releases typically happen when one or more sprints has resulted in the product having enough value to outweigh the cost to deploy it. A release can be either the initial build of a product or the addition of one or more features to an existing product. A release should take less than a year to complete, and in some cases, may only take three months.
  • Repository. A real or virtual facility where all information on a specific topic is stored and is available for retrieval.
  • Requirement. A statement of required behavior, performance and other characteristics of the system to be developed.
  • Research stage. Often referred to as the Discovery stage. Complex projects will comprise significant user and competitor research activities, while small projects may require nothing more than some informal interviews and a survey.
  • Return on investment. A measure of the profitability of a project or investment.
  • Risk Management. A process of identifying what can go wrong and making plans that will enable a system to achieve its goals.
  • Risk. An uncertain event or condition that, if it occurs, will affect the goals or objectives of a proposed change.
  • Root cause analysis. Root cause analysis is a structured examination of an identified problem to understand the underlying causes.
  • Scenario. A narrative describing “a day in the life of” one of your personas, and probably includes how your website or app fits into their lives.
  • Scenario. An analysis model that describes a series of actions or tasks that respond to an event. Each scenario is an instance of a use case.
  • Scope model. A model that defines the boundaries of a business domain or solution.
  • Scope. The area covered by a particular activity or topic of interest. See also project scope and solution scope.
  • Scrum. The most widely used framework under the Agile umbrella. Scrum is an iterative software model that follows a set of predefined roles, responsibilities, and meetings. In Scrum, iterations are called sprints and are assigned a fixed length—sprints typically last one to two weeks, but can last as long a month.
  • Self organization. The principle that those closest to the work best know how to do the work, so set clear goals and boundaries and let them make all tactical and implementation decisions, cf. emergence, empiricism.
  • Sitemap. A complete list of all the pages available on a website.
  • Span of control. Span of control is the number of employees a manger is directly (or indirectly) responsible for.
  • Specifications. The technical requirements for systems design.
  • Spike. A short, time-boxed piece of research, usually technical, on a single story that is intended to provide just enough information that the team can estimate the size of the story
  • Sprint goal (aka Sprint theme). The key focus of the work for a single sprint.
  • Sprint. A fixed-length iteration during which one user story or product backlog item (PBI) is transformed into a potentially shippable deliverable. Each sprint is assigned a set amount of time to be accomplished (sometimes referred to as Timeboxing), which could be anywhere from one week to one month, but typically lasts two weeks.
  • Stakeholder analysis. The work to identify the stakeholders who may be impacted by a proposed initiative and assess their interests and likely participation.
  • Stakeholder Interviews. Conversations with the key contacts in the client organization funding, selling, or driving the product.
  • Stakeholder list, roles, and responsibility designation. A listing of the stakeholders affected by a business need or proposed solution and a description of their participation in a project or other initiative.
  • State diagram. An analysis model showing the life cycle of a data entity or class.
  • Strategy stage. The stage during which the brand, guiding principles, and long-term vision of an organisation are articulated. The strategy underpinning a UX project will shape the goals of the project—what the organisation is hoping to achieve with the project, how its success should be measured, and what priority it should have in the grand scheme of things.
  • Structural rule. Structural rules determine when something is or is not true or when things fall into a certain category. They describe categorizations that may change over time.
  • Survey. An online form designed to solicit feedback from current or potential users.
  • Sustainable pace. The pace that an Agile team can work at indefinitely without resulting in developer burnout (ideally 40 hours per week).
  • Swarming. Mutual work of team members with appropriate skills work together to complete a task that a team member is having trouble completing on his or her own.
  • Swimlane. The horizontal or vertical section of a process model that show which activities are performed by a particular actor or role.
  • System Design. The identification of all the necessary components, their role, and how they have to interact for the system to fulfill its purpose.
  • System Integration. The activity of integrating all the components of a system to make sure they work together as intended.
  • System. A collection of interrelated elements that interact to achieve an objective. System elements can include hardware, software, and people. One system can be a sub-element (or subsystem) of another system.
  • System. A set of interrelated components working together to produce a desired result.
  • Systems Approach. The application of a systematic disciplined engineering approach that considers the system as a whole, its impact on its environment and continues throughout the lifecycle of a project.
  • Systems Engineering. The orderly process of bringing a system into being using a systems approach.
  • Technical communication. The practice of creating easily accessible information for a specific audience.
  • Technical constraint(s). Technical constraints are limitations on the design of a solution that derive from the technology used in its implementation. See also business constraint.
  • Technical debt. refers to the obligation a development team incurs when they use a short-term, expedient approach to developing a software package without considering the long-term consequences. Technical debt increases project cost and complexity due to inefficiencies, inaccuracies, and other issues introduced into the software package. Poor management, incompetency, timeline pressure, or inadvertent mistakes can all contribute to technical debt.
  • Technique. Techniques alter the way a business analysis task is performed or describe a specific form the output of a task may take.
  • Temporal event. A system trigger that is initiated by time.
  • The How. A term used to describe the domain of the team, as distinct for the product owner, cf The What. The How can also be described as tactic (i.e. how to win the battle).
  • The What. A term used to describe the domain of the product owner, as distinct for the team, cf. The How. The What can also be described as strategy (i.e. what's the best order for battles).
  • Thumb vote. A quick pulse to get a sense of where the team are in terms of commitment, or agreement on a decision, etc. thumb up generally means agree, yes, or good, and thumb down disagree, no or bad; the analog version of this allows the thumb to be anywhere on the half circle to indicate differing degrees of agreeability.
  • Timebox. A fixed period of time to accomplish a desired outcome.
  • Timebox. An assigned period of time during which an individual or team works toward an established goal. The team stops work when the time period concludes, rather than when work is completed. The team then assesses how much work was accomplished toward the specified goal.
  • Timeboxing. Setting a duration for every activity and having it last exactly that (i.e. neither meetings nor sprint are ever lengthened - ever).
  • Trade-off. losing one quality or aspect of something in return for gaining another quality or aspect.
  • Unified modeling language (UML). A non-proprietary modeling and specification language used to specify, visualize, and document deliverables for object-oriented software-intensive systems.
  • Unit testing. A short program fragment written for testing and verifying a piece of code once it is completed. A piece of code either passes or fails the unit test. The unit test (or a group of tests, known as a test suite) is the first level of testing a software development product.
  • User feedback loop. Ideas are put in front of users, who provide their feedback, which is used to refine the design, and then the process repeats.
  • User interview. Used for understanding the tasks and motivations of the user group for whom you are designing, user interviews may be formally scheduled, or just informal chats.
  • User journey. The step by step journey that a user takes to reach their goal.
  • User research. Observation techniques, task analysis, and other feedback methodologies which are used to focus on understanding user behaviors, needs, and motivations.
  • User-centred design (UCD). A design process during which the needs of the user is considered at all times. Designers consider how a user is likely to use the product, and they then test the validity of their assumptions in real world tests with actual users.
  • Validation. Testing to insure that the created system actually provides the value intended to its stakeholders. (Did we build the right system?).
  • Value. The benefit enjoyed by the stakeholders of the system when the system is in operation.
  • Velocity. A metric that specifies how much work a team is able to complete within a single, fixed-length iteration or sprint.
  • Verification. The process of proving that a finished product meets specifications and requirements. (Did we build the system right?)

The successor lecture is Project Management Quarter.