| Term | Definition | 
| Acceptance Test | Final functional testing used to   evaluate the state of a product and determine its readiness for the end-user.   A ‘gateway’ or ‘milestone’ which must be passed. | 
| Acceptance Criteria | The criteria by which a product or   system is judged at Acceptance Test. Usually derived from commercial or other   requirements. | 
| Alpha | The first version of product where   all of the intended functionality has been implemented but interface has not   been completed and bugs have not been fixed. | 
| ATP | Approval   to Proceed, management approval that a project may go on to the next phase | 
| Audit | Review of   project to assess compliance with requirements, specifications, baselines,   standards, procedures, instructions, codes, contract requirements, and/or   license requirements | 
| Baseline | A snapshot at a particular point   in time of part of a project plan. A “schedule baseline” is a   snapshot of the schedule at that point in time. Can be compared over time. | 
| Beta | The first version of a product   where all of the functionality has been implemented and the   interface is complete but the product still has problems or defects. | 
| Big-Bang | The implementation of a new system   “all at once”, differs from incremental in that the transition   from old to new is (effectively) instantaneous | 
| Black Box Testing | Testing a product without   knowledge of its internal working. Performance is then compared   to expected results to verify the operation of the product. | 
| Bottom Up | Building or designing a product   from elementary building blocks, starting with the smaller   elements and evolving into a lager structure. See “Top Down” for contrast. | 
| Change   (as in Change Control, below) | Any   alteration of the functional or physical characteristics of a project work   product. This includes both defect repairs and enhancements | 
| Change   Control | Process   by which a change is proposed, evaluated, approved or rejected, scheduled,   and tracked to completion | 
| CM | Configuration   Management (also known as SCM, Software Configuration Management) | 
| Commitment | Pact   between two or more people who trust each other to perform; commitments are   freely assumed, explicitly defined, and visible | 
| Configuration | Functional   and physical characteristics of hardware or software as set forth in   technical documentation or archived in a product; requirements, design, and   implementation that define a particular version of a system or system   component | 
| Critical Path | The   minimum set of tasks which must be completed to conclude a phase or a   project. | 
| Deliverable | A tangible, physical thing which   must be “delivered” or completed at a milestone. The term is   used to imply a tactile end-product amongst all the smoke and noise. | 
| End-user | The poor sap that gets your   product when you’re finished with it! The people that will actually   use your product once it has been developed and implemented. | 
| Feature/Scope creep | The relentless tendency of a   project to self-inflate and take on more features or functionality   than was originally intended. Also known as ‘scope creep’. | 
| Impact | The   relative harm or damage to a project if a risk becomes a problem, usually   expressed either as a dollar amount or on a scale from 1 to 10 | 
| Incremental development | The development or a product in a   piece-by-piece fashion, allowing a gradual implementation   of functionality without having the whole thing finished. | 
| Independent   Audit | Independent   review of a project by an outside agency or team separate from the   organization responsible for the project, to assess compliance with product   requirements, specifications, baselines, standards, procedures, instructions,   codes, contractual requirements, and/or licensing requirements | 
| Independent   Verification and Validation (IV&V) | Verification   and validation (see entries elsewhere in this Glossary) performed by an   organization that is technically, managerially and financially independent of   the development organization | 
| Issue | Any area   of concern that presents an obstacle to achieving project objectives | 
| Lessons   Learned Session | Same as   Post Project Review | 
| Major   Information Resources Project | Defined   in the General Appropriations Act as any information resources technology   project identified in an agency operating plan whose development costs are   over $1,000,000 and includes one or more of the following:  
 | 
| Milestone | Scheduled event used to measure   progress in a project. A significant point in a project schedule which   denotes the delivery of a significant portion   of the project. Normally associated with a particular “deliverable”. | 
| Milestone   Review | Formal   review of management and technical progress of a project | 
| Not   Invented Here (NIH) | The   attitude of resisting anything that was not invented or derived by the using   organization or person | 
| Process   Assets Database | Organization   collection of defined policies, processes, procedures, and templates. This   may include structured collections of lessons learned on projects. | 
| Project | A   temporary activity characterized by having a start date, specific objectives   and constraints, established responsibilities, a budget, a schedule, and a   completion date | 
| Project   Completion Review | Same as   Post Project Review | 
| Project   Development Plan | Document   describing the approach that will be taken for a project; typically describes   the work to be done, resources required, methods to be used, configuration   management and quality assurance procedures to be followed, schedules to be   met, and the project organization. The plan is required for all projects, but   is only submitted to the Quality Assurance Team when requested. The plan will   be used by the Team to analyze the status of the project. Amendments to the   plan may trigger a reassessment of risk and monitoring levels | 
| Project   History Database | An   organization collection of reusable data about individual projects; generally   information about plans and the actual results at project completion | 
| Project   Management | System of   procedures, practices, technologies, and know-how that provides the planning,   organizing, staffing, directing, and controlling necessary to successfully   manage a project | 
| Project   Postmortem | Same as   Post Project Review | 
| Prototype | A simple   model of a product which is used to resolve a design decision in the project. | 
| Quality Assurance (QA) | The process of preventing defects   from entering a product through best practice. Not be   confused with testing which is done to remove defects already present. | 
| Quality   Assurance Team (QAT) | The QAT   is composed of representatives from the Department of Information Resources   and the State Auditor’s Office. The Team is responsible for reviewing,   approving, and overseeing major information resources projects. | 
| Requirement | A statement of need from a project   stakeholder which identifies a required attribute of the   system or product to be delivered | 
| Risk | The   possibility of an act or event occurring that would have an adverse effect on   the state, an organization, or an information system. Risk involves both the   probability of failure and the possible consequences of a failure | 
| Risk   Exposure | The level   of loss presented to an organization by a risk; the product of the likelihood   that the risk will occur and the magnitude of the consequences of its   occurrence | 
| Risk   Factor | An   element of project development and management that is used to evaluate a   project. It is an element that has the potential to affect the success or   failure of the project. Risk factors can be both internal and external to the   agency. Each risk factor should be addressed and controlled as much as   feasible by the project management team | 
| Risk   Management | A process   used to identify potential problems before they occur, so that actions can be   taken to reduce or eliminate the likelihood or impact of these problems   should they occur | 
| Risk   Mitigation | Actions   taken to reduce the likelihood of a risk occurring as a problem, or to reduce   the impact if it does occur | 
| ROI | “Return On Investment” – a ratio   which compares the monetary outlay for a project to the   monetary benefit. Used to show success of a project. | 
| Scheduling | Determining   the start and stop time of each activity and task in the project, taking into   account the precedence relations among tasks, the dependencies of tasks on   external events, the required milestone dates, and the resources available | 
| Show Stopper | A defect that is so serious it   literally stops everything. Normally given priority attention until it is resolved. Also known as “critical” issues. | 
| Software   Acquisition Management (SAM) | The   actions taken by management with a supplier or subcontractor in the process   of acquiring software | 
| Software   Configuration Management (SCM) | A   discipline applying technical and administrative direction and surveillance   to  
 | 
| Software   Quality Assurance (SQA) | A process   by which an organization determines that software it produces and/or acquires   satisfies the organization’s technical and administrative performance   requirements, relatively free from discrepancies, and meeting user needs. SQA   must be part of an organization’s culture to ensure all of its products and   services are of the highest quality | 
| Stakeholder | Any   individual or group who  
 | 
| Standard | Approved,   documented, and available set of criteria used to determine the adequacy of   an action or object | 
| Testing | The process of critically   evaluating a product to find flaws and to determine its current state of   readiness for release | 
| Top Down | Building or designing a product by   constructing a high level structure and then filling in gaps in that   structure. See “Bottom Up” for contrast | 
| Usability | The intrinsic quality of a product   which makes it simple to use and easy to understand and   operate. Often described as the ability of a product to not annoy the user. | 
| Validation | Determining   the correctness of a work product, with respect to the user’s needs and   requirements (Is this the right product?) | 
| Verification | Determining   whether the products of a given phase of the life cycle meet the requirements   established during the previous phase (Are we building the product right?) | 
| White/Glass Box Testing | Testing a product with an understanding of how it works.   See Black Box Testing.     | 
| Work   Breakdown Structure (WBS) | The   complete list of activities that need to be done for a project, used for   estimation and scheduling the work | 
| Work   Product | Any   tangible item that results from working on a project function, activity, or   task | 
Source : DIR , Nick Jenkins






