Tuesday, 17 December 2013

EVALUATION (iii) [3 marks]

(iii) Desirable extensions [3 marks] 

As a result of completing the system, the candidate should identify the good and bad points of the final system, highlighting any limitations and necessary extensions to the system, and indicating how the extensions could be carried out. 

3 marks The candidate clearly portrays the good and bad points of the system indicating the limitations, possible extensions and how to carry out the extensions. 

2 marks The candidate clearly identifies good and bad points and any limitations. 

1 mark The candidate identifies the obvious good points of the system and possibly some bad points or limitations. 

EVALUATION (ii) [3 marks]

(ii) Evaluate the user’s response to the system [3 marks] 
It is important that the user is not assumed to be an expert in computer jargon, so some effort must be made to ensure that the system is user-friendly. It will be assumed that the user will have considerable knowledge of the underlying theory of the business or area being computerised. 

Clarity of menus, clear on-screen help and easy methods of inputting data are all examples of how the system can be made user-friendly. Here marks are awarded for the degree of satisfaction that the user indicates in the acceptance procedure. 

Could the system or its results be used? Was the system specification achieved? Do any system faults still exist? The candidate should evaluate the 
user’s response to the final version of the system. 

3 marks The user indicates that the system could be used but there are some faults which need to be rectified. The candidate provides a detailed discussion of how these inadequacies may be dealt with. 
OR 
A fully functional user-friendly system has been produced. The user indicates that the system fully meets the specification given in section (a), and there are no known faults in the system. 

2 marks The system is, in the main, user-friendly, but there is room for improvement (eg no on-screen help has been provided). The user indicates that the system could be used but there are some faults which need to be rectified. The candidate has made some limited attempt to discuss how these inadequacies may be dealt with. 

1 mark The system does not meet the design specification and the end user is not able to make use of the system. The

EVALUATION (i) [4 Marks]

(i) Discussion of the degree of success in meeting the original objectives [4 marks] 

This discussion should demonstrate the candidate’s ability to evaluate the effectiveness of the completed system. The original objectives stated in the requirements specification should be matched to the achievements, taking into account the limitations. User evaluation is also essential and should arise from direct user evaluation. 

3–4 marks A full discussion, taking each objective mentioned in (b) (i) and explaining the degree of success in meeting them (indicating where in the project evidence can be found to support this), or reasons why they were not met. 

1–2 marks Some discussion about a number of objectives, but some omissions or inadequate explanation of success or failure. 

0 marks No discussion present.

Thursday, 12 December 2013

DOCUMENTATION [10 marks]

Candidates should be able to: 
a. develop detailed and appropriate user documentation. 

Quality of Written Communication is assessed in this documentation. Much of the technical 
documentation will have been produced as a by-product of design and development work and also
as part of writing up the report to date. The software solution should also include sufficient on-screen help to enable the end user to make use of the system. Some additional supporting documents will be necessary including initial set-up, getting started and troubleshooting guides, to ensure the end user can implement the solution

8–10  marks Candidates will provide detailed and accurate documentation. The documentation will 
be well presented, in a structured and coherent format. The documentation will cover all aspects of the system, with no omissions, including installation, typical use, troubleshooting, and backup. The on-screen help and supplementary documentation makes a complete guide to the solution and is well presented and easy to follow. Subject-specific terminology will be used accurately and appropriately. There will be 
few, if any, errors of spelling, grammar and punctuation. 

4–7 marks Candidates will provide clear documentation. The documentation will be well presented. There is clear on-screen support to enable the end user to use the system. The supporting documentation and on-screen help is well presented and covers most aspects of the system operation with only one or two omissions, eg troubleshooting or backup. Some subject-specific terminology will be used. There may be occasional errors of spelling, grammar and punctuation. 

1–3 marks Candidates will provide superficial documentation, with weak supplementary user documentation covering few aspects of the system. The information will be poorly expressed and limited technical terms will be used. Errors of grammar, punctuation and spelling may be intrusive. 

Tuesday, 10 December 2013

TESTING [14 Marks]

Develop a software solution and, using the test plan developed in 3.4.2: Design, show that the system works with valid, invalid and
borderline data (or, if it does not, under which circumstances it fails). 

Test plan clearly cross-referenced to evidence that the system has been tested during development and implementation. 

Evidence of user testing. 
• Software development 
• Alpha testing 
• Response to the results of testing 
• Beta testing 
• Response to the results of beta testing 
• Modularisation of code 
• Code documentation 
• Use of modules, data structures and objects 
• In-code documentation 
• Code structure 

• Applying the test plan and data 


An attempt should be made to show that all parts of the system have been tested, including those sections dealing with unexpected or invalid data as well as extreme cases. Showing that many other cases of test data are likely to work – by including the outputs that they produce – is another important feature. Evidence of testing is essential. 

The beta testing should cover all aspects of the test plan produced in the design section, which should cover all aspects of the design specification. 
The examiner must be left in no doubt that the system actually works in the target environment. 

This evidence may be in the form of hard copy output (possibly including screen dumps), photographs or any format that does not require access to any specific hardware or software. The end user(s) must be involved in this process and evidence of end-user testing is required. 

11–14 marks 
The testing covers as many different paths through the system as is feasible, 
including valid, invalid and extreme cases. The testing covers all aspects of the 
design specification and the test plan from the design section. There is clear 
evidence of end-user testing. 
8–10 marks 
There is evidence of testing covering most aspects of the design specification but 
with omissions, eg test data does not include erroneous data for all tests or there is 
limited evidence of end-user testing. 

Friday, 15 November 2013

Development


Development must include:
a. develop the rules/methods/algorithms of a design using a programming language; 
b. develop the data structures of the design using the appropriate features of a programming language; 
c. develop inputs/outputs using the features of a programming language; 
f. produce suitable modular code with full annotation and a description of how the modules combine to create the solution; 

Mark bands:
A technical description of how the solution relates to the design specification produced and agreed with the user should be included. It is the responsibility of the candidate to produce evidence of their development work. This section must show how the candidate tested each section during development and the responses to this alpha testing. The code must be documented adequately to 
explain its function and there must be clear evidence of how modular code has been used to develop the final solution.

13–16 marks 
There is complete evidence showing how the solution was developed using suitable alpha testing at each stage to inform the process. The modular code is fully annotated indicating clearly the purpose of each section and the interrelationship between the sections. The developed solution fulfills all of the design specification. 

9–12 marks 
Program listings are provided in the form of printouts. Data structures are illustrated as part of the listings where appropriate, detailing their purpose. There is sufficient annotation evident to illustrate how the solution was developed for a particular purpose and indicate the purpose of sections of code. The code will be modular and there will be good evidence to show how testing was used during the development process to inform each stage. The developed solution fulfils the design specification but there are some minor flaws in the solution. 

5–8 marks 
Program listings are provided in the form of printouts. Data structures are illustrated as part of the listings where appropriate, detailing their purpose. There is some annotation evident to illustrate how the solution was developed and some limited evidence that some testing took place during development. The developed solution has significant flaws and only partially fulfils the design specification. The code maybe linear but with some annotation indicating how the code relates to the problem and some limited evidence of alpha testing. 

1–4 marks 
Program listings are provided in the form of printouts but with no annotation or evidence of alpha testing. The developed solution does not fulfil the design specification. There is some evidence of system development. 

Tuesday, 15 October 2013

Test strategy

A detailed test strategy and plan, together with appropriate test data, should be developed and documented. It is vital to produce test cases and to show that they work. To do this, it is necessary not only to have test data, but to know what the expected results are with that data. 

5 marks A detailed test strategy and plan covering all aspects of the system with data to test under normal, extreme and abnormal circumstances. 
3–4 marks A detailed test strategy and a plan covering several aspects of the system but with inadequate data to effectively test the system, eg data covers only normal circumstances or covers only a limited part of the design specification. 
1–2 marks A vague discussion of how the system might be tested.

Must include:
d. identify, develop and document a test strategy for the design; 

e. select suitable test data for the design;