• Personnel in charge of processing data are normally independent of those responsible for input and output, so as to maintain data integrity.
• Processed data is destined to the transaction or master file and nowhere else. From these files, their destinations can be verified.
• Transactions, once processed cannot be processed again or duplicated or improperly changed.
• Processing errors should be identified and corrected on a timely basis to avoid future unnecessary cost to rectify magnified problems.
• To ensure programs are not altered in any way, personnel in processing department should have no access to the programs.
• There should be recovery procedures for use in the event of power failure so that processing function is not left ‗hanging‘ as this creates room for manipulations.
• Provision of offsite processing in the event of disaster.
– Programmed sequence checking ensures the completeness of input in a timely fashion.Each item of input is verified as having been input into the system.
– Programmed matching of input to a control file, containing details of expected input.
– For accuracy, control techniques such as batch total will be used. This control also ensures completeness.
– Programmed check digit verification wherein a check digit included in a reference number is arithmetically checked to ensure it bears the required relationship to the rest of the number.
– Summary processing. This technique ensures completeness and accuracy i.e. in computation such as involving depreciation, calculation based on total assets value is compared with the sum of depreciation, calculation based on individual asset values.
– Record counts and hash totals are techniques used to ensure the continued correctness of master files and the standing data contained therein.
– Programmed reasonableness checks (for input), including checking the logical relationship between two or more files.
– Programmed existence checks against valid codes.