The purpose of this section of the guidelines is to provide guidance on what benchmarks and quality assurance measures should be in place for digitisation projects.

Back to top

What are benchmarks?

Benchmarks outline the level of quality that a back-capture digitisation project is expected to attain. Benchmarks should define image quality as well as other quality measures that need to be in place. These should be defined during the planning stages of a project and agreed with by all relevant stakeholders.

Benchmarks can differ from project to project. Their rigour can depend on the purpose and business drivers for a project and the records that are being digitised.

For example:

It is important to consider more rigorous benchmarks if the original paper records are going to be destroyed. The digital image will need to be relied upon as evidence of the business that occurred. If the records are high risk (e.g. regularly required in court) or need to be kept long term or as archives, the best achievable benchmarks should be set so the images can withstand time and migrations.

By defining and documenting your benchmarks, your organisation can have clear and defensible proof that they considered ways to protect the records.

Back to top

What is quality assurance?

Quality assurance involves checking digitisation processes and digital images against benchmarks to ensure that the benchmarks:

  • are correct or suitably applied
  • are being met in practice.

The degree and nature of quality assurance should be defined early in a project. Benchmarks and quality assurance checks should be tested and refined in pilot projects to ensure they produce acceptable results. Quality assurance checks should be carried out periodically during a project in accordance with defined procedures.

Quality checking must be completed before the images are used in business processes and before the original paper records are destroyed.

Back to top

Why are benchmarks and quality assurance needed?

Robust benchmarks and quality assurance measures can help you to ensure that:

  • original paper records are handled correctly and are undamaged during digitisation processes
  • original paper records are lawfully destroyed (when appropriate)
  • digital images are accurate, complete and legible and of sufficient quality to be ‘fit for purpose’ with their essential characteristics preserved
  • metadata to control and manage digital images is adequate
  • your organisation is able to prove that robust and trusted systems and processes were used to produce digital images if their authenticity as evidence is questioned.

Note: The extent of benchmarks and quality assurance definition and documentation will be dependent on a project's aims and the risks associated with the records being digitised.

Back to top

Assign responsibility and train staff

It is essential that all staff members are aware of their roles and responsibilities for benchmarking and quality assurance. Roles and responsibilities may include:

Task This is usually the responsibility of...
Setting benchmarks, testing them and defining and testing processes that will enable benchmarks to be met The project manager in liaison with relevant stakeholders, service providers etc.
Undertaking defined processes so that benchmarks are met Staff members involved in digitisation
Undertaking quality assurance checking of digital images Staff members (may be different to the staff members who undertook the digitisation)
Undertaking additional quality assurance checking of digital images to promote consistency

 

Note: May be required where a number of staff members are involved in quality assurance

Supervisory staff
Checking that other program benchmarks are met     The project manager

Staff members need to be adequately trained and supported to meet their roles and responsibilities. Much of quality assurance, in particular, is subjective so training is needed to guarantee consistent results. See Staffing digitisation projects for more information.

Note: If your organisation outsources digitisation, defining your benchmarks clearly and including these in tender and contract documentation can help to ensure that service providers deliver the required quality.

Some quality assurance may be undertaken by service providers in accordance with agreed parameters. It is essential that some quality checking is also undertaken by your organisation at regular intervals to ensure quality standards are maintained.

Back to top

Process benchmarks and checks

Your organisation should establish procedures for the digitisation process and make them available to relevant staff. These should outline exactly what steps need to be taken to meet agreed quality benchmarks and any acceptable variations from normal processes.

It is likely that amendments will be made to processes during implementation. These should result in changes to procedures that are made known to staff.

Digitisation procedures (or separate quality assurance procedures) should outline the nature, degree and regularity of quality assurance measures. The procedures should be approved by senior management. For more information see Policies, procedures, standards and documentation.

Your organisation may also choose to make its processes and procedures subject to a quality assurance review at a later date in the project.

If outsourcing, relevant process benchmarks should be communicated to service providers.

Back to top

Equipment benchmarks and checks

Your organisation will need to make some decisions regarding the image quality and technical standards required before purchasing digitisation equipment. Indeed these can be useful criteria to assess what hardware and software to purchase. See Benchmarks for digital image quality.

Once purchased, equipment needs to be checked periodically, cleaned and tested to ensure that hardware and software is well maintained and operating effectively. Otherwise digital image quality may suffer.

For example:

Scanners become dirty over time with continual use. If they are not clean they can start adding lines to digital images.

Scanners should be correctly calibrated to ensure that the digital images can meet quality benchmarks and existing standards.[1]  They will need to be re-calibrated at various intervals.

The frequency of checks will depend on the volume of use. Checks should be documented.

If the digitisation is outsourced, the service provider will determine what equipment is needed to guarantee your image quality benchmarks.

Back to top

Benchmarks for digital image quality

The complexity and degree of rigour required for benchmarks for digital image quality should be based on an assessment of:

  • what will be fit for purpose, i.e. what benchmarks will ensure identified business and/or stakeholder needs are met
  • the particular characteristics of the records in question and which of these are essential to reproduce.

Images are fit for purpose

Your organisation should consider the reasons for a digitisation project and how business, stakeholder and accountability needs may be met. This will assist you in determining the degree of quality required.

Sometimes you will be able to meet these needs using lower quality images.

For example:

An organisation undertook a back-capture digitisation project simply to satisfy external stakeholder needs for access. The digital images were not going to be used in business processes or replace the original paper records as evidence of past business. Access was to be provided via the Internet. As a result the organisation chose low quality benchmarks so the images could be delivered easily and quickly over the Internet to stakeholders.

At other times optimal quality is required.

For example:

One program involved the digitisation of maps and plans which contained very fine detail. It was identified by the business that meaningful colour coding and detailed notations, including corrections and references, made in both pen and pencil on the maps and plans themselves had to be identifiable and clearly legible if the digitised images were to replace the physical maps and plans for the purposes of access and use. Access was to be provided via the internet to staff and external users of the records. To ensure that the digitisation of these maps and plans captured all the required details (both colour and notation) the optimal quality of digitisation was necessary.

If the digital images are required to satisfy many purposes, it is important that the benchmarks set are able to meet them all. In this case it may be advisable to create a master with more rigorous benchmarks, and produce derivatives from it that are of lower quality to meet other needs.

For example:

If your organisation does not intend initially to destroy original paper records but may do so in the future, it should opt for higher quality masters. If image quality is low, re-digitisation may be required before originals can be destroyed.

Where records are required as State archives, your organisation must contact Museums of History NSW to discuss suitable benchmarks for image quality for masters.

A note on enhancements

Image enhancement techniques (e.g. sharpening, clipping of highlights or shadows, blurring to eliminate scratches and spotting or de-speckling) may be employed to make an image more exactly resemble the original. However, benchmarks should document acceptable changes and these must be routinely employed. If they are not, your organisation may be subject to challenges that the digital images are not authentic representations of the original paper records. [2]

Images reproduce the essential characteristics of the records

You should also examine the particular original records that are to be digitised and determine the essential characteristics that need to be maintained and present in the digital image.

Essential characteristics are the elements of a record that are vital to reproduce in order for the record to retain its meaning and/or evidential value.

For example:

An organisation had a particular group of records often required in court where colours were vital to understanding annotations on the records eg. green pen meant something different to red pen. In this case colour was an important essential characteristic of the records that needed to be reproduced. This affected decisions regarding hardware and software and technical (colour management) requirements.

The following table indicates the benchmarks for image quality that should be defined and includes some questions to consider: 

Benchmarks Questions to consider

Technical specifications

See Technical specifications for more information

  • What file formats, bit depth, resolution, compression, colour management etc is required?
  • Can these be applied to capture the essential characteristics of all records? If variations are required for some groups of records what are they?
  • Can hardware and software ensure these can be met?
  • How will these be checked for quality assurance?

Metadata requirements

See Metadata requirements for more information

  • What metadata needs to be collected for all uses of the digital images?
  • Can one metadata set be applied to all records? If variations are required for some groups of records what are they?
  • Can metadata be captured automatically?
  • How will the content and structure of metadata properties be checked for quality assurance?

Some technical aspects of the quality of images can be evaluated using software.

For example:

Noise in digital images is caused by random pixel fluctuations, and may make images appear grainy. Software can be used to measure the level of noise in images, to check that it is minimised to an acceptable level.[3]

Back to top

Checking for digital image quality

Checking of the quality of digital images should form part of workflows.

If large volumes of images are involved it is acceptable to check random samples of imaged records. See Sampling for more information.

Checking should examine the:

  • smallest detail legibly captured (e.g. smallest type size for text, clarity of punctuation marks, including decimal points etc.)
  • completeness of detail (e.g. acceptability of broken characters, missing segments of lines or pixels etc.)
  • dimensional accuracy compared with the original
  • scanner-generated speckle (i.e. speckle not present on the original)
  • completeness of the overall image area (i.e. missing information at the edges of the image area)
  • colours or tones in comparison with the original (e.g. density of solid black areas, colour captured in colour, colour fidelity, correctness of tonal values and colour balances, correctness of brightness and contrast)
  • sharpness of the image compared to the original (e.g. lack of sharpening, too much sharpening, unnatural appearance, halos around dark edges etc.)
  • accuracy of captured text where Optical Character Recognition (OCR) is used. [4]

Where equipment is being readjusted, it may also be relevant to check that:

  • images are in the correct file formats
  • compression ratios (when appropriate) are correct
  • the correct bit-depths and resolutions have been used.

The quality check should ascertain if all the essential characteristics from the original paper records (identified at the benchmarking stage) have been fully represented.

Completeness of digitisation

To ensure that all of the required original paper records are digitised, checks should be conducted on the completeness of the work. This may include validating the number of pages in paper records against the number of digital images created. For multi-page items, the number of pages within an image should accurately reflect the original paper records, and the pages should be structured and arranged in the correct order.[5]

Sampling

Where there are limits on time and finances, some degree of sampling may be adopted for undertaking quality checking of images. The following table shows the pros and cons of each option.

Degree of quality checking Pros Cons
Checking all digital images and their metadata All will meet minimum required quality baselines Time and resource intensive
Checking only random samples of digital images and their metadata Less time and resources required Lower degree of certainty that all images have met quality baselines

If sampling is adopted, benchmarks for the frequency of sampling should be determined according to system usage and expected or anticipated deterioration periods. Sampling should include assessments of both digital images and their metadata. Advice from system vendors may assist in determining the frequency period.[6]  The extent and frequency of sampling should be documented.

Initially, it may be appropriate to sample frequently. However, once benchmarks, equipment and processes have been stabilised, this may be reduced to a random sampling of between 5 and 10%.[7]

Care must be taken to ensure that samples are representative of the range of records digitised and include examples where the quality of the original paper records is poor compared to the majority of sampled original records. In some cases, e.g. following equipment repairs or if using new staff or service providers, each image may be checked until there is confidence that the standards are being met.

Environment for quality checking

A controlled environment is required to consistently apply quality assurance checks. In an uncontrolled environment, e.g. with excessive glare or reflections, or using an improperly set up computer system, a high quality image may be incorrectly deemed to have not met quality benchmarks.

The output device that the digital image is intended for should be used for quality assurance checks.

Example:

If a digital image is intended for printing, then the image should be printed and checked against the quality baselines for printed images.

If a digital image is intended for display on a computer monitor, quality baselines should be verified on a computer monitor.

Checking may need to be conducted on a variety of printers and monitors to detect variations.

The extent of an image that can be seen on a monitor depends on the image pixel dimensions and the desktop resolution. The area of an image displayed can be increased by increasing the screen resolution or by decreasing the image resolution. Multiple images may be viewed on a screen at one time, however to ensure that details have been captured appropriately a number of the images should be viewed at 100% or greater magnification.[8]

User fault reporting

Users should report any errors back to the digitisation team so that these can be rectified through re-imaging. These should be included in reports regarding quality assurance as they may help to rectify common faults.[9]

Re-digitising

If digital images do not meet documented benchmarks, your organisation (or, if outsourced, the service provider) will need to re-digitise them. Where quality standards are not met during random sampling, your organisation may need to re-inspect the remaining output.

Examples of approaches:

If more than 1% of the total number of images and associated metadata examined in a randomly selected sampling are found to be defective, the entire output since the last quality check is re-inspected. Any specific errors found in the random sampling and any additional errors found in the re-inspection are corrected.

If less than 1% of the batch is found to be defective, then only those defective images and metadata that are found are redone.[10]

Note: Your organisation may consider setting an ‘acceptable margin of error’ with digitisation. However, if you have set appropriate benchmarks that are not optimal but rather are fit for purpose and ensure essential characteristics are protected, you should aim for 100% accuracy. If the essential characteristics of the records are compromised, or the images are no longer fit for purpose, there is little value in retaining the images as they are.

Back to top

Benchmarks and checks for metadata

Your organisation should determine the metadata required for a back-capture digitisation project during planning. This will provide the benchmark for quality assurance. See Metadata requirements for more information.

Note: It is very important to prevent or remove errors in metadata. Inaccuracies in metadata can have significant consequences for retrieval and, depending on the records, may have wider risks for the organisation.

Where possible, metadata should be automatically collected.

For example:

The Department of Education and Communities created forms and templates for their human resource management records and wrote document definition scripts. These enabled their Optical Character Recognition (OCR) software to automatically harvest and populate the metadata fields. About 99% of metadata was automatically captured, significantly reducing the chance of errors. See the Case Study: Department of Education and Communities pilot digitisation of HR records.

Quality assurance checks of technical and recordkeeping metadata need to be determined, documented in procedures (or contract documentation) and implemented. These may need to be more rigorous for manually entered metadata. However, even automatic metadata will require some degree of checking.

Examples of what checks should determine include:

  • whether all minimum required metadata has been collected
  • whether additional metadata standards established by your organisation for the particular project have been met
  • whether file naming conventions have been adhered to
  • the relevance of metadata collected
  • the accuracy of grammar, spelling and punctuation, especially for manually-keyed data
  • whether there is consistency in the creation and interpretation of metadata
  • whether there is synchronisation of metadata stored in more than one location – e.g. information related to an image might be stored in the TIFF header, the management system and other databases and this should always be consistent
  • completeness of metadata – i.e. that all mandatory fields are complete.[11]

Organisational procedures should address what to do if poor metadata capture has been revealed through quality assurance checks.

It may be useful to evaluate, over time, the usefulness of the metadata being collected and, if appropriate, make amendments to processes and systems to ensure required metadata is being captured.

Of particular importance are:

Verifying the accuracy of the file identifier. File names should consistently and uniquely identify both the digital image and the metadata record (if it exists independently of the image). File identifiers will likely exist for the metadata record itself, in addition to identifiers for the digital image, which may embed information such as page or piece number, date and project or institution identifier, among others. Information embedded in file identifiers for the image should parallel metadata stored in a database record or header. Identifiers often serve as the link from the image to information stored in other databases and have to be accurate to bring together distributed metadata about an image. Verification of identifiers across metadata in disparate locations should be made.

Verifying the correct sequence and completeness of multi-page items. Pages should be in the correct order with no missing pages. If significant components of the record are recorded in the metadata, such as the presence of attachments, documents with identifiable chapters or multi-page records, they should match up with the actual images. A convention for describing these views should be followed and should match with the actual images.[12]

Back to top

Benchmarks and checks for original paper records

Preparation of original paper records

Digitisation procedures should document how to prepare original paper records for digitisation. See Managing original paper records for more information.

With outsourcing arrangements your organisation will need to determine whether preparation of the records will be undertaken by staff members or contractors. If contractors are to undertake the work, benchmarks need to be communicated to them.

Quality assurance checks should ensure that original paper records are retrieved and handled in an appropriate manner. If originals are to be retained after digitisation they should suffer no damage and be returned to storage in their original order.

Handling of State archives

If your organisation wishes to digitise records that are required as State archives you must contact Museums of History NSW to discuss the project. Quality assurance checks should ensure that contact has been made, the project has been discussed and any conservation advice provided regarding the handling of original paper records has been followed.

Retention or destruction of original paper records

If your organisation intends to destroy original paper records after digitisation it is essential that this is considered as part of planning and relevant approvals for disposal are in place. Disposal processes should then be outlined in procedures for staff.

If the destruction of original paper records is authorised, they should be retained for a period of time after digitisation to allow for the quality of the digital image to be verified. The period of time they need to be retained should be assessed during the planning phase of a back-capture digitisation project and articulated in an internal policy statement and built into procedures.

Part of quality checking should involve determining whether appropriate approvals are in place and that staff are following procedures for disposal correctly, including waiting for the determined period before destroying original records. Any inappropriate destruction should be ceased immediately and steps taken to ensure it does not resume.

Checks should also ensure that the minimum required disposal metadata is retained about original paper records that are destroyed.

See Disposal of original paper records after digitisation for more information.

If outsourcing disposal, your organisation is still responsible for ensuring that procedures for disposal are in line with its responsibilities under the State Records Act, and relevant general retention and disposal authorities, including the General retention and disposal authority: original or source records that have been copied.

Back to top

Benchmarks and checks of storage and controls for digital images

Benchmarks also need to be set by your organisation regarding where digital images are stored, how they are captured and the reliability of storage. Decisions should be made early in a digitisation project regarding the best methods of capturing and storing images. This will be dependent on their purpose and the type of records being digitised.

For example:

If records are highly sensitive they will need to be stored with strict security and access controls.

If records are required to be retained as evidence of business they should be captured into recordkeeping systems or stored in other suitable environments with audit trails and suitable access and security controls.

Procedures for staff should describe how to store images correctly and apply suitable controls.

Quality checks should determine if digital images are being captured appropriately into chosen systems. Any deviation from standard procedures should be remedied. Tests should also be conducted on storage methods to ensure that they are reliable and the images are well protected.

Note: Digital images should be monitored and managed as part of wider organisational records management strategies for the retention of digital records, including migration planning. For more information, see Managing digital images as records.

Back to top

Documentation

Benchmarks and quality assurance measures for a back-capture digitisation project should be carefully defined and agreed to by all stakeholders. Procedures should document roles and responsibilities, how to ensure benchmarks are met, the method, degree and frequency of quality assurance checks, and when re-digitisation should occur. Procedures should be approved by senior management and communicated to relevant staff and stakeholders. If outsourcing digitisation, relevant documentation about benchmarks should be communicated to service providers.

Quality assurance data (such as logs, reports, decisions) should be captured in your organisation’s recordkeeping system. This data becomes an integral part of an image's metadata and may be used to demonstrate the authenticity of a digital image and inform future preservation decisions. [13]

Back to top

Review of benchmarks and quality assurance measures

It is important to undertake periodic revisions of benchmarks and quality assurance measures so that they remain relevant to the intended purpose of the records, and reflect emerging technology, legislation and industry trends.

Back to top

Common questions

See relevant parts of this section of the guidelines or Frequently asked questions for answers to the following questions:

  • Should I use watermarking or fingerprinting for records?
  • If I use techniques such as sharpening, blurring or de-speckling to make the digital images more accurately resemble the original, do I need to check them?
  • If I use Optical Character Recognition (OCR) do I need to check for the accuracy of text?
  • What are some common quality faults in digitisation we should plan to prevent?
Back to top

Checklist

Note: The following is a very comprehensive checklist for quality assurance based on Queensland State Archives’ former guidelines on digitisation which are no  longer available online. This checklist can be used as a guide when determining organisational requirements for quality assurance. If outsourcing digitisation to service providers, the checklist will need to be modified for use.

The most important points from this checklist are for organisations to define what benchmarks are required, ensure there are procedures and training for staff who are required to meet these benchmarks, and ensure that  there is quality checking to determine how well benchmarks are being met and where remedial action is required.

Benchmarks and quality assurance Yes No
Establishing benchmarks and establishing quality assurance standards    

Have benchmarks been developed in relation to:

  • digital image quality, including technical specifications, metadata requirements and requirements for equipment (e.g. calibration, output viewing device etc.)
  • storage and controls for digital images
  • management of original paper records including when they can be destroyed (if relevant)?
   
Have benchmarks been developed in liaison with stakeholders, documented, approved by senior management and communicated to relevant stakeholders?    
Do the processes determined and documented in digitisation procedures enable benchmarks to be met?     
Have benchmarks and quality assurance measures been communicated to relevant staff through procedures?    

Do procedures address:

  • roles and responsibilities including sign-off at an appropriate level
  • the method and frequency of calibration testing of equipment
  • the quantity of images and metadata to be checked and how frequently checks should occur
  • how quality checking is to be carried out
  • what to do if checks reveal poor image or metadata capture, including when re-imaging is required and how it is to be conducted
  • when image enhancement techniques can be used
  • whether images should be enlarged for quality checking?
   
Have benchmarks and quality assurance procedures been tested before digitisation  commences, to ensure they can be implemented and produce acceptable results?       
Has advice been sought from Museums of History NSW regarding the digitisation of original paper records required as State archives?      
Staffing    
Are staff and managers with responsibility for benchmarking and quality assurance sufficiently trained and supported to meet their responsibilities in a consistent way?       
Are changes to procedures documented and communicated to staff?    
Equipment    
Is equipment regularly cleaned and serviced and callibrated?      
Checking of digital images    
Are digital images quality checked as regularly as required by organisational procedures?         

Are a sample of digital images checked for aspects such as:

  • legibility (e.g. smallest type size for text, clarity of punctuation marks, including decimal points etc.)
  • completeness of detail(e.g. acceptability of broken characters, missing segments of lines or pixels, missing information at the edges of the image area, images cropped or incomplete etc.)
  • whether dimensions accurately compare with the original
  • whether scanner-generated speckle has been removed (i.e. speckle not present on the original);
  • whether colours or tones accurately compare with the original (e.g. density of solid black areas - Too light? Too dark? Colour captured in colour? Colour fidelity? Correct tonal values and colour balances? Correctness of brightness and contrast?)
  • the sharpness of the image compared to the original (e.g. lack of sharpness or too much sharpening, unnatural appearance and halos around dark edges etc.)
  • the accuracy of the text captured by Optical Character Recognition (OCR) software (where relevant)
  • whether images are in the correct file formats
  • whether compression ratios (where appropriate) correct
  • whether the correct bit-depths and resolutions are used?
   
Have all the essential characteristics defined in the planning stage of a project been reproduced successfully?    
Have all of the records identified for digitisation and all of the pages in multi-page items digitised?         
Have benchmarks for the output viewing device been met?    
Metadata    
Has all required metadata been collected?            
Is the captured metadata relevant, accurate and linked to the correct records, e.g. appropriate levels of security applied, accurate creation date captured, correct document author and scanner operator identified etc.?         
Can metadata be interpreted consistently?    
Is metadata that is stored in more than one location synchronised?    
Are all mandatory metadata fields complete?         
Has the usefulness of the metadata been assessed over time?       
Are digital records being re-imaged and metadata re-captured in line with procedures when they do not meet quality standards?     
Random sampling    
Is any sampling conducted in line with documented procedures, e.g. in line with the frequency specified?           
Do the agreed samples for quality checking represent the range and quality of records digitised?      
Image enhancement    
Are the scope and extent of use of any image enhancement techniques documented?    
Have these been checked so they do not result in the loss of information?    
Quality failure and re-imaging    
Are failures in the reliability of storage logged, analysed and addressed?           
Are quality failures logged, analysed and addressed?    
Have records been re-imaged and/or metadata reassigned according to procedures when initial digitisation does not meet quality standards?    
Management of original paper records    
Are all original paper records being carefully handled during retrieval, preparation for digitisation and the digitisation process?        
Where original paper records are not destroyed, are they returned to their original order and storage after digitisation?           
If original paper records are destroyed, are they kept for the predetermined retention period after digitisation (for quality assurance purposes) before being destroyed?            
If original paper records are destroyed, is this in accordance with the General retention and disposal authority: original or source records that have been copied.?    
Management of digital images where original paper records are destroyed    
Are digital images stored in accordance with the organisation’s documented requirements, e.g. in a recordkeeping system?     
Is the storage system tested to ensure digital images remain protected?       
Documentation    
Is quality control data (such as logs, reports, decisions) documented and captured as records and managed as part of the digital images’ metadata?     
Review    
Are benchmarks and quality assurance procedures implemented and regularly reviewed?        

Footnotes

[1] ISO 12653 – 1; 2000, Electronic imaging – Test targets for the black and white scanning of office documents, Part 1 – Characteristics, and Part 2- methods of use. ISO/TR 15801:2004, Electronic imaging – Information stored electronically – Recommendations for trustworthiness and reliability.

[2] Archives New Zealand, Digitisation standard, 2007, Appendix 7.

[3] Queensland State Archives, Digitisation Disposal Policy Toolkit, Quality Assurance Guideline, May 2010, section 2.2.1.

[4] Archives New Zealand, ibid., Appendix 7; Stuart D. Less, Digital imaging, a practical handbook, Library association Publishing London  2001, p.84; Queensland State Archives, Digitisation Disposal Policy Toolkit, Quality Assurance Guideline, May 2010, op.cit., section 2.2.1

[5] Queensland State Archives, op.cit., section 2.2.1.

[6] Archives New Zealand, op.cit.

[7] Loc.cit

[8] Queensland State Archives, op.cit., 2.2.3.

[9] JISC Digital media, Quality assurance and digitisation projects, 14 November 2008.

[10] Archives New Zealand, op.cit, Appendix 7; Queensland State Archives, Digitisation Disposal Policy Toolkit, Quality Assurance Guideline, May 2010, op.cit., section 2.2.4

[11] Archives New Zealand, Loc.cit.

[12] Loc.cit

[13] Loc.cit

Back to top
Recordkeeping Advice