Saturday, 19 July 2008

Notes on Knowledge Management.

The business of universities is all about knowledge. The following refernces are some links for notes I am working on on a paper.

1. Accetta, R. (2001) “E-Learning Crossfire” (Saved: 22 Oct 2002) (
2. Adler, A., (1959). “Understanding Human Nature”. New York: Premier Books.
3. Alexander, S (1995)”Teaching and Learning on the World Wide Web” Proceedings of AusWeb ’95
4. Ansoff, I. (1965), “Corporate Strategy”, McGraw-Hill, New York.
5. Azadegan, S. Lavine, M. O'Leary, M. Wijesinha, A. Zimand, M. “An Undergraduate Track in Computer Security”. ACM SIGCSE Bulletin, Proceedings of the annual conference on Innovation and technology in computer science education, Volume 35 Issue 3. June 2003. Available on March 14, 2006 -
6. Beirne, T. Brecht, H.D. & Sauls, E. (2002) “Using the web to serve students as information clients” Proceedings of Informing Sciences and IT Education Conference, University College Cork, Cork IRE (June 19-21, 2002)
7. Beldore, T & Brecht, H.D. & Sauls, E. (2002) “Online Education: The future is now” Socrates Distance Learning Technologies Group, Academic Research and Technologies
8. Berge, Z (1993) “Computer Conferencing and Online Education” The Point Electronic Journal on Virtual Culture, (1) 3.
9. Beyer, H. and Holtzblatt, K. (1998) “Contextual Design Defining Customer-Centered Systems”, Morgan Kaufman Publishers Inc
10. Bogolea, Bradley & Wijekumar, Kay (2004) “Information Security Curriculum Creation: A Case Study” ACM InfoSecCD Conference’04, October 8, 2004, Kennesaw, GA, USA.
11. Bolman, Lee & Deal, Terrance, (2003) “Reframing Organizations : Artistry, Choice, and Leadership”, Jossey-Bass; 3 edition, USA
12. Border, Charles. Holden, Ed. “Security Education within the IT Curriculum”. Proceeding of the 4th conference on information technology curriculum on Information technology education. Oct 2003. Available on March 14, 2006 -
13. Boud, D., Dunn, J. and Hegarty-Hazel, E. (1986) “Teaching in Laboratories”. SRHE & NFER-Nelson, Surrey, UK
14. Boyd, C. and Mathuria, A. (2003) “Protocols for Authentication and Key Establishment”. Springer-Verlag, Berlin, Germany
15. Burgess, S. & Darbyshire, P. (2003) “A Comparison between the use of IT in Business and Education: Applications of the Internet to Tertiary Education” Victoria University, Au
16. Cox, R. and Light, G. (2001) “Learning & Teaching in Higher Education: The reflective professional”. Sage Publications, London, UK
17. Crowley, Ed. “Information System Security Curricula Development.” Proceeding of the 4th conference on information technology curriculum on Information technology education. Oct 2003. Available on March 11, 2006 at
18. Darbyshire, P. (1999) “Distributed Web Based assignment submission and access” Proceedings – International Resource Management Assoc. IRMA 1999. Hershey PA: Idea Publishing Group
19. Dark, Melissa J. (2004.a) “Assessing Student Performance Outcomes in an Information Security Risk Assessment, Service Learning Course” Purdue University
20. Dark, Melissa. J. (2004.b). “Civic Responsibility and Information Security: An Information Security Management, Service Learning Course”. Proceedings of the Information Security Curriculum Development Conference, 2004.
21. Davies, I.K. 1994, “Process re-design for enhanced human performance”. Performance Improvement Quarterly, 7 (3): 103-113
22. Earl, M.J. (1989) “Management Strategies for Information Technology” Prentice Hall NY
23. Ford, W. and Baum, M. S. (1997) “Secure Electronic Commerce”. Prentice Hall
24. Garfinkel, S. and Spafford, G. (2001) “Web Security, Privacy & Commerce”. 2nd Edition. Cambridge, Mass: O'Reilly
25. Ghosh, A. K. (1998) “E-Commerce Security”. Wiley
26. Ghoshal, S. & Bartlett, C.A. 1995”Changing the role of top management: beyond structure to processes”. Harvard Business Review, January-February: 86-96
27. Habermas, J. (1984) “The Theory of Communicative Action: reasons and Rationalisation of Society” Beacon Press, Boston MA
28. Hay/McBer (2000). “Research into teacher effectiveness: A model of teacher effectiveness report by Hay McBer to the Department for Education and Employment”. Report prepared by Hay/McBer for the government of the United Kingdom,
29. Infosec Graduate Program. Purdue University. First viewed on March 12, 2006 at
30. Issacs, Henri (2003) “On-Line Case Discussion: A Methodology” Paris Dauphine University, France
31. Jackson, Nancy. (1992); Chapter 7 “Training Needs: An Objective Science?” In Training for What? Labour Perspectives on Job Training, ed. Nancy Jackson. Toronto: Our Schools/Our Selves Education Foundation.
32. Jacques, D., Gibbs, G. and Rust, C. (1991) “Designing and Evaluating Courses”. Oxford Brookes University, Oxford, UK
33. Kalakota, R. and Whinston, A. B. (1996) “Frontiers of Electronic Commerce”. Addison-Wesley
34. Kollock, P. (1997) “The design prnciples for Online Communities” The Internet and Society: Harvard Conference Proceedings, Cambridge, MA: O’Reilly & Assoc.
35. Kotter, John P. & Cohen, Dan S. (1992). “The heart of change”. Harvard Business School Press, Boston, Massachusetts
36. Kumar, A., Kumar, P., & Basu, S. C. (2002) “Student Perceptions of virtual Education; An exploratory study” In M. Khosrow-Pour (Ed.) Web Based Instructional Learning (pp 132-141) Hershey, PA IRM Press.
37. Lane, David, 2004, “Foundations of HRM, Performance and Compensation Management”, Course Notes, University of SA
38. Longworth, N (1999) “Making Lifelong Learning Work” Kogan Page, London UK
39. Master of Science degree program in Information Security and Assurance. George Mason University. Available on March 12, 2006 at
40. Master of Science in Security Informatics. Johns Hopkins University. Available on March 12, 2006 at
41. McGill, Tanya, Ed. (2002) “Current issues in IT education”, Murdock University, IRM Press, Melbourne, Australia.
42. McLelland, Ross (2004), “Emotional intelligence in the Australian context”, Pacific Consulting,
43. Mintzberg, H. (1994) “The rise and fall of strategic planning”. New York: Free Press
44. Peltz, P (2000) “Do virtual classrooms make the grade” (Saved article from Apr 2001, no current link)
45. Porter, M.E. & Millar, V.E. (1985) “How Information gives you competitive advantage” Harvard Business Review, 63, 4 July/August pp 149-160.
46. Rice, R (1993) “Media appropriateness using social presence theory to compare traditional and new organisational media” Human Communications Research, 19, (pp 451 - 484)
47. Robey, D. (1994). “Designing Organizations”. (4th Ed.). Irwin. Homewood, Illinois
48. Romm, C & Taylor W. (2000) “Online Education – Can we combine efficiency with quality?” Proceedings ACIS, Brisbane 6-8, 2000
49. Romm, C & Taylor W. (2001) “Teaching Online is about Psychology – not technology” In M.Khosrow-Pour (Ed.) Proceedings of IRMA Conference, Hersey PA, Idea Group Publishing
50. Rummler, G.A. and Brache, A.P. 1995 “Improving performance”. 2nd edition, San Francisco: Jossey Bass
51. Salmon, G (2000) “E-Moderating: The Key to Teaching and Learning Online” Kogan Page, London UK
52. Schooley, C (2001) “Online universities introduce alternatives for higher education” Planning Assumption. GIGA Group.
53. Senge, P. M. (1994). “The Fifth Discipline: The Art & Practice of the Learning Organization”. New York: Currency-Doubleday. Stace, D. and Dunphy, D. (2001), “Beyond the Boundaries”, 2nd ed. McGraw-Hill Australia: Roseville.
54. Shaikh, Siraj A. (2004) “Information Security Education in the UK: a proposed course in Secure E-Commerce Systems” ACM InfoSecCD Conference’04, October 8, 2004, Kennesaw, GA, USA.
55. Sherif, M. H. (2000) “Protocols for Secure Electronic Commerce”. CRC Press
56. Simon, Judith C., Brooks, Llyod D., & Wilkes, Ronald B. (2003) “Empirical Study of Student’s Perceptions of Online Classes” The University of Memphis, USA
57. Slusky, Ludwig & Partow-Navid Parviz (2003) “Training in Remote Database Server Administration” California State University US
58. Stiller, A.D. (2003) “Designing e-Business and e-Commerce Courses to Meet Industry Needs”, University of the Sunshine Coast, Au
59. Taylor, Wal; Dekkers, John; & Marshall, Stewart (2003) “Community Informatics – Enabling Emancipatory Learning” Central Qld University Au
60. Tsang, P. & Fong, T. L. (1998) “Learning support via the web: How do know I make a difference?” Proceedings of the 12th Annual Conference of the Asian Association of Open Universities, New Delhi, 4-6 Nov 1998
61. Vaughn Jr. Rayford B., Dampier, David A. & Warkentin, Merrill B (2004) “Building an Information Security Education Program” ACM InfoSecCD Conference’04, October 8, 2004, Kennesaw, US
62. Vroom, V.H. 1964, “Work and Motivation”. New York: John Wiley & Sons.
63. Weade, R., & Gritzmacher, J. 1987. “Personality characteristics and curriculum design preferences of vocational home economics educators”. Journal of Vocational Education Research, 12(2), 1-18.
64. Weil, N (2001) “University net courses help make pros make the grade” (Viewed and saved; 16 May 2003) (
65. Wellman, B, Salaff, J., Dimitrova, D., Garton, L., Gulia, M., and Haythornthwaite, C. (1996) “Computer networks as social networks: Collaborative work, telework, and virtual community” Annual review of Sociology, 22 (p 213 – 238)
66. Wellman, B., Boase, J. & Chen, W. (2002) “The Networked Nature of Community: Online and Offline”. IT & Society, 1(1), 151--165.
67. Winfield Treese, G. and Stewart, L. C. (2002) “Designing Systems for Internet Commerce”. 2nd Edition, Addison-Wesley
68. Yang, Andrew. “Computer Security and Impact on Computer Science Education”. The Journal of Computing Small Colleges, Proceedings of the sixth annual CCSC north-eastern conference on the journal of computing in small colleges, Volume 16 Issue 4. April 2001. Available on March 21, 2006 -

Thursday, 17 July 2008

Talk about an old Bug...

Otto Moerbeek, a developer from OpenBSD discovered a bug in YACC recently. "Funny thing is that I traced this back to Sixth Edition Unix, released in 1975”.

This bug follows the May discovery of a 25 years old BSD flaw.

So how many code errors have occured as a result of the parser error in the last 33 years?

Tuesday, 15 July 2008


The following is a guideline to concerns associated with IT audits of Financial Systems. This is focused on the Australian Audit Standards, but it can be adapted to other standards and approaches. What needs to be remembered is the scope of the audit. Security is a component of a financial systems review, but it is not the goal. For instance, spending time on the privacy concerns of a system is unlikely to bring benifits when the scope of the IT audit is to suppirt a financial audit looking at the accuracy of the finacual statements. Yes there may be issues, but not everything can be solved in one go.

Information Systems audit is based on the generic concept of auditing. Simply put, an audit, any audit, is the comparison of actual conditions to expected conditions, and a determination as to whether one is in conformance or not in conformance. This is the same philosophy used to perform financial, quality, regulatory compliance, and systems audits. It is prudent to first review what the common elements are in order to better understand why audits are different.
There are several definitions of audit components that are common to any type of audit. ISO 27001 and COBiT each define these terms for Information Systems audits, but they apply in other cases also. The concepts and processes are generic enough to be applied “as is” to other types of audits. What is generally missing is an idea of how the standards actually apply and an interpretation of the standards in direct relation to the systems being tested.

An audit is fundamentally a comparison of audit evidence to audit criteria to determine findings. The evidence is the objective information collected through interviews, visual reconnaissance, and documentation review. The audit criteria are the expectations or “rules” of how conditions should be. It is the criteria that distinguish one audit from the next. For example, in compliance auditing, the criteria are the regulations. With an Information Systems audit, the criteria would be the description of the expected system elements. In this case, the Information Systems criteria would be that described in ISO 27001/2 and this can be directly aligned to the Australian Auditing (AUS) Standards and CLERP 9 as it applies to Government audits.
When evidence is compared to criteria, one can determine whether the audited entity does or does not conform. This determination is a finding, and a finding can either be one of conformance, or non-conformance. Therefore, an audit will always produce findings, even if what is being audited is in full conformance with criteria.

Other key definitions to be aware of with auditing are: objectives, scope, auditee, client, and auditor. The audit objective(s) is simply why you are conducting an audit; usually the reason is to demonstrate conformance to stated criteria. The audit scope is what entity is being audited, and can be a department, a site, or unit within a site or department.

In both the COBiT and ISO 2700x realms, there are clear distinction between the auditee and client. The auditee is the entity being audited. The client is the party commissioning the audit. For example, a client can be the customer, and the auditee a supplier to that customer. In ISO 2700x, this distinction is important because the client sets the scope, objectives, and plan for an audit, not the auditee, although it is expected the auditee will be involved and cooperate. This reflects the situation that is described within the various Australian Auditing Standards (AUS).
The auditor is the one actually collecting evidence and determining findings. The auditor can be comprised of several individuals on a team. There are requirements in ISO 2700x that state that those performing functions within the Information Systems, such as the auditors, be qualified in their tasks. This means the auditors must have received training in Information Systems auditing. However, there may be audit team members who do not have the training, but are on the team because of some unique expertise, such as process, language, or regulatory knowledge.

1.1. The Difference Between Compliance Auditing and Systems Auditing
Above, we discussed what is fundamentally the same among all audit types as well as what makes them different. Often however, there is confusion between regulatory compliance auditing and Information Systems auditing. This is because there are many elements of regulatory compliance that overlap with the Information Systems. In fact, Information Systems audit is becoming a more critical aspect of financial audits over time.

The distinction is that this is not a general security audit and the scope and goals of an information systems audit as a component of a financial audit are different than the scope of what many information systems auditors generally select.

The criteria in a financial compliance audit are the applicable regulations, whereas the criteria in an Information Systems audit would be the complete set of controls from a standard such as COBiT or ISO 27001. But do these not address financial systems risk and compliance? The answer is yes, but from a system standpoint, not from the view of security generally. The scope and goals of the audit are different.

In other words, the standards require that certain procedures exist regarding identification of legal and other requirements, that periodic compliance assessments be performed, that legal requirements be considered in setting objectives and targets, and that there be a commitment to compliance.

In summary, the goal of the financial systems audit is to verify compliance with regulations and to measure financial risk, whereas the Information Systems audit’s goals are commonly aligned to the overall security of a system. The aim of the proposed training is to ensure that the Information Systems auditor learns the correct processes and controls to test in order to give the financial systems auditor a level of comfort around the systems. The format of this training will align information systems processes (including ISO 27001 and COBiT) with the needs of a financial audit.

1.2. Essential Features of an Audit
The Information Systems audit incorporates in a condensed form the following general features that are essential elements of any audit, i.e.:
- They are pre-planned and methodical in nature rather than haphazard
- They should be free from bias or prejudice
- They encompass some form of inquiry and critical consideration of the resultant findings
- They are concerned with all activities that affect financial issues and with results reflecting financial performance
- They should ensure that such activities are carried out in an effective and consistent manner in accordance with planned arrangements

1.3. Why Perform Information Systems audits?
In order to confirm that the defined financial systems are operating effectively, it is essential to carry out some form of monitoring activity in addition to ongoing monitoring and measurement. This includes testing the information systems related to the financial risk areas being tested. Listed below are some of the potential benefits of adopting Information Systems audits as the basis of any such additional monitoring:
- They provide a means of confirming that the Information Systems policy is understood and is being implemented.
- They give management confidence that the system is being implemented in the manner prescribed.
- They provide a structured means of identifying deficiencies in the system, agreeing on corrective action, and following up to confirm effectiveness.
- They enable system weaknesses in the financial controls to be highlighted before the related potential problems are reflected in the financial performance.
- They provide a convenient framework for investigating operations in a particular area, e.g., in response to financial problems.
- Again, if they involve personnel from other areas, the opportunity is created for interchange of ideas so that successful features of an area’s system can be applied elsewhere if appropriate.
- They can, by involving personnel more widely in the operations of the business, lead to increased commitment and motivation.

2. The Audit Process
The entire audit process can be described as planning, executing, and reporting. ISO 27001 on Information Systems audit procedures was created to describe this process, and provide suggestions on setting up audit programs. Recall that both COBiT and ISO 27001 require that the organization establish auditing programs and procedures. In this section, we will examine the three major steps of auditing in detail, providing examples and suggestions towards establishing an audit program. Once the audit program is put together to test a site’s Information Systems, it should not have to be changed appreciably.

2.1. Planning the Audit
With Information Systems auditing, as with any type of auditing, a very important step is planning the audit. This involves preparing the specific audit plan, making team assignments, deciding on working documents, and addressing any unique extenuating circumstances. To understand the importance of planning, imagine going on vacation without planning; in other words, not knowing where you are going, what you will do, or how long you will be gone.

The Audit Plan
The audit plan is the document that establishes the scope, objectives and criteria, and schedule of the audit. It also goes into specific details on what areas will be audited, when, and by whom. Other details such as which checklists may be used, how the report is to be formatted and distributed, and how meetings will be conducted can also be included in the plan. In essence, the audit plan reflects the programs, procedures, and methodologies of the Information Systems audit process, in accordance with the Australian Audit Standards. It can be determined for instance that the entire Information Systems will be audited once per year, but in four partial events. This schedule then becomes part of the procedure.

The audit scope defines what part of the organization will be audited. Obviously, this should coincide with the scope of the Information Systems itself, and is usually the site in question. If the full Information Systems audit is divided in smaller segments conducted throughout the year, then the scope of any given segment is what portion of the organization will be audited at that time. Typically, an organization will create a chart or matrix showing the various divisions of the site or activity and when it will be audited. A typical entry may show the a particular department being audited in the first quarter and production in the fourth quarter, for example.
Also noted in the audit plan is the audit objective(s). The audit objective describes why an audit is being conducted. Typically the reason is to conform to AUS315, 330 and 505 requiring that the Information Systems be periodically evaluated. Another reason is demonstrate conformance to others.

Although Information Systems audits may appear in their own right to be “good practice”, it is essential that auditors have a clear concept of what the general objectives of such audits are.
The definition of Information Systems audits highlights the need to confirm conformance with planned arrangements and to ensure that these arrangements are effective and suitable to achieve objectives. ISO 27001 and COBiT each expand this to form a number of general objectives for any type of Information Systems audit. Audits should be carried out to:
- determine conformance of an auditee’s Information Systems with the Information Systems audit criteria
- determine whether the auditee’s Information Systems has been properly implemented and maintained
- to identify areas of potential improvement in the auditee’s Information Systems
- assess the ability of the internal management review process to ensure the continuing suitability and effectiveness of the Information Systems
- evaluate the Information Systems of an organization where there is a desire to establish a contractual relationship, such as with a potential supplier or a joint-venture partner.
Using this definition and sources such as ISO 27001 and COBiT, the following statement of the specific objectives of an Information Systems audit has been developed. The results from a departments Internal audits should be tested to ensure that:
- The Information Systems continues to meet the needs of the business
- The necessary documented procedures that exist are practical and satisfy any specified requirements
- The necessary documented procedures are understood and followed by appropriately trained personnel
- Areas of conformity and nonconformity with respect to implementation of the Information Systems system are identified and corrective action implemented
- The effectiveness of the system in meeting the Information Systems objectives is determined and that a basis is created for identifying opportunities and initiating actions to improve the Information Systems system.

The audit criteria define what the “rules” are. The criteria consists of the elements of the various applicable sections of the Australian Audit standards and how these relate to ISO 27001 and COBiT. A subtle point to note however is that the site’s Information Systems requirements are also part of the criteria. This means that in addition to responding to the requirements of ISO 27001, the Information Systems must also respond to “planned arrangements”, or what the organization said it was going to do. In audits, a common response is “the standard does not require such and such detail”. However, if the site’s procedure does require some specific response, then it becomes part of the criteria. In essence, the auditors are verifying the system not only to the various AUS clauses and CLERP 9, but also to what the Information Systems documentation states.

How the audit is divided and scheduled throughout the time interval is up to the organization and will be a function of minimizing disruption to site operations and resource needs. The only requirement is that the full audit be completed within the frequency established in the procedures under the Australian Audit Standards. One of the requirements regarding frequency is that how often an area is audited be in part a function of prior audit results. This means that the planned frequency may change with time based on what auditors are finding.
How long each audit takes again is a function of resource needs and operations. It is recommended, however, that any individual audit event not be protracted out over long time periods. The longer a task takes, the easier it is to get distracted and lose focus.
When developing an audit plan, it is wise to consider the three C’s of the Australian Audit Standards in regards to Information Systems auditing: Conformance, Consistency, and Continual Improvement. Conformance relates to addressing each of the requirements of the standard, i.e., the “shalls”. Consistency relates to how well each procedure or process of the Information Systems relates to the others. In other words, do objectives and targets reflect the policy commitments? Are personnel trained on the correct legal and other requirements? Finally, Continual Improvement requires that the system lead to improvements in the system itself as well as with financial performance. A system that has all the prerequisite procedures, but remains static, is not in conformance.

With the three C’s in mind, one now sees why it is best to audit all applicable elements of the standard in a given area at one time, rather that tracing any one standard element throughout various areas.

The training will teach information systems auditors how to create an audit plan with elements that are in compliance with the aims of the Australian Audit Standards:
· the audit objectives and scope;
· the audit criteria;
· identification of the auditee’s organizational and functional units to be audited;
· identification of the functions and/or individuals within the auditee’s organization having significant direct responsibilities regarding the auditee’s Information Systems;
· identification of those elements of the auditee’s Information Systems that are of high audit priority;
· the procedures for auditing the auditee’s Information Systems elements as appropriate for the auditee’s organization;
· the working and reporting languages of the audit;
· identification of reference documents;
· the expected time and duration for major audit activities;
· the dates and places where the audit is to be conducted;
· identification of audit team members;
· the schedule of meetings to be held with the auditee’s management;
· confidentiality requirements;
· report content and format, expected date of issue and distribution of the audit report;
· document retention requirements.

If the audit is to proceed smoothly, it is helpful for the auditor to establish a dialogue prior to the actual audit with the person responsible for the area being audited as well as the financial auditor. This dialogue may be conducted by memo, telephone, or during a formal or informal meeting. The main factor that should influence the auditor’s choice of method for setting up this dialogue should be the department’s normal style or culture. Irrespective of the method of communication the auditor adopts, the following points should be established:
· The overall duration of the proposed audit
· The starting location and time
· The proposed scope and areas to be covered by the audit
· A timetable for approximate progress of the audit where applicable, e.g., if a number of different departments or geographical areas are to be included in the scope of the audit
· The arrangements for any close out meeting where the findings of the audit can be agreed and corrective action requirements discussed
· The personnel liable to be involved at each stage of the audit

If an auditor does not give sufficient attention to ensuring that clear agreement is reached with respect to the above points, the potential for misunderstandings that can affect the conduct of the audit is greatly increased. However, these initial communications with the personnel of the area being audited not only affect the “tone” of the forthcoming audit, but they can significantly influence the commitment and level of cooperation shown by that area throughout the audit process and for many subsequent audits.

Prior to commencing the audit, but once the plan is prepared, the audit team assignments are made, and working documents are defined. Working documents are those documents such as observation logs and checklists that are used during the audit to collect evidence, but are not necessarily retained as records. In other words, they may be discarded after the audit is complete and the report prepared.

Of these, only the checklist should require an input at this stage from the auditor. However, before compiling a checklist, the auditor must determine if the function and format of the checklist are prescribed by the audit procedure or whether personal preference can be exercised.

The format of the checklist may vary considerably, depending on whether it is intended to act as an aide or as a part of audit records showing the scope and conduct of the audit. The former may consist only of general topics to be covered during the audit, whereas the latter may be an extensive and detailed questionnaire on which details of sampling and answers to the questions are to be recorded.

The need for checklists and the type appropriate will vary according to other experience of the auditors and the culture of the department. It is recommended that for purposes of the audit, checklists, even if limited, should always be developed. However, standard questionnaire type checklists not prepared by the auditor that must be slavishly followed and completed, should be avoided. This latter type is likely to result in an unnecessary restriction in the scope of the audit and a stifling of auditor initiative.

Although an auditor should always work within the scope defined for the audit, the working documents must not be designed so that they restrict additional audit activities or investigations that may become necessary as a result of information gained during the audit. There are differences of opinion over whether it is preferable to create the checklist anew or whether a previously developed checklist can be used. Although the former is desirable in principle, it is not always practical in terms of the best use of the resources available. The best compromise is to utilize whatever available checklists are already in existence, but to review these critically against the relevant documents previously identified. In this way, time can be saved in using them as a foundation without detracting from effectiveness.

Collecting Evidence
Having established with the auditee and client the scope of the audit, now is the time to undertake an initial review of the related documentation, which will normally consist of:
· The Information Systems manual and the procedures applicable to the area being audited and any related documentation
· Regulatory documents and specifications that typically apply in the area being audited
· The findings of the last audit of the area and any available audit checklists relating to that area
· Any records of corrective action analysis relating to that area

The examination of the Information Systems manual and procedures undertaken at this early stage is a general review rather than an in-depth study essential for checklist compilation. At this stage the auditor should confirm the adequacy of the proposed scope, e.g., if a manager has provided him with an audit schedule which references various procedures but does not include one which the auditor considers essential to the operations in the area being audited. The auditor, in undertaking this general review, should also consider how much time is necessary to prepare the required checklists and to perform the audit, and confirm that this is compatible with the actual time available. Lastly, the auditor must satisfy himself that the business systems and/or technology involved in the area being audited are not so unfamiliar to him that they undermine his ability to conduct the audit.

The foundation of a good audit is effective evidence gathering. The ultimate interpretation of the data to develop findings will only be as good as the raw data. The auditing planning process described above was in part intended to identify the criteria and decide what information must be collected to verify conformance. This leads to the conclusion that the auditor must be aware of not only what the requirement is, but what type of information will be appropriate to verify conformance.

Orienting Oneself to Audit
To be most effective, the auditor should be somewhat familiar with the specific area they will be auditing. This familiarization goes into more depth than the audit plan. For example, proper preparation will include knowing an area’s significant aspects, objectives and targets, monitoring and measurement needs, and supporting documentation. Documentation can include reference documents, work instructions, procedures, records, and calibration procedures. This needs to be aligned to the objectives and focused on the requirements in the Australian Audit Standards.

As this is quickly throuwn together, there are a few lines paraphrased from the Australian Audit Standards, COBIT (ISACA), ISO 27001 and have not been listed line by line.

Sunday, 13 July 2008

Errors in Evidence

There are important reasons for using a good forensic methodology and keeping an open mind. As forensic practitioners, it is not our job to judge another. Rather, we provide evidence. This should occur in a fair and unbiased manner.

Saks & Koehler (2005) demonstrated the following factors that played a significant role in the wrongful conviction of 86 people who where later exonerated through DNA evidence:

  1. 71% Eyewitness error
  2. 63% Forensic science testing errors
  3. 44% Police misconduct
  4. 28% Prosecutorial misconduct
  5. 27% False/misleading testimony by forensic scientists
  6. 19% Dishonest informants
  7. 19% Incompetent defense representation
  8. 17% False testimony by lay witness
  9. 17% False confessions

It is important to note that Forensic Science Testing errors and false or misleading evidence from the forensic expert played a large part in the worngful convictions of over 60 of the 86 wrongly convicted people.

Forensic science is not there to prove a point, it is there to scientifically examine evidence and provide the evidence available in a completely unbiased manner. This means reporting all of the facts. What many people in the industry forget is that you are working for the court and justice system, not the people who pay you.

[1] Saks, M. Koehler, J. (2005) "The coming paradigm shift in forensic identification science". Science Magazine, 309, Pp. 892-895