Friday, 12 December 2008

Password Changing

I have been quantifying IS risk for some time now. On that note I have a paper topic I am writing for next year on password security. In contradiction to the myth, changing passwords turns out to not be a good idea.

Forcing password changes is a cost. The only events where a password change should be mandated are:

  1. Known or “reasonably certain” compromise
  2. Shared devices
Adding effective password controls is more effective then password changing. In fact, the benefit of a password changing regime is negligible and even reduces the complexity of the average password.

There are many reasons for this and the value of the cost function changes with the technical capability and awareness levels of the user (interestingly with a risk in the end tail for highly aware users). I will be detailing these with a quantitative model in the next year. The effects are something like the poor ASCI art demonstrated below. In this graph, the x-axis represents technical capability generally (not just security). The y-axis is the associated cost mean with a password change process.

*
***
****
******* *
************ *****
*************** *********
*************************
*************************
*************************
-------------------------


Cost functions are measured in both pure economic and survival times (using a lambda function). What can be demonstrated is that there is an inflection point of knowledge where experienced and knowledgeable users start to bypass the controls and mitigate the control benefit.

The benefits of not changing passwords decrease after 12 months to near zero difference (that is the optimal strategy if password changing is mandated is 12 monthly.

The benefits also increase as password size increases. This starts to level off after 12 character passwords.

One thing noted is that users will form password patterns that can be guessed and even used to determine passwords (from encrypted data). Even with complexity, password changing at 30 days has demonstrated that over 65.4% (SE +/- 4.56, Australian companies only tested) of users maintain what is in effect the same password with cosmetic changes based on some formula (ie, passw0rd1!, passw0rd2!, passw0rd3!).

These passwords are also generally used on multiple systems and a common component of them can remain on systems where no change occurs aiding in cracking the others.

The data is based on an analysis of the practices of over 300 companies (from SME’s to listed’s). The total user base tested is over 115,700 people and the data has been collected over 4 years.

What wine?

Well to add to the qualification list, I have completed my wine course. I do not think I will be giving up the current job to become a somiliar however.



Of course, it took a good deal of study and trials of many wines not on the course list. I had to also do additional study trialing wines from many locations and of differnt types of grape.



Being a good student is simple when it involves drinking good wine...

Wednesday, 10 December 2008

Adding verification to a script

In the DD example the other day, the file output was written without checking if a file exists. The following is an example of how you can add a small amount of script to verify that you are not overwriting an existing file:

if [ -f $FILE ]
then
echo "The file [
$FILE] that you are seeking write already exists"
echo "Do you want to overwrite the existing file? ( y/n ) : \c"
read RESPONSE
if [ "$RESPONSE" = "n" ] || [ "$RESPONSE" = "N" ]
then
echo "The file will not be overwritten and the process will abort!"
exit
fi
fi

It is also a good idea to use the full path in a script. Users can change the path variables they are exposed to and unless you set these (either explicitly or by adding a profile for the script to use) an attacker could use a system script to run their own binary.

More on this another time.

Tuesday, 9 December 2008

DATs and the future of Audit and Assurance

Sayes law of economics shows us that gains in productivity offset any economic equilibrium leaving the general state of the economy one being of flux or change. In this, the undertakings that survive are those that embrace change. This requires entrepreneurial thought and constant innovation.

In contradiction to the common belief that entrepreneurs necessarily start new businesses, Sayes’ definition of an entrepreneur was one that shifts the means of production from less productive to more productive enterprises. In this, the entrepreneur is anyone who increases an undertaking's productivity.

Change does not happen as quickly as people believe. Through the nature of compound interest, small incremental changes result in large subsequent results. Currently, and for a number of years, the big four international accounting firms have been engaging in technology research and productivity innovations that have delivered between 4 and 6% each year for the last decade. This may seem small, but when you consider that a 5% yearly compounding rate over the last 10 years together has made an incremental 50%+ increase on productivity from just a decade ago.

From my observations, the accounting and audit would seem to be increasing its productivity at a rate of between 1 and 3% per annum. At this rate, not only can organizations who are not growing fail to maintain equilibrium (this is currently attained through exceeding with the big four) in the long run, but within a decade, small to medium firms will likely lose up to 50% of their business to them.

Lemar Swinney of KPMG and several groups within PWC are actively researching “the future of the financial audit”. John Fogarty directs Deloitte's "third generation audit" which is focused on a similar line. He has been quoted with saying, “Web-based audits. In the future, a company's financial accounts and data will be completely digitized. The Web will act as host. That will allow auditors to sit in one location and access all necessary corporate information and transactions. While the technology for this exists and while there are small-scale experiments under way, Swinney believes widespread Web-based audits are "realistically six, seven years down the road."

Existing research has resulted in advanced CAAT technologies now known as DATs (digital audit techniques). DATs (research available on request) are consistently detecting over 90% of all financial statement frauds. The big four firms are starting to implement these technologies. Some of the Mid-Tiers are following suit (and I am trying to lead this change). But what of the others?

These technologies will be commercial on a wide scale usage within the next decade.

DATs have also shown and accuracy of over 96% on analysis of non-fraud financial statements. When teams are developed implementing both traditional audit techniques and the use of advanced technologies and mathematical formulations, the accuracy has exceeded 99.8%.

Current figures put traditional audit techniques at a level of 8% accuracy in the determination of financial statement fraud.

There has been a lot of discussion concerning productivity of late. MOst audit firms operate as isolated pockets of technical skills. We embrace our skills close to ourselves and do not share them. We do not seek ways to work together.

Not only are DAT based audits more accurate, but they are faster and more productive. This is not incrementally more productive, rather studies have shown that they are capable of being up to 90% more productive than existing audit techniques.

To make these types of productivity gains, we don’t need to work harder we need to follow the oft stated idiom that we need to work smarter. We need to look at working with each other and thinking about how we can better implement technology.

These techniques are not going to go away. Change is pervasive, either we embrace it in an entrepreneurial manner or it will steam roller us.

This is a call for change and progress in the industry as a whole!

Monday, 8 December 2008

Why care about the ability to reverse a file?

In my prior post I looked at using "dd" to reverse a file (bit by bit or by sectors). This is of value for a number of reasons:

  1. Attackers could do this to bypass filters, controls and other protections
  2. Anti-forensics, finding the needle in a haystack is difficult - esp. when the tools do not help
  3. Pen Testing - just as in point 1 for attackers, the tester can use this to load tools without being detected by filters or through malware detection engines
Once a file has bypassed the perimeter controls, getting it to work inside an organization is simple. Hence a means to bypass controls is of interest to those on the attack side of the equation (both validly and less so).

Next, it is a concern to the forensic professional. Hiding files through reversing them makes the process of discovery a proverbial search for the needle in a haystack.

An interesting effect to try is to maintain the header on a bitmap file (ie skip the first portion of the file and reverse the later parts). What ends up occurring is that the image can be recreated upside down. All types of interesting effects can be found.

As always, the cards are stacked in favor of the attacker. When in a contest that pits rules against open morality, rules lose more than not. This does not mean that we give up, only that we have to understand the odds that are stacked against us and that it is also the case that people naturally err. This is when we (the "good" guys) win.