[fc-discuss] Financial Cryptography Update: Open Source Insurance, Dumb Things, Shuttle Reliability

iang@iang.org iang@iang.org
Sat, 10 Sep 2005 16:25:41 +0100 (BST)


 Financial Cryptography Update: Open Source Insurance, Dumb Things, Shuttle Reliability 

                           September 10, 2005


------------------------------------------------------------------------

https://www.financialcryptography.com/mt/archives/000549.html



------------------------------------------------------------------------

(Perilocity reports that) LLoyds and OSRM to issue open source
insurance, including being attacked by commercial vendors over IP
claims.

http://riskman.typepad.com/perilocity/2005/08/open_source_llo.html
http://www.channelregister.co.uk/2005/08/12/opensource_indemnification/

(Adam -> Bruce -> ) an article of the "Six Dumbest Ideas in Computer
Security" by Marcus Ranum.  I'm not sure who Marcus is but his name
keeps cropping up - and his list is pretty good:

#1) Default Permit (open by default)
#2) Enumerating Badness (cover all those we know about)
#3) Penetrate and Patch
#4) Hacking is Cool
#5) Educating Users
#6) Action is Better Than Inaction

http://www.ranum.com/security/computer_security/editorials/dumb/

I even agree with them, although I have my qualms about these two
"minor dumbs:"

 * "Let's go production with it now and we can secure it later" - no,
you won't. A better question to ask yourself is "If we don't have time
to do it correctly now, will we have time to do it over once it's
broken?" Sometimes, building a system that is in constant need of
repair means you will spend years investing in turd polish because you
were unwilling to spend days getting the job done right in the first
place. *

The reason this doesn't work is basic economics.  You can't generate
revenues until the business model is proven and working, and you can't
secure things properly until you've got a) the revenues to do so and b)
the proven business model to protect!  The security field is littered
with business models that secured properly and firstly, but very few of
them were successful, and often that was sufficient reason for their
failure.

Which is not to dispute the basic logic that most production systems
defer security until later and later never comes ... but there is an
economic incentives situation working here that more or less explains
why - only valuable things are secured, and a system that is not in
production is not valuable.

 * "We can't stop the occasional problem" - yes, you can. Would you
travel on commercial airliners if you thought that the aviation
industry took this approach with your life? I didn't think so. *

There's several errors here, starting with a badly formed premise
leading to can/can't arguments.  Secondly, we aren't in general risking
our life, just our computer (and identity as of late...).  Thirdly,
it's a risk based thing - there is no Axiomatic Right From On High that
you *have* to have secure computing, nor be able to drive safely to
work, nor fly.

Indeed no less than Richard Feynman is quoted (to support #3) which
talks about how to deal and misdeal with the occasional problem.

http://www.ranum.com/security/computer_security/editorials/dumb/feynman
.html

"Richard Fenyman's [sic] "Personal Observations on the Reliability of
the Space Shuttle" used to be required reading for the software
engineers that I hired. It contains some profound thoughts on
expectation of reliability and how it is achieved in complex systems.
In a nutshell its meaning to programmers is: "Unless your system was
supposed to be hackable  then it shouldn't be hackable."

Feynman found that the engineering approach to Shuttle problems was
(often or sometimes) to rewire the procedures.	Instead of fixing them,
the engineers would move the problems into the safety zone created
conveniently by design tolerances;   Insert here the normal management
pressures including the temptation to call the reliability as 1 in
100,000 where 1 in 100 is more likely!	(And even that seems too low to
me.)

Predictibly Feynman suggests not doing that, and finishes with this
quote:

"For a successful technology, reality must take precedence over public
relations, for nature cannot be fooled."

A true engineer :-)

-- 
Powered by Movable Type
Version 2.64
http://www.movabletype.org/