SANS Security Trend Line

A Conversation Around Supply Chain Integrity - Is There Any Real Way to Trust Products?

Bill Murray and I recently had an fun interchange on the topic of supply chain security and he's agreed to let me reproduce it here.

The starting point was a comment I made in SANS Newsbites on this news item:

[[60]] China Vetting Networking Gear

(May 22, 2014)

After the US Justice Department indicted five members of China's People's Liberation Army (PLA) for espionage, China has begun imposing inspection requirements for networking gear sold there. The US imposed similar restrictions on Chinese-made gear in 2012, essentially removing Chinese network equipment suppliers Huawei and ZTE from the US market.

 

Pescatore - I said last week "It's official, we are now in a Cyber Cold War" with China. During the Cold War with the USSR, the US was the dominant supplier of computing, networking and telecoms gear and had little to fear from the "Trade Wars" that come along with a Cold War. That is not true anymore - the US is just as dependent on global suppliers as everyone else. Supply chain integrity is important, but the UK approach of testing telecoms gear is a much more meaningful approach than trying to ban products from certain countries.

 

The "UK approach" I mentioned was the establishment (funded by Huawei) in the UK in 2010 of the Huawei Cyber Security Evaluation Centre to allow the UK CESG to examine and test all software to be used in Huawei equipment going into British national telecommunications networks. Contrast this to the current US approach: attempts by Congress to introduce legislation to ban use of telecomms equipment from China.

That led to this interchange between Bill Murray and myself:

From Bill: ... as much as I abhor any deliberately disorderly behavior in our space, I prefer "cyber war" to thermo nuclear war. I think that those who equate them have not given either enough thought.

From John- I tend to agree that, if given the choice of war type, I would choose cyber war over what they now call "ballistic warfare."

That said, it used to be pretty easy to be sure that buying a TV made in China did not have an explosive device built-in, not so easy to tell these days if a logic bomb is built into that Samsung TV - or that Cisco switch.

From Bill: I have given a lot of thought to the Trojan Horse problem and I find the analogy both illustrative and instructive.

The defenses that would have protected Troy included looking inside the horse and quarantining the horse. Kudos to the Brits for trying but the modern problem does not yield easily to either of these solutions. There are too many horses, the warriors are tiny, not to mention cloaked, and it is very dark inside.

Thirty years ago, when I first started to write on this subject, (see pages 16-19 of the attached 1981 paper) I was hopeful about the small glass horse strategy, i.e., accept as gifts only those horses too small and transparent to conceal armed warriors. Again the analogy fails when the objects pass by the millions per day, and the hostile entity is very similar in appearance to friendly ones. A malicious "button" is likely to appear as innocent and attractive as any other.

That leaves us with the tried and true but expensive strategies of layered defenses, compartmentation and isolation, selective inspection at borders (ingress and egress), mutually suspicious processes, and some tolerance for error.

Incidentally, these measures are as ancient as Troy. They are still the best we have. On the other hand, if we do not use them any better than the Trojans did, we can expect similar results.

I have no silver bullet to offer; I have been looking long enough that I suspect that there is not one. In a world where "leadership" promises, indeed must promise, zero risk, one expects to be about as popular as the skunk at the garden party.

"That is why we are called professionals and are paid the big bucks."

From John - at the root of the problem, of course, is that Software Engineering is still an oxymoron. (Bill interjects: Should not be used in the same paragraph, much less right next to one another) .Since there is no "handbook of material strengths" for software, there is no way to know if the software is "strong" enough for its intended use.

Bill sputters: Nonsense. That is just the excuse we make for continuing to reuse code that we KNOW to be weak.Take Flash, Reader, IE, and huge chunks of Windows. We do it because it saves us work and because we like the features and properties, i.e., generality, flexibility, and backward compatibility. The only executive in the history of our industry who ever said we will not do that is Steve Jobs. It shows in the difference in the quality of his products. Not perfect but an order of magnitude better than his competition.

John drones on: So, testing can't be perfect, but just as the car industry learned when it started to focus on supplier quality, I think you can get to a point where you have raised the strength of inbound product inspection and/or supplier qualification high enough that you reduce the number of security incidents to an acceptable cost of doing business - not to zero, but to an acceptable level.

Bill notes: Engineers do not "accept," weakness, they compensate for it. If they use materials of unknown strength, they do so in a manner that will not cause "the wings to come off."

John points out: In business mergers and acquisitions, a 60% failure rate seems to be acceptable to business - that's about what they average in the inexact "science" of doing due diligence before an acquisition. Heck, HP lost more money on the failed acquisition of Autonomy than probably the past five years of credit card breaches combined!

Bill opines: Most of this is merely inspection, not testing. It is the equivalent of inspecting the welds in the wings. It may detect weakness but it does little to speak to whether or not the plane will fly as intended.

"Testing" began with every component of the plane. It continued through every assembly step. Yes, it included test flights but it certainly did not begin there.

Check this out. http://youtu.be/Ai2HmvAXcU0 or this http://youtu.be/f4LFErD-yls "Engineers" do not whine. They do not blame their materials or their suppliers. Engineers design for component failure such that even multiple simultaneous component failures do not cause the plane to fall out of the sky.

John, running out of synonyms for "says," says: I think we are in violent agreement about software not being an engineering discipline yet. Great old (1990) paper by Mary Shaw here.

In retail, shrinkage (loss due to shoplifting, hijacking and employee theft) for decades has averaged 1.5% of revenue and they spend 1.5% of revenue to keep it at that level - 3% loss of revenue due to shrinkage and loss prevention spending is an acceptable level. They could spend 5% of revenue to get shrinkage to zero - but that would mean they are actually losing more money.

Software testing with the automated tools (augmented with human expertise) we have today is probably already better than M&As 60% failure, probably not as good as retail's 1.5%! But the companies who have pushed it onto external software developers seem to have seen success, and the UK Huawei testing program seems to work. I spoke with the Director, he knows that doesn't scale to everything with software the UK buys but for high risk stuff, sure.

(certainly to be continued, the Sun may go out before this discussion ever ends...)

Post a Comment






Captcha


* Indicates a required field.