Archive for December, 2007


More on MAC Security


So it appears Gartner has something to say about MAC security too.  Here is an interesting article building on the MAC security issue.  It’s just a matter of time before a major attack happens that hits the MAC platform.  Another interesting tidbit is that the article points out that “Mac’s generally have to be patched one at a time”.  Don’t get me wrong using both Macs and PCs can be good if the overall strategy supports security, but the key here is not to have a false sense of security.

 

 http://news.yahoo.com/s/infoworld/20071228/tc_infoworld/94177;_ylt=AmF8ijFNlThIuDkLJJ6MHJEE1vAI


An article was recently published about the Army adding Macs to improve security.  Although diversifying vendors will usually make you more secure if used to support a defense-in-depth strategy, the context of the article supports a lack of knowledge or evidence to support the statements made on the Army’s part. 

Article in Full:

http://news.yahoo.com/s/nf/20071224/bs_nf/57382;_ylt=AtIAHN4BI3dTDzpNM.n7xA8E1vAI

 

There is one particular statement that is worrisome whereas the Army security spokesperson has been quoted “Apple’s version of Unix is inherently more secure than Windows”.  Now I don’t claim to know all the facts but if you look at the links provided below the Mac OSx falls behind in 2007 and in the year 2004 has less advisories, but remains equally comparative percentage wise in regards to the number of critical vulnerabilities.

 

2007 Stats:

http://blogs.zdnet.com/security/?p=758

2004 Stats:

http://www.techworld.com/security/news/index.cfm?newsid=1798

 

Fortunately the article has a counter argument by Charlie Miller at the end supporting the fact that the Army needs to step it up with more than Macs when it comes to security strategy.  He comments about Mac being “behind the curve in security”.  Also has a great reference stating “In the story of the three little pigs, did diversifying their defenses help? Not for the pig in the straw house.”  On the other hand diversifying is good if you use one product to back up the function of another project in the event one fails.  So even though the pigs straw house was destroyed if that third pig could get to the brick house it would still survive.


There is an article that came out earlier from DRJ (Thomas L. Weems) based on a study that provides guidelines on the required geographical distance for alternate site locations.  This is good news for those performing risk assessments where this is considered vulnerability, because as far as I know FEMA has provided no specific guidelines. 

http://www.drj.com/articles/spr03/1602-02.html (registration required to view)

Ideally 105 miles point to point is the key number for all the threats listed below.  For those who don’t have access to the article below is a breakdown of the recommended geographical distances based on the threat.

NOTE: The article provides a graph so the numbers below is based on my interpretation of the graph.

Alternate Site Distance Recommendations

Hurricane:  105
Volcano:   75
Snow/Sleet/Ice:  70
Earthquake:  60
Tsunami:  52
Flood:   48
Military Installation: 45
Forest Fire:  42
Power Grid:  36
Tornado:  35
Central Office:  29
Civilian Airport: 28
None of the Above: 21

Off Site Storage Facility Distance Recommendations

Hurricane:  85
Volcano:  64
Snow/Sleet/Ice:  56
Tsunami:  45
Earthquake:  43
Flood:   43
Military Installation: 41
Forest Fire:  38
Power Grid:  36
Central Office:  25
Tornado:  24
None of the Above: 24
Civilian Airport: 22

Also the key here is to remember that the off site storage facility should accessible from the alternate site facility, which is a mistake many organizations make.

Problems and Revisions

Based on some quick research there are a few problems with the current distances above.  For example, I took three common disasters and did a quick analysis and here are the results along with some suggested changes.

Hurricane – Katrina spanned a much larger distance then 105 files proving that this distance is not adequate in a very large hurricane storm.  The article below explains that Katrina expanded over 780 miles whereas the outer regions were probably only affected by rain.  However, from my research severe damage was over about a 200 mile radius.  Therefore, I would suggest doubling the current metric to 210 miles.

http://earthobservatory.nasa.gov/NaturalHazards/shownh.php3?img_id=13083

Volcanoes – Although the current figure will probably be fine in most cases there is information to support that volcanoes can spread ashes up to 100 miles as displayed in the below article.  Therefore, this number should be revised to 105 miles based on the type of volcano.

http://pubs.usgs.gov/gip/volc/types.html

Earthquake – Similar to the volcano this distance will probably be sufficient but why take the chance when there is evidence that a 7.8 earthquake ruptured 220 miles of a fault.  Therefore, this number and the definition should be clarified to be at least 60 miles from a major fault line.

http://www.earthquakecountry.info/roots/shaking.html


On strategic risk assessments not testing the anti-virus signatures before being deployed should be considered a vulnerability.  Many of my customers believe this is ridiculous and not practical, however I report it anyway.   Whatever the case, the organization has the decision to accept the risk, as I am only there to point it out.  There is a great example published where a routine update caused serious problems forcing customers to have to re-install the operating system.

 http://news.yahoo.com/s/zd/20071206/tc_zd/221141;_ylt=AhIN_X.SMrgYGlzdK7zmNe8E1vAI

So you decide.  Should Anti-virus software be tested before deployment.

Follow

Get every new post delivered to your Inbox.