Posted by: David Harley | September 16, 2011

Before you get to the blogs further down…

[Updated on 12th April 2014.]

…welcome! Here’s a quick overview of the geography of a Small Blue Green World. This is the Small Blue-Green Blog, but it’s more accurate to think of it as the gateway to the various blogs and bits and bobs that constitute the SBGW presence on the web. Essentially, this is a consultancy offering services to the security industry, launched by David Harley in 2006 and with one main customer (ESET), so this particular page isn’t maintained very regularly: it has (currently) no commercial/advertising function, but it includes some papers/resources that may not be available elsewhere. The blogs linked here, however, especially those to which I contribute on ESET’s behalf, are maintained regularly.

My work for ESET

The services I provide to ESET are quite wide-ranging, but they include blogging on the ESET blog page. I stopped contributing to SC Magazine’s Cybercrime Corner some time ago, and that page seems to have been removed. I’ll be looking back over my articles for that venue to see which might usefully be republished. I do write fairly regularly for Infosecurity Magazine, primarily on Mac issues, but that may not continue. Other authoring and editing includes conference papers, white papers and so on. The ESET Threat Center and We Live Security pages include links to a range of resources.

More specifically, the ESET resources page and includes white papers written specifically for ESET, papers for external conferences and workshops submitted on ESET’s behalf, links to articles written for outside publications and sites, again on ESET’s behalf, ESET’s monthly threat reports, for which I often provide articles and editing, while some of my conference presentations are available as slide decks here.

Some articles and conference papers can’t be posted on a commercial site for copyright-related reasons, so I tend to post them on this site instead. When I remember. Actually, most of that stuff is now posted to Geek Peninsula.

Other outlets

AVIEN (formerly the Anti-Virus Information Exchange Network), which was run as an independent organization by myself and Andrew Lee (and before that by Robert Vibert), is still hosted on its own web site and has its own blog page hosted there, but I’m no longer heavily associated with the organization except as an occasional blogger there. I do maintain (intermittently) a phone scam resources page there. I run several other specialist security blogs completely independently of ESET, and these include a blog focused on hoaxes, spam, scams and similar nuisances (thanks to ESET N. America CEO and long-time friend and colleague Andrew Lee, you can also access this as http://www.virushoax.co.uk), and another that focuses (mostly) on Apple malware: essentially, it’s the current incarnation of the old Mac Virus web site originally founded by Susan Lesch, and includes contributions from Old Mac Bloggit, the well-known pseudonym.

We no longer host the AMTSO blog, and  I don’t do any administration on the main AMTSO site any more. I do, however, maintain an independent AV-testing blog/resource called, imaginatively, Anti-Malware Testing, and this archives most of the articles I originally posted on the old AMTSO blog – of course, they do not represent AMTSO’s official views. I also blog occasionally at other sites, include Infosecurity Magazine,  (ISC)2 and Securiteam.

I used to flag current articles, papers, blogs and media coverage at The Geek Peninsula (some of this is also tweeted via http://twitter.com/DavidHarleyBlog/) but I was having trouble remembering to update it. I’m now using it as a repository for (most of) my papers, some of my articles, pointers to my current and past blogs, and so on. There are some other blogs associated with this site that aren’t security-oriented. Words and Music is used for – well, words and music: songs, verse, miscellaneous extra-curricular prose by David Harley. Out and out parodies are posted at Parodies Regained. There are also some photographs here and there: Postcards from is a selection from our huge collection of travel photographs, Flower Portraits is about astronomy (I’m kidding! It’s floral photographs!) and Words is other (mostly travel) writing with more photos, all maintained by Jude Harley. Much more to come. :)

David Harley
Small Blue-Green World
ESET Senior Research Fellow

Posted by: David Harley | March 1, 2013

Hoaxes and social media paper

Welcome to the Web 2.0 incarnation of the Misinformation Superhighway. Did you really think that hoaxing had died out?

Origin of the Specious: the Evolution of Misinformation is a white paper for ESET that’s just come out.

David Harley CITP FBCS CISSP
Small Blue-Green World
ESET Senior Research Fellow

Posted by: David Harley | February 28, 2013

Spamfighting and Hamfighting

This is an article from Virus Bulletin on Hamfighting, July 2006, made available here by kind permission of Virus Bulletin, which holds the copyright. (You can also read it at HTML on the Virus Bulletin site, but for that you need to be a subscriber – registration is free, though.)

It addresses the problem of legitimate mail (‘ham’) misdiagnosed as spam, with particular reference to aggressive filtering by Verizon. I’m putting it up here now because it has particular relevance to a post I’m putting together on Mac Virus. Brief extract from the introduction to the paper:

Complaints in various forums of poor email delivery service from the ISP seemed to be confirmed by claims from Verizon ‘insiders’ that a policy of rejecting mail by IP block resulted in the loss of all mail from large portions of Europe and Asia. This led to a much publicized class action, resulting in a settlement offer from Verizon to compensate customers who lost legitimate mail between October 2004 and May 2005.

I’ll probably be putting up some more papers and articles that aren’t available on my own sites, in the near future, or external links where appropriate.

David Harley CITP FBCS CISSP
ESET Senior Research Fellow

Posted by: David Harley | May 21, 2012

PIN Holes: Passcode Selection Strategies

This is the second paper I presented at the EICAR 2012 conference in Lisbon recently. As before, It’s posted here rather than on the ESET resources page for conference papers in accordance with EICAR’s copyright stipulation that EICAR conference papers be posted on personal web sites.

PIN Holes: Numeric Passcodes and Mnemonic Strategies 

And here’s the abstract:

Recommendations on how to select and/or memorize a four-digit PIN (Personal Identification Number) can be found all over the Internet, but while we have learned a great deal from analyses of mixed-character passwords and passphrases revealed by high-profile breaches like the highly publicized Gawker and Rockyou.com attacks, there are no exactly equivalent attack-derived data on PIN usage. However, a sample of 204,508 anonymized passcodes for a smartphone application, by ranking 4-digit strings by popularity, gives us a starting point for mapping that ranking to known selection and mnemonic strategies.

Memorization strategies summarized by Rasmussen and Rudmin include rote learning; memorization according to keypad pattern; passcode re-use from other security contexts; code with personal meaning; code written down or stored electronically (as on mobile phone) – possibly using various concealment and transformation strategies.

The data provided by Amitay allows us to assess the degree to which memorization strategies are used in relation to a standard smartphone numeric keypad, but also to engage in some informed speculation on the extent to which they might be modified on other keypads, including QWERTY phone keypads, ATM keypads, security tokens requiring initial PIN entry, and hardware using an inverted (calculator-type) numeric layout. The ranking allows evaluation of the entropic efficacy of these strategies: the more popular the sequence, the likelier it is to be guessed.

These considerations are used to assess the validity of commonly recommended strategies in a diversity of contexts and generate a set of recommendations based on the findings of this analysis. These recommendations are placed into the context of more general mixed-character passwords and passphrases. They will provide a starting point for security managers and administrators responsible for the education and protection by policy of end users and customers using the kinds of device and application that require numeric passcodes for authentication.

David Harley CITP FBCS CISSP
ESET Senior Research Fellow

Posted by: David Harley | May 10, 2012

After AMTSO: a paper for EICAR 2012.

This is a paper presented at the 2012 EICAR conference in Lisbon. (I actually presented two: the other one will be posted here in a day or two, or maybe a little longer as I’m travelling right now.) It’s posted here rather than on the ESET resources page for conference papers in accordance with EICAR’s copyright stipulation that EICAR conference papers be posted on personal web sites.

After AMTSO: a funny thing happened on the way to the forum

Here’s the abstract: 

Imagine a world where security product testing is really, really useful.

  • Testers have to prove that they know what they’re doing before anyone is allowed to draw conclusions on their results  in a published review. 
  • Vendors are not able to game the system by submitting samples that their competitors are unlikely to have seen, or to buy their way to the top of the rankings by heavy investment in advertising with the reviewing publication, or by engaging the testing organization for consultancy. 
  • Publishers acknowledge that their responsibility to their readers means that the claims they make for tests they sponsor should be realistic, relative to the resources they are able to put into them. 
  • Vendors don’t try to pressure testers into improving their results by threatening to report them to AMTSO.
  • Testers have found a balance between avoiding being unduly influenced by vendors on one hand and ignoring informed and informative input from vendors on the other. 
  • Vendors don’t waste time they could be spending on enhancing their functionality, on tweaking their engines to perform optimally in unrealistic tests.
  • Reviewers don’t magnify insignificant differences in test performance between products by  camouflaging a tiny sample set by using percentages, suggesting that a product that detects ten out of ten samples is 10% better than a product that only detects nine. 
  • Vendors don’t use tests they know to be unsound to market their products because they happened to score highly.
  • Testers don’t encourage their audiences to think that they know more about validating and classifying malware than vendors.
  • Vendors and testers actually respect each others work.

When I snap your fingers, you will wake out of your trance, and we will consider how we could actually bring about this happy state of affairs. 

For a while, it looked as if AMTSO, the Anti-Malware Testing Standards Organization, might be the key (or at any rate one of the keys), and we will summarize the not inconsiderable difference that AMTSO has made to the testing landscape. However, it’s clear that the organization has no magic wand and a serious credibility problem, so it isn’t going to save the world (or the internet) all on its own. So where do we (the testing and anti-malware communities) go from here? Can we identify the other players in this arena and engage with them usefully and appropriately?

David Harley CITP FBCS CISSP
Small Blue-Green World
ESET Senior Research Fellow

Posted by: David Harley | December 12, 2011

Malicious Android: why the Birds are Angry

It’s no secret that trojans that misuse premium SMS services are one of the most prevalent problems in the mobile malware arena. However, the flood of “Lagostrod” and “Miriada” so-called free knock-offs of real games are peppered with code that sends text messages to premium services. Mikko Hypponen retweeted an estimate, based on comments to reddit, that the attackers could have made around $12,000,000.

According to Sophos’ Vanja Svajcer:

After more than a day on the market, the applications were pulled off by the Android Market security team. Google’s reaction has been quick, but not quick enough – at least ten thousand users downloaded one of the malicious apps from the list.

Much more information on the event in Vanja’s blog and in Sean’s blog for F-Secure.

I hope to see an apology from Chris DiBona for suggesting that anyone working for an AV company should be ashamed of themselves if they have a product for Android,  Blackberry or iOS, but won’t be holding my breath.

(Yes, this is the sort of stuff I usually post to Mac Virus, but it’s not really Apple-related, I guess, so I think I’ll probably do more of it here.)

David Harley CITP FBCS CISSP
Small Blue-Green World/AVIEN/Mac Virus
ESET Senior Research Fellow

Posted by: David Harley | May 15, 2011

EICAR 2011 Paper

And a big hand, please, for my EICAR 2011 paper!

This is a paper I presented last week at the EICAR conference in Krems, Austria, on “Security Software & Rogue Economics: New Technology or New Marketing?” Here’s the abstract:

A highlight of the 2009 Virus Bulletin Conference was a panel session on “Free AV vs paid-for AV; Rogue AVs”, chaired by Paul Ducklin. As the title indicates, the discussion was clearly divided into two loosely related topics, but it was perhaps the first indication of a dawning awareness that the security industry has a problem that is only now being acknowledged.

Why is it so hard for the general public to distinguish between the legitimate AV marketing model and the rogue marketing approach used by rogue (fake) security software? Is it because the purveyors of rogue services are so fiendishly clever? Is it simply because the public is dumb? Is it, as many journalists would claim, the difficulty of discriminating between “legitimate” and criminal flavours of FUD (Fear, Uncertainty, Doubt)? Is the AV marketing model fundamentally flawed? In any case, the security industry needs to do a better job of explaining its business models in a way that clarifies the differences between real and fake anti-malware, and the way in which marketing models follow product architecture.

This doesn’t just mean declining to mimic rogue AV marketing techniques, bad though they are for the industry and for the consumer: it’s an educational initiative, and it involves educating the business user, the end-user, and the people who market and sell products. A security solution is far more than a scanner: it’s a whole process that ranges from technical research and development, through marketing and sales, to post-sales support. But so is a security threat, and rogue applications involve a wide range of skills: not just the technical range associated with a Stuxnet-like, multi-disciplinary tiger team, but the broad skills ranging from development to search engine optimization, to the psychologies of evaluation and ergonomics, to identity and brand theft, to call centre operations that are hard to tell apart from legitimate support schemes, for the technically unsophisticated customer. A complex problem requires a complex and comprehensive solution, incorporating techniques and technologies that take into account the vulnerabilities inherent in the behaviour of criminals, end-users and even prospective customers, rather than focusing entirely on technologies for the detection of malicious binaries.

This paper contrasts existing malicious and legitimate technology and marketing, but also looks at ways in which holistic integration of multi-layered security packages might truly reduce the impact of the current wave of fake applications and services.

David Harley CITP FBCS CISSP
ESET Senior Research Fellow

Posted by: David Harley | May 13, 2010

EICAR Performance Testing Paper

Back to ESET White Papers

This is a paper called “Real Performance?” written by Ján Vrabec and myself and presented at the 2010 EICAR Conference in Paris in May, available by kind permission of EICAR.

Abstract:

The methodology and categories used in performance testing of anti-malware products and their impact on the computer remains a contentious area. While there’s plenty of information, some of it actually useful, on detection testing, there is very little on performance testing. Yet, while the issues are different, sound performance testing is at least as challenging, in its own way, as detection testing. Performance testing based on assumptions that ‘one size [or methodology] fits all’, or that reflects an incomplete understanding of the technicalities of performance evaluation, can be as misleading as a badly-implemented detection test. There are now several sources of guidelines on how to test detection, but no authoritative information on how to test performance in the context of anti-malware evaluation. Independent bodies are working on these right now but the current absence of such standards often results in the publication of inaccurate comparative test results. This is because they do not accurately reflect the real needs of the end-user and dwell on irrelevant indicators, resulting in potentially skewed product rankings and conclusions. Thus, the “winner” of these tests is not always the best choice for the user. For example a testing scenario created to evaluate performance of a consumer product, should not be used for benchmarking of server products.

There are, of course, examples of questionable results that have been published where the testing body or tester seem to be unduly influenced by the functionality of a particular vendor. However, there is also scope, as with other forms of testing, to introduce inadvertent bias into a product performance test. There are several benchmarking tools that are intended to evaluate performance of hardware but for testing software as complex as antivirus solutions and their impact on the usability of a system, these simply aren’t precise enough. This is especially likely to cause problems when a single benchmark is used in isolation, and looks at aspects of performance that may cause unfair advantage or disadvantage to specific products.

This paper aims to objectively evaluate the most common performance testing models used in anti-malware testing, such as scanning speed, memory consumption and boot speed, and to help highlight the main potential pitfalls of these testing procedures. We present recommendations on how to test objectively and how to spot a potential bias. In addition, we propose some “best-fit” testing scenarios for determining the most suitable anti-malware product according to the specific type of end user and target audience.

Download – EICAR: Real Performance paper

David Harley FBCS CITP CISSP
Security Author/Consultant at Small Blue-Green World
ESET Research Fellow & Director of Malware Intelligence
Chief Operations Officer, AVIEN
Mac Virus Administrator

Also blogging at:
http://avien.net/blog
http://www.eset.com/blog
http://smallbluegreenblog.wordpress.com/
http://blogs.securiteam.com
http://blog.isc2.org/
http://dharley.wordpress.com
http://macvirus.com

Posted by: David Harley | April 16, 2010

Re-Floating the Titanic: Social Engineering Paper

Back to ESET White Papers

Re-Floating the Titanic: Dealing with Social Engineering Attacks is a paper I presented at EICAR in 1998. It hasn’t been available on the web for a while, but as I’ve had occasion to refer to it several times (for instance, http://www.eset.com/blog/2010/04/16/good-password-practice-not-the-golden-globe-award and http://avien.net/blog/?p=484) in the last few days, I figured it was time it went up again. To my surprise, it doesn’t seem to have dated particularly (less so than the abstract suggests).

Abstract follows: 

“Social Engineering” as a concept has moved from the social sciences and into the armouries of cyber-vandals, who have pretty much set the agenda and, arguably, the definitions. Most of the available literature focuses on social engineering in the limited context of password stealing by psychological  subversion. This paper re-examines some common assumptions about what constitutes social engineering, widening the definition of the problem to include other forms of applied psychological manipulation, so as to work towards a holistic solution to a problem that is not generally explicitly recognised as a problem. Classic social engineering techniques and countermeasures are considered, but where previous literature offers piecemeal solutions to a limited range of problems, this paper attempts to extrapolate general principles from particular examples.

It does this by attempting a comprehensive definition of what constitutes social engineering as a security threat, including taxonomies of social engineering techniques and user vulnerabilities. Having formalized the problem, it then moves on to consider how to work towards an effective solution.  making use of realistic, pragmatic policies, and examines ways of implementing them effectively through education and management buy-in.

The inclusion of spam, hoaxes (especially hoax virus alerts) and distribution of some real viruses and Trojan Horses in the context of social engineering is somewhat innovative, and derives from the recognition among some security practitioners of an increase in the range of threats based on psychological manipulation. What’s important here is that educational solutions to these problems not only have a bearing on solutions to other social engineering issues, but also equip computer users to make better and more appropriate use of their systems in terms of general security and safety.

David Harley FBCS CITP CISSP
Security Author/Consultant at Small Blue-Green World
Chief Operations Officer, AVIEN
Chief Cook & Bottle Washer, Mac Virus
ESET Research Fellow & Director of Malware Intelligence

Also blogging at:
http://avien.net/blog
http://www.eset.com/blog
http://smallbluegreenblog.wordpress.com/
http://blogs.securiteam.com
http://blog.isc2.org/
http://dharley.wordpress.com
http://macvirus.com

Posted by: David Harley | March 20, 2010

Anti-Phishing Working Group: CeCOS IV

The Anti-Phishing Working Group has asked its members to publicize the forthcoming Counter eCrime Operations Summit in Brazil, which I’m pleased to do. Apologies to those who will have come across this elsewhere, including some of my other blogs.

This year the APWG is hosting it’s fourth annual Counter eCrime Operations Summit (CeCOS IV) on May 11, 12 & 13 in São Paulo, Brazil.  The Discounted Early Bird Registration rate will end on April 9th.  Do not miss this opportunity to join our host CERT.br with APWG Members from around the globe at this one of a kind event. Counter-eCrime professionals will meet for sessions and discussion panels that look into case studies of organizations under attack and deliver narratives of successful trans-national forensic cooperation.

This is APWG’s first visit to South America and will help build our network of trusted friends worldwide.  The discounted registration rate of $250 USD covers all three days of content, lunch, breaks and the Wednesday night reception.  (NOTE: APWG Members will receive an additional discount during registration) This “Early Bird” rate will end on April 9th, after that through the beginning of the event on 11 May registration is $325 USD.

A partial agenda is posted at the link below.  Translation services for English, Spanish and Portuguese will be available for all session.

http://www.apwg.org/events/2010_opSummit.html#agenda

Register Here:

http://secure.lenos.com/lenos/antiphishing/cecos2010/

David Harley FBCS CITP CISSP
Security Author/Consultant at Small Blue-Green World
Chief Operations Officer, AVIEN
ESET Research Fellow & Director of Malware Intelligence

Also blogging at:
http://avien.net/blog
http://www.eset.com/blog
http://blogs.securiteam.com
http://blog.isc2.org/
http://dharley.wordpress.com
http://macvirus.com

Older Posts »

Categories

Follow

Get every new post delivered to your Inbox.