Lochgoilhead Gala Day

Since last year we have been based in Lochgoilhead which is a small village in rural Argyll. Today is gala day here, and ScotSTS have been happy to sponsor the event. I will be drawing the prizes down by the Loch this afternoon.

We consider ourselves privileged to be able to live and work in such a beautiful area, and we think that it is greatly beneficial to National Parks (we are in the Argyll Forest Park) that small consultancies like ours should be able to take advantage of improved technology and shifting work patterns to base themselves in areas where traditional economic activity tends to cause environmental damage. We can definitely see this being a theme over the next five years as improved internet connectivity moves testers and developers out of cities and into the countryside.

The benefits are on both sides so we are happy to help the community here and good luck with the gala day.



Rory and I just returned from OWASP AppSec EU where (for once) I was presenting but Rory wasn’t (as he was on the selection panel – though barred from reviewing my presentation!).

The quality of the talks was very high – though in my opinion there was rather too much of an emphasis on mobile this year. I know it is the exciting new area at the moment – but call me old fashioned but I would personally have liked to see more stuff on traditional web security. Perhaps (as per a talk Rory did a year or so back) it is just that there is no point of saying any more about it because no one is fixing the existing problems. I particularly enjoyed Maty Simon on HTML 5, James Kettle on Active Scan++ module for Burp and Jerry Hoff’s talk on mobile security.

We also heard Dr Richard Stallman talking about his views on ‘free’ software. I actually share more of his opinions than I would have thought – particularly around data privacy, although of course I do fundamentally disagree with him about proprietary software. I made a donation to his foundation and for a while I guess I may have been the only person in the world wearing a Microsoft T-Shirt with a GNU/Linux badge pinned on it. Anyway – I greatly respect him and his right to hold his views and I think he has stuck to his guns in a way that must cause him great personal inconvenience in the modern world.

Rory and I also attended the ‘Mobile Boot Camp Training’ as we were keen to expand Rory’s iOS and my Windows Phone knowledge into Android. It was a good course and we learnt a lot, but I have to say that the more I see of Android as a platform, the less I would be inclined to use it myself or to recommend it to others – particularly in an Enterprise environment. Be that as it may – we are now in the pretty rare position for a small consultancy of covering all the major mobile platforms.

My talk was an updated version of the Windows Store App presentation I did for Securitay back in January. There is quite a lot of new material and I seem to have managed to remove some of the annoying mannerisms from my delivery – https://www.youtube.com/watch?feature=player_detailpage&v=szKZG12XgIE#t=12509 The main new feature is ‘Store Sheep’ which I have just launched as an OWASP project. This is going to be a training app along the lines of ‘Web Goat’ which introduces testers and developers to Windows Store Apps and shows how to find and fix security issues in them. It is very much in Alpha at the moment (code word for ‘I haven’t anything like finished writing it yet’) – but I will be posting about it here as I make progress on it.

Rory at AppSec EU

Rory at AppSec EU

The picture is of Rory looking pensively at some Ruby Code while we were enjoying an excellent breakfast at ‘La Patisserie Vallerie’.

Just one other quick plug for an attraction any geek would love. We went to the Museum of Computing in Cambridge http://www.computinghistory.org.uk/index.htm They have hundreds of old computers in working order – check out Attic Attack on the Spectrum and ‘Flappy Bird’ for the ZX80. Also Altair 8800, Apple II etc. One of the best fun mornings I have had in a long time.


Web Application Testing Workshop

We did our Workshop on testing Web Applications at Scottish Ruby Conf today. This took place at Crieff Hydro and was targeted at Ruby developers and other people who are keen on the language. It is the fifth year of the conference this year and Rory has taken part in all of them.

The workshop was a bit like the one we did at BSides London last year – only where that one dealt with a sample infrastructure, this one covered how we go about testing a Web App – including an introduction to Burp and some sample exercises from OWASP ‘RailsGoat’ (a deliberately vulnerable Web App based on Ruby on Rails). We spent all day yesterday setting it up and cloning and testing 40 VMS. The VMS went on our new mini-server ‘Rhododendron’ and two laptops – we also had three WI-FI routers and sundry cables – so not as much stuff as last year but still a fair amount (the nice thing this year being that we could stick it all in the car boot).

We had done a fair bit of work to make sure that the whole Workshop would work offline because we have been to enough conference hotels to know that the wireless connection to the internet would really suck. This proved a challenge because late on yesterday afternoon we noticed that the app uses the Google chart API and it doesn’t work offline. Lucky Rory managed to hack that part of it out and we were good to go. We got up at six to arrive at Crieff nice and early – had a good breakfast and got set up in plenty of time. We had been warned plenty of people would turn up – and sure enough – we had nearly 40.

We were bang on the money about the hotel Wi-Fi and very glad we didn’t rely on it because it was very slow and caused issues in the other workshops we attended. Then our audience arrived (95% male and 95% MAC users – not necessarily the same people though!). Everything seemed to go very well and the majority of our demos worked. One thing I would take away from it is that if we do it again we need to go over the tool setup part more slowly and allow more time for showing people stuff – we tend to forget how complex these things are as we do them every day.

But we got about 3/4 of the examples done and we were able to show SQLi, XSS and command injection actually working – I think the real and immediate impact that they have surprised some of the audience. Our presentation is attached to this post and includes some more general notes on Web Application testing. Some of the things we mentioned today… The Web Application Hacker’s Handbook we recommended and is available here :- http://www.amazon.co.uk/Web-Application-Hackers-Handbook-Exploiting-ebook/dp/B005LVQA9S/ref=sr_1_1?s=books&ie=UTF8&qid=1399909420&sr=1-1&keywords=web+application+hackers+handbook OWASP top 10 here – https://www.owasp.org/index.php/Category:OWASP_Top_Ten_Project RailsGoat (sample app used) – https://github.com/OWASP/railsgoat Sample XSS vectors – http://html5sec.org/
We are happy to answer any questions from attendees at the workshop – our addresses are on this site. Great conference, nice venue (shame about the wi-fi). Hope to be back next year.

WP_20140512_12_50_32_Pro 1

Presentation from Conference

Open Source Responsibility

Unless you’ve been living under a rock for the last couple of days you will have noticed a bit of a kerfuffle about a vulnerability in OpenSSL. One of the more notable parts of this story has been the wide variety of large companies who have been seriously affected by the problem.

This led me to thinking about the fact that a lot of very large profitable corporations are essentially relying on software that they haven’t purchased and which, I doubt, many of them have good security assurance over.

* First Question how many billion dollar companies rely on OpenSSL for secure communications?

* Second question how many of those same companies have sponsored a security review of OpenSSL over the last two years?

Now I don’t know the exact answer to either of these questions, but I’m willing to wager that the first is a lot higher than the second.

The real question then becomes, should corporations who rely on open source software be taking an active part in ensuring the security of that software?

Well I’m a security guy so obviously is say yes :-) it seems obvious that if you rely on things you have an interest in the quality of that software…

House of Cards

I was reading this post and I was thinking that this is another good example of the general theme in a lot of modern business and security.

People will a lot of times neglect some of the “plumbing” of their website and not realise quite how important it is to their sites security. In the linked example it was DNS. An attacker was able to get control of the site domain name and then essentially controlled the site. That’s one way of pulling it off but there are others.

Good examples of services which are often overlooked but are critical

– Hosting services. If you use VPS or the like and the hosting service is compromised then, the attackers can likely get access to your servers too. A good example of this was the Linode hack in 2013. There the attackers didn’t even have Linode as a primary target, they were after one specific customer.
– DNS providers. If the attacker can control your DNS, they can redirect mail, carry out MITM attacks on web sites, basically make a right mess of your system. But hacks on DNS providers (either social engineering or direct) are a common theme in stories of compromise.
– E-Mail providers. Might not seem as critical, but how are most password resets done…. by E-Mail. If the attacker owns your e-mail service they can usually trigger password resets for other things like DNS or hosting.

So what makes me say these things are “neglected”? Well look at the market and it’s pretty obvious. In a lot of cases the successful providers in these areas are the cheapest/easiest to use, not the most secure. Of course there’s the usual security problem of a “market for lemons” in that all providers will say that they’re secure but I’d still recommend that if you have a system that’s important to you (and that’s true for an increasing number of companies who do business primarily on-line), then spending some time trying to find high quality “plumbing” will pay off in the long run.

Why security is getting worse

I was doing a talk for the OWASP meeting in Glasgow the other day, which covered the OWASP Top 10.  I had made the point that the Top 10 is largely the same now (in it’s 2013 iteration) as it was in it’s original iteration in 2003. Someone asked me a question based on that which (roughly) was “Why isn’t security getting better?”

Good question really and obviously one there’s not a simple answer to.  At base I do believe that the state of defensive security is going to get worse before it gets better and it comes down to a number of factors.

I’ve got a list of them below, but ultimately I think it comes down to incentives.  Good software security and good enterprise security are difficult and expensive things.  If the economic incentives aren’t there people just won’t do it.

Increasing spend/focus on offensive security

Offensive security is becoming a larger and larger industry driven by demand from governments and to a lesser extent corporates for “cyber attack tools”.  Essentially to me, this boils down to people finding exploits and creating malware to deliver them.  That focus has a couple of effects which are likely to be bad for overall security.

*    As these tools are developed they will “escape” into the wild and be re-used by criminal elements.
*    Governments have an active incentive not to rush software providers to fix critical issues, as this would destroy some of their expensive cyber weapons.
*    More security people spending their time on offence, means there’s likely to be less spending their time on defence.

Breach Fatigue

There’s been so many breaches that they’ve stopped being news.  I read a piece recently on Ars Technica where the university that the journalist had attended had a breach and a good number of records were compromised (310,000).  When he went to report this his editor essentially said that it wasn’t a big enough story to warrant reporting.

This is an example of breach fatigue, where breaches become so common that they’re not noteworthy any more.  The problem is that this removes one plank that security people use to get companies to spend on defensive security, reputational damage.  Essentially now, unless your breach is really bad or you handle it really badly, there is no reputational damage from a breach.

Another example of this was linode.  They’ve been breached a couple of times and in discussions I see regarding using their service, the security provided does not seem to be a factor.

Vulnerability Fatigue

The cousin of breach fatigue is vulnerability fatigue.  These days every large software company has had security issues and has had to fix them. Some companies handle them better than others, but again I don’t see that being a big factor in companies choosing what software to buy…

Many people in the security industry would point to Oracles poor handling of security issues at various points (slow fixing, lack of communication etc) but I don’t see that having hurt their sales figures at all.

And on the flip side you see companies who are generally considered to do security right still have breaches (e.g all the major browsers falling at this years Pwn2Own ).

So there has to be an element of some companies wondering to themselves whether it’s worth the effort to have a truly great software security programme.

No Legal Requirements For Secure Software

Realistically in most jurisdictions, there’s no requirement to produce secure software.  There may be regulations relating to it (e.g. PCI) but governments seem to be steering well clear of actually legislating that software companies have any obligations in that regard.  Personally I think this is where we’ll end up but it’ll be a hugely uphill struggle as every software company out there will fight this to the end.

FWIW I think that government legislation of this type will be a disaster for the IT industry but if nothing else works, it’s where I think we’ll end up.

Lack of developer training

Ultimately all IT security bugs come down to software or users.  We can’t fix human nature, but theoretically we could fix software security bugs. Unfortunately I don’t think this is going well.  Most people would agree that producing secure software requires that developers receive good in-depth and repeated training, but where will they get that from?

From universities? Nope – I’d be surprised if more than 10% of programming or computer science degrees have good secure coding modules throughout the degree.

From employers? Nope – A lot of companies have constrained budgets already and the idea of spending good money on proper face to face training for all their developers, isn’t likely to happen especially if they can’t see a direct correlation between that training and their bottom line profits.

For a certain definition of Secure….

Rory recently spoke at a conference about ‘cargo cults’ in security. To summarize, these are ‘security best practices’ which people follow, as a kind of religious belief without ever really thinking about whether they are really valid in the context of today’s threat landscape. We don’t just see these implemented by info sec policies – but actually included as part of commercial products.

I came across a good example recently – I won’t mention the name of the product, but we were asked to review it as part of an external infrastructure, and it made me wince, not technically (it did after all run over SSL aka ‘military grade encryption’), but from the perspective of user account security.

So to start on a reasonable footing, it required a strong password with at least 8 characters, a special character and a numeral. It wouldn’t take my 20 character passphrase password (which frankly will be brute forced the day hell freezes over) because of these rules – and that started me getting annoyed with it. Then just in case you forgot your strong password, it also has a secondary secret which will be used in the password reset process. I noticed that the questions are not stellar, one of them is ‘name of first pet’ and another one is ‘favourite food’. The account locks after four incorrect attempts at the password which in my opinion is low for email – but again – ok so far.

So what is wrong with a system where a user has to have a good password and there is a reasonable lockout policy? Well in this case – the password reset process. Having forgotten the strong password and locked the account after three tries, the user clicks on the ‘forgotten password’ link. This takes them straight to a page where they are asked for the secondary secret. Entering a correct secondary secret allows them to set a new password, and after this they are logged in. So the secondary secret is exactly equivalent to the password – but instead of having a complexity requirement – it has no restrictions at all – it will in fact accept ‘p’ or ’1′ as a valid entry. And there is no lockout on it. So instead of attacking the strong password with account lockout, an attacker can just go for the one character secret with no lockout. Or better still, he can just go for a few guesses of favourite foods (chocolate anyone?). And of course because it is an email system, the username is the email address which is trivially easy to discover. There is no attempt at any out of band solution once the secondary secret has been entered (sending a password reset link to a backup account for example) – you just enter the secondary secret and a password of your choice and you are straight in.

But the thing that annoyed me the most about this system, was that having used this extremely insecure mechanism to let me login using my favourite food as a password – it then had the unmitigated gall to refuse to let me reuse my previous password. I’d love someone to explain to me where the danger of password reuse stands on a scale of 1 to 100 compared to alloing a one character account password which does not lock.

This was a cargo cult if ever I saw one and the perpetrators should have their souls devoured by the Great Old Ones….

Surface Pro as Server

Now having bought the Surface Pro 2 – I was at a bit of a loss to know what to use my original Pro for. It basically is a lovely device – but with a couple of ‘if at first you don’t succeed – call it version 1.0′ flaws. The worst of these is that the battery life (about five hours at reasonable utilization) is just a bit too short to be able to use it for travelling anywhere you are not likely to be able to plug in. So last year if I was flying anywhere, I would take the pro and the RT, use the RT as a tablet and save the battery on the Pro for when I really needed to do some work. But the Pro 2 has fixed that – with more than 7 hours battery time it lasts for any journey I am likely to make with no available power.

So I had a few ideas for what to use the original Pro for. Firstly I had an idea that it might be useful for doing wireless tests – but for this we would be better off with Linux – partially because the kit for doing this in Windows is very expensive, but also because the access to low level networking libraries is better. So we put Ubuntu on it – but it proved quite unsatisfactory. The basic OS was there and worked in a basic sort of way, but it was unstable and the touch screen was hardly usable. I am not a Linux fan at the best of times – but on the Surface it really turned a lovely tablet into a downright unpleasant user experience.

I then had another idea – I was much in need of a development server which could live in my office, but also be accessible when I was out and about (and even on site). So I put Windows Server 2012 R2 on the Surface. I wasn’t sure how well this would work – but as it turns out – it was surprisingly easy to install, and works well and smoothly now it is on.

For anyone interested – these were the stages.

a) Make a bootable USB stick with the server OS on it.
b) From Update and Recovery -> Recovery -> Advanced Startup start the Surface off the USB
c) Install the OS as usual – there were no problems or glitches with this.
d) Once the base OS is installed. Go to add Programs and Features and under Features, add the wireless service. Then start it (this shouldn’t really have thrown me but I was so used to this just working in Windows 8 that I didn’t ask myself when I last saw a server running on WiFi).
e) Also add the Desktop Experience Feature – this enables various ‘non-server’ bits such as access to the Store which are handy for a tablet.
f) Set the power management to your liking – obviously while it is just working in the back ground it makes no sense to have the screen fired up.

I then put Hyper-V and IIS on it. With all this installed (but no VMs running), it is at about 30% memory utilization and its CPU is not straining at all. I think it should have no problems with one or two smallish VMs and being a development web server. But the other good thing about it is that it is still a nice tablet and without close inspection you would never know it from a consumer device. All the drivers work perfectly, the screen is just as good as in Windows 8.1, and you can even install Apps from the store, look at your photos and play Mahjong. But behind all this is the power of a full on Server OS.

Let’s see anyone do this on an iPad….

Security Testing Windows Store Apps

Rory and I recently presented at Securi-Tay again. This was the third conference organized and led by the students on the ethical hacking course at Abertay University in Dundee. As usual it was well set up and attended and it is good to see that the professional Scottish testers of the future can arrange a conference which is as good as (if not better than) many of the professional ones we have attended. We had an enjoyable day – even though it was a very long drive there and back.

I spoke about Windows Store Apps and how to test them. We often find ourselves in a situation where we are asked to test things that we are not particularly familiar with – and it is very useful to be able to find some material on the Internet that gives us somewhere to start. I am going to start trying to write a few posts on things we have come across which may be unusual or difficult to test in some way – as usual from the perspective of a professional tester in UK trying to achieve good coverage for a customer in the timescales given in a typical test rather than something done as a hypothetical exercise in hacking.

Metro (down the Tube) – Security Testing Windows Store Apps

So my presentation covers what the purpose of these apps is, how they are architected, developed and certified for the Windows Store. I then talk about where to find them, what software you need to test them and how to install and configure it. I outline how you would typically go about testing them and how they tie in with the OWASP top ten and mobile top ten. Finally I consider whether Microsoft have managed to achieve one of their goals with these apps and improve security and confidence for the average non-technical Windows user.

Of Human Stupidity

For a number of years, I have felt that tech companies must be seriously lacking in acumen to take the policies they do with regard to their customers.   Yesterday I noticed however that it is not restricted to tech companies, and it makes an interesting study in human stupidity to see this in operation.

So for example, a search for property management companies in the UK came up with this


I have no personal knowledge of the company involved, or whether the reviews in question are in fact accurate,  but I find it inconceivable that any commercial entity would allow their customer service (or their marketing department) to be so egregiously bad as to have that kind of review show up in search engines.  I mean surely they must realize that in the modern world, potential customers are going to look for reviews – and that it is not going to be a positive thing if you have a consistent 1 star rating.

But then I got to comparing it to the behaviour of tech companies – and I am afraid that three immediately jump to mind.  I used to administer Lotus (heard of them?) technologies.  At one point they had a massive chunk of the Office Suite, email, collaborative website and instant messaging market.  Then they were bought by IBM and everything went downhill from there.  There is no doubt in my mind that Domino was a superior product to early versions of Exchange.  Equally QuickPlace was there before SharePoint was really more than a twinkle in Microsoft’s eye.  I see that support was recently finally dropped for Lotus SmartSuite – but in its day it was way ahead of MS Office.

The same can be said of Novell Netware.  I loved that product back when, and I would defend NDS as a better directory service to AD until about ten years ago.  Today who under the age of 30 has even heard of Novell?  And AD runs most of the world’s internal networks.

One final example….  Sadly I think Blackberry is going the same way.   Considering that they have been relegated to fourth place in the mobile market, I think it unlikely that they will still be in the race (as an independent company) in a year’s time.  And yet not so long ago they had the vast majority of the smartphone market.

And the common factor in all of this….   Intellectual arrogance and the complete inability or unwillingness to listen to their customer base, or in any way to acknowledge that their product was not automatically superior just because it was the current market leader.

So two things I would take from this…..  Firstly if I were a property management company I would seriously do something about my customer service before I allowed a simple Internet search to make me look that bad.  Secondly, I would bet a lot of money against the tech press who are writing Microsoft off as a major force compared to Apple and Google.  Consistently over the last 20 years they appear to have been the only company who genuinely care about customer service, and also one of the few who try to adapt to changing times.    They have never made any claims about ‘doing no evil’ – but it consistently appears that caring for your customers on a day to day basis, and being seen to do so, makes good commercial sense.