The Unsecured State Part 3 - 2,000+ NHS Security Vulnerabilities (Disclosed)

This is part 3 of a series of blog posts looking at the security of the UK Government's web infrastructure.


Britain's National Health Service is riddled with old and insecure WordPress-based websites. Many of these sites have severe flaws including being vulnerable to XSS attacks.

XSS Spammers

There is absolutely no suggestion that patient data or confidentiality has been put at risk.

These flaws were discovered passively using the information which was returned by the web server following a normal request. I have not exploited any of the holes found.

All these flaws were responsibly disclosed to Department of Health Officials in January 2014. Throughout February I was repeatedly in contact with various NHS officials trying to get them to do something about these problems.

This is a technical look at how I found these flaws. Please buy the latest edition of Computer Active to read the full story.

Step 0 - Was This A Problem In The Past?

In 2009, a security researcher discovered a severe security flaw in one of the NHS's websites. I wondered if the NHS had improved its web security practices in the last 5 years.

Step 1 - Get All NHS Domain Names

I initially thought there would be a public list of all the NHS's websites. There isn't.
Thankfully, Rob Aley had made a Freedom of Information request a year ago which I was able to use.

The list dates from January 2013 - so it doesn't contain any of the more recent domains. However, as any WordPress site created in the last 12 months is (hopefully) free of vulnerabilities, that's not too big an issue.

So, with 5,000 domains in hand, it's on to....

Step 2 - Look for Vulnerabilities

There were five main classes of vulnerabilities I was looking for.

  1. Old WordPress versions.
  2. Server Information.
  3. Directory listings.
  4. Unsecured login pages.
  5. XSS Flaws.

Finding the WordPress version is simple enough. Most sites will spit out a header in the HTML which says:

<meta name="generator" content="WordPress 3.5.2" />

If the administrator is sensible enough to have hidden that header, we can still infer which WordPress release is running by looking at which JavaScript libraries are bundled with the site.

Server information means we can see if the website is running on old, unpatched software. Directory listings allow us to see all the files on the server. Better hope there's nothing confidential on there!

Unsecured login pages means that anyone can guess the address of the login page. Without suitable protection, repeated login attempts can be made until. Unless the site is running SSL (and most aren't) the username and password are sent unencrypted. Better hope no one is logging on over public WiFi!

Finally, on to XSS. The easiest way to test is to search for an HTML string and see if it is returned.

Testing each one of these manually is possible - although a right pain in the arse - so I turned to...

Step 3 - wpscan

The Open Source software wpscan is a simple tool - you give it a URL and it finds every single WordPress vulnerability on the site. It tells you the version, what bugs are present, whether the site is likely vulnerable the XSS, and all sorts of other interesting details.

wpscan NHS
Right, time to get started!

Step 4 - Scanning

Sadly, wpscan doesn't have a batch mode. Nor does it play well with parallel processing. That means running it in serial.

Taking a list of NHS domains in a .txt file, it's relatively easy to extract each one, scan it, then dump the result to a text file with the same name as the domain.

cat nhs.txt | 
xargs -iURL sh -c './wpscan.rb --follow-redirection --url URL > URL.txt'

In order to stop wpscan asking me every time it couldn't find plugin directory, I patched the wpscan.rb file.

  1. unless wp_target.wp_plugins_dir_exists?
  2.   puts "The plugins directory '#{wp_target.wp_plugins_dir}' does not exist."
  3.   puts 'You can specify one per command line option (don\'t forget to include the wp-content directory if needed)'
  4.   print 'Continue? [y/n] '
  5.   #unless Readline.readline =~ /^y/i
  6.   #  exit(0)
  7.   #end
  8. end

With 5,000 records to check, it was bound to take some time. Thankfully, not all the sites run WordPress and wpscan only takes a second to ignore a site it can't scan.

The scan ran at about 7 URLs per minute. Meaning the whole thing was done in less than half a day.

Step 5 - Parsing The Results

Out of the 5,000 domains, 358 were identified as running WordPress.

5 were identified as running the extremely old WordPress 2.X!

WordPress Versions

How many potential XSS vulnerabilities were found? 597. Several of the sites were identified with multiple potential exploits (I say "potential" because they were not all manually checked).

After running the reports, parsing the data, summing the number of XSS, privilege escalation, open redirects, and other miscellaneous bugs - I came up with the linkbait conservative total that over 2,000 security bug were identified.

Step 6 - Calm Down

It's important to note that these are suspected vulnerabilities. The wpscan software isn't perfect - some of the flaws it detects may be mitigated by other measures.

Many of the problems are "Privilege Escalation" vulnerabilities. This means that the secretary who updates the opening times, may be able to assume the role of an administrator and do some serious damage. This makes it unlikely that an external malicious user could exploit these flaws.

Ok, so what about the ~4,500 sites which aren't running WordPress? Are they secure? No!

Step 7 - Look for Non-WordPress Vulnerabilities

There are a number of sites which don't run WordPress which are still vulnerable to XSS attacks.

Nearly every single site built by a particular Norfolk based company had a confirmed XSS vulnerability.

These were found by manually searching for an HTML string and seeing if it was returned unescaped.

After repeated contact, and some hand-holding, they were able to fix dozens of vulnerable sites.

Step 8 - Responsibly Disclose the Problems

It's really hard to contact the Department of Health to report these issues to them. I'm lucky enough to have some friends in the Civil Service who were able to escalate my concerns - but even then I seemed to hit a brick wall.

I tried contacting individual website owners - who mostly forwarded me on to other people who then ignored me.

I contacted the Department of Health directly and provided screenshots of the problems - no reply was forthcoming.

Finally, I contacted James Temperton, the award winning journalist from Computer Active. James was the only journalist who responded to my request for a PGP key in order to communicate securely. In the age of Snowden, it seems bizarre that computing journalists don't take the minimum amount of effort to provide a secure contact channel.

With James' help, I was able to craft this story and he was able to contact the PR people at the Department of Health. You can read James' story in the latest issue of Computer Active.

What I Learned

Many Doctors' Surgeries in an area will all use the same cheap, private sector contractors to built their site. If there's a bug in one - that bug is present in hundreds of other sites.

Our practice-fs8

On 12th February, I finally heard back from someone senior within the NHS. They explained that the Department of Health has no central control over NHS websites. As a result, sites fall through the cracks as local teams change. Consequently, in many cases there is simply no way to contact the website owners.

Abandoned Sites

I've tried to disclose the flaws to the site owners and directly to the Department of Health. In some cases - such as the following - no one is responsible for the site!

Breast Milk Video XSS

I contacted the designer - he passed me on to the agency commissioned to design the website. The agency passed me on to the NHS group they did the work for - which has since been re-organised. They passed me on to the local government contact who is meant to be responsible. She cannot find out who currently controls the site.

The Department of Health, HSCIC, local government, and NHS Care Commissioning Groups are all abdicating responsibility.

So now we have a situation where the NHS has lost control of its websites. They can be used to host spam and malware, hijack their usernames and passwords, or scam patients into giving up confidential information.

Recommendations

I love WordPress - this blog runs it, as do many more sites I administer. Like any software, it needs to be kept updated and maintained.

It's clear that many NHS websites are not being actively maintained. That's a serious failing. I don't think it's an exaggeration to say that looking after a website is as important as cleaning a hospital.

Ok, maybe a bit of an exaggeration. But XSS flaws are especially pernicious when they're on a trusted domain like nhs.uk.

The NHS is being privatised by a corrupt Tory Government. It's clear that the fractured nature of the NHS means that private companies are free to exploit small NHS practices. Many of these vulnerable sites have been delivered by private companies with no thought of the public harm they are doing.

Earlier this year, Sam Smith asked a very important question:

It's clear that neither tiny NHS practices nor megalithic Trusts have the experience to commission and run simple websites. The ideological desire for "competition" has lead to a waste of millions of pounds of taxpayers' money and resulted in horrendous security flaws throughout the NHS.

Public health is too important to leave to the "invisible hand" of capitalism's free market. We need a strong, centralised management which can produce and enforce best-practice across the NHS's web portfolio.

It's time that the Secretary of State for Health, Jeremy Hunt, stopped trying to undermine the public sector ethos of the NHS and, instead, concentrated on making it stronger. Rather than setting the NHS up to fail via phoney "competition", he should be ensuring it works together as a community to ensure the security of the NHS's digital portfolio.

The Official Response

After raising this through multiple channels - including directly to some of the site involved and to GovCertUK - this is the official reply we got from HSCIC on 18th February.

In relation to nhs.uk sites, the HSCIC's role is to process applications to use the domain name from NHS organisations and provide permission for its use, where appropriate. However, responsibility for the maintenance and security of sites using the nhs.uk domain sits with the organisation running each website or service.

The HSCIC is currently drafting some additional guidance, in support of our existing technical guidance, to be issued to all applicants receiving permission to use the nhs.uk domain. We are grateful to the individuals who have alerted us to these issues so that we can take them into account when drawing up this document.

A Special Message To Tim Kelsey about care.data

If the NHS can't be trusted to secure their websites - why should I trust them to secure my confidential medical details?

That's why I've opted-out of care.data and you should too.


8 Responses to “The Unsecured State Part 3 - 2,000+ NHS Security Vulnerabilities (Disclosed)”

  1. Martin Hall Image of Martin Hall

    Great work,
    I also did a similar thing back in Feb 2010 also.
    See http://jeremiahgrossman.blogspot.co.uk/2010/02/best-of-application-security-friday-feb_12.html

    I looked at SQL Injection and had a 100% hit rate. I would hope that things have improved in regard to SQL injection,

    I also contacted the NHS trusts and UK CERT but did not get any replies except the standard canned (we will look into your request).

    I just randomly checked 20 sites from the freedom of information list you mentioned above and again 100% hit rate, this time on XSS.

    I'm guessing that if the checks were manually done rather using an automated tool the error rate for XSS would be nearer 80-90% out of the 5000

    Reply
  2. anon Image of anon

    Private sector is not the problem in itself. Lack of competition is - that lack of competition is dictated by the bureaucratic procurement process which permits only a select subset of players to sell ICT solutions to the government. Most of the time - the same, rotten codebase is sold numerous times to the same government body. Lowering that barrier of entry (which G-Cloud and other frameworks are doing) - so that more qualified players can bid for government ICT solutions is what will improve this state of affairs. I assure you - the government has as much money as it wants, and cutting costs was never a priority.

    Another huge problem is the lack of qualified technology leadership. Most ICT directors within the government are converted from other posts, which had nothing to do with technology.

    Reply
  3. David Coveney Image of David Coveney

    Yes, this is fair. We ourselves have built some sites for NHS clients, but all through Capita. The problem is that in most cases you're there to build it, hand it over, and then somebody else runs the server install. And they don't bother to update because that was never put in their supply agreement - they just do servers. We send the emails, nobody answers. Now, I'm not a believer in the 'upgrade now!' approach to WP site management, so long as you know the likely risks and what your server config is like you don't always need to, but in some cases the updates are indeed sensible.

    But I'd get out of the political statements thing in what is otherwise a serious report that you've made. I don't feel our government is any more or less corrupt than others that have come before, and is certainly better than many others. Believe it or not, tendering is supposed to reduce corruption. It may be no fun. It may mean prices go up a lot, but it creates an open procurement process. If you've ever worked with public sector you'll know that it's far harder to corrupt than most other business - I was knocked back when I couldn't pay for a civil servant's meal simply because he'd have to declare it and "that would be more hassle than it's worth."

    Reply
  4. trapped Image of trapped

    "Another huge problem is the lack of qualified technology leadership. Most ICT directors within the government are converted from other posts, which had nothing to do with technology."

    Nail meet head.

    The levels of mis-management and lack of IT literacy is truly shocking... and I say that whilst typing on my NHS keyboard at my NHS workstation... :(

    Reply

What Do You Reckon?