The concept behind our robot is that hackers are building 'mesh networks' interlinking the sites they hack. Our robot finds hacked websites and follows links from one site to another where the link looks like it is 'malicious', usually because it contains words such as 'viagra' but the site doesn't appear to be pharmaceutical related.
We found your site linked to from a site we deemed to have been hacked.
Does this mean your site is hacked?
It certainly raises the possibility.
But my site doesn't look hacked - it looks normal to me.
Indeed. Unfortunately hackers are clever. The current generation of hacks remain invisible to site owners and operators for as long as possible - typically months. The hackers show different content to Google's robot (crawler) with words and links that they use to promote the entirety of their network.
How can I tell for sure whether my site is actually hacked?
A good way to start would be to run a scan using 'Hacked Or Not'. If, as you are reading this page, your site has already been scanned recently we can show you the results without re-scanning your site. If the site looks hacked to our robot we'll show you what looks suspicious to us and offer you some guidance on how to go about cleaning the site.
If my site is hacked - how can I be sure that you aren't the ones who did it?
We are legitimate website security specialists: our clients include Government departments, local and regional councils, Europe wide organisations and United Nations bodies.
Dean Marshall Consultancy Ltd are a legitimate company, operating entirely above the law. We are registered in England and Wales as company number 6615299. We operate from CityLab, Dalton Square, Lancaster, Lancashire, LA1 1PP.
Our main website is www.deanmarshall.co.uk and was Registered on: 13-Apr-2002
Our mission is to reduce the value of these hacker networks - taking the fight to the hackers where we can, and reducing the value of their network wherever we can. We hope to reduce the length of time that any site remains under the influence of the hackers - and help site owners to repel the invaders and shore up their security so no repeat incident occurs.
I've had a problem with your bot - can I block it?
We are happy to block any site from our scanner upon the receipt of a complaint / request from a duly authorised individual or company.
Does the robot obey the robots.txt directive?
We made the decision not to follow the robots.txt directives - we believe with good reason.
If we implemented a way to stop our robot requesting your pages the hackers would simply change your robots.txt file to stop us from identifying sites as being hacked. It would defeat the purpose of the project entirely.
Is this some form of Penetration Testing - and isn't it wrong/illegal to scan/test a site without the owners permission?
Good question. Short answer: No.
- We do no form of penetration testing.
- We do not try to hack your site - no SQL Injection attempts - no attempts at CSRF attacks - no attempts at XSS attacks.
- We don't try to determine if your site is vulnerable to being hacked - SQL Injection, CSRF or XSS.
- We simply request your web pages and some other resources just like a browser, search engine or hacker would - it is just we pretend to be Google or another search engine while we do it (because these hacks show different content to Google than to normal user-agents). If the returned page contains suspicious content we analyse it and react accordingly.
If we carried out penetration testing without your consent that would be wrong - not necessarily illegal - but in our opinion certainly wrong. But as that isn't what we do - the point is moot.
There are a number of other online scanners where a site visitor can enter a URL and trigger a scan. These take no precautions that the visitor is authorised - one example might be sucuri.net.
Where a visitor to our site wishes to trigger a scan - we require authentication of their authority - either in the form of an email address at that domain or the placement of a text file in the root of the website in question.
Further notes and information:
Currently approximately 50% of sites that we analyse as a result of following links from other hacked sites show clear signs of being hacked. A significant proportion of the remainder can be seen to have been hacked previously and been cleansed recently.
Our project is still in beta test mode and there are still some issues with false positives which we are working to iron out at present.
Our aims are:
- to do no harm
- to help cleanse the internet - through changing the economics of exploiting hacked sites.
By alerting site owners and operators sooner than is currently the case (Google takes months before they tag a site as 'possibly' compromised), affording them the opportunity to clean their site(s) much sooner, we (the collective 'we', the legitimate web users) hope to reduce the value of hacked sites and reduce the incentives to hack additional sites.
To this end we are currently undertaking an extensive research and development programme, and seeking funding to further our efforts.