I work at Google and was the one who posted on our forums about this.
What our systems found was definitely a compromised JS file, and others on this thread have posted something similar to what we saw. This is not a false positive.
We have detailed help for webmasters in this kind of situation:
One thing that I strongly suggest to any webmaster in this situation is to look for any server vulnerability that allowed this file to get compromised in the first place. We sometimes see webmasters simply fix the affected files without digging into security hole that allowed the hack, which leaves the server vulnerable for repeat attacks.
php.net allows users to post answers and examples of code throughout the website. Likely, one of the submission forms has/had a hole that allowed someone to submit or alter actual JS code.
Thanks for the information. Can you confirm how long the malware has been on the site for, and if it's possible that people may have been drive-by'd before you flagged the domain? Also what systems did the malware target?
It's a pity you don't disclose which malware specifically the sites were distributing. As a user who may have been affected prior to google flagging it, it's frustrating to have no information on what to look for.
When I go to that page I see: "Of the 1838 pages we tested on the site over the past 90 days, 4 page(s) resulted in malicious software being downloaded and installed without user consent. The last time Google visited this site was on 2013-10-24, and the last time suspicious content was found on this site was on 2013-10-23."
As a student interested in security, will the js file be provided so we can examine and learn how this was done outside of privileged access?
EDIT
OKay looks like safe browsing said it is no longer suspicious. And I think someone already provided the JS file below.
Google's safe browsing looks pretty cool! Really powerful infrastructure. I wonder if they did this with virustotal?
Can virustotal recongize this?
Another thing is other search engines dont seem to have this built-in. I wonder people using DDG will ever think about querying Google safe browsing or not.
Does google provide an api (beside just querying safebrowsing directly).
I don't understand: you guys CLEARLY have the information on what, specifically, is causing the problem (which JS file, in this case). Yet for the site owner, you don't make it available? Or if it is, it's not at all easy to find the information. It's just this black box "go figure it out yourself" thing. Meanwhile you've essentially killed access to their site. Or is there something I'm missing?
But wasn't that only after rasmus posted about it on Twitter? And how would that have worked out for someone not running such a high profile site as him?
Once you're a confirmed webmaster for a given site, Google provides you with access to exactly what files are infected on your site. To provide that information publicly would be open the site up to further exploitation.
There are huge repercussions for any website that gets blacklisted with the Stop Badware clearinghouse, the least of which is the inability to figure out exactly where the problem actually is because the company you work for's information for a webmaster to resolve the problem is ridiculously minimal. There are no notifications (unless you are signed up for Google Webmaster Tools) and restoring a website to normal globally can take anywhere from 48 hours to two weeks. There are millions of developers who rely on the PHP website daily for performing their day jobs and you've now made it that much harder for us to do our jobs.
Stop Badware needs a serious overhaul. At the very least, they should contact the contacts in the WHOIS record for the domain BEFORE doing anything. Give the website owners 24 hours to resolve the issue before blacklisting the site. And give them a heck of a lot more information to go on than some vague text.
Also, there are several anti-virus vendors out there who use the clearinghouse database for their products...6 to 12 months after the original blacklisting. So this will happen all over again 6 to 12 months from now. Finding contacts for anti-virus vendors for removing domain blocks is a lot harder than removal from the blacklist on the Stop Badware site.
The CORRECT solution for this situation was to find a contact at PHP who could resolve the issue quickly and amicably. Seriously, how hard is it to locate Rasmus' e-mail address? Always try to find a human contact before using Stop Badware. You can chalk using Stop Badware for the PHP website as being the dumbest decision you've made this year. Hopefully this decision of yours will raise the ire of the Internet just enough to force the company you work for to revamp Stop Badware so it doesn't suck, Google Webmaster Tools so they don't suck, and the reporting tools for sending information to Stop Badware so they also don't suck.
Stop Badware needs a serious overhaul. At the very least, they should contact the contacts in the WHOIS record for the domain BEFORE doing anything.
Why? This isn't a responsible disclosure, "we found a potential vulnerability but we don't know if it's being exploited yet" kind of situation. This is a "there's a real threat to anyone visiting that site via your search engine right now" kind of situation.
As a user, I'd be much happier if the search engine flagged this immediately.
As a site owner, if someone found malware on my site I'd want to know ASAP too. Obviously it would be helpful if they sent me a notification and made the specific details of the identified threat available. However, I could hardly criticise them for blacklisting my site while it should be blacklisted, or for claiming that we were dangerous while we were actually serving malware.
Not clearing up the blacklists promptly after the threat is identified and removed is an entirely different question. If you're going to go around blacklisting sites then I think you also have a responsibility (and, for that matter, you should also have a legal obligation) to remove them from the blacklist with similar efficiency if you're notified that the threat has been removed. Claiming that someone's site is dangerous when it isn't is defamatory, and should be treated as such.
Google is not doing this as a service to the website, they are doing it as a service to the user. Giving the owner some kind of grace period to fix the issue could mean hundreds of thousands of people could get hit with a malicious script in the meantime.
If Google waited 24 hours before blacklisting a site, how many people would be infected during the grace period?
It's incorrect to say that Google doesn't attempt to contact the site owner. According to the Webmaster Tools support site [0], Google will send notices to several common email addresses when it blacklists a site.
> Seriously, how hard is it to locate Rasmus' e-mail address?
Why should google do that ? because it's Rasmus? they dont have to do that period.
The CORRECT SOLUTION is to protect users FIRST and not allow the site to infect more computers.
IT IS NOT google responsability to warn webmasters if their site are infected (though they can be warned by email automatically).
IT IS the webmaster's responsability to audit his website security, which obviously did not happen with php.net. If they get punished for that , that's FAIR , because it will force them to take security more seriously.
[edit]
it's hight time people move from httpd to something else like nginx. httpd is insecure by default, this is not how you deal with security. as for PHP, since it doesnt promote any good security practice by default, it should be avoided.
Are you seriously suggesting that Google should not immediately block a website when they detect malware on it... because millions of people are using that site?
I don't understand your suggestion that Google should have waited to protect their users from a site that is serving malware. Why would that be a good idea?
Everybody seems to laugh and rage about this, but could somebody tell me if this is correctly detected or not? I would not be surprised at all if somebody had breached php.net. Did they properly check against intrusions?
From my experience are these contents only provided once per IP and then you're getting filtered to not get any content again, to prevent 'easy' detection of this.
You simply get blacklisted after the first serving
Yeah, I ran across malware once that only injected JS for visitors from certain referrers, such as Google search. I believe the intention was so that when someone would tell me, "Hey, you have a bunch of weird links on your site" I would go to it directly and not see a problem. IIRC the .htaccess had been modified.
I've seen this sort of thing from the Darkleech apache module[1]. It also won't show the malicious Javascript to any IP that appears in the `last` log. It looks like php.net uses Apache too[2]. The easiest way I've seen to find the module (they come with a variety of names) is to do something like
What a mess. I hope running Chrome via EMET is enough to keep my machine safe.
I've noticed that hacks have gone up recently in my little part of cyberspace. Things like Cryptolocker are so profitable that its motivating a lot of talented guys to get into malware and hack servers. Usually servers running some unpatched CMS or module.
then include the functions js you will see an autocomplete list of functions when you type into the pattern box. The lists of function names are stored in a compressed string at the top so it's not really obfuscated, just minified. They shouldn't store it minified though.
Both files appear identical to me. Which is odd, since there is only 1 CNAME to an address with 1 A record. Perhaps your version of static.php.net is cached by your ISP?
Even with only one public facing address there could be more than one server handling the content. It could be that only one had a bad file, or they all did but that one is yet to be cleaned. Or, as you say, the bad file could be cached at the ISP level (if this was only affected one ISP, whcih obviously it didn't, it could even have been injected at the point rather than at php.net's resources).
Its was definitely hacked .. the log shows that the size of userprefs.js has definitely changed multiple times in the past 25 hrs : http://lerdorf.com/static.log.gz
The site that is linked to in the obfuscated code is http://lnkhere.reviewhdtv.co.uk/stat.htm and it is that site which Google has marked as unsafe. Php.net has received the malware warning as a result.
Notably the whois on that domain includes the registrants full name and address. Nominet allows personal registrants an opt-out on the full details in whois, so you would be unlikely to try and hack PHP.net and forget to use a privacy service on a domain name that isn't quite so traceable..
The domain record for that site show:
Domain name:
reviewhdtv.co.uk
Registrant:
Oli Bachini
Registrant type:
UK Individual
Registrant's address:
Rainbow Cottage
West Perry
Huntingdon
Cambs
PE28 0BX
United Kingdom
Registrar:
Webfusion Ltd t/a 123-reg [Tag = 123-REG]
URL: http://www.123-reg.co.uk
Relevant dates:
Registered on: 13-Oct-2010
Expiry date: 13-Oct-2014
Last updated: 06-Oct-2012
Registration status:
Registered until expiry date.
Name servers:
ns.123-reg.co.uk
ns2.123-reg.co.uk
WHOIS lookup made at 11:44:39 24-Oct-2013
@icebraining, You're right! The file has indeed been changed a lot lately. In fact as can be seen here: http://lerdorf.com/static.log.gz that file has changed in size from: 2602 bytes to 5821 to 1279 all in the space of 25 hours... that is really suspicious
Err, often they do. Or more correctly, they often don't show something you think they would if it happened.
Logs show a subset of what has happened. There's no way to prove they are showing everything, so there's no way to use them to prove what did not happen.
> For php.net, it reports only mere 4 trojans. So php.net is almost 100 times safer that google.com, according to this tool. That sounds pretty good :)
Compare how many google.com pages have been tested and how much php.net pages have been tested and stop with that non sense.
There seems to be some controversy here, and one of our research systems found the same problem. So heres a quick post and a link to the full pcap so you can see for yourselves.
I think our social media coordinator got a little happy with the options. For now disable JS to get a nice read, I'll see about getting that fixed.
EDIT: fixed, looks like last update of WP-Socializer introduced the bug. Disabled for the time being. Thank you and sibling poster for pointing it out.
This is what happens when you give too much power to one company. And what is the appeal process? Asking for help on Twitter as the founder of a huge project like PHP? https://twitter.com/rasmus/status/393258264034422785
It's very heavy handed. It has not been 100% verified that the site was compromised, and a lot of very technically smart PHP community members are looking hard at this. It may prove to be a false positive or otherwise, but in the meantime:
1. Google is blocking access to the site in Chrome.
2. Firefox is warning users that php.net is not to be trusted (it uses the same list of infected sites provided by Google).
3. Google is warning users on Google Search that "This site may harm your computer.".
4. Google's appeals process is slow and cumbersome.
So yeah, that is a lot of power for one company.
If this happened to your website due to, for example a false positive, you would be pretty unhappy. Only a high profile project like PHP gets this kind of attention, but I'd happily wager that many smaller websites suffer the same faith every day.
A site we once had under development was incorrectly flagged. I reported the error via the webmaster tools and after less than 20 minutes, the warning went away.
Google doesn't notify anybody, you have to find out for yourself the hard way.
And after that, it forces the owners of the site to register with Google and use Google services just to even figure out why, and to get their sites unflagged. And that is after the owner even figured out how and where to contact Google.
Yes, they do. If you've signed up for Webmaster Tools, you'll get notified by email.
They don't force anyone to sign up. If you do nothing other than fixing your website, eventually Google will check it again and remove from blacklist.
Seriously, what's your complaint? If you don't want to get blacklisted, don't let your site be hacked. If your site is hacked, and you're complaining that Google blacklisted it and notified you about it, you're dumb.
And guess what -- they provide this service (and also pay the real person to review your re-listing request) for FREE.
Google sends out emails to a bunch of different addresses like webmaster@domain.com, abuse@domain.com, etc and notifies anyone signed up through Google Webmaster tools. The only improvement I can think of would be if they notified whoever was listed after doing a WHOIS of the domain but that's a little hard to automate.
>And after that, it forces the owners of the site to register with Google and use Google services just to even figure out why, and to get their sites unflagged.
Google forces you to prove that you own the domain before they give you any information that they don't release publicly. How else do you suggest they go about not releasing everything publicly? Also, all you have to do as a site owner is click on the safe browsing diagnostic link and go from there.
In our case the email alerts went out 12 hours after they identified our site and started giving the warning to users. We got several calls from customers before being notified by Google.
WebMaster Tools really need some improvements - there is no way to re-scan suspected page fast and get more info about the issue. If even Rasmus was unable to get this resolved fast, imagine regular webmasters in the same situation.
A site I visit frequently was once identified as containing malware. I overrode it and went there anyway. (In firefox.)
And now forevermore the icon for that site in the url-bar dropdown is the warning icon, and I have not been able to find out how to change it back to the normal one.
Favicion caching is extremely aggressive in Firefox. In the past, visiting the URL of the favicon and pressing Ctrl+F5 was enough. Nowadays, you have clear your cache [1] and then restart your browser.
[1] Tools -> Options -> Advanced -> Network -> Cached Web Content -> Clear Now
The icon is correct in tabs (and by correct I mean not there - the site has no favicon), it's only incorrect in the url-drop down (the arrow in the url box which shows you the most visited pages).
7 months in "web years" is pretty old, but as you know PHP has been around a long time, so there's still alot of relevant information for those who depend on the site.
Well somebody screwed up here. Maybe PHP core developpers should concentrate on the security of their own website , it's more than embarrassing. There is no reason why php.net should use anything more than a static site generator.
From the headers, php.net seems to be Apache/PHP on BSD. This might be an example of a widespread ongoing attack pattern which is a bit of a mystery.
For the past year or more there have been compromises in this pattern - Linux/unix platform, Apache webserver; foreign Javascript or PHP gets inserted somehow; and/or in some cases the server binary is replaced. Sample article: http://arstechnica.com/security/2013/04/exclusive-ongoing-ma... - you can find more on this.
The big question is how the original exploit happens. It may be a long-out-there 0-day, or some admin tool that the sites have in common, or credentials taken from compromised boxes of developers, or something else.
Serious question: this "arbitrary flag" would possibly be in reference to this widely-reported watering hole attack[1] (or was that attack misreported?), or are you referring to some other issue with that web site?
False positives are the life of security. Microsoft Updates (update.microsoft.com) was just blacklisted by malwaredomains this week. It happens. Algorithms are not humans.
According to Twitter post by Rasmus (https://twitter.com/rasmus/status/393258264034422785) this has been like this for at least 1 day and still has not been fixed. Something tells me that Google has way too much power and the fact that they don't sort out false positives in a timely fashion is really bad.
According to one of the people responsible for a software project that has been plagued by security holes for ~15 years, and whose website was hacked, and who hasn't fixed it...
Yeah, that's definitely a problem with google alright. Just because the entire PHP team disregards security completely, doesn't mean the consequences of that are google's problem. The fact that they just assume it is a false positive and don't even bother to verify their hacked site is incredible.
Not sure if you're joking or lacking knowledge. Just because it's the official PHP site does absolutey NOT mean it cannot contain malware. Legitimate sites are compromised and used to spread malware all the damn time.
But in this case it looks like Google tool found legit, but obfuscated file, which was loaded in some tricky way that bad sites usually use, and decided it's a malware.
mysql.com was hacked by a sql injection [0]. microsoft.com had XSS vulnerabilities a while back that allowed auth token harvesting via an overly generous cookie paths [1].
Any website in the world has the potential to be flagged as serving malware.
One reason I migrated away from php is the fact that there is simply way too many attack vectors. Using frameworks help quite a bit, but it is to easy to miss configure a stock php install. Not saying that is the case here though.
that's the issue, PHP should be secure(ie restrictive) by default, Linux style... it is not. PHP+Apache => recipe for disaster. PHP is a templating language yet doesnt do html sanitizing by default !
95% of compromised websites are PHP ones.
That's the reason why PHP will die eventually,when businesses understand while it's cheap to go online with a PHP cms, once you get hacked , it will cost you your business.
That's speculation. I've seen servers get compromised due to FTP problems, SSH misconfiguration, unpatched Apache vulnerabilities, third-party stats monitoring software with 0-days and even SQL injection.
Defacement (I consider malware injection a form of defacement) isn't unique to PHP by a long shot.
This is ridiculous speculation on your part, you can't speculate with security, for all you know the webmaster's ex-girlfriend could have inserted the malware.
Currently using Django. Once I started playing around with it I haven't looked back. Although I am told cake php and a few other frameworks really do improve php.
they dont improve PHP. you still have to deal with PHP shortcomings even with a framework. But since you dont deal with low level stuffs your code might be more secure yeah.
PHP has too many unsecure apis accessible to beginners.
With Django for instance you have a view layer with auto escaping by default.You dont write unsecure SQL queries ,..., That makes a huge difference.
What our systems found was definitely a compromised JS file, and others on this thread have posted something similar to what we saw. This is not a false positive.
We have detailed help for webmasters in this kind of situation:
http://www.google.com/webmasters/hacked/
One thing that I strongly suggest to any webmaster in this situation is to look for any server vulnerability that allowed this file to get compromised in the first place. We sometimes see webmasters simply fix the affected files without digging into security hole that allowed the hack, which leaves the server vulnerable for repeat attacks.
Happy to answer questions.