4 years ago (2015-09-20)  Algorithm language |   First to comment  9 
post score 0 times, average 0.0

Recently, Wang Baiyuan’s blog is quite quiet. Yesterday it was the third time Alibaba Cloud had sent a host resource that was excessively expensive, leading to a forced shutdown of the site:

We apologize to inform you that due to the excessive consumption of system resources, the virtual host free version type host product qxu10******3 you purchased at WAN has been shut down.

  The extremely guests could not help restarting the host manually again and again, and Alibaba Cloud's free hosting could only manually restart the site three times a month, which made me very confused.Since the extremely guests are using Alibaba Cloud's free virtual hosting, its configuration is strictly limited, and it is naturally less than some independent servers. However, it is more than sufficient for small-scale blogging websites, and due to excessive resource consumption for two months. Problems arise.In addition, the number of visits to my website in the past two months was not much more than in the previous months. I also closed some of my visits to robots such as Baidu Cloud Watch and Shenma Search. There should be no excessive visits to cause the host to stop.Therefore, the problem of excessive consumption of resources really puzzles me.

First, analyze the website log

The website log is a visitor's visit to the site, the site automatically records the visit time, visitor IP, UA, HTTP request type, request URL, source URL, and the returned status code. Through the date and time in the notification email sent to me by Alibaba Cloud, I found the relevant website logs at the last moment of three host downtimes.Except for the exception of the website log at the first downtime, the situation on the second and third downtime was similarly "miraculous." Suffering a thrilling adventure with a malicious scan download zip attack

Website logs on September 19, 2015 at around 14:10

Web log at around 18:10 on September 17, 2015

The following can be found through the website log:

  • Request method is HTTP head;
  • Extremely intensive requests (523 requests in one and a half on September 17; 424 requests in one minute on September 19)
  • Host crashes return status code: 500 (server internal error) at the last moment
  • The request file type is a zip file, which looks like www.web, wwwrroot, etc.

Second, my analysis

HEAD is a request method like GET or POST in HTTP. Unlike GET, the client sends a HEAD request to the server.The server will only return the header portion of the page, which is much faster than requesting the main part of the page. The above site's log record IP is obviously scanned through HEAD for possible zip files in the root directory of my website. Through the HTTP status code, the attacker can know if it has a randomly scanned zip file and download it if it exists.Of course, the 404 (requested resource does not exist) in the log indicates that the attacker did not succeed: Get the “he wanted” zip file on my website.This made me a false alarm. Then, the intensive requests led to the excessive consumption of resources by my host. Maybe you think that even if downloading a few zip files does not matter, but you can see that the zip file name that the attacker wants to download looks random and actually has a lot of articles. The web, wwwroot, and www are common names in the root directory of the web site. The webmasters, including extremely important guests, may regularly back up the files on their web sites: Select all and then compress.Just like compressing several folders or files on a computer, the compressed file name created is often the same as the root directory.So the attacker maliciously traverses the possible root directory name .zip and then tries to download your website backup file.Take the wordpress website system as an example. Some files in your backup file (wp-config.php) record your FTP and database passwords. It can be said that knowing FTP and database passwords, your website means the whole line. Fallen.

Third, take preventive measures

1, prohibit the request zip and other compressed files

Due to the limited authority of the virtual host, the most effective way to control user access behavior is to add related rules in htaccess. The rules for forbidding compressed files such as zip are as follows:

The above rules prohibit the request of compressed files in the ziprargz format.

2, handle head request

The head request is not commonly used in http requests. We can completely prohibit this request method:

The above rules will lead to rewrite the Baidu home page regardless of the head mode request. To download it, go to the Baidu home page.    

 

This article has been printed on copyright and is protected by copyright laws. It must not be reproduced without permission.If you need to reprint, please contact the author or visit the copyright to obtain the authorization. If you feel that this article is useful to you, you can click the "Sponsoring Author" below to call the author!

Reprinted Note Source: Baiyuan's Blog>>https://wangbaiyuan.cn/en/malware-scan-download-zip-attack-journey-2.html

Post comment

Style

No Comment

登录

Forget password?

您也可以使用第三方帐号快捷登录

切换登录

注册

TW