This article discusses the hidden pitfalls of hosting multiple websites on one hosting account, and how you can remediate the consequences of website cross-contamination.
The structure of virtual hosting (also known as shared hosting) can be illustrated by a bee hive: each website (bee) has its own folder (cell). At the same time, all bees share the same hive (hosting account resources, such as disk space, database, RAM, CPU, etc.).
In most cases, hosting companies do not provide resource isolation for shared hosting accounts (plans that let you host multiple websites on one account). In practice, that means that all website files are owned by the same system user, and server scripts (using PHP, Python, Perl, etc.) on each website on the account run with equal access rights. So, we get into a situation where the scripts of one website on the account may create, remove or modify any file on any other website hosted on the same shared hosting account.
This can also lead to the possibility of a person or script being able to access any website database on the same hosting account. In such situations it is enough for attackers to break into just one website (e.g. a small, outdated, forgotten blog on the very last corner of the hosting account) to gain control over all websites on the account.
Are there any real life examples?
A shared-hosting customer sent a request to clean some infected website on a hosting account. It is blocked by a desktop antivirus software and the host claims it is sending out spam emails.
After a brief check, I found that all the other websites on the account have hacker scripts and injections as well. So instead of cleaning just one website, I needed to clean 5 (10, 20, 50, a lot). The customer is upset, naturally.
Very often, in such situations, to save money on cleaning, the client might ask just to clean the infected website and leave the others. This is bad practice, because in a few days, and in some cases even in a few minutes, the website can become reinfected with malicious scripts from infected neighbors.
To sum up, if even one website became infected on a multi-website virtual hosting instance, all the websites should be scanned (and cleaned if required). Otherwise, the cleaning process may just go into an infinite loop of constant reinfections.
The owner of an infected account has a few options to effectively solve this issue:
- Isolate each website on a separate hosting account e.g. purchase separate, single-site hosting plans, then migrate websites and clean them one by one.
- Remove or lock all unused websites, and then clean and harden the remaining websites within a short period of time (less than 24 hours for all websites) to prevent further reinfections.
- Clean and harden all the websites within a short period of time (less than 24 hours for all websites) to prevent further reinfections.
It is important to understand that:
- Websites should be isolated, i.e. placed under separate user accounts using virtual hosting (or placed under virtual hosting-supported website isolation).
- Scanning, cleanup and hardening must be done for all active websites on a virtual hosting environment within a short period (i.e., 24 hours or less).
Many shared hosting account owners incorrectly understand the term 'website isolation'. Additionally, the lack of understanding of the dynamics of malware infection, and the ways in which websites become infected, create the following misconceptions.
Creating a separate FTP account for each website will isolate them.
Unfortunately, it won't. In this case, websites will be isolated only on the FTP level. It does not have any effect on the web scripts, which will still have the same access level to any file or database within the same shared hosting account.
True isolation is possible only when you are hosting your websites on different accounts, or if your host supports jailed environments (via virtualization, OS settings, kernel tuning, etc.). I should also mention here that the PHP option 'open_basedir' cannot fix the issue.
I can clean one website now and leave the rest for later.
Again, unfortunately, this doesn't work. By the time the last website is cleaned, the first one will be reinfected again from its 'sick' neighbors, and you will go into infinite cleanup loop.
What if websites are hosted on a VPS/VDS/Dedicated Server?
For a dedicated server, the same rules are valid. It is too risky to place multiple websites within one user account. You should create separate system users for each website and place them into a separate jailed environment owned by this user.
If you already have a VPS/VDS with several websites hosted in one folder, it is recommended to spend some time splitting the websites and placing them under different jailed locations owned by separate system users. It might save you a lot of time and money in the future.
What are the other benefits of website isolation?
- The SFTP protocol should be used to connect to your website host instead of insecure FTP.
- You can set up separate performance and resource settings for each website without worrying that it will interfere with the functionality of other websites.
- Individual access can be granted to each website for SEO specialists, digital editors or web developers, without worrying if they might break neighboring websites (in error or deliberately).
- If any major security events occur, you can fix or cure the websites individually, one by one, with no stress or hassle.
I hope this information helps you to avoid mass infection or hacking issues with your websites, or, if this has already happened, effectively resolve the incident. Along with professional security advice, a comprehensive security solution such as Imunify360 is essential to keep such incidents from happening in the first place.