Google malicious content – hints on how to remove it

Google malicious content – hints on how to remove it

Why was my site blacklisted by Google?

Unfortunately, the majority of blacklisted websites are in reality legitimate websites into which hackers have inserted malicious content. Here are some popular reasons:

  • your HTML or PHP code was hacked
  • the redirects to harmful and infected sites were inserted through iframes or JavaScript
  • flash .swf files were added to your website

What is Malware?

It is malicious software specifically created to gain access or damage a computer without the knowledge of the owner. Viruses, worms, and Trojan horses are examples of malicious software that are often grouped together and referred to as malware. Malware can open access to your personal information, such as passwords and credit card numbers or alter your search results without your knowledge.

How to fix the issue

Below are some suggestions how to resolve this issue if it is happening to your site:

1. Scan and remove the malicious content from your site.
2. Scan your PC.
3. Register and verify your site in Google’s free Webmaster Tools.
4. Sign into Webmaster Tools and check the Security Issues section to see details of sample URLs that may be infected.
5. Find the cause of the security issue and fix it. Otherwise, your site is likely to be reinfected.
6. Request a review in the Security Issues section in Webmaster Tools when your entire site is clean and secure. Once Google determines your site is fixed, they will remove the malware label.

How to – Sitemap

How to – Sitemap

What is a sitemap and why do you need it?

A sitemap displays the structure of your site, its sections and links between them. It helps users navigate your site easily plus it’s necessary for better SEO rankings. Sitemaps improve SEO of a site by making sure that all the pages can be found – this is particularly important if a site uses Adobe Flash or JavaScript menus that do not include HTML links.

Moreover, whenever your site gets updated, your sitemap notifies search engines about it. There are two types of audiences a sitemap is useful to – site visitors and web spiders, and there are also two types of sitemaps: HTML and XML sitemaps.

HTML sitemap is for visitors – it helps find information on the page, and it’s usually located in the website’s footer.
XML sitemap is for web crawlers – it tells them which parts of the site should be indexed as well as the hierarchy and priority of the site content. The instructions to the sitemap are provided in robots.txt file.

Creating a sitemap and submitting it to Google

Getting a sitemap for your site is simple. You can do it in three steps: generate a sitemap, upload it to your site and notify Google about it. There are two ways to generate a sitemap – either download and install a sitemap generator or use an online sitemap generation tool. There are a lot of options available here.
Some of them are free, but they often have a crawl cap on site URLs, so it’s up to you to decide which one to use.

When choosing an XML sitemap generator, pick one that allows reviewing the crawls of URLs and deleting any duplicated URLs, excluded URLs, etc. – you only want to include the pages on the site that you want a search engine to index.

Once you created a sitemap, you need to upload it to your site document root ( and let Google know about it – this means that you need to add the site to your Google Sitemaps account.

NOTE: In order to be able to add the sitemap to your account, you need to be the legitimate owner of the site.

Sitemaps: tips and tricks

  • When creating a sitemap, go back and make sure all of your links are correct.
  • All the pages on your sitemap should contain a link back to the sitemap.
  • Most SEO experts say your sitemap should contain 25 to 40 links. Besides, it makes your site more user-friendly and readable.
  • Your sitemap should be linked from your home page. If linked from other pages, the web crawler might find a dead end and leave.
  • The anchor text (words that are clickable) of each link should contain a keyword whenever possible and should link to the appropriate page.
  • Small sites can place every page on their site map, but this is not a good idea for larger sites. You just don’t want search engines to see an endless list of links and assume you are a link farm.

As you see, we’ve provided you with instructions only for the Google search engine. Well, we chose it because it’s the most important one.

There is also Bing, and the procedure is similar for it. At the moment, Yahoo! and MSN do not support sitemaps, or at least not in the XML format used by Google. Yahoo! allows submitting ‘a text file with a list of URLs, and MSN does not offer even that.

What is a robots.txt file and how to use it

What is a robots.txt file and how to use it

Robots.txt – General information

Robots.txt is a text file located in the site’s root directory that specifies for search engines’ crawlers and spiders what website pages and files you want or don’t want them to visit. Usually, site owners strive to be noticed by search engines, but there are cases when it’s not needed: For instance, if you store sensitive data or you want to save bandwidth by not indexing excluding heavy pages with images.

When a crawler accesses a site, it requests a file named ‘/robots.txt’ in the first place. If such a file is found, the crawler checks it for the website indexation instructions.

NOTE: There can be only one robots.txt file for the website. A robots.txt file for an addon domain needs to be placed to the corresponding document root.

Google’s official stance on the robots.txt file

A robots.txt file consists of lines which contain two fields: one line with a user-agent name (search engine crawlers) and one or several lines starting with the directive


Robots.txt has to be created in the UNIX text format.

Basics of robots.txt syntax

Usually, a robots.txt file contains something like this:

User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /~different/

In this example three directories: ‘/cgi-bin/’, ‘/tmp/’ and ‘/~different/’ are excluded from indexation.

NOTE: Every directory is written on a separate line. You can’t write ‘Disallow: /cgi-bin/ /tmp/’ in one line, nor can you break up one directive Disallow or User-agent into several lines – use a new line to separate directives from each other.

‘Star’ (*) in User-agent field means ‘any web crawler’. Consequently, directives of the type ‘Disallow: *.gif’ or ‘User-agent: Mozilla*’ are not supported – please pay attention to such logical mistakes as they are most common ones.

Other common mistakes are typos – misspelled directories, user-agents, missing colons after User-agent and Disallow, etc. When your robots.txt files get more and more complicated, and it’s easy for an error to slip in, there are some validation tools that come in handy:

Examples of usage

Here are some useful examples of robots.txt usage:

Prevent the whole site from indexation by all web crawlers:

 User-agent: *
Disallow: / 

Allow all web crawlers to index the whole site:

  User-agent: *

Prevent only several directories from indexation:

User-agent: *
Disallow: /cgi-bin/ 

Prevent the site’s indexation by a specific web crawler:

 User-agent: Bot1
Disallow: / 

Find the list with all user-agents’ names split into categories here.

Allow indexation to a specific web crawler and prevent indexation from others:

User-agent: Opera 9
User-agent: *
Disallow: / 

Prevent all the files from indexation except a single one.

This is quite difficult as the directive ‘Allow’ doesn’t exist. Instead, you can move all the files to a certain subdirectory and prevent its indexation except one file that you allow to be indexed:

 User-agent: *
Disallow: /docs/ 

You can also use an online robots.txt file generator here.

Robots.txt and SEO

Removing exclusion of images

The default robots.txt file in some CMS versions is set up to exclude your images folder.This issue doesn’t occur in the newest CMS versions, but the older versions need to be checked.

This exclusion means your images will not be indexed and included in Google Image Search, which is something you would want, as it increases your SEO rankings.

Should you want to change this, open your robots.txt file and remove the line that says:

   Disallow: /images/ 

Adding reference to your sitemap.xml file 

If you have a sitemap.xml file (and you should have it as it increases your SEO rankings), it will be good to include the following line in your robots.txt file:


(This line needs to be updated with your domain name and sitemap file).

Miscellaneous remarks

  • Don’t block CSS, Javascript and other resource files by default. This prevents Googlebot from properly rendering the page and understanding that your site is mobile-optimized.
  • You can also use the file to prevent specific pages from being indexed, like login- or 404-pages, but this is better done using the robots meta tag.
  • Adding disallow statements to a robots.txt file does not remove content. It simply blocks access to spiders. If there is content that you want to remove, it’s better to use a meta noindex.
  • As a rule, the robots.txt file should never be used to handle duplicate content. There are better ways like a Rel=canonical tag which is a part of the HTML head of a webpage.
  • Always keep in mind that robots.txt is not subtle. There are often other tools at your disposal that can do a better job like the parameter handling tools within Google and Bing Webmaster Tools, the x-robots-tag and the meta robots tag.

Robots.txt for WordPress

WordPress creates a virtual robots.txt file once you publish your first post with WordPress. Though if you already have a real robots.txt file created on your server, WordPress won’t add a virtual one.

A vrtual robots.txt doesn’t exist on the server, and you can only access it via the following link:

By default, it will have Google’s Mediabot allowed, a bunch of spambots disallowed and some standard WordPress folders and files disallowed.

So in case you didn’t create a real robots.txt yet, create one with any text editor and upload it to the root directory of your server via FTP.

Blocking main WordPress directories

There are 3 standard directories in every WordPress installation – wp-content, wp-admin, wp-includes that don’t need to be indexed.

Don’t choose to disallow the whole wp-content folder though, as it contains an ‘uploads’ subfolder with your site’s media files that you don’t want to be blocked. That’s why you need to proceed as follows:

Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/ 

Blocking on the basis of your site structure

Every blog can be structured in various ways:

a) On the basis of categories
b) On the basis of tags
c) On the basis of both – none of those
d) On the basis of date-based archives

a) If your site is category-structured, you don’t need to have the Tag archives indexed. Find your tag base in the Permalinks options page under the Settings menu. If the field is left blank, the tag base is simply ‘tag’:

   Disallow: /tag/ 

b) If your site is tag-structured, you need to block the category archives. Find your category base and use the following directive:

Disallow: /category/ 

c) If you use both categories and tags, you don’t need to use any directives. In case you use none of them, you need to block both of them:

 Disallow: /tags/
Disallow: /category/ 

d) If your site is structured on the basis of date-based archives, you can block those in the following ways:

 Disallow: /2010/
Disallow: /2011/
Disallow: /2012/
Disallow: /2013/ 

NOTE: You can’t use Disallow: /20*/ here as such a directive will block every single blog post or page that starts with the number ’20’.

Duplicate content issues in WordPress

By default, WordPress has duplicate pages which do no good to your SEO rankings. To repair it, we would advise you not to use robots.txt, but instead go with a subtler way: the ‘rel = canonical’ tag that you use to place the only correct canonical URL in the section of your site. This way, web crawlers will only crawl the canonical version of a page.

Does dedicated IP affect SEO?

Does dedicated IP affect SEO?

Sometimes SEO consultants recommend their clients to get a dedicated IP for better SEO. However, it is not completely true that having a dedicated IP improves your position in search results

Let’s take a closer look at this question.

Performance and Speed

The speed of your website impacts your search result essentially. So, you always need to strive for making your website as fast as possible.

If your account is located on a shared hosting server, your website most likely shares the same IP with another website handling high traffic. Especially if the site has a lot of visitors and heavy content, which can a bit slow down your website. And the speed impacts SEO as it influences search engine ranking position with Google.  However, it doesn’t mean that getting a dedicated IP will speed up your website.

Luckily, there are numerous optimization methods to speed up website loading time significantly. If you have a WordPress website, you can refer to this guide about optimizing and speeding up the website. You can check your website with online tools such as Google Page Speed to track the website speed results after each step and optimize recommended items. You may also consider our Business SSD plans with the pure SSD disks, which are faster and more reliable. SSD provides improved performance for consumers (particularly for larger, heavy websites such as popular Magento stores or very busy WordPress blogs).

A server with SSD disks read data faster than a standard HDD or a hybrid system that includes several servers which all share one external SSD powered storage. SSD disks are more resistant to drops, bumps, g-forces and run quietly, have lower access times, and less latency comparing to HDD.

Improved performance is achieved by higher read/write speeds on SSD disks, which allow websites to load faster. With a SSD a server can handle more read/write requests in general before becoming unstable, which confirms the stability of the server and uptime. Even during peak activity the backed up I/O requests can be served.


On the shared server, your website can share an IP address with another website that has been marked as malware. In such cases some network security programs can “null route” requests to these IPs and it will make your website unreachable. Also some anti-virus software may mark all websites with the same IP as malware, which will cause false alerts against your website and such kind of alerts can be displayed in most of modern browsers (for example, ‘This site may harm your computer’ or ‘The website ahead contains malware’). So having a dedicated IP can be a good way out.


SSL Certificates encrypt  traffic to your site. Usually it’s required by e-commerce website and websites, which store personal information.

With the cPanel version 11.38 and higher, SNI technology that allows to install multiple SSL certificates on a Shared IP address. So, a dedicated IP is not a must for installing SSL Certificate. However, having a few SSL certificates on a shared IP address may cause issues with older versions of some rare  and/or old browsers. In this case, all the visitors of such websites will receive a message containing information about untrusted connection, indeed the website is going to be available via HTTPS protocol. Dedicated IP address helps to avoid such issues. Here is a list of browsers supporting SNI:

Desktop Browsers:

  • Internet Explorer 7 and later on Windows Vista and later
  • Internet Explorer (any version) on Windows XP does not support SNI
  • Mozilla Firefox 2.0 and later · Opera 8.0 (2005) and later (TLS 1.1 protocol must be enabled)


  • Google Chrome:
    • Supported on Windows Vista and later
    • Supported on Windows XP on Chrome 6 and later
    • Supported on OS X 10.5.7 on Chrome v5.0.342.1 and later


  • Safari 2.1 and later:
    • Supported on OS X 10.5.6 and later
    • Supported on Windows Vista and later

Mobile Browsers:

  • Mobile Safari for iOS 4 and later
  • Android default browser on Honeycomb (v3.x) and later
  • Windows Phone 7

This doesn’t affect SEO directly, but is should be mentioned as well.

SEO and Shared IP addresses

The real thing you should take into account is Google penalties due to links or malware.

It may happen that your website  is hosted on a shared server with a website considered as ‘spammy’ and your website could be negatively impacted in this case. However, we are constantly working for prevention of such occurrences on our servers.

Google understands that the website can be on the shared hosting and you can’t control websites, which share same IP or IP subnet. However, if there are some ‘spammy’ websites on the same IP and only one is ‘normal’, that looks rather bad and it worth to be worried about. On the other hand, when there is a mix of ‘spammy’ and ‘normal’ websites on the same IP (like on most of shared hosting), there is nothing to worry about.

For example, you have a website and it shares a host and IP with a big WordPress Multisite that is used as a ‘link farm’ to generate links to ‘spammy’ websites to try and gain position in search engine ranking positions. In this case, your website takes a risk to be penalized for sharing the same IP as this ‘link farm’ because of the violations, which involves other thousands of sites on the network.

Penguin update by Google improved their search ranking algorithm and takes into account websites participating in a link network.

Monitoring Your Website for Issues

1. Utilize a Reverse IP Lookup

You can always look up and see who is sharing your hosting IP address. There are many free tools to do a reverse IP lookup in the Internet. For example, this one.

shared ip seo 1 - Does dedicated IP affect SEO?

2. Monitor Google Webmaster Tools

You can add your website to Google Webmaster Tools and enable email alerts to be sure you will be get a notification from Google Webmaster Tools if they send you a warning.

Here’s an example:

shared ip seo 2 - Does dedicated IP affect SEO?

3. Resolving Link Warnings

If you discover a problem on the website or are notified by Google in Google Webmaster Tools you can do following steps:

  • Contact your hosting provider and ask to provide you with another IP.
  • Submit a Reconsideration Request in Google Webmaster Tools.


There can be exceptions to the rule, but in general it’s not a must to have a dedicated IP address for improving SEO. Always check your website by Google Webmaster Tools and monitor your co-hosted sites in order to protect your website and pay attention to security and optimization of your own website.

There are much many other SEO related things you should focus on before considering a dedicated IP address. For example, you can check Google’s Webmaster Guidelines for more information.

If you still think you need a dedicated IP address, make sure you understand ‘WHY’ so that you and your hosting provider can find a solution that works for you.

That’s it!


5 Quick Ways to Speed Up Your WordPress Site

5 Quick Ways to Speed Up Your WordPress Site

No one likes waiting around for a site to load, so much so that 40 per cent of people abandon a site that takes more than three seconds to load.

Slow page load speeds are especially crippling for eCommerce sites. Almost 80 per cent of shoppers who are dissatisfied with a site’s performance are less likely to buy from the same site again, while a one second delay decreases customer satisfaction by 16 per cent.

Even Google factors site speed into their algorithm when ranking websites. So if your site loads too slowly you can expect your Google rankings to fall, and in turn attract less traffic to your site.

So what can you do to make your site faster? Here are a few simple ways to keep your page load times low and your visitors happy.

Use a Great Caching Plugin

If you’ve got static images, CSS and Javascript on your website that rarely change, browser side caching can help make your site snappier.

Caching involves storing parts of your site so they only need to be loaded once instead of every time a user visits your site. Caching is especially helpful for your return visitors, as well as others who visit several pages of your site.

W3 Total Cache is a popular caching plugin for WordPress used by sites

WP Super Cache is a more user-friendly alternative and you don’t need to be a server expert to set it up.

Compress Images

Images are usually the largest files on a site so if they aren’t compressed they can take ages to load.

Luckily there are some great tools out there to help you compress your files.

WP, now managed and supported by WPMU DEV, automatically strips meta data from JPEGs and removes unused colors from indexed images.

Tiny PNG is another great tool, which allows also strips un-used colors for lossy compressions.

If you use a lot of images on your site, you might want to implement lazy loading. The Lazy Load plugin allows you to only load images above the fold when a new visitor arrives at your site. Images load once a user starts scrolling down the page. This technique not only speeds up page load times, but also saves bandwidth for users who don’t scroll all the way to the bottom of your pages.

Minify HTML, CSS and Javascript

In other words, remove all white space from code where possible.

While spaces and tabs make code more readable for humans, servers and browsers couldn’t care less as long as it’s valid and executes without error.

Rather than manually sift through your code with a fine tooth comb, plugin’s like WP Minify and W3 Total Cache can handle this at runtime.

Cut Down on HTTP Requests

Every time someone visits a page on your site, the corresponding files must be sent to that person’s browser, including images, CSS files andJavascript library references. So if you have a HTML file, two CSS files, five Javascript files and eight images, that’s a total of 16 files that need to be loaded.

By reducing the number of objects in your site’s pages, you can minimize the number of HTTP requests that are required to render a page, speeding up load times.

One way to do this is by simplifying the design of your site, and combining files such as scripts and CSS. The minify section in W3 Total Cache allows you to add your CSS and Javascript files so you can easily combine them into one file.

Optimize Database Tables

Optimizing your database tables is like defragging your computer or changing the oil in your car – it will help free up space and keep your database running smoothly.

You can optimize your database tables manually using phpMyAdmin or with a plugin.

WP-DBManager allows you to optimize, repair, backup and restore your database.

There’s also WP-Optimize, another database cleanup and optimization tool. This plugin also lets you remove post revisions, comments in the spam queue, un-approved comments and items in trash.


This is a quick round-up of simple measures you can put in place to speed up your pages.

Optimizing your site can make a big difference in site speed, encouraging them to stick around and engage with your content.

These tips shouldn’t take very long to put in place and for the effort you put in you’ll get a speedier site and happier visitors.

Set up an email account that uses your domain name

Set up an email account that uses your domain name

The steps have been broken in two parts. First we’ll see how to create the domain email address. Second, we’ll integrate that domain email with your Gmail account.

1.Create the domain name email address

1.Log into your blog hosting control panel, or cpanel.

2.Click on Email Accounts in the Email section.

3.Enter the details for your new account, and click Create Account, as shown here.

image1 620x267 - Set up an email account that uses your domain name

4.You will see a notification that reads something like this: “Success! Account Created.” The account will be shown on the  same page.

5.Now go back to your cpanel and click on Forwarders in the Mail section. Then click Add Forwarder.

6.Fill all the details as shown below. Then, click Add Forwarder and you’re done.

image2 674x332 - Set up an email account that uses your domain name

Now all the emails sent to will be sent to your personal email address.


2.Integrate your new domain email with Gmail

1.Sign in to your Gmail account.

2.Go to Options, then to Mail Settings, then click Accounts and Imports.

3.Check Send Mail As, and click on Add Another Email Address You Own.

4.In the popup that appears, fill in your details, add the new domain email address you just created, then click Next.

5.Click on Send Verification, and a verification email will be delivered to your inbox. Simply click on the link to verify it, and  you are done.

6.Now, click on Compose Email, and see the changes you’ve made in action.

We hope these steps are clear enough for you to set up your own domain email address.

How To Ensure Your Website Content Is Secure

How To Ensure Your Website Content Is Secure

Securing your entire website with SSL is an integral part of website security and is a great way to build up that customer trust. If you are thinking of, or have already added full SSL encryption to your website pages this guide offers tips on ensuring your website content is secure.

Having full SSL encryption will see your web address start with https://, instead of http:// – therefore it’s important that all your website content is accessed via https://, too.

Upon adding SSL to your website you can preview the secure https:// version of your site, before anyone else – we recommend you browse your website pages and checkout for any certificate or browser warnings that could crop up, due to “mixed” or “insecure” content.

Things to look out for

Pay particular attention to any content you have added such as:

  • HTML Fragments
  • Externally hosted images
  • A wallpaper that is hosted elsewhere
  • Any third-party widgets or tools

It’s likely that the provider of any third-party content will have a https:// version available on their website or upon request. If they do, you can add this to your website and better improve the experience for your visitors.

Preparing your content for SSL

There are some best practices for ensuring your content is ready for SSL – the following tips will help to ensure the best experience for your visitors, but are also beneficial for your SEO ranking:

  • Check that all internal URLs are “relative” where possible – for example, a relative URL goes directly to the page “/home.html”. An “absolute” URL will include the domain name such as “”
  • Add the HTTPS version of your domain name to both Google and Bing Webmaster Tools

Implementing the tips above will help the search engines, like Google, index your pages with ease, but will also help to avoid those potential “mixed content” or browser warnings.


How can I stop search engine indexing certain pages on my website?

How can I stop search engine indexing certain pages on my website?

Sometimes you may wish to disallow search engines from indexing certain pages on your site – maybe they’re under construction, or only for certain customers. You can easily stop search engines indexing a certain page of you website.

To do this please follow the instructions below:

  1. Go to the ‘Content’ screen
  2. Click on the ‘Page options’ icon for that page.
  3. Click the tab titled ‘Meta Info’
  4. You will need to select the option “No” from the drop down for “Include In Sitemap”
  5. Go to “Save Changes”
What is SMTP?

What is SMTP?

The Mailman Inside Our Computers. Or:
What Is Simple Mail Transfer Protocol?

Almost all of your online activity is made possible through the help of protocols—the special networking-software rules and guidelines that allow your computer to link up to networks everywhere so you can shop, read news, send email and more. (Your IP address, which stands for Internet Protocol, is just one of many.)

The protocols are vital to your networking activity and, fortunately for you, you don’t need to manage, install or even think about them. They’re built in to the networking software on your computers. Thank goodness for advanced technology and IT geniuses!

Still, every once in a while, you may find yourself having to learn about a protocol—such as your IP address. That’s the case with a term that affects every email you’ve ever sent out in your entire life—Simple Mail Transfer Protocol, or SMTP. Without it, your emails would go nowhere.

What is SMTP?

SMTP is part of the application layer of the TCP/IP protocol. Using a process called “store and forward,” SMTP moves your email on and across networks. It works closely with something called the Mail Transfer Agent (MTA) to send your communication to the right computer and email inbox.

SMTP spells out and directs how your email moves from your computer’s MTA to an MTA on another computer, and even several computers. Using that “store and forward” feature mentioned before, the message can move in steps from your computer to its destination. At each step, Simple Mail Transfer Protocol is doing its job. Lucky for us, this all takes place behind the scenes, and we don’t need to understand or operate SMTP.

SMTP at work.

SMTP provides a set of codes that simplify the communication of email messages between email servers (the network computer that handles email coming to you and going out). It’s a kind of shorthand that allows a server to break up different parts of a message into categories the other server can understand. When you send a message out, it’s turned into strings of text that are separated by the code words (or numbers) that identify the purpose of each section.

SMTP provides those codes, and email server software is designed to understand what they mean. As each message travels towards its destination, it sometimes passes through a number of computers as well as their individual MTAs. As it does, it’s briefly stored before it moves on to the next computer in the path. Think of it as a letter going through different hands as it winds its way to the right mailbox.

Nothing fancy about it.

SMTP is able to transfer only text—it isn’t able to handle fonts, graphics, attachments, etc.—maybe that’s why it’s called simple. Fortunately, Multipurpose Internet Mail Extensions were created to lend a hand. MIME encodes all the non-text content into plain text. In that transformed format, SMTP is coaxed into transferring the data.

SMTP sometimes stands for “stop.”

Most of us don’t know this, but our Internet Service Providers typically have a limit to the number of emails we can send out over a certain amount of time. Most of the time, it’s limited to a set number per hour or per day.

Each ISP relies on its SMTP to determine (and govern) the email that can be sent out by one connection. (It is a protocol, after all.) For some people who work at home or manage large mailing lists, that could be a problem. After they hit their limit, the ISP will simply stop sending emails. If they think you’re a spammer, they might even shut down your account.

That email limit varies by ISP. For example, the typical Comcast Cable Internet customer is limited to 1,000 emails per day. (Their business customers have a limit of 24,000 emails daily.)

What is: cPanel

What is: cPanel

cPanel is a web based hosting control panel provided by many hosting providers to website owners allowing them to manage their websites from a web based interface. This program gives users a graphical interface from which they can control their portion of the Unix server. The tools provided are designed to simplify running and controlling a website. It uses a tiered structure that allows different levels of access. Administrators and end users can control the different aspects of the server and the website directly through their browser. CPanel is generally accessed using https on port 2083 or simply by adding “/cpanel” to the end of the host name. Depending on the hosting provider the cPanel will generally have some sort of auto installer or package dedicated to content management systems like WordPress.

cpanel 300x185 - What is: cPanel

With WordPress installed, a user can use cPanel to manage the features offered by their WordPress hosting plan. Some of these popular features are the ability to manage databases, domain names, mail accounts, and back ups. Software like cpanel, makes it extremely easy for users to manage their hosting with little or no technical knowledge of web hosting on their own without breaking anything.