WP SEO: What you need to know

Archive

Are You Still Doing SEO like It’s 2004?

Are You Still Doing SEO like It’s 2004?

Search engine optimization (SEO) is an important investment for website owners. With the recent Google algorithm update — in fact with each algorithm update — web operators are asking themselves again:

  • Should my company be worried?
  • How would we know if something is going to lower our rankings in the search engines results page?
  • How often is our ranking effected by negative factors?
  • Could my website be punished because of someone else’s actions?
  • What does a decline in search rankings cost my company’s brand in dollars and cents?

How Does SEO work?

SEO uses a combination of methods to increase the number of visitors to a website. Properly done, it produces a high-ranking placement in search engine results pages (SERP). It is multidisciplinary and benefits from good website design, appropriate content, proper coding and audience outreach.

There are two ways to influence search engine algorithms. These relevant keywords fall under two basic areas.

  • On-page SEO refers to website elements that comprise a webpage such as HTML/CSS/JavaScript, text content (including images), title page, meta description, heading tag and URL/permalink. It’s also helpful to embed structural elements such as H1, H2 and H3 tags in your text. Doing so makes it easy for search engines to better understand how your content is structured and identify what’s most important. On-page SEO also includes technical factors such as how fast your page loads, how long your visitors stay on your site and engage with your content, and how your site appears on mobile devices.
  • Off-page SEO refers predominantly to activities such as backlinks. Backlinks can come from many sources, including news articles, blog postings, blog commenting and article submissions.

What Is Black Hat (Negative) SEO?

Originally, one of the best strategies to achieve high SEO ranking was to accumulate as many backlinks as possible. The quantity and sources of web page backlinks are among the factors that Google’s algorithm evaluates to determine how important any page is.

At the time, a backlink was measured as a vote of confidence between disparate sites. For example, a site having 500 backlinks would likely outrank one having only 100.

Then site owners began to game the system by creating blogs and sites using several domains. They would buy links and insert spam comments in forums and blogs, all with very little content and even less value. All they had to do was add all their site links to their primary site, instantly promoting it once it was crawled by the search engine bots. That worked for a very short time.

It took a few years for search engines to realize what was going on. Google changed the way links were processed with the first Panda release in 2011. As always, it continued to count links, but it put more weight on the authority of the original site as well.

Websites that were gaming the system also felt the wrath of Google. The iconic search engine began downgrading the trustworthiness and authority of the target site. In this way, it penalized sites that created a web of linked sites and looked to buy low-quality links instead of earning relevant links.

Traffic, as a result, plummeted. To keep everyone in line, Google fastidiously kept records of sites known to abuse these link networks. Sites with bad backlinks were effectively demolished in the rankings.

No longer able to generate the same revenue, blog networks adapted to Google’s approach. Understanding that they would now suffer SERP penalties by promoting their own site using the former tricked-out method, they turned the table on business rivals.

Webmasters would buy links from blog networks. But they now directed those links to target their competitors’ sites, driving their ranking down with thousands of backlinks red-flagged by the search engines.

Caveat: If you’re writing relevant content for your audience and building useful, relevant links to back up your story, you are following SEO best practices. But, you still need to be aware of what your competitors might be doing.

How to Improve Your SEO

SEO is primarily about writing relevant content: By orienting your reader to what the story is about and helping your target audience find it you are practicing SEO. Here are some best practices we use at Incapsula.

  • Publish relevant content: Write for your audience. Find out what they are reading and write what you know. Your audience will return for more.
  • Update your blog and other site content: Readers will return if content is fresh. The secondary benefit is search engines like updated content, too, as it reflects a site’s relevance to its audience.
  • Use metadata and structured data, such as microdata: This gives you an opportunity to add context to your content and more information for the search engines. Areas to address are title metadata, and description metadata. While not a ranking factor itself, it helps drive clicks and engagements from the search engines to your site – which is a ranking factor.
    In addition, structured data is code that helps Google understand what your content is, how it is structured, and how to better display it to relevant readers, including in Google’s “featured snippets”.
  • Link to relevant information: Let your reader know where you are leading them by using anchor text that tells the reader what the link is about.
  • Use alt tags: Tagging your images gives you an added opportunity to let search engines find your content and alt text shows up when images aren’t enabled.
  • Use an SSL/TLS certificate: In 2014 Google announced that sites using SSL would get a slight ranking boost in their search index. As part of its new search approach, Google considers if sites use secure, encrypted connections and HTTPS.

Note: SSL or secure sockets layer encrypts data. It’s the standard security technology for establishing an encrypted link between a web server and a browser. An SSL connection ensures that the data exchanged between the web server and user remains private and integral in their own session. SSL may add potential latency, due to the SSL handshake, which can be mitigated by accelerating content delivery through a CDN, as in the case of Incapsula.

How Incapsula Dedicated IP Can Help Boost SEO for Your Site

Imperva Incapsula IP Protection can also help. The service ensures that customers have a unique static IP address for their website. Once an IP address is allocated, it’s never shared among other Incapsula clients. This gives customers additional control over their TLS certificate.

Here’s how using Incapsula dedicated IP significantly helps solve problems in the following situation.

If a site is being flagged as risky due to IP addresses being shared among bad neighbor sites, its HTTPS traffic bypasses WAF inspection and tunnels directly to a specific origin server (impacting all domains sharing the IP).

There are two options:

  1. Non-SNI clients (e.g., APIs) need to be served with a custom SAN certificate for multiple customer domains
  2. Non-SNI clients need to be served with a customer cipher-list or TLS version

Since only customer domains can appear on the Incapsula-generated SAN certificate list, no other brands or competitors will share the same certificate.

The benefits of using Incapsula IP Protection include

  • The service immediately flagging your site as safe
  • Significantly reducing the risk that your site is flagged negatively
  • IPv6 capabilities if your network requires it

What’s your SEO story? We look forward to your comments.