An Expoit refers to an entry point or ‘door way’ that hackers can use to break into or otherwise compromise your website. Expoits can be software bugs, or even correctly operating software that can be used to access, vandalise or compromise your site.
Redirects are an important SEO tactic, used to patch up holes in the fabric of your website.
Protecting Landing pages
Say Google puts one of your web pages into search results (ie it becomes a landing page), so it is likely that searchers will see and visit that page. All is good.
You then unceremoniously remove that page, or change its URL address. Google will still display the original address in search results (sometimes for quite a while). Visitors who click the search result will see a page not found error because that page no longer exists
So, you are likely to loose that client, but wait there’s more!
Google is likely to:
- Remove that page from search results
You loose a valuable organic landing page and all the subsequent traffic it may have delivered to your site into the future.
- De-rate site because now its got errors
Google doesn’t like poor quality content (which includes errors)
Woe is you! Not only did you loose a lead and probably all future leads to that page, but now Google has punished your site and you’re likely to loose even more leads.
Here’s another scenario where 301 redirects can save your site. Let’s assume that someone had kindly put a back link to the landing page mentioned above. This is gold because Google likes them, and frankly its probably one of the reasons the page is a landing page anyway.
People going to the remote site might click on the back link to your site; discover your products and services. Instant sales leads from that back link. Wow!
Then you decide to remove the page or change its URL as per our scenario above. The backlink doesn’t connect to a page in your site anymore. It goes no-where. Google’s love of your site is reduced because you lost that backlink and maybe others. Even would-be sales prospects from the remote site can’t click through to your site anymore so they go and buy elsewhere. Dark days Trev.
This is where the 301 redirects can save your website’s SEO credability. Patching previous landing pages or back link inbound pages to real pages will preserve their status. Importantly using a 301 redirect (permanent) instead of temporary (302) or refresh redirects ensures search engines remain happy with your patching.
301 redirects are The SEO Marketer’s friend
Rank is where on the search result page your website is listed for a given search term.
This the page that visitors arrive into your site; its their entry point, and typically but not always its your home page.
With organic search its Google who decides which pages are entry points, not you; although with careful design you can exert influence.
In Pay Per Click (PPC) advertising systems like Adwords the landing page can make a significant amount of difference in conversions and so there is usually careful consideration of landing page design, call to action etc
In 2007 however robots.txt was elevated to contemporary relevance through the introduction of the Sitemaps Protocol.
Backlinks are (usually) SEO goodness for your site. These are connections from another site to yours and represent an acknowledgment of your website by the remote site. Google and other search engines examine the backlinks to your site to determine its ‘authority’.
Note that Google polices backlink quality carefully, using its Penguin spam detection sub-system to determine good from bad backlinks. If your site has ‘bad’ backlinks it may incurr Google’s wrath in the form a ranking penalty.
A nofollow tag can be added to the backlink to remove the backlink ‘SEO goodness’. This is typically done to avoid incurring Penguin penalties by contravening Google’s Quality Guidelines eg links on paid advertising sites etc Read more
This simple piece of HTML code does the work: rel=”nofollow”
Note that this variation rel=”nofollow external” also indicates that the link is nofollow and external from the site.
Matt Cutts from Google explains nofollow backlinks in detail below:
These term labels techniques to improve your site’s search engine exposure into two categories:
Black Hat SEO Techniques are those that Search Engines don’t approve of.
This are typically schemes or strategies that trick the search engine into thinking your website warrants more exposure than it otherwise should.
White Hat refers to SEO techniques that search engines sanction.
The labels relates to old Westerns movies where the bad guy wore a black hat, and of course the good guys wore White Hats.
Keyword match describes the relationship between a target or seed term and those terms that Google associates (or matches) with it
Matches are used in AdWords and the Google Keyword Tool to define the target term(s)
Match types include:
Broad Match criteria is a loose match. For example it could include a match with:
- individual terms
- terms re-ordered
- other terms similar to the original terms
e.g. these terms could potenially be broad matches for the term: pink elephant
pink, elephant, elephant pink, pink elephants, pink pachyderms, pink elephant trunk
Syntactically broad search is shown without markup
Phrase match is a ‘string’ match – a match of the target terms in sequence
eg These terms could potentially be phrase matches for the term: pink elephant : pink elephant, pink elephants, bright pink elephant
These terms are not a phrase match : pink pachyderm, pink large elephant
Syntactically phrase match is shown with apostrophes ie “pink elephant”
Exact match is just that. An exact match of the target term(s)
eg The exact match for the term: pink elephant is pink elephant
These terms are not a exact match : pink pachyderm, pink large elephant, pink elephants
Syntactically exact match is shown with open and close square brackets ie [pink elephant]
This is a road map of your website especially designed for search engine spiders and generally not seen by human visitors.
If you’re into detail read more here
The XML Sitemap guides to search engines which pages to ‘index’ or record in their database, along with optional information such as priorities, dated last updated etc.
The XML sitemap is typically stored as <your web address>/sitemap.xml but file name this is not mandated.
Robots.txt can optionally direct the search engine spider to the sitemap.
Ideally the XML Sitemap is created by your website’s Content Management System (CMS) so you don’t have to remember to update it when you edit and add content.
There are plugins to do this for WordPress (XML Google Site Map Plugin by Arne Brachhold is a fav) as well for Joomla etc
If you have a flat HTML site, or your CMS doesnt provide an XML Sitemap plugin there are some great tools that you can use to manually build the sitemap eg:
- Websites including www.xml-sitemaps.com
- Software like Screaming Frog SEO Spider
- Notepad – if you are masochist 🙂
Website content that despite the fact is published on the internet, will not be featured in search results.
This can occur when:
- the website has not been registered with a search engine
ie the entire site is invisible to search
- the search engine crawler cannot reach pages within the website
ie one or more page(s) are invisible to search
If you website is invisble to search then its not going to attract sales leads from people searching online for your products or services. They’ll find and go to your competitors instead!
Ph: 0409 507 920
Complete our Contact form
We are a small business based in suburban Adelaide, South Australia.
- Blogging Tips
- Domain Renewals
- Duplicated Content
- False reviews
- Google + Business Pages
- Google My Business
- Internet Technologies
- Knowledge Vault
- Mobile Usability
- Online Marketing Adelaide
- Poor Website Performance
- Premium Hosting
- Rank Penalties
- Sales Tips
- Search Term Masking
- SEO Tips
- Social Media
- Website Hack Prevention
- Website Legals