Claudio's Blog Just another MA Web Design & Content Planning site


Google Webmaster Tools (extended article)

Analytics and competitors research 

According to Wikipedia, (SEO) Search Engines Optimization is the practice of improving the visibility of a website or web page in search engine results; the higher a site or page ranks the more visitors it will receive.

It is for this reason that SEO is considered an important internet marketing strategy that comes to be indispensable at the launch of a new website and even more during the management and administration of it, to increase and drive more traffic to the site. Also, this factor becomes even more important if the website is an online business.

That being said, different techniques exist for On-site SEO* and Off-site SEO* as several tools are also available to help webmasters with the on-going optimization of a site; among these the most relevant and used ones are provided by Google (Google Webmaster Tools), Bing (Bing Webmaster Tools) and Yahoo! (Yahoo! Site Explorer through Bing).


*On-site SEO: optimization of on-page elements - meta data, page title, page content, authoative content.
*Off-site SEO: optimimization of off-page elements- inbound links, social media signal, etc. 

Google Webmaster Tools

It is a free service by Google that allows webmasters to find out how Google indexes their websites by providing reports about visibility of pages on Google Search so that webmasters can act to improve their search result ranking.

Generally, it offers tools that let webmasters to:

-      Submit and check a site map.

-      Check and set the crawl rate.

-      View stats about how Googlebots access a website.

-      Generate and check Robot.txt files.

-      See internal and external pages linking to a site.

-      View search queries and click through rates.

-      View stats on how Google indexes the site and any errors found while doing it.

-      Set a canonical (preferred) Domain.

However, before starting to look at data, the first thing to do is to register a site with Google Webmaster Tools and verify ownership of this. Site verification can be done in several different ways and it is essential in order to use the tools.


Site Verification


Meta Tag Verification:   According to Google, this method is the best option if the server cannot be accessed and it is done by adding a XHTML or HTML Meta tag on the Homepage codes of the uploaded site.

HTML File Verification:   It is done by uploading a file on to the webserver to the location specified by Google.

Other Verification methods are the DNS TXT record and the Google Analytics tracking code (through Google Analytics).

Once the site verification is done webmasters have access to Google Webmaster Tools and may begin the configuration of their sites; though before starting getting too deep into “how Google Webmaster Tools” works it is a good idea to look at how Google Search finds web pages matching a query and returns search results.


Crawling, Indexing, Serving

 The delivery of search results is done in three processes: crawling, indexing and serving. The first two are tasks performed by search engine spiders* while the third by the search engine’s *algorithms. 

Crawling:   Simply, it is the travelling from site to site or page to page to discover content and links. It is the process by which Googlebot finds new and updated pages to be added to the Google Index. Googlebot, also known as robot, bot or spider is the program that crawls pages and detects new sites, changes to existing sites and dead links; data that are used to update the Google Index.



Indexing:   It is the process of building an index or database of keywords and phrases, the sites and pages that are relevant to them and the links within those sites. Keywords may include key content tags (title tag, etc.) and attributes (ALT attribute, etc.). However, content of rich media files or dynamic pages cannot be processed.

Serving:  The method by which Google serves the results to users depends on over 200 factors, one of which is PageRank. PageRank is the measure of importance of a page, based on the incoming links from other pages. Not all links are equal as some are identified as spam links by Google. Hence, the best types of links are the ones based on the quality of content. Good quality content is key to getting people link to your site.



*Search Engine Spider:  (also known as a robot or a crawler)  a program that follows, or "crawls", links throughout the Internet, grabbing content from sites and adding it to search engine indexes. 
*algorithms: step-by-step procedure for calculations.

Google Webmaster Tools: Configuring Your Site

Once a site has been uploaded on Google Webmaster Tools and verified for ownership, webmasters have the option to provide Google with information about their websites to help Google perform a better indexing and crawling of the sites. Such information will tell Google about the pages and content of a site, how webmasters wish these to be crawled and what is the preferred domain to be shown in search results.


 Configuring Your Site: Sitemaps

Sitemaps are a way to tell Google about your pages. Simply, an XML Sitemap is a list of the pages on a website. Sitemaps allow webmasters to directly submit their pages to Google to make sure that Google knows where they are but also which of these are the most important ones. In addition, Sitemaps can be used to provide Google with metadata about specific content on a site, including video, images, news, mobile and software source code.


Sitemaps are helpful if:

  • A site has dynamic content (always changing)


  • Pages are difficult to be crawled by Googlebot (rich of Ajax or images)


  • A site is new and have few links to it


  • A site is very large and pages are not well linked

 Google can accept sitemaps in several formats but the recommended one is XML. Webmasters can create XML sitemaps through Google Webmaster tools or other tools available on the internet. Alternatively sitemaps can be created manually.

RSS, mRSS (media RSS) and Atom 1.0 are formats also accepted by Google. These formats are useful to submit RSS or Atom 1.0 feeds from Blogs or to tell Google about media content on a site (mRSS). Images, video, etc. could also be submitted with the following sitemap extensions:


General URL
Code Search


Google recommends, however, creating separate Sitemaps for news content; these Sitemaps will be crawled very frequently to check for new articles.

Basic Web Sitemaps (only web page URLs, not image, video, etc.), can be created with a simple Text File that contains one URL per line


Note that XML Sitemaps are not a replacement for HTML Sitemaps:

  1. XML Sitemaps do not pass PageRank and cannot be used for navigation.
  2. HTML Sitemaps are the ones used to navigate a site.

The information that can be viewed on the Sitemap Details Page in Google Webmaster Tools is:

  • Sitemap format: e.g. standard Sitemap, RSS feed, Atom feed, URL list, or Sitemap index


  • URLs submitted and indexed


  • Type of content (images, video, and News content) submitted and indexed.


  • The date the Sitemap was submitted and processed.


If you have multiple websites, you can create one or more Sitemaps that includes URLs for all your sites, and save the Sitemaps to a single location. More on Sitemap could be found here.


 Configuring Your Site: Crawler Access

To prevent content or pages of a site to be accessed by Google or other search engines, a robots.txt file is needed to specify how search engines should crawl the site.




Robots automatically check if a robot.txt file exists when crawling a site. Pages with a robot.txt file will not be shown in search result. However, if Google finds the same pages through incoming links on other sites their URL will be listed. Also, there is a possibility that some robots do not respect the robot.txt file, so robot.txt file is not recommended for highly private content.


Also, robot.txt needs to be placed in the top-level directory of a web server in order to be useful. e.g. http:/


Robot.txt file syntax examples:


Block all web crawlers from all content:
User-agent: *
Disallow: /


Block a specific web crawler from a specific folder
User-agent: Googlebot
Disallow: /no-google/


Block a specific web crawler from a specific web page
User-agent: Googlebot
Disallow: /no-google/blocked-page.html


Allow a specific web crawler to visit a specific web page
User-agent: *
Disallow: /no-bots/block-all-bots-except-rogerbot-page.html
User-agent: rogerbot
Allow: /no-bots/block-all-bots-except-rogerbot-page.html


Sitemap Parameter
User-agent: *


Alternatives to robot.txt file are X-Robots Tags (:noindex, :nofollow, etc.) to use as an element of the HTTP header but also noindex meta tags to be placed into the <head> section of a page.


e.g.   “X-Robots Tag Syntax”
HTTP/1.1 200 OK
Date: Tue, 25 May 2010 21:42:43 GMT
X-Robots-Tag: noindex


e.g. “noindex Meta Tag”
To prevent all robots from indexing a page on your site:
<meta name="robots" content="noindex">
To allow other robots to index the page on your site, preventing only Google's robots from indexing the page:
<meta name="googlebot" content="noindex">


“noindex Meta Tags” prevent access to pages even if other sites link to it. However, there's still a small chance that Googlebot or other search engines won't respect the noindex Meta tag.

Thus, the best and more secure method to completely keep confidential content on pages is by keeping these in a password protected directory on the server.

E.g. If you're using Apache Web Server, you can edit your .htaccess file to password-protect directory on your server.

Configuring Your Site: Site links

Site links are automatically generated links by Google to help user navigate a site.



Google has complete control on site links display; though best practice to improve quality of site links can be followed. Also, site links can be blocked if these are considered incorrect or inappropriate by the owner of the site.


Configuring Your Site: Change of Address 

It is usually convenient to tell Google if a site has been moved to a different domain.



When moving a site to a new domain:

  • Manage the amount of changes (first move your site and then launch a re-design if it’s the case)


  • Check all your links (internal, external, incoming, etc.)


  • Tell Google through the Change of Address tool in G. Webmaster tools and add the new one to your account.


  • Create and submit a new sitemap.

The best way to move a site to a different domain is by 301 Redirect.

301 Redirects are useful if:

  • A site has been moved to a new domain.


  • A site can be accessed through different domains (in this case redirect to a preferred domain)


  • Two sites needs merging and so all their links redirected

A 301 Redirect on a local server (Apache) can be done within the htaccess file.



Configuring Your Site: Settings

This section allows webmasters to tell Google what is the geographic location they are targeting, what is their preferred domain and also change Google’s crawl rate.



Settings: Geo-targeting     

The process of associating a URL Address to a geographic location so that Google determines how a site should appear in search results in response to a user’s query from a particular geographic location.

Sites with country-coded top-level domains (such as are already associated with a geographic region, in this case United Kingdom.

Sites with an international domain (.com, .org, etc.) are given geographic location by Google through IP address, location information on the page, links to the page, etc.


Settings: Preferred Domain

The preferred domain or canonical domain is the URL address that webmasters choose to be shown in search results (http://www... Or http://...)

By telling Google the preferred domain, Google will act by crawling and indexing only the specified domain. After specifying a preferred domain a 301 Redirect may be needed.  


Settings: Crawl Rate

Crawl rate is the speed by which Google crawl each site. This can be changed through Google Webmaster Tools. Note that Crawl Rate does not determine the number of times Google crawls a site.


Configuring Your Site: URL Parameters

When a website has the same content within different URLs’ parameters it is always the case of duplicated content. This limits the number of pages crawled by Google and affects search results.

Duplicated content is normally grouped and pages with duplicated content are consolidated by Google, and so telling Google how to handle these pages will help Google make the most of its crawling.

In order to avoid duplicated content it is good practice to set a preferred (or canonical) URL.



Setting a canonical domain will allow having more control over how URLs appear in search results but also consolidating properties such as links popularity.

Canonical URLs can be set in two ways:

  1. Add a rel="canonical" link to the <head> section of the non-canonical version of each HTML page.
  2. Indicate the canonical version of a URL by responding with the Link rel="canonical" HTTP header.

Also, 301 redirects can be used to send traffic from non-canonical pages to the preferred URLs. Alternatively, tell Google about preferred domain by including it in the Sitemap to be submitted or through Parameter handling in Google Webmaster Tools.

Once Google knows all this information about a site, it will show more accurate data. Such data are reported in the dashboard page on Google Webmaster Tools and grouped within the category “Your Site on The Web”.


Your Site on the Web: Data

All data that Google provides to help webmasters with the optimisation of their sites refers to search queries, inbound links, keywords, internal links and subscriber stats.


Data: Search Queries

The search queries page shows the most popular search terms that return your site in search results but also the most clicked keywords phrases based on your ranking. In other words, it allows you find out how visitors are finding your site; see which queries most often return your site and which queries users click to access your site.

 Such data includes:

Search Queries:     the total number and the top search queries through which your site appears in search results.

Impressions (page views):    the number of times pages from your site appeared in search results.

Clicks:    the number of times pages from your site were clicked for a particular query.

Click Through Rates (CTR):    the percentage of impressions that resulted in a click through to your site.

Average Position:    the average position of your site in the search result page for a particular query.


By default these stats come grouped together but they can be filtered so that you can see the specific information for example only stats from mobile phones, web, etc.

In addition to providing a good barometer of how you are ranking in Google for your keywords, this data provide a road map for internal links opportunities.

It is important to know which ones are the most clicked phrases or which terms users use to index your site, so that you can increase the number of keywords or create internal links with the same words to improve PageRank. You should always have useful content relevant to your keywords and also try to improve your content according to stats.

Data are likely to differ from data provided by other tools such as Google Analytics. A Reason for this is that although both tools have something in common they still do operate differently when processing data.


Data: Links to your Site

The Links to Your Site page shows a list of all the links to your site (inbound links, anchor text, etc.) and also the pages on your site with the most links. If users reach a page on your site as a result of clicking a link with a redirect, that intermediate link will also be listed.


Links to your pages determine the importance of those pages and so PageRank. They also tell you where your traffic is coming from and what other sites are interested in yours. This will help you build up your traffic.

Links that are not going to be listed on your webmaster reports are:

  • Links from pages blocked by robot.txt files.


  • Broken links


  • Links from non-canonical URLs

However, any errors that Google encounters during crawling will be listed on Crawl Error page; these included.

Also, to find out about your inbound links you could perform a Google search using the “link” operator: (e.g.


Data: Keywords

The Keywords page lists the most significant keywords and their variants that Google finds when crawling a site. The more frequent a keyword appears on your pages the more significant this will be to Google.


If unexpected keywords, such as "Viagra", appear on the list, this could be a sign that your site has been hacked.

A list of keywords helps you judge if those 20 top keywords are relevant to your site’s niche but also if these are the keywords you want to rank for in search engines.

Google judges your content based on the top 20 significant keywords that appear on your site. Terms related to those keywords will rank you higher.

When you find the keywords which you want to focus on you need to increase the occurrences of these keywords in the content of your site by writing more content based on those keywords rather than stuffing the existing content or your Meta description. Also try to build long tailed phrases and write contents on those long tailed phrases.


Data: Internal links

The Internal links page shows a list of all the pages of your site which have incoming links from other pages on your site. The number of internal links pointing to a page is a signal to search engines about the relative importance of that page.


Data: Subscriber Stats

Feeds are used to keep track of a large number of sites or blogs without having to check each site individually.


Subscriber stats shows the number of Google users who have subscribed to your feeds using any Google product (such as Reader, iGoogle, or Orkut). However, users can subscribe to feeds in many different ways, so the actual number of subscribers to your site may be higher than the one shown on the data. Also, only feeds with actual subscribers will be listed.



Differently, all errors and malware stats that Google finds while crawling your pages are listed in the Diagnostic category. As though, under “Diagnostic” stats on Crawling and HTML suggestions could also be found.

Diagnostic: Malware

If Google detects that your site has been hacked, they will tell you about it in Webmaster Tools. To clean up your site follow Google’s advice.



Diagnostic: Crawl errors

The Crawl errors page provides details about the URLs in your site that Google could not crawl.


Possible errors:

  • 404 or Not found ( HTTP response code indicating that the client was able to communicate with the server, but the server could not find what was requested).


  • URLs not followed (URLs that spiders were unable to completely follow)


  • URLs restricted by robot.txt files


  • URLs timed out (DNS lookup timeout, URL timeout, robots.txt timeout)


  • HTTP errors (hypertext transfer protocol various errors)


  • URL unreachable (errors in the server, server is busy, lack of communication with the server, etc) 


  • Soft 404s (Some websites report a "not found" error by returning a standard web page with a "200 OK" response code; this is known as a soft 404. Soft 404s are problematic for automated methods of discovering whether a link is broken.Soft 404s can occur as a result of configuration errors when using certain HTTP server software).

In addition, specific crawl errors reports exist for Google News publishers.

Possible errors:

  • Article fragmented (The article appears to consist of isolated sentences not grouped together into paragraphs)


  • Article too long (The article appears to be too long to be a news article)


  • Article too short (The article appears to be too short to be a news article)


  • Date not found (publication date of the article is missing)


  • Date too old


  • Empty article


  • Etc...


Diagnostic: Crawl Stats

This page shows Googlebot activity in the last 90 days. The information is presented on charts and it consists of:

  • Pages crawled per day


  • Kilobytes downloaded per day


  • Time spent downloading a page


Diagnostic: Fetch as Googlebot

This tool allows seeing a page as Googlebot does. This is useful for troubleshooting a page's poor performance in search results. (E.g. pages rich of media files content.



Information provided by this tool includes:

  • The HTTP response returned by your server


  • The date and time of your crawl request


  • HTML code


  • Indexable Content

For more information click here


Diagnostic: HTML Suggestions

The HTML suggestions page shows details on problems that Google finds with your site HTML, CSS, etc.  during crawling and indexing. 


Data that may be included on this page are:

  • Title problems:   Potential problems with the title tag on your pages, such as missing or repeated page titles.


  • Meta description problems:   Potential problems with duplicate or otherwise problematic Meta descriptions.


  • Non-indexable content:   Pages containing non-indexable content, such as some rich media files, video, or images.

In addition Google Webmaster Tools allows you to check Activity, Audience and Search Impact data from + Matric which generally refers to Google + and any 1+ button appearing on your site.

Other useful tools are available through Webmaster Labs which allows you to try out experimental new features of Google Webmaster Tools which are not 100% perfect but still very useful.

Alternative products which offer similar tools to Google Webmaster Tools are Bing Webmaster Tools ( ) which now has incorporated Yahoo! Site Explorer ( ),  SEOmoz PRO ( ) which provide a complete set of software and tools to maximise search engine optimization also similar to SEO Administrator ( ).


Bing Webmaster Tools

Bing Webmaster Tools is very similar to Google Webmaster Tools in that it provides webmasters with similar types of data. It has a home page for all user's tasks such as settings, messages and resourses and a dashboard page organized around three key areas: Crawl, Index and Traffic.

Once a site has been registered and verified for ownership, webmasters have access to data such as impression, clicks,  indexed pages, crawled pages and pages with crawl errors.

More detailed information on crawl, index and traffic is organised under each of those categories that present  sub-categories for crawl summaries, crawl settings, crawl details and more on sitemaps, mark up validation, etc. Under the Index category URLs can be submitted or blocked and stats on inbound links and other links can be viewed in addition to specify parameters through URLs Normalization and more. In comparison with Google Webmaster tools, extra tools  are added such as Index Explorer which enables webmasters to learn more about how thoroughly their sites have been crawled and indexed by Bing. Additionally , data on ranking and  traffic which includes average impression position and average click position are shown in combination from Bing and Yahoo! results.

Query volumes for phrases and keywords can also be found through the organic keyword research tool.These can also be filtered by geographic location and language.  



SEOmoz PRO is not a free product like Google Webmaster Tools or Bing Webmaster Tools but it offers as much excellent tools as they do.


The first thing you see after signing up  is the welcome screen which allows you to visit your dashboard or start your first campaign.


Starting a campaign would let you set up your site, keywords and competitors. On the "Campaign Overview" page there are three main sections: Crawl Diagnostics, keyword Rankings and Competitive Link Analysis. Each of these sections provides summaries and detailed information on the most common errors and warnings that were flagged from the last crawl.

In addition, SEOmoz PRO offers an overall keyword performance and a summary of how well your site is comparing against up to three competitors. Also, data on keywords are shown in “Ranking”. Other section like “Crawl Diagnostic” report all issues found from crawling such as duplicate page content, server and client errors and SEOmoz crawler blocked by robot.txt files.

Reports on how specific URLs perform for chosen keywords are also available together with a side by side overview of a site against competitors; this includes: Total Links, External Followed Links, Linking Root Domains, etc. This last data is accessible from“Link Analysis”.

Traffic data shows  information on  traffic that can be also integrated with Google Analytics accounts.

Moreover, through the PRO Dashboard, you have access to "Tools Reports" to see reports about the tools you have used and also Q&A  to get answer to your questions. Additionally, "SEO Resources" can help with learning how to use these tools.


Other tools that can be used alongside these but for other analytics and competitor research are Google Analytics ( ), Alexa ( ) Compete ( ) but there are others.




Thank You...














Filed under: Uncategorized No Comments

Website for typography

The IAAH’s website is an experimental and unconventional site where typography is used as a mean of visual communication in addition to being meaningful text that tells the user how to navigate the site. Big headings of different font sizes are used to prompt the user to perform an action and to indicate where to click. Such headings are surrounded by a generous amount of white space that helps to maintain a visual balance. The composition of text is well organized through the use of a grid system; it is readable both for the contrast against the background and for the linearity of each line of text. Also, readability is enhanced by the fact that line-height and paragraphs are used but also each line of text is made from 7 to 10 words distributed within columns. The typeface used is not sharp and works well even if letter spacing is very limited on some text. The text is presented in different formats and with different formatting depending on the page and it is in the most of cases in the form of an image that creates the graphic. Such prominent text is distributed sparingly on pages and flows within the composition also through the use of different colors that enhanced the sense of playfulness throughout the site.

Filed under: Uncategorized No Comments

What I learned this week (18-01-2012)

Web Technology

Websites are stored in a Server. These are provided by Hosting Companies such as Echo Web Hosting.

Web Server Hardware = Computer  (Dell, HP)

A server could hold up to 1000 websites. This is called Share Server

Large Website needs their own server. This is called Dedicated Server

Web Server Operating System: Linux (most popular) or Windows based

Web Server Software (Apache, IIS): serves requested pages by the pablic

Web Server Scripting Language (PHP, ASP): computer language. PHP is free. ASP is similar to PHP but you pay for support.

Database: where the content is stored. PHP asks for content from the database and send it to the person requiring it.

Disk Space: Available storage space for a website

Bandwidth: Amout of data that could be transfered from the web server to the user. It is usually measured over a period of a month.

Cloud: it's like an external harddrive for a server, used to store websites with a huge amount of videos, music, galleries. it is not necessarily located in the same place as the server. Types of websites that use it are youtube, vimeo, etc.

Client Side Language: HTML, CSS, Javascript (processed on the computer requesting the website.

Ajax: is a technique of querying a database without having to press submit.

Back up: Back ups should be taken regularly when using CMS since most of the work you do is in the database on the server. Downloading the website plus database to your computer prevents issueses if something goes wrong with the server. Hosting companies take back ups regularly though.

Content Management System

A CMS is a web based software that allows people with no technical knowledge to manage the content of their sites by providing user friendly interfaces. Also, it makes development of the site faster.

A CMS is built using a scripting language such as PHP and the content is stored in a database such as MySQL. The content can be updated from whichever computer if the CMS is on the Web.

Personal Templates can be uploaded.

Open Sorce:  Free of charge plus codes are not encripted so codes can be edited

Each CMS as its own plug-ins to extend functionalities

Blogging platforms are ideal for websites which are very user based (wordpress)

Image Gallery CMS are ideal for uploading images, create galleries, etc. Eg. Pixel Post

eCommerce CMSs are dadicated for for selling things online. These handle payment gatways, taxs, postage

Specialist CMS (moodle, wikimedia)


Disadvantage on using general CMS is that the only support you get is through the forum whereas with paid CMS you get personal support if something goes wrong.

Enterprice level CMS: big companies custom make their own CMSs

Standardly CMS allows you to design your own templates and upload them so that you use your own themes.

plug-ins may be already built by the support company or you can build or get someone to build your own.

Perch and WordPress have an "undo" facility, so that old versions of pages can be restored.

use a CMS that is easy for a client to use

Hybrid CMS: two CMS may combine if one doesn't offer some of the functionalities needed.

Most CMSs create Search engine friendly URLs through the apache web server.

When installing a CMS The web host has to support the CMS used since these have their own requirements.





Filed under: Uncategorized No Comments

Grids in Web design.

“A Grid System is a rigid framework that is supposed to help graphic designers in the meaningful, logical and consistent organization of information on a page. Rudimentary versions of grid systems existed since the medieval times, but a group of graphic designers, mostly inspired in ideas from typographical literature started building a more rigid and coherent system for page layout”. (


1 "Villiard Diagram"


The grid system is therefore an organizing principle with roots in the “Villard Diagram” but also other theories of aesthetic measurements such as the golden ratio and the rule of thirds; principles that had been used for a very long time before the modern typographic grid was implemented. It is in fact, in the 1950s that Emil Ruder and other representatives of the Swiss style such as Armin Hofmann and Josef Muller Brockmann started to device a flexible system to help designers achieve coherency in organizing the page.

2 “Golden Spiral Ratio”

Hence, the grid system came to be associated with the Swiss Style, also known as the International Typographic Style for the fact that it became adopted worldwide by the 1970s. It is a graphic style that emphasizes cleanness, readability and objectivity through the use of grids, sans-serif typeface, ragged right text and a preference for photography in place of illustrations and drawings; but also it is a style that shows influences from constructivism, and minimalism where the unnecessary is removed and emphasis is given to the necessary.

3 “Swiss Graphic design Wohnbadarf”


4 20 Carto- J. M. Brockmann

Nevertheless, it is with the advent of 1980s that some designers reacted against the constructive nature of the grid, also rejecting it for its association with corporate culture and its domestic use; and this was done in favor of a more organic structure and a major creative freedom. It is therefore for this reason that, today grids are mostly seen as a useful tool than a requirement or starting point for all page design. In spite of this, grids are becoming more and more employed in web design after their significant use in print and designers now adopt them on their pages for this new media. However, grids on the web are not very flexible as for not having a fixed height as in print.  Also, grids are intended to guide the graphic design process, not to dictate it and so can be broken according to the design’s requirements. But why are they so important?
“The use of the Grids as an ordering system is the expression of a certain mental attitude inasmuch as it shows that the designer conceives is work in terms that are constructive and orientated to the future” - Josef Muller-Brockmann -     

5 The Grid System in g. design – Josef Muller Brouckmann


“The use of the Grid System implies the will to systematize, to clarify, the will to penetrate to the essential, to concentrate, to cultivate objectivity instead of subjectivity…” - Josef Muller Brockmann - 

Grids help distribute content more efficiently, in a way that is logical and beautiful as with the use of a system that divides space with the golden ratio. Also, online content assimilation and retention increases when these are used, enhancing the user experience. Additionally, compositions and designs are more linear and aesthetically pleasing with a certain order and organization constantly maintained on pages.

Grids also determine the dimension of space in addition to making the design process quicker by creating fields or compartments for the content to be directly placed in. This creates a compact system of arrangement that has its own rhythm, but also that fosters analytical thinking.

“By arranging the surface and spaces in the form of a Grid the designer is favorably placed to dispose the content in conformity with objective and functional criteria” – Josef Muller Brockmann –

Thus, Grids solve usability and accessibility issues but also visual problems through the representation of this visual framework that allow the designer to create more balanced and harmonious compositions.

Additionally, by reducing the number of elements and their incorporation in a grid system a sense of compact planning is created together with a manifestation of clarity and this suggests order. If the information is presented in a logical set out i.e. title, subtitle, text, etc. this will be quicker to scan and easier to assimilate if also these elements are formatted in the right size, with the correct line-high or margins that create white space and make each line of text breath. This factor is enhanced when a baseline grid alignment is adopted too.

A baseline grid is a horizontal grid system that exactly aligns the baselines of all the text on a page, regardless of size or style. Baseline grids create a smooth rhythm in the typography within a design.

6 Example of Baseline Grid

Now, to understand how a grid system constructs the framework of a page, it is a good idea to look at the anatomy of a grid and identify each different part that builds it up on. Generally, a grid is made of columns, rows, module, flow-lines, gutters, margins and fields both in print and on the web. In order to construct an effective grid system “Ratios” are very important.

Ratios are the bulk of a well-designed grid system. Either the rule of third or the golden ratio can be adopted to create a grid system, with emphasis on the latter to create more complex grid structures.

Units (or Modules) are also as important as they constitute the base piece from which the rest of the grid is developed. These could be derived from constraints such as the content elements that are going into a design i.e. images, adverts or the typeface sizes, etc. or alternatively by subdividing the maximum screen resolution.

By doing so, a relationship is created between the layout and the element from which the unit is derived. All elements subsequently placed within the composition will be harmoniously connected.


Moving forward…
•    Columns are vertical bands of modules that could be equal in width or vary across the grid.
•    Rows are the horizontal equivalent of columns. Online it is hard to plan for rows as the height of the format is often inconsistent and dynamic.
•    Gutters are the spaces between columns and rows that separate modules either vertically or horizontally.
•    Flowlines or Hanglines are horizontal lines that break the space into horizontal bands. These are used to help guide the eye across the page and above all to align elements.
•    Fields are groups of adjacent modules used to place large images or long blocks of text.
•    And finally…
•    Margins are the space between the edge of the page and the outer edge of the content. These help establish the overall tension in a composition. The smaller the margin the more tension is created.

Designers use this subdivision of space to create layouts that reflect a visual balance within elements of content and build a more organized composition which appeals to the eye of the user, enhancing user experience either in print design or in web design.

Grids used for the web need to be flexible in order to accommodate varying monitor widths and resolutions. When using a fixed-width grid for the Web, even if the browser window expands the layout within the grid is not altered. In a variable-width grid system each column expands proportionately with the width of the browser frame. This would cause the layout within the grid to change in compliance with the width of the user’s monitor.

A combination of variable-width columns and fixed width columns within grid systems would bring the variable-width columns to expand and move the fixed width columns if there is not enough space.

A number of grids exist today whether this are for print design or for the web, however these could be grouped into four main types:

•    Manuscript grid is the simplest grid structure. It’s mainly a large rectangular area made of a single column used for extensive and continuous blocks of text.


•    Column grid is made up by placing multiple columns on the page. This is good when discontinuous information and content needs to be presented.


•    Modular grid is the same as column grid with the addition of rows and gutters. It is used for complex project.


•    Hierarchical grid is commonly found on the web. This is based more on an intuitive placement of elements, which still conforms to the needs of the information.


Now, how are these applied to Websites?

The university college of Falmouth’s website present a grid based layout which uses multiple of 4 to subdivide the page into columns. In fact, deconstructing the grid, either 4 or 8 and so on 12 and 16 columns seem to lead the alignment of the content. However, 12 columns could be identified as the best structural framework of this layout. These columns divide the total width of the page which is 960 pixels and control the natural flow of the content, maintaining order and organization on the page.

In fact, by looking at the Homepage it is evident that, the majority of content is aligned to an invisible structural framework which also allows a flexible organization of each element. Just looking at the arrow that characterized the links on the left hand side of the page, this aligns completely to the edge of the column; whether these are 4 or a multiple of this, makes no difference for the alignment which is maintained on all pages. Same proportions and column width is reflected on the left where the navigation menu and adverts are disposed. These in fact fit completely within the column and are positioned in the center, sitting directly on fields and separated from each other by apparent uneven spaces.

It is difficult to understand if a baseline grid has been used, however the disposition of text seems to be very linear and in line with the rest of the content. Also, each line of text is disposed within large columns “probably too large” as the margins are probably too small, but line-high and paragraphs are used and this makes the text easy to read, also creating a balance that, in my opinion, allows easy understanding of the information. Since the margins are not too large the page may look a little overfull with content, however as the background extends widely in the window this is balanced up.

Also, the Grid does not seem to be a constraint for all the content. In fact, this is broken in a number of occasions. Firstly, the element at the bottom of the page, where “search bars” and the “Quick Links” button are, extends wider than the total width of the page breaking the grid. This factor is consistent on all pages since this element is fixed to the window and so it is always visible even when scrolling up and down the page. Secondly, the logo reproduced on the top right corner as part of the background image, extends out of the grid, so as the top navigation bar does, and although it is not much emphasized and almost invisible it contributes to enhance brand identity by breaking the linearity of the grid system.

The logo itself is respectively in line with the grid system and regards the same alignment of margins. Although, this does not break the system it still adheres to the concept of brand identity, through the use of typography.

The sections of the page that have been created for this layout are recognizable as header, footer, sidebar, and main content area and this arrangement is consistent on all pages, so are the top navigation bar and side navigation menu on the left. This consistent arrangement improves navigation and so the usability and accessibility of the site. However, such organization of elements and space is not abide  by the “Innovation & Research” category, where the content follows a different arrangement and it extends out the margins to the very edge of the page.

Emphasis is primarily achieved through the color but also by the use of bigger typefaces for headings and the labels on each category of the top navigation bar. These follow the Grid structure in that they are equally divided within columns and this division is made evident by the use of three different colors. Also, emphasis on content is obtained through arrows that point straight to specific direction to guide the user through the information or to suggest an action to be performed such as opening a navigation category or to have access to information.

To conclude, some sections of the site seem not to have been included within the organization of the grid almost looking independent. These, expand horizontally to the size of the window and are respectively the footer and the bar on the very top of the page. However, links and elements of these sections are still positioned within the composition.

The King’s College London’s website reflects a feel of alignment and division of space. However, this is not necessarily true. Dividing the total width of 960 pixels in to equal columns it could be seen that not all the content aligns to the same grid. Some proportions are abide by the top navigation bars in that the space occupied by each category is the same for all of them within a grid of seven columns. In other words they all have the same width. However, looking down at the list of links within the main content area on the homepage it is clearly evident that these do not align at all to the same grid; neither the elements in the section within the main content and the banner. Adding or moving columns do not make much of a difference either, hence there is some alignment in that some elements aligns to other adjacent elements but they do not necessarily follow the flow of a grid unless a hierarchical grid system has been adopted. This factor applies also to other internal pages.

Although these present a similar layout to the homepage in that all content is mainly links images and captions with an additional side navigation menu, when it comes to presenting long lines of text the layout changes in the disposition of elements to accommodate the information. However, the typeface used is too small and there is not enough space between each line so legibility is an issue even though paragraphs have been adopted. Furthermore, the text is disposed into very narrow columns on some pages and this makes reading annoying.

Although, some of the content is not in line with a possible grid framework, the composition of the overall layout present some kind of alignment as mentioned earlier, hence usability and accessibility are not as much affected by this, nor the credibility of the layout, since columns are still used and therefore the content is still organized in a compact arrangement.

Emphasis on the structural layout is given by the alignment of the banner, images together with the top navigation bar which features of a widget that hides and shows out sub-categories that are listed and organized within aligned columns.

The logo is placed within the same alignment of the top navigation bar. It seems that it has been designed as well through the adoption of a grid System as typographic elements are well aligned respecting proportions and compactly contained within the same portion of space.  In addition, although it is a little small and not very prominent on the page, in my opinion it still tells about brand identity through typography and the structural look.

The Glasgow School of Art’s website states it clearly and succinctly that a grid system has been used for the structural planning of the layout and organization of the content within it. In fact, the grid has been left as part of the background, visible and available to the user as a guide to flow along the content presented on the page.

Alignment and organizations are the main points that make the composition work harmoniously and the delivery of the information efficient and effective on every page. This consistency allows the user to navigate the site with ease and to assimilate the information through, in same way, “visual memory” that would make it easier for them to remember.

The content is presented aligned within a 12 column grid system that creates visual balance within a harmonious composition. Such balance has an influence on the user experience, allowing better navigation and use of the site that enhances users’ confident and establishes credibility on the site and consequently on the University.

As far as the homepage is concerned, the information is easier to find and to interpret. This is respectively distributed through modules that enable better maintaining of the visible content when it comes to being updated as these could be re-sized, but also the white used for the “boxes” together with the pink hover (these are click-able images), gives them emphasis and in my opinion it also gives emphasis to the structural framework in which they are contained i.e. the grid. The same arrangement of the information is also adopted on other pages. However, when it comes to some lengthy text, spaces between lines and paragraphs are used for a better interpretation of the information and this improve legibility and accessibility. Also columns of text are not too wide as they are not too narrow and are perfectly in line with the grid.

Alignment and organization is also visible on the top navigation menu where categories follow the flow of the grid same as their subcategories which hide behind the banner in a similar way as they did in the previous example.
Images are also arranged within the grid so are headings and captions and in general they occupy more than a module of space and extend across the columns but still without breaking the grid. Some of these images feature of swap behavior that emphasizes the use of the Grid System by moving the pictures from the original columns they are trapped within to others, maintaining the alignment with the Grid at the same time.

Finally, the logo does not seem to bring particular emphasis to brand identity. Although it may also be recognizable, the type-font size used is too small in my opinion, to enhance the brand image of the University and also this is still contained within the system of the Grid.

In conclusion, it is as a consequence of the growing use of Grids on the Web that several sources and tools are now available on the Internet to help designer build their grid systems. Some of these are CSS Framework and Blue Print as suggested by Mark Boulton but others such as the one I have used to show the examples on this essay, are just add-ons for the browser. I now understand how important grids are for page layouts and I am convinced I will start using them in the nearest future.









Filed under: Uncategorized 2 Comments

Warm coffee Bean Shop

Mockup 1


Filed under: Uncategorized No Comments

Coursework 6

Thee Example of Good Typography Websites:

This is the website of a creative consultancy. Here typography plays a fundamental role in the whole structure of the site. A number of different type-fonts and type-faces are used to emphasize headings, subheadings, etc, and also to distinguish each part of text dividing it into sections. In addition, emphasis is also give by the use of bold, italic and color to stress letters, lines of text and quotations. What I like the most is the fact that consistency is achieved through typography. This can be seen from the twitter and facebook links which are replaced by text in order to match together with the whole layout.

This is a portfolio website with a fairly good approach to typography. Generally speaking “Contrast” is a predominant feature overall and it is rendered in the first place by the combination of different type-faces and the use of color (or non-color). For example, the elegant type-font used on the top left corner of the page is in contrast with the rest of the text but it still does not break the layout as it is supported by the italic text just below.

The literacy 2030’s website presents a very linear use of typography. Although, the body text is very synthetic in some pages and with just small variations in size and style to differentiate headings from subheadings and body-text, what I found interesting is the Logo of the company where typography is used in an attempt to represent the company and give the business it’s own identity.


Thee Example of Poor Typography Websites:

Psardo’s website is quite uninviting for an interior design company website .Looking at typography, it is clear that no much attention has been paid to the way the text should look like and therefore the information delivered. There is no consistency in the way the text is arranged. In fact, it aligns partly to the left and to the centre or wraps up around the images. The blue color for the font is another thing to point out since it distinguishes the body text from other text but it does not go well with the whole.

Here I think there is too much contrast by the use of a black background and a white type-font.  Also typography is poorly taken into consideration as not much effort has been made to add a certain type of style to the font-face used for the main content nor for the links.

Again, this is another website with very poor care for typography. In fact, body-text doesn’t exist at all but the information is scattered throughout the site together with links and butterflies that fly behind the content one’s trying to read. Again, the background image itself makes the content difficult to read regardless of the butterflies and also many different fonts are used randomly in a non-harmonious composition.  

Filed under: Uncategorized No Comments

Coursework 5

Three Websites with Good Colour Scheme

Victoria’s website is a good example for a Monochromatic Color Scheme website. Here I think that the use of the one brown color in its different shades gives the site certain qualities such as simplicity and harmony which makes everything look neatly and tidily in the right place. This I believe reflects the purpose of a portfolio website.



Here I believe, a Complimentary Color Scheme based on blue and orange is used, even though some extra color like pink has been added throughout the site. All seems to work well together and blends harmoniously with the background image and the graphics suggesting a certain atmosphere of fun.


This is another portfolio website where a nice color scheme has been used. The color does not present any variation of gradient, in fact is very flat yet it’s well-combined together. There is a lot of high-contrast between colors and text and this makes it very accessible. What I liked the most is the minimalistic structure obtained by the use of a few colors.







Three Websites with Poor Colour Scheme

This I think is really shocking! The rainbow pattern is innervating not just for the highly saturated color but also for the fact that it is animated. In other words, looking at it will drive you crazy! This is just the Intro page, yet looking at the inside it doesn’t seem to improve. In fact, there is no consistency in the use of color in each singular page and the extensive use of animation makes the information difficult to read and therefore to be delivered for what the website is “preaching”.



This seems to be the website of a German Arty Hotel where the abuse of color makes the all visual appearance break. It doesn’t seem that a concrete color scheme has been adopted to create this website since both the background color and the font color seems to be randomly picked and also do not work harmoniously together, even though some high contrast is provided. There is no consistency throughout the site in the way that each page presents a different combination of color and also having links that disappear in the background it seems to me quite pointless.




Although a monochromatic color scheme is adopted and work well throughout the site, I think that the highly saturated color catches the eyes making the text hard to read even though there is completely high contrast. So I would suggest not using the white as the font color but a different gradient of color which would make the text easier to follow without having the background color lighting your eyes. 


Filed under: Uncategorized 3 Comments

Coursework CSS

Hey this is my html + CSS...

Filed under: Uncategorized No Comments

Five things I learnt this week

  1. How to create web pages using XHTML 1.0 strict.
  2. How to use an FTP Client
  3. How to Set Up a Hosting Server
  4. How to use correct markup (I Hope)
  5. That a <br /> tag needs to be nested.
Filed under: Uncategorized No Comments

A link to my site


Validation Passed

Filed under: Uncategorized No Comments