Is the web strategy working? Does the navigation get people to the information they need? Is the server reliable? Measuring audience satisfaction, looking at feedback, understanding access statistics without measures such as these you will not be able to demonstrate value for money, or that you are meeting the needs of users and the aims of management. Therefore, regular (quarterly will be sufficient), formal evaluation exercises of both the content and the technology are strongly recommended.
Evaluation of website design and content can be carried out by drawing on:
The web strategy and management team should ensure, at the procurement stage, that ISPs/hosting services are offering to provide a full range of server log information.
It is acceptable to use HTTP cookies or session identities to track visitors' paths through the website (and this will be essential in e-transactional sites). The website should contain a clear statement of policy on the use of cookies.
Good practice dictates that the need for attention to the accuracy and timeliness of information will increase as the level of activity of a site increases.
Web managers should, in the interests of open government, consider publishing a summary of usage statistics on their websites
Evaluation of website design and content can be carried out by drawing on:
- Website access statistics provided by the ISP/hosting service provider. (The ISP/hosing services provider may either supply the raw web server logs or the results of their having been processed by analysis software);
- Responses via feedback tools (forms, databases, email addresses);
- Feedback from contributors to the website;
- Conventional audiences research, for example, focus groups and professionally authored online questionnaires.
- The number of recruits that applied via the website.
- Their performance of web recruits measured against that of staff recruited by other means.
- The cost per recruit measured against the cost per recruit of publicity in other media.
- number of unique users (visitors)
- number of visits ,and
- page impressions (page views).
- error message counts (indicating that pages and other content were not served successfully); and
- traffic analysis focussing on peak times (to assess bandwidth requirements) and ‘dead’ times (should it be necessary to switch the site off while maintenance is carried out)
- successful requests;
- unsuccessful requests;
- most frequently visited pages;
- least frequently visited pages;
- top entry pages;
- top referring websites.
- identify the most popular content,
- review the navigation system for example, identifying orphaned pages,
- identify referring websites (the sites from which users arrive at your website),
- audit the level of response to electronic forms,
- assess the effectiveness of marketing/PR campaigns in bringing traffic to the website,
- provide information on users’ platforms and browsers,
- identify users’ DNS domains and thus visits from abroad or from within government.
- give more importance to visitors, unique visits and page impressions than to hits;
- take as much notice of error logs as of any other statistics;
- determine who is using the website the most;
- monitor current bandwidth use, and attempt to project future requirements;
- archive server logs to use for monitoring trends over time.
The web strategy and management team should ensure, at the procurement stage, that ISPs/hosting services are offering to provide a full range of server log information.
It is acceptable to use HTTP cookies or session identities to track visitors' paths through the website (and this will be essential in e-transactional sites). The website should contain a clear statement of policy on the use of cookies.
Good practice dictates that the need for attention to the accuracy and timeliness of information will increase as the level of activity of a site increases.
Web managers should, in the interests of open government, consider publishing a summary of usage statistics on their websites
0 comments:
Post a Comment