Home   -   Computerkurs   DemoMatrix   Editpointstatic   Folderwatcher   Gipsydrive   Licenses   Shrinkseries   V4   More...  

All   Asm   Bas   C   C++   C#   CSS   HTML   Java   JS   Lisp   OPAL   PHP   Perl   PS   Py   Rb   SQL   Win  

Chars   Colors   Misc   Syntax  

Previous page Home chapter Next page

Trilo - Downtown - Languages


No systematic reflection on HTML, just some loose specific diary enties.

This line makes the page not be cached by the browser. This is useful during development, as then each code change will be immediately function e.g. in the JavaScript Debugger. 20111121.1051

    <;meta http-equiv="Expires" context="0" />


Using .htaccess


Redirects e.g.:

   Redirect /temp http://www.kirupa.com/developer

Custom error pages. Note the nice feature at http://www.kirupa.com/404.htm (# 20111004.1102) Sample lines:

   ErrorDocument 401 /errors/auth.html
   ErrorDocument 403 /errors/forbidden.html
   ErrorDocument 404 http://www.kirupa.com/404.htm

Mimetypes. Especially, to make Apache execute PHP code inside HTML files. Sample lines:


Define index file:

   DirectoryIndex otherindex.html


Some of the above examples come from www.kirupa.com/html5/htaccess_tricks.htm (20111004°1101). Article

Using robots.txt

A file named robots.txt, placed in a website, tells search engines how they shall crawl the site. Most search engines respect the rules expressed in robots.txt, but probably not all do so. The file has to apply a specified syntax.

To allow all folders be searched, just leave the Disallow: directive empty, or leave the robots.txt empty, or don't place a robots.txt at all. Example for a robots.txt allowing to search the complete site (20110926°2111):

# file 20110926.2111
User-agent: *

Example for a robots.txt completely disallowing to search the site (20110926°2112):

# file 20110926.2112
User-agent: *
Disallow: /


www.robotstxt.org (20110926°2121) is a page dedicated to the robots.txt topic. Basics
Google Webmaster Tools Help article 'Block or remove pages using a robots.txt file' (20110926°2122) explains how Google uses robots.txt, and how you configure it correctly. If you are registered,, you can use a robots.txt checker there. More
Google Webmaster Tools Help article 'Using meta tags to block access to your site' (20110926°2123) explains how Google uses the <META NAME="ROBOTS"> tag. More

Using target="_blank"

On this pages here, I am using attribute target="blank" systematically in links that go to other sites. But I don't like it, and I am looking for a better solution.

Why I want it:

What's wrong about it:


Here how the arrow icons are introduced by CSS, plus changing the link color when the mouse goes over it:

   a.extern {
      background-position:left center;
   a.extern:hover {

Links about:

www.webmasterworld.com/forum21/11165.htm   (20110924°1822) Discussion plus some JavaScript code proposals as alternatives for '_blank'. Discussion
www.peterkroener.de/warum-target_blank-nervt-und-verboten-gehoert/   (20110924°1821) Very long discussion about the '_blank' issue. Also some people are defending the use of '_blank'. Discussion


SEO (Search Engine Optimization)

Just some links:

The T3N Magazine lists some SEO tools in the article SEO: Hässliche, aber nützliche Tools, some of which we inspect below (20110914°1421).

The Wayback Machine lets you inspect former versions of pages (20110914°1422).

Xenu's Link Sleuth finds broken links on sites. It's a free tool to be downloaded and run locally. (20110914°1423) .

SeoBook Rank Checker is a 'Free Search Engine Ranking Checking Tool for Firefox' ... (20110914°1424)

SeoBrowser is ..., slogan 'see your website like a search engine sees it'. (20110914°1425)

Validate robots.txt by robots.txt checker (20110914°1426)

urivalet.com validates pages and lists errors and statistics (20110914°14xx).

Datenschutz Imprint

http://downtown.trilo.de/demos-html.html © 2011 - 2018 Trilo Software e.K.   (20110717°1727)