Home - Computerkurs DemoMatrix Editpointstatic Folderwatcher Gipsydrive Licenses Shrinkseries V4 More... All Asm Bas C C++ C# CSS HTML Java JS Lisp OPAL PHP Perl PS Py Rb SQL Win |
![]() ![]() ![]() |
No systematic reflection on HTML, just some loose specific diary enties.
This line makes the page not be cached by the browser. This is useful during development, as then each code change will be immediately function e.g. in the JavaScript Debugger.
<;meta http-equiv="Expires" context="0" />
.
Samples.
Redirects e.g.:
Redirect /temp http://www.kirupa.com/developer
Custom error pages. Note the nice feature at http://www.kirupa.com/404.htm Sample lines:
ErrorDocument 401 /errors/auth.html ErrorDocument 403 /errors/forbidden.html ErrorDocument 404 http://www.kirupa.com/404.htm
Mimetypes. Especially, to make Apache execute PHP code inside HTML files. Sample lines:
...
Define index file:
DirectoryIndex otherindex.html
Links:
![]() |
Some of the above examples come from www.kirupa.com/html5/htaccess_tricks.htm . | Article |
A file named robots.txt, placed in a website, tells search engines how they shall crawl the site. Most search engines respect the rules expressed in robots.txt, but probably not all do so. The file has to apply a specified syntax.
To allow all folders be searched, just leave the Disallow: directive empty, or leave the robots.txt empty, or don't place a robots.txt at all. Example for a robots.txt allowing to search the complete site :
# file 20110926.2111 User-agent: * Disallow:
Example for a robots.txt completely disallowing to search the site :
# file 20110926.2112 User-agent: * Disallow: /
Links:
![]() |
www.robotstxt.org is a page dedicated to the robots.txt topic. | Basics |
![]() |
Google Webmaster Tools Help article 'Block or remove pages using a robots.txt file' explains how Google uses robots.txt, and how you configure it correctly. If you are registered,, you can use a robots.txt checker there. | More |
![]() |
Google Webmaster Tools Help article 'Using meta tags to block access to your site' explains how Google uses the <META NAME="ROBOTS"> tag. | More |
On this pages here, I am using attribute target="blank" systematically in links that go to other sites. But I don't like it, and I am looking for a better solution.
Why I want it:
What's wrong about it:
Alternatives:
Here how the arrow icons are introduced by CSS, plus changing the link color when the mouse goes over it:
a.extern { padding-left:11px; background-image:url(img/_20091221-152103_link-extern_.png); background-position:left center; background-repeat:no-repeat; } a.extern:hover { background-color:#ffd0d0; }
Links about:
![]() |
www.webmasterworld.com/forum21/11165.htm Discussion plus some JavaScript code proposals as alternatives for '_blank'. | Discussion |
![]() |
www.peterkroener.de/warum-target_blank-nervt-und-verboten-gehoert/ Very long discussion about the '_blank' issue. Also some people are defending the use of '_blank'. | Discussion |
Just some links:
The T3N Magazine lists some SEO tools in the article SEO: Hässliche, aber nützliche Tools, some of which we inspect below .
The Wayback Machine lets you inspect former versions of pages .
Xenu's Link Sleuth finds broken links on sites. It's a free tool to be downloaded and run locally. .
SeoBook Rank Checker is a 'Free Search Engine Ranking Checking Tool for Firefox' ...
SeoBrowser is ..., slogan 'see your website like a search engine sees it'.
Validate robots.txt by robots.txt checker
urivalet.com validates pages and lists errors and statistics .