Is it okay to have a very long .htaccess file?
I am giving out URLs to people on a website, that actually point to something ugly (on the same website).
I'm planning to achieve this by modifying the .htaccess file whenever needed. I'm going to make the PHP script write to the .htaccess file, adding a new rewrite rule whenever there is a new hand-out of a url (from an admin area a non-programmer can control [specify the URL title for this new entry, admin : it will be assigned automagically]).
Is this going to be a problem, especially after 1000 or so such URLs? What is the actual number that is acceptable? Because, I can picture this: the server receives a request for a URL, then it searches the .htaccess file for the right page for this url, and finally sends the user to the right page. If this is anything like Database searching, it might take a long time for the user to actually get to the right page...
Any pointers on this, please?
No, it's not OK and will hamper your page-load speed.
.htaccess files are evaluated on EVERY server request. Even for static images, CSS and JS files. So, you're asking the webserver to parse 1000+ long possibly REGEX lines while executing a request.
Also, parent directory's .htaccess file is processed for files residing in subdirectory too. So, if your large .htaccess is in root directory of website, then it will be processed for all requests made for files in the subdirectories too (along with the .htaccess file in subdirectory).
That my friend is a lot of processing. If you have a page with 10 images (for example) it gets processes 11 times. And the more processing in the file, then the more cycles it takes. So yes, anything in a htaccess file has an impact. Now is that impact noticeable? It is hard to say when it becomes an issue. But it would have to be pretty big as the processing in relatively simple, which in your case is.
The key with an htaccess file is to make it smart. You would not want to list 200 entries. You can so it smart with just a few lines (if you want to use htaccess).
I would perform a simple test: generate a large .htaccess file with random URLs, and measure the resulting performance yourself.
import random,string def rand_string(): length = random.randint(4,10) res =  for i in range(length): res.append(random.choice(string.letters)) return ''.join(res) for i in range(1000): print "RewriteRule %s http://www.mydomain.com/boring.php?%s [R]" % \ (rand_string(), rand_string())
Having that much rules is really inadvisable. It’s not that they are in a .htaccess file or that they are tested for each request. (Depending on the configuration, that’s might also happen if you put them into your server’s or virtual host configuration.)
It’s the mere fact that it’s that huge amount of rules that is tested and, depending on the rules, every rule has to be tested until a match is found.
Well, you could counteract that by arranging the rules in the order of matching probability so that the probability of finding a match early is high. But it’s complexity is still O(n) in the worst case.
If you really need that much mappings and the mappings are fixed, you can use a RewriteMap hash file that has a complexity of O(1) instead of O(n) for the separated rules. Or you shift the mapping to your PHP application and do it there.
It should be fine. You just might experience server lag when it gets really long. I would look into mod_rewrite specs, you might be able to automate the forwards with a few line & regex function. I don't know enough about the url variables that will be passed to give you an example.
It's definitely not OK, as a few people here have mentioned.
Do you know all the URLs you wish to rewrite beforehand. If so, you can store the rules in some database, iterate thru the rules and pre-generate the actual URLS,and store them in memcache, with key being the good-looking URL and value being the actual URL of the content.
Then when the request comes, look up the key in memcache, and redirect the user to the real URL. I don't even think you need .htaccess for this.