Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
fuss:apache [2016/12/04 13:57]
office [Block Bad Bots]
fuss:apache [2017/02/22 18:30] (current)
Line 1: Line 1:
 +====== Using Apache Memory Cache ======
  
 +Apache has its own memory cache module ''​mod_mem_cache''​ which can be used to cache content similar to ''​memcached''​. The following directives configure Apache to use ''​1GB''​ of memory and will cache up to ''​1000''​ objects, each between ''​1''​ byte and ''​1''​ megabyte.
 +
 +<code apache>
 +<​IfModule mod_mem_cache.c>​
 +  Cache Enable mem /
 +  MCacheSize 1074741824
 +  MCacheMaxObjectCount 1000
 +  MCacheMaxObjectSize 1048576
 +</​IfModule>​
 +</​code>​
 +
 +====== Usual Required Modules ======
 +
 +<​code>​
 +authz_host_module
 +log_config_module
 +headers_module
 +setenvif_module
 +# Proxy needed by default on OSX
 +proxy_module
 +mime_module
 +# DAV needed by default on OSX
 +dav_module
 +dav_fs_module
 +# Status needed by default on OSX
 +status_module
 +negotiation_module
 +dir_module
 +userdir_module
 +dir_module
 +userdir_module
 +apple_userdir_module
 +alias_module
 +rewrite_module
 +php5_module
 +</​code>​
 +
 +====== Remove Directory Listing ======
 +
 +Find:
 +<code apache>
 +Options Indexes FollowSymLinks MultiViews
 +</​code>​
 +and remove ''​Indexes''​ from the options list.
 +
 +====== Allow and Deny Network ======
 +
 +Every directory can be protected with ''​Allow''​ and ''​Deny''​. The following example denies access to the root path ''/''​ to anybody except the local network ''​192.168.1.0/​24'':​
 +
 +<code apache>
 +<​Directory />
 +       Order deny,allow
 +       Allow from 192.168.1.0/​24
 +       Deny from all
 +</​Directory>​
 +</​code>​
 +
 +The ''​Order''​ is important - it says to first ''​deny'',​ which resolves to ''​Deny from all''​ and after that to ''​allow''​ which resolves to ''​Allow from 192.168.1.0/​24''​.
 +
 +====== Reverse Proxy with Virtual Host ======
 +
 +Apache can be configured as a reverse proxy in order to pass traffic to a backend. For example, the following directives:
 +<code apache>
 +<​VirtualHost *:80>
 +       ​ProxyRequests Off
 +       ​ProxyPreserveHost On
 +       ​ProxyPass / http://​192.168.0.1:​80/​ connectiontimeout=300 timeout=300
 +       ​ProxyPassReverse / http://​192.168.0.1:​80/​
 +       ​ServerName website.com
 +       ​ServerAlias *.website.com
 +       ​ErrorLog ${APACHE_LOG_DIR}/​error.log
 +       ​LogLevel warn
 +       ​CustomLog ${APACHE_LOG_DIR}/​access.log combined
 +</​VirtualHost>​
 +</​code>​
 +will redirect all traffic for ''​website.com''​ and any sub-domain, such as ''​site.website.com''​ to the host ''​192.168.0.1''​. Additionally,​ it specifies that connections will timeout in ''​300s''​. As long as:
 +
 +<code apache>
 +<​VirtualHost *:80>
 +</​code>​
 +
 +is specified, then you can have any number of websites to be proxied to their respective backends.
 +
 +====== Block Bad Bots ======
 +
 +On Debian-like systems, a global rule can be set to block bad bots. In order to do so create a file: ''/​etc/​apache2/​conf.d/​blockbots.conf''​ containing the [[apache/​templates/​security/​crawlers/​bad_bots|bad_bots configuration]].
 +
 +You must also enable the ''​setenvif''​ module with:
 +<code bash>
 +a2enmod setenvif
 +</​code>​
 +====== Create WebDAV Share ======
 +
 +The following directive will share ''/​var/​www/​webserver/​share''​ through ''​WebDAV'':​
 +
 +<code apache>
 +        <​Directory /​var/​www/​webserver/​share>​
 +                DAV On
 +                AuthType Digest
 +                AuthName "​share"​
 +                AuthDigestDomain /
 +                AuthDigestProvider file
 +                AuthUserFile /​etc/​apache2/​.htpasswd
 +                <​LimitExcept PROPFIND>​
 +                    Require valid-user
 +                </​LimitExcept>​
 +        </​Directory>​
 +</​code>​
 +It seems that ''​Directory''​ is prefered instead of ''​Location'';​ the latter leading sometimes to ''​System Error 67''​ meaning that the network path could not be found.
 +
 +The authentication mechanism to use for maximum compatibility is ''​Digest''​ and username and password pairs can be generated using the ''​htdigest''​ command line utility. For this example:
 +
 +<code bash>
 +htdigest -c /​etc/​apache2/​.htpasswd share username
 +</​code>​
 +
 +which will prompt for a password.
 +
 +If you have been trying for a while to access the ''​DAV''​ share from Windows and are prompted for the username and password over and over again even if you entered the correct information,​ you may want to run in Windows on a command prompt:
 +<code dos>
 +net stop WebClient
 +net start WebClient
 +</​code>​
 +
 +in order to restart the web-client thereby resetting ''​WebDAV''​ caches and fixing the username and password prompt loop.
 +
 +
 +====== Set Expiration for Resources ======
 +
 +After enabling ''​mod_expires''​ with:
 +<code bash>
 +a2enmod expires
 +</​code>​
 +
 +after which [[apache/​templates/​optimizations/​expiration/​expiration.conf|an optimized configuration]] can be added as a file under ''/​etc/​apache/​conf.d/''​ in order to set the expiration time for resources.
 +====== Apache Logs Statistics Through using The Shell ======
 +
 +We use ''​zcat''​ in combination with the ''​-f''​ flag to read compressed and uncompressed files. The commands assume that your current working directory is in the apache log directory. The commands also assume that the files are named ''​access.log*''​ which may be different on other systems (ie: ''​access_log''​).
 +
 +===== Get Number of Unique Visitors =====
 +
 +<code bash>
 +zcat -f access.log* | awk '​{print $1}' | sort -u | wc -l
 +</​code>​
 +
 +===== Get Number of Unique Returning Visitors =====
 +
 +<code bash>
 +zcat -f access.log* | awk '​{print $1}' | sort | uniq -c -d | sort -n -k 1
 +</​code>​
 +
 +====== Apache Image Hot-Link Protection ======
 +
 +The idea of preventing people from hot-linking images on your website is contradictory in itself: if you are serving content, then why prevent people from using the content you provide? You can instead shut your website down and keep the content to yourself.
 +
 +Nevertheless,​ some people chose to do this, and one way to prevent hot-linking is to branch on the referrer (typo intentional in the example below) and deny any access to images when the referrer is not your website. In the example, we assume that ''​www.example.com''​ and ''​example.com''​ are the accepted domains used as referrer to your images and any other referrer, except a blank referrer (last rule) is denied the connection:
 +
 +<code apache>
 +SetEnvIfNoCase Referer "​^http://​www.example.com/"​ locally_linked=1
 +SetEnvIfNoCase Referer "​^http://​www.example.com$"​ locally_linked=1
 +SetEnvIfNoCase Referer "​^http://​example.com/"​ locally_linked=1
 +SetEnvIfNoCase Referer "​^http://​example.com$"​ locally_linked=1
 +SetEnvIfNoCase Referer "​^$"​ locally_linked=1
 +<​FilesMatch "​\.(gif|png|jpe?​g)$">​
 +Order Allow,Deny
 +Allow from env=locally_linked
 +</​FilesMatch>​
 +</​code>​
 +
 +Note that branching on referrers is a very bad idea, given upstream caches that may manipulate headers as well as the possibility to circumvent this protection easily by sending a different referrer (your domain) along with the request. Bad, bad idea.
 +
 +====== Permanently Redirecting pages ======
 +
 +The best way to permanently redirect surfers to a different website is to ''​RedirectMatch''​ with the ''​permanent''​ option. This will ensure that the ''​301''​ (permanently moved) will be sent before redirecting.
 +
 +Suppose you have a virtual host configuration for the domain ''​servername.com'':​
 +<code apache>
 +<​VirtualHost *:80>
 +  ServerName ​   servername.com
 +  DocumentRoot ​ /​var/​www/​servername.com
 +  <​Directory /​var/​www/​servername.com
 +    Order allow,deny
 +    Allow from all
 +  </​Directory>​
 +</​VirtualHost>​
 +</​code>​
 +
 +and that you want to redirect ''​www.servername.com''​ to ''​servername.com''​. You would then add another virtual host that will perform the redirection:​
 +
 +<code apache>
 +<​VirtualHost *:80>
 +  ServerName ​ www.servername.com
 +  RedirectMatch permanent ^/(.*) http://​servername.com/​$1
 +</​VirtualHost>​
 +</​code>​
 +
 +====== Migrating Website Domain ======
 +
 +In this example, we want to move ''​somesite.net''​ to ''​newsite.org''​. We already have a virtual host in-place to handle ''​newsite.org''​ and we need to redirect every page from ''​somesite.net''​ as well as any sub-domains to ''​newsite.org''​. In order to do that, we modify the old virtual-host for ''​somesite.net'':​
 +
 +<code apache>
 +<​VirtualHost *:80>
 +        ServerName ​   somesite.net
 +        ServerAlias ​  ​*.somesite.net
 +        RewriteEngine on
 +        RewriteCond ​  ​%{HTTPS}s ​   ^..(s?)
 +        RewriteRule ​  ​^ ​           -                                 ​[E=PROTO:​http%1]
 +        RewriteCond ​  ​%{HTTP_HOST} !^\w+\.newsite\.org$ ​             [NC]
 +        RewriteCond ​  ​%{HTTP_HOST} ^(\w+)\.\w+\..+ ​                  [NC]
 +        RewriteRule ​  ​^/?​(.*) ​     %{ENV:​PROTO}://​%1. newsite.org/​$1 [R=301,L]
 +        RewriteCond ​  ​%{HTTP_HOST} somesite\.net
 +        RewriteRule ​  ​^/?​(.*) ​     %{ENV:​PROTO}://​newsite.org/​$1 ​    ​[R=301,​L]
 +</​VirtualHost>​
 +</​code>​
 +
 +this will match both protocols, HTTP and HTTPs (matching the last ''​s''​),​ it will also match any sub-domains and redirect everything to ''​newsite.org''​.
 +
 +====== Disabling Timeouts for Long-Running Services ======
 +
 +When using a web-server for live streams or software repositories (git, svn, etc...) it is sometimes useful to disable timeouts. This can be done in apache using the following directives:
 +
 +<code apache>
 +<​IfModule reqtimeout_module>​
 +    RequestReadTimeout header=0 body=0
 +</​IfModule>​
 +LimitRequestBody 0
 +
 +</​code>​
 +
 +====== Make Apache Case-Insensitive ======
 +
 +When migrating documents from case-insensitive filesystems to a server with Apache and a case-sensitive system, some links may appear broken and inaccessible. In order to resolve this, enable the ''​speling''​ (note the lack of double-''​l''​) module with:
 +<code bash>
 +a2enmod speling
 +</​code>​
 +
 +and then either in an ''​.htaccess'',​ ''​Directory''​ directive or globally add the setting:
 +<code apache>
 +CheckSpelling on
 +</​code>​
 +
 +====== Configure Zend OpCache ======
 +
 +To increase the amount of memory alloted to the Zend OpCache found in newer PHP variants, edit the file ''/​etc/​php5/​apache2/​conf.d/​05-opcache.ini''​. It should contain something like the following:
 +<​code>​
 +; configuration for php ZendOpcache module
 +; priority=05
 +zend_extension=opcache.so
 +</​code>​
 +
 +and then after that line add the following options:
 +
 +<​code>​
 +opcache.memory_consumption=256
 +opcache.interned_strings_buffer=8
 +opcache.max_accelerated_files=8000
 +opcache.revalidate_freq=60
 +opcache.fast_shutdown=1
 +opcache.enable_cli=1
 +</​code>​
 +
 +====== Redirect to HTTPs and then Authenticate ======
 +
 +Suppose you have a ''/​support''​ alias where clients should be able to access some protected data by authenticating first using basic authentication. In this case, the password may travel over the wire unencrypted such that we need to first redirect through HTTPs and then provide authentication.
 +
 +In the example below a ''/​support''​ alias is created that is protected by [[/​apache/​pam_authentication|basic authentication through PAM]]. By using the 403 error document we can redirect to the HTTPs version of the website ''​site.tld''​.
 +
 +<code apache>
 +Alias /support /​usr/​share/​support
 +
 +<​Directory /​opt/​support>​
 +  Options FollowSymLinks MultiViews
 +  AllowOverride All
 +
 +  SSLOptions +StrictRequire
 +  SSLRequireSSL
 +  ErrorDocument 403 https://​site.tld/​support
 +                                         
 +  AuthType Basic
 +  AuthName "​Restricted Area"
 +  AuthBasicProvider external
 +  AuthExternal pwauth
 +  GroupExternal unixgroup
 +  Require valid-user
 +
 +</​Directory>​
 +</​code>​
 +
 +The reason for this hack is that authentication takes precedence such that using ''​mod_rewrite''​ to create a redirect would not work. As such, the setup now generates a 403 error when the HTTP version of the website is requested, which in turn redirects to the HTTPs version where the authentication takes place over HTTPs.
 +
 +====== Enable HSTS =====
 +
 +If a website accepts a connection through HTTP and then redirects to HTTPS, there is a small window of opportunity where the client talks to the server without encryption that can be abused with a MITM attack.
 +
 +First enable the headers module:
 +<code bash>
 +a2enmod headers
 +</​code>​
 +
 +To enable HSTS, edit the SSL virtual host version of your website and add the following stanza:
 +<​code>​
 +Header always set Strict-Transport-Security "​max-age=31556926;​ includeSubdomains;"​
 +</​code>​
 +
 +this will tell browsers to use the HTTPS version of the website without loading the HTTP version first.
 +
 +The age in the example is set to one year and the ''​includeSubdomains''​ option will enable HSTS for any subdomains as well.
 +
 +====== Enforce Strong Cryptography ======
 +
 +When using Apache SSL, the following directives should be added in order to use strong cryptography:​
 +<​code>​
 +SSLProtocol all -SSLv2 -SSLv3
 +SSLCipherSuite ECDHE-ECDSA-CHACHA20-POLY1305:​ECDHE-RSA-CHACHA20-POLY1305:​ECDHE-ECDSA-AES128-GCM-SHA256:​ECDHE-RSA-AES128-GCM-SHA256:​ECDHE-ECDSA-AES256-GCM-SHA384:​ECDHE-RSA-AES256-GCM-SHA384:​DHE-RSA-AES128-GCM-SHA256:​DHE-RSA-AES256-GCM-SHA384:​ECDHE-ECDSA-AES128-SHA256:​ECDHE-RSA-AES128-SHA256:​ECDHE-ECDSA-AES128-SHA:​ECDHE-RSA-AES256-SHA384:​ECDHE-RSA-AES128-SHA:​ECDHE-ECDSA-AES256-SHA384:​ECDHE-ECDSA-AES256-SHA:​ECDHE-RSA-AES256-SHA:​DHE-RSA-AES128-SHA256:​DHE-RSA-AES128-SHA:​DHE-RSA-AES256-SHA256:​DHE-RSA-AES256-SHA:​ECDHE-ECDSA-DES-CBC3-SHA:​ECDHE-RSA-DES-CBC3-SHA:​EDH-RSA-DES-CBC3-SHA:​AES128-GCM-SHA256:​AES256-GCM-SHA384:​AES128-SHA256:​AES256-SHA256:​AES128-SHA:​AES256-SHA:​DES-CBC3-SHA:​!DSS
 +SSLHonorCipherOrder on
 +SSLCompression off
 +SSLSessionTickets off
 +</​code>​
 +
 +the settings will disable SSLv2 and SSLv3 as well as enable most of the elliptic-curve cryptography.
 +
 +Optionally, given a non self-signed certificate,​ SSL stapling can be used for apache above 2.3.3:
 +<​code>​
 +SSLUseStapling on
 +SSLStaplingResponderTimeout 5
 +SSLStaplingReturnResponderErrors off
 +SSLStaplingCache shmcb:/​var/​run/​ocsp(128000) ​
 +</​code>​
 +
 +====== Return 404 for Default Virtual Host ======
 +
 +Create a virtual host file or create a virtual host using the following configuration:​
 +<code apache>
 +<​VirtualHost *:80>
 + Redirect 404 /
 +</​VirtualHost>​
 +</​code>​
 +
 +The effect is that if a visitor attempts to access the server IP directly, the default virtual host will return a 404 error.
 +
 +====== Calculate the Optimal Maximum Number of Concurrent Connections ======
 +
 +To calculate the ''​MaxClients''​ value for Apache, you first need to get the Resident Set Size (RSS) of an Apache process. In order to do this, under Linux, you can issue:
 +<code bash>
 +ps -ylC apache2 --sort:rss
 +</​code>​
 +
 +otherwise, the ''​top''​ utility may work as well by watching the ''​SIZE''​ or ''​RSS''​ value.
 +
 +The formula to calculate the value that should be passed to ''​MaxClients''​ in the Apache configuration is:
 +\begin{eqnarray*}
 +\text{MaxClients} &=& \frac{(\text{Total Memory}\mbox{ }GB) * 1024 - \text{Memory for Other Processes}\mbox{ }MB}{\text{Resident Set Size (RSS)}\mbox{ }MB} \\
 +\end{eqnarray*}
 +
 +For instance, assuming a server with a total of 8GB of RAM, leaving 1GB over to other processes, and with an Apache RSS of 100MB the calculation would become:
 +\begin{eqnarray*}
 +\text{MaxClients} &=& \frac{8GB * 1024 - 1024}{100MB} \\
 +&​\approx&​ 71 
 +\end{eqnarray*}

fuss/apache.txt ยท Last modified: 2017/02/22 18:30 (external edit)

Access website using Tor Access website using i2p


For the copyright, license, warranty and privacy terms for the usage of this website please see the license, privacy and plagiarism pages.