This is the chapter of the MemberManual that describes how to serve your website(s).
Static Web Sites
If you're going to use a domain, please read the next section. If you plan on having static websites without any CGI such as php or perl, then read on. In your home directory, there is a directory named public_html. By default, you can access this at http://www.hcoop.net/~USER. You will never be able to execute server-side scripts when accessing webpages in that manner.
Dynamic Web Sites
If you plan on having a website that utilizes CGI such as php or perl, then you must either have a domain or an hcoop.net subdomain (i.e., USER.hcoop.net).
When you have chosen a domain to be hosted by HCoop, you then simply request control of that domain at the portal. Once it is authorized by an administrator, you will be able to utilize DomTool. DomTool will let Apache and other services know about your domain. Please take a look at using DomTool, DomTool user guide, and DomTool examples to learn how to do this. Our nameservers are ns1.hcoop.net and ns2.hcoop.net.
As a hint, DomTool configurations are stored in ~/.domtool/. Some users have made their production configurations readable and so you may be able to learn from them. See the bottom of DomTool examples to find out who is showing off their DomTool configurations.
If your web application needs write access to a data directory, give USER.daemon write permission to it and all of its subdirectories. In this example, be sure to replace USER with your username (lowercase):
fsr sa ./webdata USER.daemon write
Alternatively, use only fs if you need to set the ACL for just one directory.
For database help, take a look at this manual's Databases chapter.
To see how you can transfer files to HCoop, see the Transferring Files chapter.
In addition, .htaccess files are not processed on our servers. See DomTool Examples to learn how to use rewrite rules and other features normally provided by .htaccess.
We use PHP 5 by default to serve .php, .phtml, and .php5 files. We may offer PHP variants; to explicitly set the PHP version, use the phpVersion action as follows.
To use PHP 5 in a directory or virtual host:
Common Web Applications
It is likely that another member has configured one of many common applications and documented it on the ../WebApplications page.
Running your own web server
Many popular Apache modules for "fast web serving," like mod_python and mod_perl, are incompatible with our security requirements; they force all Python, Perl, etc., scripts run through them to run as a single UNIX user. Thus, to use these modules, you will need to run your own separate web server. See RunningYourOwnApache. You will probably want to run lighttpd instead of your own Apache, since configuring it is much simpler.
It also couldn't hurt to petition the authors of these modules to fix this problem.
Examining your logs
The error and access logs are stored in ~/.logs/apache. They are separated by machine and domain. Your ~/.logs/apache directory is updated once every 20 minutes from the "real" logs in /var/log/apache2 on the machine that serves your virtual host. This is almost certainly navajos.
It is expected that you don't have permission to read your logs in /var/log/apache2. Instead, use the handy domtool-tail program to view the logs in realtime. For instance, this command line will dump the last entries in the access log for www.domain.com, in the style of the UNIX tail program. We assume that you have Domtool permissions on domain.com.
domtool-tail [-n LINES] www.domain.com access
The optional -n LINES argument will fetch that many lines from the log.
You can view a graphical representation of your access logs by browsing our webalizer interface at https://members.hcoop.net/webalizer/. Its statistics are updated once per day.
Permissions Issues (403 Access Denied)
When you publish web content, it will probably live in your home directory. The web server will need permission to read your files, or it will return "403 Access Denied" errors. Since your home directory is in AFS, normal UNIX permissions are irrelevant.
For instance, if you get a 403 error serving ~/public_html/otherdir/page.html, you might run this to see what's up:
$ fs listacl ~/public_html/otherdir Access list for /afs/hcoop.net/user/y/yo/you/public_html/otherdir is Normal rights: system:administrators rlidwka system:anyuser l you rlidwka
Oops! Apache only matches the "system:anyuser" principal, so it only gets the "l" (= "list") permission and can only list your directory contents. Try this to fix it:
$ fs setacl ~/public_html/otherdir system:anyuser read $ fs setacl ~/public_html system:anyuser read $ fs setacl ~ system:anyuser l
The first two give full read permission on the mentioned directories. "l" permission is needed in every parent directory of a file to be able to access it, so the last line makes sure "l" is granted to system:anyuser on your home directory.
When your web content is accessed through your own virtual host, you can also grant read access to $USER.daemon instead of the broader system:anyuser, where $USER is your username. This is your bizarro-world twin, which Apache runs as when serving your content.
Note that your CGI directories and executables should be in the group nogroup; if this is not the case you may see cryptic warnings in your error.log along the lines of suexec policy violation: see suexec log for more details.
See the Getting Started chapter of the Member Manual, in particular the AFS section, for information on how to work with AFS's separate notion of permissions.
Permissions Issues (500 Server Error for CGIs)
If you get a 500 server error when running a CGI script, one likely cause is directory permissions. suexec will refuse to run a cgi if its parent directory is writeable by others. So, make sure permissions are set to 755 and not 775. Note that the directory permissions do not actually affect anything (since we're using AndrewFileSystem), but modifying the suexec code to skip the checks is considered too risky.
Getting HTTPS access working
In order to serve websites over HTTPS, you will need to generate an SSL certificate and optionally request an IP address from us.
Either generate an SSL certificate yourself, or buy one from somewhere (search for "ssl certificate" using your search engine of choice for a list of popular vendors).
This will work for most for most people, but relies upon TLS SNI to work. This means that Windows XP users with Internet Explorer will get the generic *.hcoop.net certificate instead of your certificate. Supposedly, they account for less of the browsing public than GNU/Linux users do so you should almost never need a dedicated IP address just for SSL. However, if you do, and can justify it, we are happy to provide you with one (we just don't have very many).
WebDAV is a set of extensions to the HTTP protocol which allows users to collaboratively edit and manage files on remote web servers. WebDAV is useful when working on a website using systems that cannot mount an AFS share. General information of WebDAV can be found athttp://research.cs.berkeley.edu/doc/dav/.
If you want to be able to use DAV services with your own domain name, you will need to set up a host which is served via HTTPS. The Getting HTTPS access working section above should be of help. Then, you will want to add a stanza to your DomTool configuration to serve DAV. An example follows.
(* Redirect HTTP to HTTPS *) web "dav" with rewriteRule "^(.*)$" "https://dav.yourdomain$1" [redirect]; end; (* Serving DAV over HTTPS *) web "dav" where DocumentRoot = home "dav"; SSL = use_cert "/etc/apache2/ssl/user/yourdomain.pem"; with addDefaultCharset "utf-8"; location "/" with davFilesystem; end; end;
You will almost certainly want to require authorization to access your davFilesystem, since it runs with your $USER.daemon tokens, and can therefore read and write anything it can.