The easiest way to backup Google Mail accounts is to enable IMAP on the account and download the messages using getmail.
Once getmail is installed you need to create a file called getmailrc. If you plan to download multiple gmail accounts then you might want to create a directory for each account and point the getmail script to that directory. Here is an example of a getmailrc file for Google Mail:
[retriever]
type = SimpleIMAPSSLRetriever
server = imap.gmail.com
username = username@example.com
password = examplepassword
mailboxes = ("[Gmail]/All Mail",)
port = 993
[destination]
type = Maildir
path = ~/username@example.com/
[options]
received = false
delivered_to = false
read_all = false
verbose = 1
After this file is saved you can procede to run getmail. I needed getmail to run all night and in the background. I outputted all the stdout to a logfile so I used the following command:
getmail --getmaildir . > output.txt 2>&1 &
Don't forget to create the directories cur, new, tmp as these are the directories that are needed for IMAP.
Now that you have your mail in a Maildir format what do you do with it? In my case I wanted to delete the account off Google Apps but still be able to search the mail if I needed it at a later date.
The strategy I came up with to bring up a copy of courier and serve the Maildir using a webmail script (in this case Roundcube).
I installed PHP through Nginx first. The easiest way to get a PHP environment up and running on Nginx is to use the Ubuntu packages:
php5-cgi
php5-common
For additional functionality such as PostgreSQL support you can install the package:
php5-pgsql
I then configured nginx with the following script:
server {
listen 80;
server_name webmail.example.com;
access_log /var/log/nginx/access.log;
log_subrequest off;
location / {
root /www/webmail.example.com;
index index.php;
location ~ \.php$ {
include fastcgi_params;
fastcgi_pass localhost:9000;
fastcgi_param SCRIPT_FILENAME /www/webmail.example.com/$fastcgi_script_name;
}
}
}
I then created the script: /etc/init.d/php-fcgi
BIND=127.0.0.1:9000
USER=www-data
PHP_FCGI_CHILDREN=15
PHP_FCGI_MAX_REQUESTS=1000
PHP_CGI=/usr/bin/php-cgi
PHP_CGI_NAME=`basename $PHP_CGI`
PHP_CGI_ARGS="- USER=$USER PATH=/usr/bin PHP_FCGI_CHILDREN=$PHP_FCGI_CHILDREN PHP_FCGI_MAX_REQUESTS=$PHP_FCGI_MAX_REQUESTS $PHP_CGI -b $BIND"
RETVAL=0
start() {
echo -n "Starting PHP FastCGI: "
start-stop-daemon --quiet --start --background --chuid "$USER" --exec /usr/bin/env -- $PHP_CGI_ARGS
RETVAL=$?
echo "$PHP_CGI_NAME."
}
stop() {
echo -n "Stopping PHP FastCGI: "
killall -q -w -u $USER $PHP_CGI
RETVAL=$?
echo "$PHP_CGI_NAME."
}
case "$1" in
start)
start
;;
stop)
stop
;;
restart)
stop
start
;;
*)
echo "Usage: php-fastcgi {start|stop|restart}"
exit 1
;;
esac
exit $RETVAL
After php was up and running I installed courier:
apt-get install courier-imap courier-imap-ssl
I then downloaded roundcube and configured roundcube as necessary.
Thursday, October 27, 2011
Wednesday, October 19, 2011
Google Apps Split Delivery for Email - Have your cake and eat it too
Split delivery for e-mail is when you have a single piece of mail but want a copy of it sent to multiple destinations. There are a couple of reasons you would want to do this:
- You are getting ready to migrate to Google Apps but don't want to go all in yet with your current e-mail server. This is understandable since you want to test out Google Apps first to see if it will work.
- You like Google Apps but don't want to pay the yearly fee. You rather just stay under the limit of the free accounts but still want to have e-mail accounts on your domain (i.e. @example.com).
- You have other special circumstances where you want a copy of all the mail that comes in and have it delivered to some other server.
This post is actually more toward the 2nd point. 90% of my e-mails are on Google Apps but there is a remaining 10% that I rather not have a Google Apps account. However I still want them to have e-mail via some webmail client.
To get this working I am using Ubuntu with Postfix running the primary mail server for example.com. I set the MX records for example.com to this:
10 mail.example.com
20 ALT1.ASPMX.L.GOOGLE.COM
Pretty straight forward so far. The trick is that you need to get Postfix to forward a copy of all Google Apps mail to their mail servers. To do this I use Postfix's Before-Queue Content Filter: http://www.postfix.org/SMTPD_PROXY_README.html.
This allows me to create a SMTP server that Postfix will delivery a copy of the mail to Google Apps. In the file /etc/postfix/master.cf I put the following at the end:
# =============================================================
# service type private unpriv chroot wakeup maxproc command
# (yes) (yes) (yes) (never) (100)
# =============================================================
#
# Before-filter SMTP server. Receive mail from the network and
# pass it to the content filter on localhost port 10025.
#
smtp inet n - n - 20 smtpd
-o smtpd_proxy_filter=127.0.0.1:10025
-o smtpd_client_connection_count_limit=10
This makes Postfix delivery a copy of the incoming mail to the SMTP server at 127.0.0.1:10025.
Okay but now you are asking where do I get an SMTP server to do the processing? It so happens Python comes with a smtp server library. I wrote a script that basicallyhttp://www.blogger.com/img/blank.gif inherits the SMTPServer (called CustomSMTPServer). It also implements the one command that is expected by Postfix (EHLO) because Postfix actually speaks ESMTP. I did this by subclassing the smtpd.SMTPChannel class. One caveat is that I had to use the _SMTPChannel__variablename syntax because some variables like fqdn and greeting were made private by the SMTPChannel class. So with Python you use the special syntax of prepending the class name to access it. This is generally bad practice but in this case it was all I had.
You can download the script here:
http://dl.dropbox.com/u/2177278/pymailforwarder.py
Simply run the script to start a basic SMTP server listening on port 10025 and localhost. The script simply accepts a piece of mail and forwards it to Google App's mail server.
So what does this let you do? In my case this lets me run Roundcube, Horde, or SquirrelMail for users that don't need a Google Apps e-mail account. For those that do I simply create that user on Google Apps.
Things to watch out for with this type of deployment:
- You have to edit /etc/postfix/main.cf and add the value
local_recipient_maps =
(Yes that is a blank or equals nothing). This makes Postfix accept the mail even though the recipient is not in the list. This can be bad. Postfix says this in the documentation:
With this setting, the Postfix SMTP server will not reject mail with "User unknown in local recipient table". Don't do this on systems that receive mail directly from the Internet. With today's worms and viruses, Postfix will become a backscatter source: it accepts mail for non-existent recipients and then tries to return that mail as "undeliverable" to the often forged sender address.
To get around this you should really have an alias map file. For temporary testing though this setting will work wonders.
- The Python SMTP relay should not be exposed to the internet. It listens on localhost but ideally it would be nice to modify the script to accept authentication of some sort.
- You are getting ready to migrate to Google Apps but don't want to go all in yet with your current e-mail server. This is understandable since you want to test out Google Apps first to see if it will work.
- You like Google Apps but don't want to pay the yearly fee. You rather just stay under the limit of the free accounts but still want to have e-mail accounts on your domain (i.e. @example.com).
- You have other special circumstances where you want a copy of all the mail that comes in and have it delivered to some other server.
This post is actually more toward the 2nd point. 90% of my e-mails are on Google Apps but there is a remaining 10% that I rather not have a Google Apps account. However I still want them to have e-mail via some webmail client.
To get this working I am using Ubuntu with Postfix running the primary mail server for example.com. I set the MX records for example.com to this:
10 mail.example.com
20 ALT1.ASPMX.L.GOOGLE.COM
Pretty straight forward so far. The trick is that you need to get Postfix to forward a copy of all Google Apps mail to their mail servers. To do this I use Postfix's Before-Queue Content Filter: http://www.postfix.org/SMTPD_PROXY_README.html.
This allows me to create a SMTP server that Postfix will delivery a copy of the mail to Google Apps. In the file /etc/postfix/master.cf I put the following at the end:
# =============================================================
# service type private unpriv chroot wakeup maxproc command
# (yes) (yes) (yes) (never) (100)
# =============================================================
#
# Before-filter SMTP server. Receive mail from the network and
# pass it to the content filter on localhost port 10025.
#
smtp inet n - n - 20 smtpd
-o smtpd_proxy_filter=127.0.0.1:10025
-o smtpd_client_connection_count_limit=10
This makes Postfix delivery a copy of the incoming mail to the SMTP server at 127.0.0.1:10025.
Okay but now you are asking where do I get an SMTP server to do the processing? It so happens Python comes with a smtp server library. I wrote a script that basicallyhttp://www.blogger.com/img/blank.gif inherits the SMTPServer (called CustomSMTPServer). It also implements the one command that is expected by Postfix (EHLO) because Postfix actually speaks ESMTP. I did this by subclassing the smtpd.SMTPChannel class. One caveat is that I had to use the _SMTPChannel__variablename syntax because some variables like fqdn and greeting were made private by the SMTPChannel class. So with Python you use the special syntax of prepending the class name to access it. This is generally bad practice but in this case it was all I had.
You can download the script here:
http://dl.dropbox.com/u/2177278/pymailforwarder.py
Simply run the script to start a basic SMTP server listening on port 10025 and localhost. The script simply accepts a piece of mail and forwards it to Google App's mail server.
So what does this let you do? In my case this lets me run Roundcube, Horde, or SquirrelMail for users that don't need a Google Apps e-mail account. For those that do I simply create that user on Google Apps.
Things to watch out for with this type of deployment:
- You have to edit /etc/postfix/main.cf and add the value
local_recipient_maps =
(Yes that is a blank or equals nothing). This makes Postfix accept the mail even though the recipient is not in the list. This can be bad. Postfix says this in the documentation:
With this setting, the Postfix SMTP server will not reject mail with "User unknown in local recipient table". Don't do this on systems that receive mail directly from the Internet. With today's worms and viruses, Postfix will become a backscatter source: it accepts mail for non-existent recipients and then tries to return that mail as "undeliverable" to the often forged sender address.
To get around this you should really have an alias map file. For temporary testing though this setting will work wonders.
- The Python SMTP relay should not be exposed to the internet. It listens on localhost but ideally it would be nice to modify the script to accept authentication of some sort.
Sunday, October 16, 2011
Locales in Ubuntu
The list of locales that you have installed on a system are in the directory:
/usr/lib/locale
To generate a locale you can run:
locale-gen en_US.UTF-8
This is very useful when you need to generate a UTF-8 locale.
/usr/lib/locale
To generate a locale you can run:
locale-gen en_US.UTF-8
This is very useful when you need to generate a UTF-8 locale.
Thursday, October 13, 2011
Microsoft Office Communicator - Problem verifying certificate from the server.
When using Microsoft Office Communicator with a server that has TLS enabled you might get an error message "Problem verifying certificate from the server.".
This message means that the computer you are on does not trust the certificate that is being presented to it.
The first way to troubleshoot this is to figure out what certificate it is receiving. The easiest way I've found to do this is to use openssl's s_client:
openssl s_client -connect lcs.example.com:5061
By doing this you will see the entire certificate chain. You now need to go into the windows certificate management tools and make sure that chain is valid.
Generally this will involve running mmc.exe, then adding the snap in "Certificate Management" for the computer itself.
Another option is to cut and paste the BEGIN and END certificate lines into a text file. Name the text file with a .der extension and install the certificate. Then browse to the certificate in MMC and see if anything is wrong. Things that might go wrong include the validity date or being unable to trust the certificate chain (most likely from missing certificates).
If you are missing certificates you need to track them down and install them. After this is done you should be able to connect to communicator server.
This message means that the computer you are on does not trust the certificate that is being presented to it.
The first way to troubleshoot this is to figure out what certificate it is receiving. The easiest way I've found to do this is to use openssl's s_client:
openssl s_client -connect lcs.example.com:5061
By doing this you will see the entire certificate chain. You now need to go into the windows certificate management tools and make sure that chain is valid.
Generally this will involve running mmc.exe, then adding the snap in "Certificate Management" for the computer itself.
Another option is to cut and paste the BEGIN and END certificate lines into a text file. Name the text file with a .der extension and install the certificate. Then browse to the certificate in MMC and see if anything is wrong. Things that might go wrong include the validity date or being unable to trust the certificate chain (most likely from missing certificates).
If you are missing certificates you need to track them down and install them. After this is done you should be able to connect to communicator server.
Wednesday, October 12, 2011
Getting Around the YouTube Duplicate Content Filter
I was recently trying to upload a video to YouTube but kept getting the Rejected (duplicate upload) message.
I found the easiest way to get around this is to change the metadata on the video you are trying to upload. In my case it was an mp4 video.
To change the metadata I used a program called AtomicParley:
http://atomicparsley.sourceforge.net/
After downloading it simply run
AtomicParsley.exe "example.mp4" --artist "Me"
Or whatever artist you want. The new file will be written out with new metadata. This should then pass any YouTube duplicate content check.
I found the easiest way to get around this is to change the metadata on the video you are trying to upload. In my case it was an mp4 video.
To change the metadata I used a program called AtomicParley:
http://atomicparsley.sourceforge.net/
After downloading it simply run
AtomicParsley.exe "example.mp4" --artist "Me"
Or whatever artist you want. The new file will be written out with new metadata. This should then pass any YouTube duplicate content check.
Wednesday, October 5, 2011
Wordpress htaccess
One of my friends recently had a desire to rewrite some URLs with a Wordpress installation. The theme they were using was called Solid-WP and it supported a concept called "Projects". When you create a new project it actually gives it the URL
http://www.example.com/project/
However there was a requirement that the URL should be renamed to http://www.example.com/apps/.
To do this I broke out mod_rewrite. Wordpress has some default rewrites stored inside .htaccess. Here is what it looks like:
# BEGIN WordPress
RewriteEngine On
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
# END WordPress
Rewrite can be hard to understand so I wanted to breakdown exactly what Wordpress was doing. First line turns on the Rewrite engine and 2nd line is used to define a base URL for rewrites.
RewriteRule ^index\.php$ - [L]
If there is a request for /index.php then don't do any rewriting and just end processing right here (the L flag).
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
These rewrite conditions are to test for real files and directories. Basically we don't want to rewrite a URL if the URL points to a directory or file. We just want to serve the file up. !-f tests whether or not the file exists and !-d tests whether or not the directory exists.
RewriteRule . /index.php [L]
This last line as I understand it rewrites all requests to index.php.
What I ended up adding was this:
RewriteEngine On
RewriteRule project/(.+) /apps/$1 [L,R]
RewriteRule apps/(.+) /example-wp/index.php/project/$1/ [L]
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
The first time a user might hit /project/example. They would hit the first rule and get redirected. Processing would stop.
The second time a user hit the URL it would say /apps/example which would trigger the 2nd rule. It would get rewritten to the long form of WordPress's controller action.
http://www.example.com/project/
However there was a requirement that the URL should be renamed to http://www.example.com/apps/
To do this I broke out mod_rewrite. Wordpress has some default rewrites stored inside .htaccess. Here is what it looks like:
# BEGIN WordPress
RewriteEngine On
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
# END WordPress
Rewrite can be hard to understand so I wanted to breakdown exactly what Wordpress was doing. First line turns on the Rewrite engine and 2nd line is used to define a base URL for rewrites.
RewriteRule ^index\.php$ - [L]
If there is a request for /index.php then don't do any rewriting and just end processing right here (the L flag).
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
These rewrite conditions are to test for real files and directories. Basically we don't want to rewrite a URL if the URL points to a directory or file. We just want to serve the file up. !-f tests whether or not the file exists and !-d tests whether or not the directory exists.
RewriteRule . /index.php [L]
This last line as I understand it rewrites all requests to index.php.
What I ended up adding was this:
RewriteEngine On
RewriteRule project/(.+) /apps/$1 [L,R]
RewriteRule apps/(.+) /example-wp/index.php/project/$1/ [L]
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
The first time a user might hit /project/example. They would hit the first rule and get redirected. Processing would stop.
The second time a user hit the URL it would say /apps/example which would trigger the 2nd rule. It would get rewritten to the long form of WordPress's controller action.
Thursday, January 20, 2011
Designing Good Web APIs
I am not going to claim eons of experience in designing good APIs. I am going to approach it from the background of a developer who has to use and integrate with them.
What makes a good API? Here are some of my tenets:
1. Make your API calls as RESTful-like as possible
Because this term has evolved quite a bit I am going to stick with the primarily principles. Make API calls nouns describing "what" and not "do something".
2. API calls should have defaults.
Don't give me 100 required parameters just to see some action. Try as hard as you can to give me a basic call so I know I am doing something right. Along with this is to keep a minimum of API calls. The more calls you have documented the more I have to figure out which one it is I am suppose to be calling to get the right information. This is an area where I actually do want to be spoon-fed.
3. Use HTTP basic over SSL for authentication.
If you have the time and energy also support OAuth. The reason I hesitate with going straight to OAuth is because if you are just building an API your resources are probably limited. You are going to make mistakes. Getting it up and running with the lowest (and secure) common denominator is key.
Also use dedicated API keys that consist of an ID and a shared secret. Make sure these can be rolled.
4. Make your return format JSON.
For the love of god please don't use XML.
5. Version your API call in the end point:
/api/1.0/user/3
/api/2.0/user/3
6. Make the calls easily testable.
To me this means I can plug it into a web browser, type in some credentials, and get a response back. As a developer this makes me feel good early on.
7. Document your API calls with examples.
Especially the ones where I can click on a link and it does an API call for me. This is also great for testing your call.
A number of these guidelines are geared heavily to web development. If you are designing an API for a message passing system with latency and size requirements a good deal of this would change.
What makes a good API? Here are some of my tenets:
1. Make your API calls as RESTful-like as possible
Because this term has evolved quite a bit I am going to stick with the primarily principles. Make API calls nouns describing "what" and not "do something".
2. API calls should have defaults.
Don't give me 100 required parameters just to see some action. Try as hard as you can to give me a basic call so I know I am doing something right. Along with this is to keep a minimum of API calls. The more calls you have documented the more I have to figure out which one it is I am suppose to be calling to get the right information. This is an area where I actually do want to be spoon-fed.
3. Use HTTP basic over SSL for authentication.
If you have the time and energy also support OAuth. The reason I hesitate with going straight to OAuth is because if you are just building an API your resources are probably limited. You are going to make mistakes. Getting it up and running with the lowest (and secure) common denominator is key.
Also use dedicated API keys that consist of an ID and a shared secret. Make sure these can be rolled.
4. Make your return format JSON.
For the love of god please don't use XML.
5. Version your API call in the end point:
/api/1.0/user/3
/api/2.0/user/3
6. Make the calls easily testable.
To me this means I can plug it into a web browser, type in some credentials, and get a response back. As a developer this makes me feel good early on.
7. Document your API calls with examples.
Especially the ones where I can click on a link and it does an API call for me. This is also great for testing your call.
A number of these guidelines are geared heavily to web development. If you are designing an API for a message passing system with latency and size requirements a good deal of this would change.
Subscribe to:
Posts (Atom)