Problem: You want to have backups of your huge website, but only new and changed, and yet have some way to recover any point in time restore of files. Meanwhile, you don't really want to have multiple entire site backups. Further, you only really have ftp access to download the website.
Solution: So, here's what you can do...
wget -r -N ftp://username:password@yoursite.com
git add .
git commit -m "daily backup"
Basically, browse and no-clobber files (update if timestamp is newer) and then add and commit to a git repository.
wget will only grab new files, git add will only add new files, and git commit stores the status of your folder structure with the new changes.
git log will show you your transaction history and you can git checkout any previous backup.