SpiderOak - empty the upload queue from the command line

Submitted by Janak on Mon, 07/11/2011 - 00:29

Being a fan of all things security, I recently started using SpiderOak for my secure cloud storage. The service is brilliant but I came across a few issues, one of which is quite a bug one for me.

Suppose in the heat of the moment you decided to upload/backup a large folder/file but only to change your mind. There isnt a neat GUI solution that will cancel the upload process. Hopefully SpiderOak will fix this in the next release, in the meantime here is how I managed purge upload queue it on my mac:

Pause all uploads

well, hit the 'pause all uploads' button in the status tab

Dropbox Encryption - Install EncFS on Linux to encrypt-decrypt Dropbox content realtime

Submitted by Janak on Wed, 06/22/2011 - 17:24

If you are using Dropbox, it might be worthwhile installing some kind of encryption to secure and protect your data. It is fairly easy if you are using Linux.

Install EncFS

sudo yum install fuse-encfs

Setup the encryption and decryption paths

encfs ~/Dropbox/encfs ~/Documents/encfs
The directory "/home/janaksingh/Dropbox/encfs/" does not exist. Should it be created? (y,n) y
The directory "/home/janaksingh/Documents/encfs" does not exist. Should it be created? (y,n) y
Creating new encrypted volume.

Dropbox Encryption - Install EncFS on OSX to encrypt-decrypt Dropbox content realtime

Submitted by Janak on Mon, 05/09/2011 - 01:16

If you have been following the recent Dropbox news, you might have some serious concerns about privacy and security of your data. If, like me you do not "trust" anyone with your data, here is a quick guide to how you can get started with EncFS for realtime -on the fly- encryption of your dropbox contents.

wait... BUT..... what is EncFS??

Rsync to OSX with foreign letters in filenames - delete copy loop

Submitted by Janak on Sun, 05/08/2011 - 14:12

So, you have a few files with foreign letters in filenames and you back them up using rsync. All is well unless you rsync from a linux machine to OSX machine. It seems OSX handles UTF-8 characters in a custom way - using 3 bytes.. hmm.. so:

Linux -rsync-> OSX

  • First pass: rsync from linux to OSX and all files appear to be copied
  • Second pass: rsync deletes the files on remote OSX server and copies them again
  • Third pass: same as above

Rsync on OSX - Upgrade to version 3x

Submitted by Janak on Sun, 05/08/2011 - 13:32

OK, OSX is a "good" operating system and I use it among win7 and centos BUT there are a number of things that annoy me from a geek perspective. One of them is outdated versions of various packages - in this case Rsync.

By defaut, OSX 10.6 ships with version 2.6.9 - which for most basic taks is "fine" but if you are rsyncing files with international characters in filename from a linux you need to use iconv option with rsync, more about that in this post.

Complete workflow, storage and backup solution for images and video - on a budget

Submitted by Janak on Thu, 05/05/2011 - 18:51

Around a year ago the unthinkable happened - my HDDs failed on me with over 6 years worth of photography and graphics design work. Nope, I only had one "backup" and that was corrupted and out of date. Lessons learnt..

Having searched the web and seeking advice from @chromasia and others, I wanted to setup a bulletproof workflow and backup solution for my digital assets - mainly RAW pics and 1080p video footage from my Canon 7D.

Fedora 14 - Enable Apple File Sharing for OSX clients

Submitted by Janak on Sun, 01/23/2011 - 22:40

So, I had a spare NC10 Netbook which made for a perfect candidate to get Fedora 14 install and turned into a headless NAS + Torrent box. All perfect, but I wanted to share the files from Fedora with my MacBook. It seems there are loads of complete and complex tutorials out there, so here is my compact set of notes from my own install.

Install Netatalk

su
yum install netatalk
yum install avahi

Install AFPD avahi service

nano /etc/avahi/services/afpd.service


Varnish Cache - Memory, Hitrate & Bandwidth Optimisation for Drupal Pressflow

Submitted by Janak on Tue, 01/11/2011 - 16:39

Recently I started to notice I very high number of LRU Nuked Objects on my websites, which was essentially wipping the entire Varnish Cache. I run Varnish with 4GB File cache and site ocntent is mostly served by external "Poor Man CDN". So, in theory my site content should not be anything near 4GB, however, Varnish Cache was running out of memory and "Nuking" cached objects.

Sizing your cache

Here is what Varnish Cache man pages have to say:

Watch the n_lru_nuked counter with varnishstat or some other tool. If you have a lot of LRU activity then your cache is evicting objects due to space constraints and you should consider increasing the size of the cache.