Monday, October 15, 2012

Compiling RetroShare for the Raspberry Pi

The question is not if you are paranoid,
it is if you are paranoid enough.
I'm a fan of darknets. I've always been. I don't want this to turn into a political discussion - it is a "tech" blog - so it should be sufficient to say that I like and use RetroShare.

RetroShare is a decentralized sharing application/network that is, in my opinion, the best darknet software available right now. I've already tried WASTE, GnuNet and AllianceP2P, and they couldn't convince me for long.
Maybe I'll make an article about the pros and cons of RetroShare, but in this article I want to describe my attempts to get it running on my Raspberry Pi, so it can serve as a 24/7 node in the RetroShare network to forward connections without consuming that much power.

When I heard about the Raspberry Pi, the geek inside me wanted to have one. Fabian is still pissed that my Raspi arrived sooner than his, that is it arrived at all, although I ordered it much later. (I think he wants to go for Parallella now ...)

I've already managed to compile RetroShare 0.5.3 on and for my Raspi, but it segfaulted whenever I wanted to add a friend or open the settings menu. With the recent release of 0.5.4 I thought I could give it a try again.

Be warned: The compiling alone takes a few hours. I didn't write down exact times, but it's probably best if you have some other tasks to attend to for 2 hours while one subproject is compiling.

What we need to get started

Raspberry Pi and Raspbian Wheezy
I'm starting with my Raspberry Pi (Model B) with a sufficiently large SD card (4 GB should be enough) and a freshly installed Raspbian Wheezy (build 2012-09-18), which I've updated today (15th October) doing apt-get update && apt-get upgrade, and haven't modified so far besides putting my dotfiles under revision control using git.

In case that information changes on the Raspi website, I'll duplicate the links and information here, so we're all on the same page:
Default loginUsername: pi Password: raspberry

RetroShare sources
The RetroShare version we need is 0.5.4b

Create a directory development in your home directory, cd there, download the sources and unpack them.
mkdir ~/development
mkdir ~/development/RetroShare-v0.5.4b
cd ~/development
cd RetroShare-v0.5.4b
tar -xvf ../RetroShare-v0.5.4b.tar.gz

Required packages

My primary source for instructions is the RetroShare wiki page UnixCompile of today's version. It is a bit outdated because they've removed gnupg and introduced openpgp, but didn't update their instructions. Install the following packages with:
sudo apt-get install libqt4-dev g++ libupnp-dev libssl-dev libgnome-keyring-dev libbz2-dev libxss-dev

Then you need to compile the projects in the subdirectories, but there are still some modifications necessary to make it work under Debian. The first two subprojects should work fine:
cd ~/development/RetroShare-v0.5.4b/trunk/
cd libbitdht/src && qmake && make clean && make -j2
cd ../../openpgpsdk/src && qmake && make clean && make -j2
The problems start with libretroshare and there are two changes you need to make on it:

Firstly, because even though the preprocessor commands are there, you still need to tell the make process that you are on Debian and your libupnp is of a different version. You do this by editing the file in the libretroshare/src/ directory and after the line 221
you add (not replace) these lines
because we do in fact have version 1.6.17 of libupnp installed. (See this thread in the RetroShare forum.)

Secondly, someone hardcoded the location of glib-2.0 into that project file in a very system dependent way. You need to change this line (should be directly below the previous change, line 224 now)
INCLUDEPATH += /usr/include/glib-2.0/ /usr/lib/glib-2.0/include
to this
INCLUDEPATH += $$system(pkg-config --cflags glib-2.0 | sed -e "s/-I//g")

cd ../../libretroshare/src && qmake && make clean && make -j2

And now the real fun starts. The Raspberry Pi has 256 MB RAM of which at least 16 MB need to be reserved for the video core, so only 240 MB RAM left. Unfortunately, this is not enough to compile retroshare-gui, it will quit with something like
virtual memory exhausted: Cannot allocate memory
g++: internal compiler error: Killed (program cc1plus)
when it tries to compile qrc_images.cpp.

Super hot update
The irony of situation is that just today an enhanced version of the Raspberry Pi was announced with 512 MB of RAM, which should make the following swap file part obsolete.

Now, I've heard that very, very bad things happen if you create a swap file on a flash storage device like the SD card your Raspi uses or USB sticks. But as a matter of fact, Raspbian wheezy already uses a swap file per default, so it can't be that bad. All you need to do is create another swap file of sufficient size, say 256 MB, which you delete afterwards just to be on the safe side:
sudo dd if=/dev/zero of=swapfile bs=1M count=256
sudo mkswap swapfile
sudo swapon swapfile
Then you can compile retroshare-gui
cd ../../retroshare-gui/src && qmake && make clean && make -j2
and deactivate the swap file again:
sudo swapoff swapfile
sudo rm swapfile
This is also a good opportunity to turn of the default swap file, too, as it really isn't that healthy for your SD card. (Hoping that actually running RetroShare is possible with only 256 MB RAM)


And that's it, you've just compiled RetroShare. This article is already way too long, so I'll stop here and post about configuring and running it in a later post. Hopefully soon.

Sunday, October 14, 2012

Step-wise howto for dotfiles with git

After implementing and using the approach to backup and manage linux configuration files for multiple machines I wanted to give a short reference and summary, both for you and for me to be able to easily look it up.

Step 1: Create a repository at github

Log into your account at github (if you don't have one, create one), and click on "Create a new repo". There you should enter a good name for your repository (I suggest "dotfiles" or similar) and - if you want - a description. Then click on "create repository". Done ;)

In case you haven't done so already, you should add the public SSH keys of your machines to the list of authorized keys, otherwise you won't be able to commit to your git repositories using SSH.
(In short, you have to copy the contents of ~/.ssh/ into the box at the github site. If you have no idea what I'm talking about, better read up on SSH public key authentication.)

Step 2: Initialize the dotfiles directory and repository

At first, you need to create the .dotfiles directory and move the dotfiles that you want to put under revision control there.
mkdir ~/.dotfiles
mv ~/.bashrc ~/.dotfiles/bashrc
mv ~/.bash_aliases ~/.dotfiles/bash_aliases
mv ~/.screenrc ~/.dotfiles/screenrc
mv ~/.vimrc ~/.dotfiles/vimrc
Then you need to place the symlinker script there, make it executable and execute it.
cd ~/.dotfiles
chmod u+x

Then initialize your repository and link it to github. This only needs to be done once.
git init
git config --global "YOUR NAME"
git config --global
git remote add origin
where "YOUR NAME" and are the name and mail address with which your commits will be signed. You don't have to do this and you don't have to provide your real name or address there. Decide for yourself.
But GITUB_USERNAME has to be your github username (who would have guessed ...).

Step 3: Track changes with git

Now and whenever you have changed something in your .dotfiles folder, add those changes or files to the repository and push it to github.
git add bashrc
git add bash_aliases
git add screenrc
git add vimrc
git add
git commit -m "First commit with some rc-files and"
git push origin master

Step 4: Add additional machines to your dotfiles management

Whenever you want to place another machine under your git dotfile management, you need to check out that repository on that machine.
mkdir ~/.dotfiles
git clone ~/.dotfiles
git config --global "YOUR NAME"
git config --global

Step 5: Pull changes from your github repository to your local machine

I haven't thought about whether I want to automate this step and how. For the moment, manually pulling changes from github seems the best solution to me, since I'm not changing my dotfiles on a daily basis.
git pull origin master
and sometimes
if you just added the machine to the management system or if new dotfiles have been put under revision control.

Addtional remarks

I've found it to be more comfortable to github as upstream repository with
git push -u origin master
because after that I only have to use
git push
git pull

Also, I wrote this article bases on the bash histories of my machines, so in case you encounter an error, please tell me so in the comments so that I can correct this article. Thanks ;)

Monday, October 1, 2012

Backup and manage linux configuration files for multiple machines

If you have multiple computers running linux, you've probably faced the same problem I do right now:
Dealing with your own personally customized configuration files like .vimrc, .bashrc, .screenrc, etc., which are called dotfiles. This includes situations like:
  • You setup a new machine and want to configure it the way you want
  • You upgrade your linux distro and now you have to merge your changes into the new config files
  • You already have multiple machines with different configurations but forgot what is where
I discussed this with a good friend (who also has a blog and you should totally check it out) and he suggested using git to manage them.

Now, I'm certainly not the first one to think about that and this problem has been solved often enough. I also don't want to copy or repeat what others have already written else, firstly because I probably can't make it much better, and secondly because I'm lazy. So I'll just explain the basic steps and link to the sources where I got it from.

Git and Github

The first thing I found was the post Using git and github to manage your dotfiles by Micheal Smalley and I liked it very much. It is a good starting point that explains how to set up git and also show a small script to do some managing of the dotfiles.

This will be from where I'll proceed now, because Fabian (above mentioned friend) wouldn't stop annoying me until I signed up at github.

But you shouldn't stop there, because there are some issues that can be solved even better.

Different machines need different dotfiles

I wasn't quite sure about the symlinking thing so I continued searching.I found the post Why I use git and puppet to manage my dotfiles and what immediately convinced me was this statement:
Of course if I only used the default master branch of git I may as well be storing all of my dotfiles in Dropbox. Since many of my machines need to have their own various customizations I use a branch for each machine. Then I periodically rebase each individual branch on the latest master and use git cherry-pick to move changes from the custom branch to the master branch.
If your machines are on different upgrade levels or even on different distributions, dotfiles that might work on one machine can easily be invalid on another, even if you want to have the same configuration on all of them.

Symlinks with Puppet

I've never heard of Puppet, but the clue seems to be that you don't specify steps, but the only the goals, and Puppet ensures that your goals are fulfilled.
file { "/home/${id}/.bashrc":
 ensure => link,
 target => "/home/${id}/config/my.bashrc",
With this configuration, Puppet will create the symlink, but only if it doesn't exist already. The command for this is
puppet apply symlinks.pp
assuming of course your file is called symlinks.pp

Actually, I have to revoke what I said there. I've tried it right now and the problem is that if the files already exists, they will be deleted. At least, I couldn't find the original .bashrc anymore, there was only the symlink to the one in my .dotfiles directory.
I will use a modified version of the bash script from the first link I provided and provide it here later.

Edit 2:
Here it is. My script is based on that from Micheal Smalley, but I didn't like the idea of a separate folder for the old configuration files. Instead, I move old configuration files that are not symlinks to the same .dotfiles folder, but add a suffix that indicated when they where moved there and from which machine. That way, you can put them under revision control and merge them later into your current configuration file.

Edit 3: small correction to the script, it didn't link when there was not previous dotfile to be moved to .dotfiles.


dir=~/.dotfiles                    # dotfiles directory
time=$(date +%Y%m%d_%H%m%S)

# list of files/folders to symlink in homedir
files="bashrc bash_aliases screenrc vimrc"


# move any existing dotfiles in homedir to dotfiles_old directory, then create symlinks
echo "Moving any existing dotfiles from ~ to $dir with suffix $oldsuffix"
for file in $files; do
  # if file exists and is no symlink, move it to .dotfiles
  if [[ -e $absfile ]] && ! [[ -h $absfile ]]; then
    mv ~/.$file $dir/$file.$oldsuffix
  # if file doesn't exist, link it
  if ! [[ -e $absfile ]]; then
    echo "Creating symlink to $file in home directory."
    ln -s $dir/$file ~/.$file

By the way, you can find my github repository of my dotfiles here:

Reuse existing dotfiles

Because other people had the same problem, there are already lots of repositories for dotfiles available. You can browse them and pick some you like, e.g. at

And don't forget: Dotfiles Are Meant to Be Forked

Thats it. I'm lazy and I want to actually implement this solution on my computers, because until know, there is no backup or management of my dotfiles whatsoever. If you spot any errors here or have questions and problems that you are too lazy to solve yourself, comment below and I'll see what I can do ;)