Tag Archives: tricks

Navigating Bash history

Today I was doing some command line based work and I found myself scrolling too many times to reuse the long command I had written, followed by a lot of ls’s and similar. I thought: this is ridiculous, there has to be a better way!

So I searched and I quickly got to history. If you enter history in a prompt, you get… a list of your last entered commands! And you don’t need to scroll past irrelevant ones until you find the one you actually wanted to use. Beautiful! I was excited so I tweeted it.

Then people that knew more than me told me to use CTRL+R. I had accidentally landed in the CTRL+R weirdness before, when I wanted to reload the current page and I thought the browser window had the focus and it was the bash window instead, but since I could just ESCape, I didn’t pay too much attention to it.

The way CTRL+R works is similar to a “find in page” functionality: you type some text and it will start showing results that match as you type–no need to press a ‘find’ button. Then to scroll across results you can press CTRL+R or CTRL+SHIFT+R (to switch between next and previous results).

Another alternative is to call history and then grep its output (grep is a filter: it will just show lines that match):

history | grep 'what you want to find'

So, for interactive searches: CTRL+R. To reconstruct what you’ve done (e.g. for a tutorial), history, or maybe combined with grep.

I don’t quite understand how I have survived without it for so long, but it also proves that you don’t need advanced command line superskills to survive in life 😛

With thanks to Jan Lehnardt and Reza Akhavan for the tips!

Fixing VirtualBox guests losing access to the network when the host has been suspended

I am running VirtualBox guest instances of Linux on a Mac OS X host. Sometimes when I open the laptop lid I find that the guest instances are unable to connect to the network–it simply times out!

The best solution I’ve found so far is to disable and enable the network interface in the guest, by right clicking on the little network icon on its window, and selecting “Connect Network Adapter…” consecutively (first to ‘disconnect’, then to ‘reconnect’ again). Then the network comes back to normal.

I’m guessing that perhaps the VirtualBox daemon or whatever it uses in Mac gets killed when the host is suspended, but I’ve no idea of how to restart it (and no will to dig it out). Other solutions could be totally closing and starting VirtualBox again, or even your laptop! But they are way more obtrusive, and this is faster. Like a gentle ‘turn off and on again’ 😉


Migrating to a new laptop (or: Apple-inflicted misery, once again)

Yesterday I got my new laptop and the technician’s idea was to just migrate all my settings and stuff over from the old one for simplicity, using Mac OS X’s built-in migration assistant.

I actually didn’t want to do this because I liked the notion of a clean slate to get rid of old cruft that I didn’t want anymore, but I thought I would give the migration assistant the benefit of the doubt.

TL;DR: it doesn’t seem to be ready for migrating a laptop that has been given intensive usage and has plenty of small files (think: the contents of node_modules) and big files too (think: screencasts).

The new laptop is one of those ultra light MacBooks with a USB-C connector, so it doesn’t have an Ethernet connector to start with unless you add one via the semidock.

The initial attempt was to migrate data using the wireless network. After three hours and the progress barely changing from “29 hours” to “28 hours” I gave up and started reaching for the Thunderbolt to Ethernet adapters. We stopped the process and set up both computers connected to the same switch with ethernet cables. The estimation was now 4 hours, MUCH BETTER.

I calculated that it would be done at about 20h… so I just kept working with my desktop computer. I had a ton of emails to reply to, so it was OK to not to use my normal environment—you can’t use the computer while it’s being migrated.

A little bit before 20h I looked at the screen and saw “3 minutes to finish copying your documents” and I got all stupidly excited. Things were going according to plan! Yay! So I started to get ready to leave the office.

Next time I looked at the screen it said something way less encouraging: “Copying Applications… 2 hours, 30 minutes left”

I was definitely not going to wait until 22:30 hours… or even worse, because the estimation kept going up–2 hours, 40 minutes now, 3 hours… I decided to go home, not without wondering if the developer in this classic XKCD cartoon was working at Apple nowadays:

Remaining time: Fifteen minutes... Six days... thirty seconds


Today I accidentally slept in (thanks, jetlag) and when I arrived into the office, all full of hope and optimism, I found the screen stuck at “359 hours, 44 minutes left”.

I turned around to Francisco and asked him: “hey, how many days is 359 hours?” He opened up the calculator and quickly found out.

About 14 days.

And 44 minutes, of course.

I gave the migration “assistant” some more benefit of the doubt and went for lunch. When I came back it was still stuck, so it was time to disregard this “assistance” and call rsync into action.

  • I enabled SSH on my old laptop (Preferences – Sharing – Enable remote login)
  • Created an SSH key on my new laptop so I didn’t have to type in the password of the old one each time. Then enabled the new key with ssh-agent, otherwise ssh doesn’t even bother trying to use that key when connecting to remote hosts, and copied and added the new public key to the authorized_keys file in the old computer (the GitHub instructions for generating keys are very good at explaining this).
  • rsync was already installed on the computers I think, or maybe I installed it with brew, but I’d swear it was already there
  • then to copy entire directories I would use something like rsync -avz --exclude '.DS_Store' sole@oldcomputer.local:/Users/sole/folderIWantToCopy/ /path/to/folder/parent
  • except when the folder would have a space in the name, in which case I had to escape it with a TRIPLE BACKSLASH. For example, to copy the VirtualBox VMs folder: rsync -avz --exclude '.DS_Store' sole@oldcomputer.local:/Users/sole/VirtualBox\\\ VMs/ .

I have most of my stuff in a ~/data directory, so migrating between computers should be easy, by just copying that folder.

Whatever wasn’t, I copied manually. For example, the Google Chrome and Chrome Canary profiles, because I didn’t want to set them up from scratch–you can copy them and keep some of the history without having to sign into Google (some of my profiles just don’t have an associated Google ID, thank you very much). Unfortunately things such as cookies are not preserved, so I need to log into websites again. Urgh, passwords.

cd ~/Library/Application\ Support
mkdir Google/Chrome -p
mkdir Google/Chrome\ Canary -p
rsync -avz --exclude '.DS_Store' sole@oldcomputer.local:/Users/sole/Library/Application\\\ Support/Google/Chrome/ ./Google/Chrome/
rsync -avz --exclude '.DS_Store' sole@oldcomputer.local:/Users/sole/Library/Application\\\ Support/Google/Chrome\\\ Canary/ ./Google/Chrome\ Canary/

I also copied the Thunderbird profiles. They are in ~/Library/Thunderbird. That way I avoided setting up my email accounts, and also my custom local rules.

I logged into my Firefox account in Nightly and it just synced and picked up my history, bookmarks, saved passwords and stuff, so I didn’t even need to bother copying the Firefox profiles. It’s very smooth! You should try it too.

Note that I did all this copying before even downloading and running the apps, to avoid them creating a default profile on their first start.

While things were copying I had a look at the list of apps I had installed and carefully selected which ones I wanted to actually re-install. Some of them I installed using homebrew, other using the always-awkward, iTunesque in spirit and behaviour, App Store. Of note: XCode has spent the whole afternoon installing.

I also took this chance to install nvm instead of just node stable, so I can experiment with various versions of node. Maybe. I guess. We’ll see if it’s more of a mess than not!

In short, it’s now almost midnight but I’m done. I started copying things at 17h, and had a few breaks to do things such as laundry, dishwasher, tidying up my flat, grocery shopping, and preparing and eating dinner, so let’s imagine it just took me 4 hours to actually copy the data I was interested in.

Moral of the story: rsync all the things. Don’t trust Apple to know better than you.

npmoffline: installing npm packages from the cache

npm has a feature where you can ask it to install packages from the cache, where cache-min forces npm to avoid installing packages younger than that value:

npm --cache-min 9999999 install <package-name>

This works, but I’m never going to remember that syntax, so I added an alias to my .bashrc file:

alias npmoffline="npm --cache-min 9999999 "

So now when I’m offline on a plane and want to install a package that I’ve already installed in the past (and so I know is in the cache), I can write this:

npmoffline install <package-I-already-installed>

and it will pull the contents from my cache.

Yayyy ?

If it doesn’t work you can also list the contents of the cache with

npm cache ls

and see what packages and versions have been cached. Perhaps you can also grep it, to discard the packages you’re not interested in, e.g. the following will only list entries related to node-firefox:

npm cache ls | grep node-firefox

Loading webcomponents-lite with require()

I just realised that the Web Components polyfills not only are in npm so you can install them like this:

npm install --save webcomponents-lite

but they also have a well formed package.json with a main entry.

So if you’re writing your front-end code with Browserify and want to load the polyfill without adding an additional script tag, you can do this:


and this pulls the polyfill into the scope.

NICE! Thanks, Addy 🙂

PS I guess this should also work with webpack, if you’re so inclined.