Saturday, November 7, 2009

Few Tips for Running Ubuntu 9.10 Karmic Koala on HP Mini 1000

Now I have been running Ubuntu 9.10 on my HP Mini 1000 for more than a week and I'm pretty happy with it. In addition to my last post about getting the wireless to work, I thought I'd probably share a few more tricks I learnt in the pass week.


Out of box, the Ethernet seems to only work when the cable is plugged in before the system boots up and it will lock up the whole machine if you unplug it. To fix this issue you need to first open up the /etc/default/grub (used to be /boot/menu.list in grub 1 but Ubuntu 9.10 ships with grub 2) with your favourite text editor and locate the following line:


and append acpi_os_name=Linux to it, so it should now look like this:

GRUB_CMDLINE_LINUX_DEFAULT="quiet splash acpi_os_name=Linux"

save the file and run:

sudo update-grub

After rebooting the machine, the Ethernet should work properly.

External Monitor

You have to disable visual effects by setting System -> Appearance -> Visual Effects to None in order for external monitor to work properly. Otherwise the machine just freezes everytime I connect an external LCD monitor to it.

Firefox Cache Directory

My HP Mini 1000 has a 16GB SSD drive, which can die quite quickly if being written to repetitively and web browsers such as Firefox writes cache data to disks quite frequently. Therefore, I prefer setting the Firefox cache folder to the ram disk to make my SSD drive last longer.

In order to change the cache folder, you first need to enter about:config in Firefox address bar and click on the "I'll be careful, I promise" button. Create a new string value by right click on the list and select "New -> String". Name the new value browser.cache.disk.parent_directory and set the value to /dev/shm/<your folder name>. The folder name does not matter and the /dev/shm folder is the ram disk folder created by Ubuntu.

Update: Corrected the grub file name and location, thanks guys

Saturday, October 31, 2009

Getting HP Mini 1000 Wireless to Work Under Ubuntu 9.10 Karmic Koala‎ Netbook Remix

I installed Ubuntu 9.10 Netbook Remix in my (actually my wife's) HP mini 1000 this afternoon. To my surprise the wireless card did not work. Also, when I looked at System -> Administration -> Hardware Drivers, the list was blank.

After few hours of googling and reading through several not too helpful forum posts, I learned that this was caused by Ubuntu 9.10 shipping "b43" driver out of box, which does not work for HP mini 1000. The proprietary driver "wl" should be used instead. However, no one said exactly what I needed to do to fix this problem.

Eventually, I decided to just launch Synaptic and search for "broadcom". The first result in the filtered list was bcmwl-kernel-source, which looked promising so I just went ahead and installed it.

I had a look at the /etc/modprobe.d folder after the installation finished, I noticed that the package actually created a blacklist file for "b43" related modules for me already. After reboot, my wireless card just worked.

Tuesday, August 18, 2009

VirtualBox: Folder Sharing between Windows Host and Linux Guest

To start, you need to install the guest additions. I use Ubuntu Server 9.04 so I followed steps outlined in this post.

  1. install necessary tools for building kernel module by:
    sudo apt-get install build-essential linux-headers-`uname -r`
  2. Next, click on menu "Devices" -> "Install Guest Additions..." then mount the guest addition CD-ROM within the Linux virtual machine by:
    mount /dev/cdrom /media/cdrom0
  3. Finally, run the installation script:
    sudo ./

If you do not have X server installed, you may see a warning message about the X driver will not be installed. Ignore this message and now we are ready to configure the shared folder.

  1. Create the a folder for sharing on the host Windows machine (say C:\Shared).
  2. Click on the menu "Devices" -> "Shared Folders..."
  3. Add a new machine level shared folder by clicking on the + icon on the right
  4. Enter "C:\Shared" for Folder Path and "shared" for the Folder Name.
  5. In Linux guest VM, create a folder /opt/shared and add the following line to the bottom of /etc/fstab:
    shared    /opt/shared    vboxsf    defaults    0   0
  6. Reboot the VM

Monday, August 17, 2009

Resolve Linux Hostname from Windows

The simplest solution is to install samba:

sudo apt-get install samba

According to this post, this is because SAMBA implements NBT (NetBIOS over TCP/IP) protocol, which broadcasts the Linux machine's hostname on the network. Microsoft Windows understands the NetBIOS protocol so it pickups the Linux machine's hostname.

Saturday, August 15, 2009

Fix Touchpad Scroll Area in Xubuntu 9.04

I have a 6 years old Compaq Presario X1000 that I use occasionally for web browsing and programming. I installed Xubuntu 9.04 on it the other day and found out that the scroll area is too wide for the touchpad.

As my usual trouble shooting procedure, I googled around for answers and I found pieces of information spread across several websites.

Step 1 - Enable SHMConfig

First of all, we need to enable the SHMConfig in order to determine the correct scroll area width with synclient. I am pretty sure I used to enable this from the X11 configuration file, however in Ubuntu 9.04 based distributions you are supposed to create a HAL fdi file for this.

sudo vi /etc/hal/fdi/policy/touchpad.fdi

Put this into the file

<?xml version="1.0" encoding="UTF-8"?>
<deviceinfo version="0.2">
    <match key="input.x11_driver" string="synaptics">
      <merge key="input.x11_options.SHMConfig" type="string">True</merge>

Save and close the file then reboot the machine. You can read more about this topic on Ubuntu community wiki.

Step 2 - Determine the correct scroll area with synclient

synclient helps you to determine the correct scroll area offset and is pre-installed on my Xubuntu.


synclient -m 1

Now if you try to touch the touchpad, you will be able to see the coordinate of your finger in following format

 time     x     y ...
0.000  5468  3475 ...

After you determined the correct boundary of the scroll area you can then test it with

synclient RightEdge=<your value>

On my laptop, when I move my finger to the right edge of the mouse movement area the x-coordinate is 5942 and it jumps to 8176 as soon as I touch the scroll area. So I tried:

synclient RightEdge=8175

but for some reason, it disabled the scroll area all together. After some head scratching, I found out that I had to use the x-coordinate right before the scroll area (i.e. 5942).

Final step

After figuring out your RightEdge value, you have to save it by adding it to the touchpad.fdi file created eariler. In my case:

<merge key="input.x11_options.RightEdge" type="string">5942</merge>

So the touchpad.fdi file now looks like this:

<?xml version="1.0" encoding="UTF-8"?>
<deviceinfo version="0.2">
    <match key="input.x11_driver" string="synaptics">
      <merge key="input.x11_options.SHMConfig" type="string">True</merge>
      <merge key="input.x11_options.RightEdge" type="string">5942</merge>

Save and close the file then reboot and your touchpad scroll should work correctly now.

Sunday, August 9, 2009

Installing FIrefox 3.5 on Ubuntu 9.04 Jaunty Jackalope

Firefox is slow in Linux compared to XP or even Vista on the same hardware. Do a quick google search for "firefox slow linux" and you see it's not just me who's saying this.

I tried a couple of tips from other users in discussion forums such as disabling visual effects and IPv6 but it didn't really help. That's why I started thinking about installing Firefox 3.5 on my Ubuntu 9.04 64-bits machine to see if it makes a difference.

At first I though I will have to download it directly from the Firefox website, however to my surprise I found the package in Synaptic. I am not sure which repository it came from but I can see the package "firefox-3.5" from the Synaptic package manager.

I went through the usual installation process and frankly it did not prompt me to remove existing firefox-3.0 packages. Instead, it installs the Firefox 3.5 alone side Firefox 3.0 under the name Shiretoko Web Browser in the Application -> Internet -> Shiretoko Web Browser.

Could be just my wishful thinking but Firefox 3.5 does seems to be a bit faster than 3.0.

Thursday, June 25, 2009

Load Testing ASP.NET Sites with JMeter

Following my previous post about using JMeter to test MOSS, I tried to figure out what are the bare minimum requirements of using JMeter against a plain ASP.NET website.

I wrote a very simple ASP.NET web application with just a button, a text fields and a static label. This application displays the content of a text file in the static label when it loads and write content of the text field back to the file when the button is clicked.


I found all I need to do in order to script this using JMeter is to extract __VIEWSTATE and __EVENTVALIDATION fields then send them back in the update request. My JMeter test plain looks like this:





Tuesday, June 23, 2009

Load Testing SharePoint (MOSS) Sites with JMeter

I have used JMeter for load testing few non-ASP.NET web sites before, however I could not get it to work with ASP.NET web sites. This is mainly due to ASP.NET ViewState and event validations, which stops a recorded JMeter script from being played back.

Recently I worked on a MOSS project and we were looking for tools to perform load testing on the server. Many people said the load testing tool in Microsoft Team System for Testers works well with MOSS. However, it is quite expensive so I decided to give JMeter another go. After several hours of hacking, I actually got it to work and here’s how I did it.

My test page is the pretty standard MOSS edit document property screen with few extra text fields added and the goal here is to use a JMeter script to change the document properties. Once I have a working script, I can configure JMeter to fire hundreds of instances of this script simultaneously to simulate the user workload.


As shown in the screenshot below, the test plan contains two HTTP requests recorded using the JMeter HTTP Proxy component and four JMeter Regular Expression Extractors in between them.:


The main trick here is to capture four key MOSS fields from the HTTP response of the “Load Edit Property From” HTTP request and send them back to the server AS IS in the “Submit Edit Property From” HTTP request along with new property values. These key fields are:

  4. <control ID>_owshiddenversion 

Load Edit Property Form

The Load Edit Property Form step is a simple JMeter HTTP Sampler generated by running the JMeter HTTP Proxy component and recording the HTTP request created by click on the Edit Properties menu item in the SharePoint drop down.


The response from the MOSS server will contains the four key fields that can be captured using JMeter Regular Expression Extractors.

Extract Event Validation

Regular Expression:

id="__EVENTVALIDATION" value="(.+?)"



Extract View State

Regular Expression:

id="__VIEWSTATE" value="(.+?)"



Extract Request Digest

Regular Expression:

id="__REQUESTDIGEST" value="(.+?)"



Extract Hidden Version

This one is not so straight forward as the other three because the <Control ID> changes from page to page depending on the layout. It can be found by simply searching the string “owshiddenversion” in the HTML source for the edit property page. The HTML tag should look something like this:

<input id="ctl00_m_g_026c19e0_cd4b_48c9_a4b3_9e7409f252ac_ctl00_ctl02_ctl00_ctl05_ctl00_owshiddenversion" type="hidden" value="15" name="ctl00$m$g_026c19e0_cd4b_48c9_a4b3_9e7409f252ac$ctl00$ctl02$ctl00$ctl05$ctl00$owshiddenversion" />

Hence, the regular expression in this case is:

id="ctl00_m_g_026c19e0_cd4b_48c9_a4b3_9e7409f252ac_ctl00_ctl02_ctl00_ctl05_ctl00_owshiddenversion" value="(.+?)"

and the screenshot:


Submit Edit Property Form

The “Submit Edit Property Form” JMeter HTTP sampler is generated by recording the Submit button click using the JMeter HTTP Proxy component. JMeter displays all POST parameters contained in this request.


The Reference Names of the four keys fields (e.g. ${viewState} as shown in screenshot above) captured previously need to be entered into their corresponding parameter value fields. My test script also updates other parameters such as author and title for my testing purpose.

Monday, June 15, 2009

Installing VMware Server 2.0.1 on Ubuntu 9.04

I use Ubuntu on daily basis and I can live without Windows most of the time. However, once in a while I need to use Windows for things like accessing the iTunes store and editing Microsoft Word documents (yes I know works but it messes up the style from time to time). I created a WIndows VM image under VirtualBox, which worked quite well except that I noticed the constant high CPU load as mentioned by several other users. I tried their suggestions such as using nohz=off and turn off ACPI, which did reduce the CPU load to 50% on one core but didn't get rid of the problem completely.

I eventually decided to give the free VMware Server a try. Setting it up on Windows was pretty simple, just double click and then click through the Wizard. However, installing it on Ubuntu 9.04 is a non-trivial task and took me a little while to complete.
  1. Launch Synaptic package manager and make sure you have linux-header, linux-source, gcc and binutils installed.
  2. Download the .tar.gz package from VMware web site (VMware-server-2.0.1-156745.x86_64.tar.gz in my case).
  3. Extract the files then run the the installation script as super user
  4. The installer will prompt you with half a thousand questions, I just use the default value for most of them.
  5. Eventually the installer will start compiling and loading kernel modules, most of them will go through fine but the VSOCK module installation will fail with the following error message.
    Unable to make a vsock module that can be loaded in the running kernel:
    insmod: error inserting '/tmp/vmware-config0/vsock.o': -1 Unknown symbol in module
    Ignore this insmod error for now and continue to finish the installation.
  6. Apply the following patch to, this fixes the insmod error above so we can re-run it to complete the VSOCK module installation.
    +++ /usr/bin/ 2008-12-01 16:55:59.000000000 +0100
    @@ -4121,6 +4121,11 @@
         return 'no';
    +  if ($name eq 'vsock') {
    +    print wrap("VMWare config patch VSOCK!\n");
    +    system(shell_string($gHelper{'mv'}) . ' -vi ' . shell_string($build_dir . '/../Module.symvers') . ' ' . shell_string($build_dir . '/vsock-only/' ));
    +  }
       print wrap('Building the ' . $name . ' module.' . "\n\n", 0);
       if (system(shell_string($gHelper{'make'}) . ' -C '
                  . shell_string($build_dir . '/' . $name . '-only')
    @@ -4143,6 +4148,12 @@
         if (try_module($name, $build_dir . '/' . $name . '.o', 0, 1)) {
           print wrap('The ' . $name . ' module loads perfectly into the running kernel.'
                      . "\n\n", 0);
    +      if ($name eq 'vmci') {
    +        print wrap("VMWare config patch VMCI!\n");
    +        system(shell_string($gHelper{'cp'}) . ' -vi ' . shell_string($build_dir.'/vmci-only/Module.symvers') . ' ' . shell_string($build_dir . '/../'));
    +      } 
           return 'yes';
  7. Now run the script as super user and the VSOCK module should compile and install fine now.
  8. Now run vmware from the command prompt to launch vmware UI in your browser. You will see following error message as shown below:
  9. Click on "Add Exception", which will pop up another dialog. Click on "Get Certificate" and then "Confirm Security Exception" to add the VMware URL to the exception list.

  10. Finally, you should see the login screen. You need to login as the root user but Ubuntu does not set password for root user by default. Therefore, you first need to set the password by executing
    sudo passwd root
    after setting the password just login as root and enjoy.

Wednesday, May 27, 2009

Ubuntu 9.04 and NVIDIA Driver

I have been using Ubuntu for quite a while and I loved it for it's simplicity. I downloaded and installed 9.04 in my machine last night and to my surprise I no longer get the nice pop-up prompting me to install proprietary NVIDIA driver for my NVIDIA 8800 GTS graphics card.

The "nv" driver worked fine but it doesn't allow me to enable desktop effects. Also, I really want the full hardware acceleration so I can do some OpenGL programming. After search around the ubuntu package repository, I found several versions of the old familiar nvidia-glx. So I installed the latest one with:
sudo apt-get install nvidia-glx-180
Another surprise was that it no longer updates the xorg.conf file for me so I ended up running nvidia-xconfig myself by:
sudo nvidia-xconfig
This overrides the xorg.conf file to make sure X server loads the proprietary "nvidia" rather than open source "nv" driver. I wonder what the Ubuntu team was thinking, this definitely felt like a step backward. Were they trying to push people to use the open source "nv" driver?

Tuesday, May 26, 2009

East Coast Trip

I met a former colleague who was born in Hawkes bay and had some random chat about work and holidays. This was when I realised that even though I’ve been living in New Zealand for 12 years, I’ve never travel to the east coast of north island before. Therefore, when my wife said she felt like to go for a holiday I immediately suggested that we should travel to the east.

Day 1

Napier is quite far away from Auckland (420 km, ~6 hours drive according to Google Map), so we’ve decided to stop by few other cities alone the way. Our first stop was Rotorua, which is famous for its geothermal sites and hot pools. After checked into our hotel, we were wondering around the street trying to find a place for dinner. Eventually, I pulled out my iPhone, browsed to, found a highly rated bar Pig & Whistle and found it using iPhone GPS and Google Map. I just can’t imagine how to live without iPhone and Internet these days. After dinner, we visited the Polynesian Spa, which is quite famous in New Zealand for its geothermal hot pools.

Day 2

Our plan for the day was to drive from Rotorua to Napier. We stopped by Huka fall prawn farm and Lake Taupo along the way. We arrived Huka fall prawn farm at around 11am and joined the hatchery tour to see how they feed prawns. All I can say is life of a prawn is sad, you either fight and eat your fellow prawns or be eaten. We actually saw that in action, a prawn killed another one and ate it.

After the tour we went on to prawn fishing, we spent one hour sitting in the cold winter and just couldn’t catch any. We gave up at the end because we were so cold and hungry. We had lunch in the prawn farm restaurant then headed to Taupo. We didn’t spent too much time in Taupo, we took a short walk around the lake side, the museum and the city centre then headed off to Napier.

The first half of the drive was quite enjoyable, it was flat and road was straight with nice country side views. We stopped by a nice little lookout alone the way. The second half, however, were terrible. It was all mountain road and unbelievably foggy. Eventually, We managed to arrived Napier safely and checked into the hotel at around 5:30pm

Day 3

The third day was quite a boring one. Firstly, the weather was crap, it was wet and cold. Secondly, I somehow had this horrible stomach ache and I can’t walk a long distance or eat anything. We visited few lookout in the morning but then I slept through the afternoon.

Day 4

My stomach finally recovered after a whole day rest and we decided to join the guided Art Deco walking tour in Napier. There was a massive 7.8 earthquake in Napier in early 30s, which destroyed most of its commercial district. They rebuild the whole city with the most modern architecture style of that period called  Art Deco.  I really enjoyed the tour and the guide told very nice stories about the city and its buildings. I would definitely recommend anyone who have a chance to visit Napier to join the guided tour.

We had lunch at Cobb & Co in Napier, which had really nice 2 course lunch for just $10, visited the mission estate winery and then we were on our way to Gisborne.

Day 5

According to Wikipedia, Gisborne is the first city to see the sun shine each day. Therefore we decided to wake up early in the morning to see the sunrise. We waked up at 6am and I was kinda worried that the sun will already be up by the time we reach the beach. Our sacrifice to sleep did pay off, the sky was clear and the sunrise was really beautiful.

After seeing the sunrise, we drove to the Tolaga Bay, it has the longest concrete wharf in New Zealand. At 660m, it is said to be the longest concrete wharf in the southern hemisphere. That was pretty much the last stop of our east cost trip. We were heading back home.

We headed off to Tauranga after lunch and it took us around 4 hours to reach there. We took a walk in Mt Mauao and had dinner in a Turkish restaurant.

Day 6

It was time to go home, I had been computer-less for 5 days and I really missed it. We stopped by the Karangahake Gorge, which has a 1 meter long dark tunnel. We tried to walk through it but it was just way too dark and scary. We only spent around 5 minutes in it then headed back to our car.

We arrived Auckland at around 5pm and gosh the weather was miserable.

Saturday, May 16, 2009

Referencing Local Variables in jQuery Callback Functions

One day I was working on a CRUD application, which contains a lot of form fields and controls. Each of them requires a mouse over event handler to display a tooltip message. After lots of copy’n’pasting, I decided to refactor the repetitive event registration code out and put them into a loop.

To test out my idea, I developed a very simple page with just three <div>s:

<div id="div1">Click me</div>
<div id="div2">Click me</div>
<div id="div3">Click me</div>

and a list of messages indexed by the <div>s’ ID:

var messages = new Object();
messages['div0'] = 'hello'; 
messages['div1'] = 'bonjour'; 
messages['div2'] = 'ciao';

When someone click on one of the <div>’s region, I would like to show a popup an dialog and display the message associated with it’s ID. Since my goal was to eliminate repetitive lines of code, I put the event registration code in a loop:

for(var i=0; i<3; ++i) {
  $("#div" + i).click(function() { 
    alert(messages['div' + i]); 

When I tried this out, instead of showing the right message, the alert box always shows ‘undefined’:


After a lot of head scratching, I finally realised that the problem is with this line of code referencing the loop variable i:

alert(messages['div' + i]);

The computer scientists’ way of describing this is that it formed a “Closure” referencing the variable i. Since this line of code sits inside the callback function,  the ‘div’ + i statement wasn’t evaluated to ‘div0’, ‘div1’ and ‘div2’ in each loop iteration as I expected. Instead, because each of the callback function holds a reference to the variable i, the ‘div’ + i statement was using the final value of i, which is 3 in this case.

This can be proven by changing the alert statement to show the value of i, which will always be 3 when any of the <div>s is clicked.


I Googled around for solutions but most of them are pretty complicated but eventually I found an answer in the jQuery reference document. All jQuery objects have a data() method, which allows you to bind any data to it. The bound data will be instance specific and the value will be evaluated at the binding time.

Therefore, to fix my code I just have to change the event registration code to:

for(var i=0; i<3; ++i) {
  $('#div' + i).data('divID', 'div' + i);
  $('#div' + i).click(function() {

So in each iteration of the loop, the value ‘div0’, ‘div1’ and ‘div2’ are stored into the ‘divID’ data section of the respective <div> objects and then the callback function uses the stored value to find out which message to display when they are clicked.

The final solution looks like this:

$(function() {
  var messages = new Object();
  messages['div0'] = 'hello'; 
  messages['div1'] = 'bonjour'; 
  messages['div2'] = 'ciao';
  for(var i=0; i<3; ++i) {
    $('#div' + i).data('divID', 'div' + i);
    $('#div' + i).click(function() { 

Tuesday, April 28, 2009

Building Native 64-bits Boost Library

I started using 64-bits Vista recently and I thought it made sense to make a native 64-bits build my favourite C++ library too.

My first attempt was a complete failure because I naively thought all I have to do is to run the bjam in Visual Studio 64-bits Tools Command Prompt. It turned out this only builds a regular 32-bits Boost library.

After some Googling and reading forum posts, I found some useful information in the Boost.Build document. Apparently, Boost does support building 64-bits target and what I end up doing is to specify the architecture and address-mode flags when running the bjam.

c:\boost_1_38_0>bjam ^
More? --toolset=msvc ^
More? --build-type=complete ^
More? architecture=x86 address-model=64 ^
More? stage

then I installed the library by

c:\boost_1_38_0>bjam ^
More? --toolset=msvc ^
More? --build-type=complete ^
More? architecture=x86 address-model=64 ^
More? install

Wednesday, April 22, 2009

Installing the WTL Application Wizard in Visual C++ 2008 Express Edition

I love Google Chrome, it is fast, elegant and beautiful. After I realized that it was written using WTL, I felt quite keen to learn about this library.

I downloaded WTL 8.0 from and Visual C++ 2008 Express Edition from only to realise that there is no WTL Wizard support for Visual C++ 2008 Express Edition.

WTL 8.0 ships with WTL/ATL application wizard but the setup script only supports the Visual C++ 2005 Express Edition (setup80x.js). However, the good news is that you can make it work with Visual C++ 2008 Express Edition in few simple steps:

  1. Make a copy of the setup80x.js and rename it to setup90x.js.
  2. Open setup90x.js up and do a global search and replace from “8.0” to “9.0”.
  3. Save the file and execute it.

If you are as lucky as I am, you should see a dialog that tells you that the wizard has been successfully installed.


Now, when you run Visual C++ 2008 Express Edition and go File –> New –> Project… you should now see the new WTL/ATL Application Wizard.


For some reason when I tried to create a new solution from the wizard, the generated code doesn’t compile. It gives me the following error:

stdafx.h(33) : error C2065: '_stdcallthunk' : undeclared identifier

To fix this problem I had to manually add this include:

#include <atlstdthunk.h>

into the generated stdafx.h, right after this line

#include <atlbase.h>

This worked on my machine anyway :)

Monday, April 13, 2009

Updating .config Files from Visual Studio Setup Project

To open Web.config from within a web setup project:

string path = Context.Parameters["assemblypath"];
path = path.Substring(0, path.LastIndexOf(
path = Path.Combine(path, "Web.config");
var config = 

To open App.config from within a setup project:

var map = new ExeConfigurationFileMap();
map.ExeConfigFilename = 
    Context.Parameters["assemblypath"] + ".config";
var config = 
        map, ConfigurationUserLevel.None);

To update settings in .config files

// update connection strings
var cs = config.ConnectionStrings;
cs.ConnectionStrings["cs1"].ConnectionString = 
    BuildConnectionString(host, user, pass);

// update app settings
var appSettings = config.AppSettings;
appSettings.Settings["key"].Value = "new value";   

Finally, to save the .config file changes


Tuesday, March 31, 2009

Skype for iPhone Mini-Review

Skype just released a native client for iPhone earlier this week and I downloaded it as soon as I saw the announcement.

I have been using Nimbuzz for Skype chatting for a little while but it has pretty high latency. I did a quick, non-scientific measurement and the latency can be as high as 2-3 seconds sometimes.

The iPhone Skype, on the other hand, have very low latency and better voice quality. I was a little bit disappointed to find out that I cannot make voice calls over 3G network. However, with the crappy slow Vodafone 3G network speed and monthly 250MB data cap, I don’t think I have missed out too much.

Friday, March 27, 2009

2 Minutes Guide to Twitter

I started using Twitter recently and I really like it. Twitter is basically a website that allows you to micro-blog.  I find the idea of micro-blogging fascinating because a blog takes a lot of time and effort to maintain, whereas writing a micro-blog entry (it’s called a tweet on Twitter) simply takes few seconds.

However, I was quite lost when I first start twittering. I saw many tweets contain strange symbols and I had no idea what they meant. Following are the three most widely used Twitter commands:

  1. @reply – Any tweet started with @<user name> (e.g. “@oscarkuo is a reply”) will be placed in the reply tab of the user’s Twitter home page.
  2. #tags – Tagging helps to organise tweets just like how you would tag your E-mails in your Gmail account or blog entries on your blog. The only difference is that #tags works across everyone’s tweet, not just your tweet. Basically the idea is you put the hash symbol (#) in front of a keyword in a tweet. For example the Twitter update “my #iphone rocks” essentially tagged this tweet with the category #iphone. You can then use Twitter’s search or other community websites such as Hashtags to track these tags.
  3. RT (ReTweet) – This is  basically a way of forwarding a tweet to your Twitter followers. This is not really an official Twitter feature but people place “RT” in front of the message followed by @<user name> to indicate that they are forwarding someone else’s tweet (e.g. “RT @oscarkuo iphone rocks”).

Many people said to me that they prefer Facebook status updates over Twitter. However, to me they serve two different groups of audience. I tweet about tasty foods, funny jokes or good podcasts on twitter. On the other hand, when I want to talk about more private stuff such as work and family I write them on Facebook status updates.

Thursday, March 26, 2009

ItemGroup Gotcha

At work, we have been deploying the latest build from the build server to the test server manually for a while. I know this is not smart but the fact is, it really doesn’t take that much time to copy and paste files from build server to the test server.

Nevertheless, one day I was finally sick of this dumb process and decided to roll up my sleeves to make the build server do this monkey job automatically.  I wrote a MsBuild script, which automatically copies files to the test server from build server after finishing a build and the script below is the simplified version of it:


  <DllFiles Include="$(OutputFolder)\*.dll" /> 

<Target Name="BuildAll"> 
    EmitDebugInformation="True" /> 
    EmitDebugInformation="True" /> 

<Target Name="DeployDlls" DependsOnTargets="BuildAll"> 
    DestinationFolder="$(DeployFolder)" /> 

When the target DeployDlls is invoked with following command:

msbuild /t:DeployDlls build.proj

It is supposed to call the target BuildAll first due to the dependency, which builds the .cs files and then the DeployDlls target will run the Copy task to copy output files to the deploy folder. This seems to be a pretty straight forward script but it has a major bug in it. It does not work with a clean build.

The problem is that the ItemGroup clause is actually evaluated before any target was invoked. Therefore, when this script is executed against a clean build (i.e. no DLL files in the Build folder yet), the DllFiles item group will get evaluated into an empty array and therefore nothing will be copied at all after the build finished.

To fix this problem, the DeployDlls target must be updated to:

<Target Name="DeployDlls" DependsOnTargets="BuildAll">
  <CreateItem Include="$(OutputFolder)\*.dll">
    DestinationFolder="$(DeployFolder)" />

This way, the CreateItem task will populate DllFiles item group after the build is finished and right before the Copy task is executed.

Saturday, March 21, 2009

Using Windows Live Writer with

I just happened to noticed today that Windows Live Writer (WLW) actually has built in support for What’s even better is that upon entering the URL and the credential of your blog, WLW retrieves other details such as styling and labels from your blog.


Also, when you tries to publish a blog post with embedded images, WLW will automatically upload these images to your Picasa web albums. I never really liked’s web based blog editor because it is a bit too primitive in my opinion.

My Second First Blog Post

After spending few hours fighting with CSS and tweaking the layout, finally this blog is up. This is actually my second attempt of blogging, my last blog didn't last long and we'll see how long this one will last.

In case you wonder, I based my blog on the 3-column Minima template, which can be found here. As far as I can see it works fine in IE7, Firefox and Google Chrome.

I have to say I am very impressed with how flexible (read generous) is when it comes to customising the look and feel of the blog.