Wednesday, November 4, 2009

Nasty Oracle ADO bug

Oracle has a nasty ADO bug that affects 10.2.0.x Windows clients, and I believe 11g R1 clients as well.

When you run a query like select N'X' from dual in an ADO recordset and you are using AL16UTF16 as the national characterset, then if you look at the value of DefinedSize and ActualSize in ADO, you will get a DefinedSize of 1 and an ActualSize of 2.

That's obviously wrong, because DefinedSize is the maximum capacity of the field, and in this case it will be returned as a adVarWChar type. Therefore, it should be being returned as 2, as the letter 'X' in AL16UTF16 is represented as 2 bytes and as it's an adVarWChar type that's all it needs to store the value.

Now if it was the supplementary Han character 𧉧, which is U+27267, the ActualSize will be 4 and DefinedSize will be 4 also. That's because in UTF-16, the actual encoding is 0xD85C-DE67, which is 4 bytes long.

This particular issue has caused my company a great deal of grief, and it took Oracle a long time to acknowledge there was a problem. However, we have been told that this will be fixed in Windows patch bundle 26 for the Oracle 10g client (bug number is 8301952), and even better I believe that it was fixed in Oracle 11g R2.

Monday, October 19, 2009

Great story

Great story.

Norbert Wiener was perhaps the greatest U.S. mathematician in the first half of the twentieth century, revered among his colleagues for his brilliance. He was also famous for his absent-mindedness.

After a few years at MIT, Norbert Wiener moved to a larger house. His wife, knowing his nature, figured that he would forget his new address and be unable to find his way home after work. So she wrote the address of the new home on a piece of paper which she made him put in his shirt pocket. At lunchtime that day, the professor had a inspiring idea. He pulled the paper out of his pocket and used it to scribble down some calculations. Finding a flaw, he threw the paper away in disgust. At the end of the day he realized he had thrown away his address. He now had no idea where he lived.

Putting his mind to work, he came up with a plan. He would go to his old house and await rescue. His wife would surely realize that he was lost and go to his old house to pick him up. Unfortunately, when he arrived at his old house there was no sign of his wife, only a small girl standing in front of the house. "Excuse me little girl," he said, "but do you happen to know where the people who used to live here moved to?" "It's okay daddy," said the little girl. "Mommy sent me to get you."

P.S. Norbert Wiener's daughter was recently tracked down by a mathematics newsletter. She denies he forgot who she was, but admits he lost the house.


Monday, October 12, 2009

Planet Gnome

You know, I really thought that Planet Gnome was all about developing... Gnome applications, frameworks and infrastructure. Silly me. Instead, it's been taken over by a whole lot of PC crap - the latest of which is that Mark Shuttleworth said that he wants to make Ubuntu such that it's "easier to explain to girls". This really put the cat amongst the pigeons, and they are still commenting about it now.

Honestly, I can understand the controversy over RMS's presentation at the Gran Canaria Desktop Summit, which really was sexist, but it appears that it has sparked a wave of self-righteous indignation amongst Gnome people. Seems quite ridiculous.

Saturday, October 10, 2009

Evil... but totally cool. Debugging apps remotely via IRC

The following is so completely evil but completely cool that I just have to tell people about it.

Basically, you get the GNU Debugger (gdb) and hook it up to IRC.

Of course, you'd want to know who you are talking to. Very Insecure.

Wednesday, August 5, 2009

Note to myself

Check the following post for responses:

Update: While somewhat helpful, the MSFT didn't bother looking at my followup question and marked their own answer as the solution. Nice going.

New post to follow up on:

Well here's something new...

... a professional sound engineer switched using his Mac to another platform. "Why would anyone use Windows for sound?" I hear you say. Well, they don't. In fact he's switch to... Ubuntu Linux.

Check it out here:

Speaking of sound on Linux, I found a good Intro to this topic here:

Monday, August 3, 2009

Guaranteeing order in views

In SQL Server 2005, using the TOP 100 PERCENT clause with an ORDER BY xxx doesn't gurantee that the results will be ordered by xxx. See this article for more info.

So what to do?

I believe that you can use the following syntax:

create view guaranteedOrderView
select xxx, row_number() over (order by xxx) as OrderNo
from exampleTable

NOTE: I've not tested this assumption. One of the extremely smart developers at my work told me that the optimizer might well... optimize... out this order.

Wednesday, July 22, 2009


Discovered a new feature of Windows today.

If you have a service that seems to be playing up, then note down the process ID in the task manager and run the following:

tasklist /svc

Unfortunately, you can have multiple services sharing the one svchost.exe process. If that's the case, then you need to split them off to their own process by doing the following:

sc config <servicename> type=own

You can later make them share the same svchost.exe process by running:

sc config <servicename> type=share

Monday, July 6, 2009

Bah, humbug

Is there anything more pointless than Twitter? Honestly, it's a technology looking for a solution to a problem that nobody experiences. I don't give a damn whether you just flushed the toilet! And I sure as heck don't care if you just connected your bing bong to your flirp flop through a dongle widget over a Facebook status update by using a new RSS twinkle you just wrote. I say summon the fail whale forever and do an end to this in s anity!

You kids. Get off my lawn!

Tuesday, June 16, 2009


Can't resist posting another blog post. The following folklore is titled "Close encounters of the Steve kind."
Steve had managed to get Don Knuth, the legendary Stanford professor of computer science, to give a lunchtime lecture to the Mac team. Knuth is the author of at least a dozen books, including the massive and somewhat impenetrable trilogy "The Art of Computer Programming." (For an amusing look at Knuth's heady self image, and his $2.56 reward program, see )

I was sitting in Steve's office when Lynn Takahashi, Steve's assistant, announced Knuth's arrival. Steve bounced out of his chair, bounded over to the door and extended a welcoming hand.

"It's a pleasure to meet you, Professor Knuth," Steve said. "I've read all of your books."

"You're full of shit," Knuth responded.

Apple story

One of the best stories about American-Japanese relations I've heard. Better even than the Crazy People ad, interestingly enough also involving Sony. Note that it's talking about how Apply initially needed to get a floppy drive working, and the Apple engineers had to work around the complete looniness of Steve Jobs:

They hatched an alternative plan to continue to work with Sony surreptitiously, against Steve's wishes. Larry Kenyon was given a Sony drive to interface to the Mac, but he was told to keep it hidden, especially from Steve. Bob and George also arranged meetings with Sony, to discuss the customizations that Apple desired and to hammer out the beginnings of a business deal.

This dual strategy entailed frequent meetings with both Alps and Sony, with the added burden of keeping the Sony meetings secret from Steve. It wasn't that hard to do in Japan, since Steve didn't come along, but it got a little awkward when Sony employees had to visit Cupertino. Sony sent a young engineer named Hide Kamoto to work with Larry Kenyon to spec out the modifications that we required. He was sitting in Larry's cubicle with George Crow when we suddenly heard Steve Jobs's voice as he unexpectedly strode into the software area.

George knew that Steve would wonder who Kamoto-san was if he saw him. Thinking quickly, he immediately tapped Kamoto-san on his shoulder, and spoke hurriedly, pointing at the nearby janitorial closet. "Dozo, quick, hide in this closet. Please! Now!"

Kamoto-san looked confused but he got up from his seat and hurried into the dark janitorial closet. He had to stay there for five minutes or so until Steve departed and the coast was clear.

George and Larry apologized to Kamoto-san for their unusual request. "No problem.", he replied, "But American business practices, they are very strange. Very strange."

Ah Apple. When you released the puck mouse, I guess you hadn't learned anything.

Thursday, June 11, 2009

Hell just froze over

Microsoft and the Linux Foundation just presented a joint letter to the American Law Institute. I think you know you've screwed up when such diametrically opposed viewpoints team up together.

Thought I think that CNet said it best - "Finally, Microsoft and the Linux Foundation agree on something. Neither wants to stand behind their products."

Tuesday, June 2, 2009

I think that the fact that my favourite track on is CD laser lens cleaner speaks volumes for the general quality of today's popular music.

Sunday, May 31, 2009

"UAC elevations are not 'security'"

UAC elevations in Windows Vista are not security.

What are you on about, I hear you say? Of course they are!

Actually, according to Mark Russinovich - of sysinternals fame - UAC elevations are not security.
Signature checks serve as proof-of-origin for trust decisions (e.g. installing an activeX control) and integrity check, not as any indication that the software is non-malicious, free from exploitable defects, or carrying a malicous data payload.

The only code in general checked for signature validity during loading are ActiveX controls, .NET assemblies and device drivers. OS components are not verified except on demand. UAC elevations are not 'security' and the signature verification performed by the consent prompt is intended primarily to encourage ISVs to sign their code.

Thursday, May 28, 2009

Excellent post of SSDs

I seem to be dealing with storage more and more these days. Who would have guessed? Oh well.

Anyway, I found the following excellent article on SSDs.

Tuesday, May 26, 2009

Method 2's a bit of winner.

We got an email today from our IT head, subject was "Method 2's a bit of winner."

In the email, it had a link to

Firstly, the symptom:

When you try to return data from Microsoft Query 97 to a Microsoft Excel 97 worksheet, the spinning globe icon (which signifies that a query is processing) may appear for a long time, and then the query returns no data to your worksheet.

And now, possible solution 2:

Method 2: Move Your Mouse Pointer

If you move your mouse pointer continuously while the data is being returned to Microsoft Excel, the query may not fail. Do not stop moving the mouse until all the data has been returned to Microsoft Excel.

NOTE: Depending on your query, it may take several minutes to return the results of your query to the worksheet.

Thursday, May 21, 2009

New storage book

I bought a new book entitled "Information Storage and Management", which is written by EMC2 experts. I'm up to the second chapter, and it's all very dry so far. However... I think it's going to start getting interesting because it goes from the basics of how hard disk drives work and how their components fit together to... 2.4 Fundamental Laws Governing Disk Performance. Which tells me the following about average queue size:

NQ = N - U
= a × R - U (from eq. 1)
= a × (RS / (1 - U)) - U (from eq. 5)
= (RS / Ra) / (1 - U) - U (from eq. 3)
= U / (1 - U) - U (from eq. 4)
= U (1 / (1 - U) - 1)
= U2 / (1-U)      (6)

Whew! Glad I knew that :-)

Sounds like some interesting reading.

Wednesday, May 13, 2009

Oracle Dump() statement

Discovered an interesting Oracle statement today. The statement is dump. The syntax is Dump(value)

This returns the raw value of a row's column.

Here are the possible values:
  • 8 returns result in octal notation.

  • 10 returns result in decimal notation.

  • 16 returns result in hexadecimal notation.

  • 17 returns result as single characters.

If you add 1000 to it, it returns the characterset.

The following website gives some good examples:

dump('Tech') would return 'Typ=96 Len=4: 84,101,99,104'
dump('Tech', 10) would return 'Typ=96 Len=4: 84,101,99,104'
dump('Tech', 16) would return 'Typ=96 Len=4: 54,65,63,68'
dump('Tech', 1016) would return 'Typ=96 Len=4 CharacterSet=US7ASCII: 54,65,63,68'
dump('Tech', 1017) would return 'Typ=96 Len=4 CharacterSet=US7ASCII: T,e,c,h'

Tuesday, May 12, 2009

Unicode and Oracle

In my work for a software company, I get a lot of questions and problems relating to Oracle and Unicode. The following series of posts will be a summary of how these two not-particularly-easy-to-use-or-understand components fit together.

Before I get into how Oracle and Unicode fits together, I really feel that it's instructive to understand both what Unicode is, and how it came to be. Therefore I'm starting with...

A potted history of the precursors to Unicode

Even before ASCII... or even proper computers

To really understand Unicode, you first need to understand a little about the background to what came before Unicode. The most detailed history I know actually gives the history of ASCII, and doesn't even touch on Unicode, but if you're into that sort of thing (I am!) then it should be interesting.

Basically, the first non-time based encoding scheme (i.e. not Morse Code) that was invented was used in Telegraphs and was called the Baudot Code. This was created by Emile Baudot in 1876 and it was itself inspired by the Gauss-Weber Telegraph Alphabet. The code was initially represented by 6-bits, but it was later reduced to a 5-bit encoding scheme (32 characters). This patented encoding was the basis of a later code that was called the Murray Code, which was invented by Donald Murray who used it to automatically print telegraphs to a punch-tape based printer he had also invented. The code was also a 5-bit encoding but introduced control characters we still see today - characters such as the Carriage Return (CR), Line Feed (LF), NULL, BLANK and DEL.

Murray's patent was later sold to the Western Union Telegraph Company in 1912. While it was patented, a number of incompatible variations of his code were being used in other Telegraph systems by this time - an issue that not even Murray was too worried about in the beginning. However as reliable communications became more and more important, governments and businesses started to realise that they needed to standardize this code so that they could communicate more easily with each other.

The French actually work with (most of) the rest of the world

Thus in 1925 the French setup the Comité Consultatif International Télégraphique (CCIT, or in English the "International Telegraph Consultative Committee") for the purposes of creating an internationally recognized standardized encoding. This proved to be no mean feat - especially given that they only had 32 letters to work with! As it turns out, standardizing this coding scheme was particularly difficult because the Russians objected to what everyone else initially agreed on. Rather than bore you with what went on, just understand that eventually two encoding schemes were formed - the International Telegraph Alphabet 1 (ITA-1) which the Russians were particularly fond of, and the International Telegraph Alphabet 2 (ITA-2) which everyone else quite liked.

As you can imagine, with almost everyone but the Russians using ITA-2, use of ITA-1 soon fell by the wayside and in 1948 it was adopted (with reservations) by the United States as their standard for Telegraphy. Of course when that happened it was pretty much all over for ITA-1. What I find interesting is that evidently in those days they were more cautious about adopting standards, as it took 19 years to adopt the ITA-2 encoding - enough time for World War I and World War II to have been and gone for some time!

The Americans take over

Now while it was grand that the world had agreed that ITA-2 was the one everyone should use, 5-bits is really quite limiting and so it was inevitable that someone would decide that the encoding set should be expanded. Thus in the 1960s Captain William F. Luebbert of the U.S. Army Signal Research and Development Lab invented FIELDDATA, which was a 6-bit encoding that was used extensively by the U.S. Army in their communications. I only really mention FIELDDATA as it is historically important because it inspired the committee that invented ASCII. More on ASCII later though.

In the meantime, another 100-pound gorilla was also inventing their own encoding scheme. That gorilla was IBM, who in 1962 created the 6-bit Binary Coded Decimal Interchange Code (BCDIC) for use in their punched card machines. When punched cards were replaced with IBM's System/360 mainframes, Charles E. McKenzie of IBM extended the code, which became known as Extended Binary Coded Decimal Interchange Code (EBCDIC). Of course, IBM didn't standardize this on their own machines until a few years went by... but it is still in wide use to this day. In fact, EBCDIC was long the rival of ASCII, but the X3.2 subcommittee rejected it. Don't worry, we'll get to this very shortly!

The birth of ASCII

Given the number of different encodings that had started to proliferate, the American Standards Association (ASA; they are now known as ANSI) decided that it was time to form a new international standard to represent characters. Thus on June 17, 1963 the ASA's X3 Committee for Computer and Processing Standards formed the X3.2 Subcommittee for Coded Character Standards and Data Formats. Within this subcommittee was formed a core group of nerds known as the X3.2.4 taskgroup. To cut a long story short, our small band of heroes spent so long arguing over such things as how many characters they should represent, whether to use control characters and in what position they should put the characters in the code chart that they never met any girls and thus never produced any offspring. Whether that was because they were off the nerd Richter scale, or whether it was because ASCII ruined them is a point hotly debated. Whatever it was, we shall never have a generation of men like them again. The important thing here is that with all that incredibly dull discussion, you would have thought that the resulting standard included such innovations as lowercase letters, umlauts and character grave-marks. Strangely, this was not to be and the world was initially stuck with a bunch of upper case characters, numbers, punctuation symbols and a lot of obscure control characters.

Incredibly, it took another four years for the X3.2.5 working group to decide on a final 7-bit encoding scheme. This time, however, someone must have told them that nobody wants to COMMUNICATE BY SHOUTING. It was only at this point that the members got a clue and decided that the world should actually be allowed to write electronically with proper punctuation and lower-case letters. Incidentally, it appears that they didn't keep their controversies inhouse, and managed to anger another set of (albeit cooler) nerds. What happened was that they were just about to release the final standard that just about every standards body in the world had approved - including the ECMA and ISO - when who should come along but the president of the IBM User Group, otherwise known as SHARE. For their troubles, the X3.2.4 subcommittee working group was broadsided by a vitriolic letter in which the SHARE president threatened that the ASA could go to hell unless changes were made to the draft standard. If they were not, he warned that the programmers of the world would create their own competing standard and ASCII as we know it would be in peril due to a lack of adoption. Faced with this unpalatable situation the X3.2.4 working group decided to pull the wool over the SHARE president's eyes by moving a few characters into different positions as well as changing the form of a few characters - for instance they adding a break in the middle of the "|" character (have a look at your current keyboard to see what I'm talking about).

Thus in 1967, with SHARE suitably triumphant and mollified - not to mention feeling very smug about forcing changes to a standard that every country in the world had agreed to - the ASA officially released the American Standard Code for Information Interchange (ASCII), or more formally X3.4-1967. This was actually a joint release done with the European Computer Manufacturers Association (ECMA) and the International Standards Organization (ISO). The ECMA released ASCII as ECMA-6 and ISO released a slightly modified version of ASCII as ISO-646. The ISO standard differs from ASCII because it replaced the dollars sign with the International symbol for currency, which is ¤. Personally, I think that while this certainly seems a good idea in theory, when you realise that currencies fluctuate against each other then you start to understand that unless you show the actual symbol of the currency being used you could either make quite a bit of money, or lose quite a bit of money. Possibly this is why nobody has ever heard of the universal symbol of currency, which ironically is not used universally. What was the ISO thinking?

As an aside, you'd think that after all that careful consideration and endless argument amongst nations (and even within nations - IBM's EBCDIC was categorically rejected by the X3.2.4 committee) that people wouldn't want to fiddle with the standardized characterset. But no, what we got was ATASCII, PETSKII, the ZX Spectrum characterset, Galaksija and YUSCII. Some people are never happier than when they are buggering up a perfectly good standard.

8-bits of confusion - Extended ASCII

It soon became apparent to almost everyone that while 7-bits certainly gave everyone a lot more characters to play with, it wasn't really enough to represent even a fraction of the world's characters. Computers by this stage all used the byte as the smallest unit of measurement anyway, so really there was no need to only represent characters with 7 bits. Also, by this time the IBM PC had been unleashed on an unsuspecting public, and with it came an 8-bit extended ASCII variant called PC-US (sometimes also known as OEM-US or DOS-US). This extended ASCII characterset was burnt into the ROM of every IBM and IBM compatible PC that was sold, which obviously made it very popular, especially as it included characters that let you do things like this:


(The box would be even sweeter if Blogger honoured non-breaking spaces).

That one extra bit allowed for a whopping 255 characters, 127 more than were available than before. The rest of the world soon cottoned on, and a large number of ASCII variants appeared; these included two version of the Greek alphabet, many variants of the Cyrillic languages, Arabic, Chinese, etc, etc, etc. Of course they all used the same code points to represent their various languages, so eventually IBM organized these into what are now known as code pages - the original being code page 437. Soon thereafter, Microsoft made it big with Windows, and decided to add to the confusion with their own set of code pages.

For a while, confusion reigned. In particular, BASIC programmers were annoyed because when they switched their graphics card to another mode that used a non-437 code page their sweet graphics would turn into a big mess of characters. And you never want to make a BASIC programmer angry...

ISO to the rescue!

Evidently realising that angry BASIC programmers aren't a good thing, ISO (now known as ISO/IEC) decided to remedy the situation. Unfortunately for the BASIC programmers this is the same organization that nearly caused financial chaos by replacing the dollar symbol with the universal currency symbol, so unfortunately for them they didn't get any ANSI graphic symbols. And hence BASIC died, to give birth to the hell-spawn known as Visual BASIC... sorry, I digress.

Basically, ISO/IEC sat down and decided to use the 8th bit of extended-ASCII to form a proper universal standard. Thus was born the ISO 8859 charactersets, of which there were eventually 15 different versions.

  • ISO 8859-1; this is the most well known. It's also called Latin-1, and you'll often see reference to it in databases and in formats such as MIME encoded emails. It covers characters from most Western European languages.
  • ISO 8859-2; (Latin-2) covers characters from Central and Eastern Europe
  • ISO 8859-3; (Latin-3) covers Esperanto and Maltic languages
  • ISO 8859-4; (Latin-4) covers Baltic languages
  • ISO 8859-5; (Cyrillic) covers Bulgarian, Byelorussian, Macedonian, Russian, and Serbian languages
  • ISO 8859-6; (Arabic) does not cover all of the Persian or Pakistani Urdu languages
  • ISO 8859-7; (Greek)
  • ISO 8859-8; (Hebrew)
  • ISO 8859-9; (Latin-5) has Turkish characters
  • ISO 8859-10; (Latin-6) covers Nordic languages
  • ISO 8859-11; (Thai)
  • ISO 8859-12; sorry, those who use Devanagari missed out - this never eventuated.
  • ISO 8859-13; (Latin-7) covers languages written in the Baltic Rim
  • ISO 8859-14; (Latin-8) covers Celtic languages
  • ISO 8859-15; (Latin-9) Latin-1 on steroids; includes the Euro symbol and a few other obscure characters
Now there are only really two places I know of where you can get a list of ISO 8859 character maps. The first is the following Debian page, and of course the second is Wikipedia which not only lets you copy and paste the characters but gives you an excruciating amount of info on the charactersets. Sort of like this blog post, only without the sarcasm.

Yet more charactersets

Of course, this now meant that there were more charactersets than you could poke a stick at. In actual fact, there are even more charactersets than this because ISO-646 had dozens of variations. Not only this, but there appeared another class of characterset encodings called the Double-Byte Character Set (also known as DBCS). DBCSes were actually the real precursor to Unicode in my mind, because they were the first to attempt to use more than a byte to represent characters. You can read more about DBCSes on Wikipedia. While you are there, if you are interested it's worthwhile reading about another ISO/IEC standard - ISO/IEC 2022 - which uses variable-sized encoding to represent characters and is mainly used to represent characters of East-Asian languages.

More to come later

Well, I'll write more later, as it's quite late here and writing about the precursors to Unicode is actually quite tiring.

Friday, May 8, 2009

Finding what font renders Unicode characters in the PUA

Unicode 5 allows for 137,468 privately used codepoints. This means that there are codepoints that are reserved for private organizations to do with what they will - which means that no characters have been officially assigned to them.

In practice, this means that you need to have some way of knowing where the document with the embedded Unicode character came from. This is normally left up to a high-level protocol to negotiate how to render the character.

A great example is the character U+F8FF, which is used by a number of fonts. According to Wikipedia, this font is the Apple Computer logo with Apple fonts installed, the Microsoft logo when using Wingdings, and the Luxi font (designed for the X Window System) shows a euro symbol. There are a heck of a lot of other fonts that also use this codepoint.

And this was my challenge the other day. I'm currently in the process of troubleshooting a particularly tricky Unicode issue, but I wanted to see what used U+100084, but finding what uses this codepoint was a bit tricky. Then I discovered that has a Unicode font list and a Unicode font search tool. I had to use the local file list, and for some reason I found that my PC had Palatino Linotype installed on it - a very expensive font. However, it was the only font that rendered the character. Therefore, I came to the conclusion that this came from a PDF document, which had the font embedded.

Very useful!

P.S. Please, please, if you are thinking of using a font that requires a private use codepoint, do everyone a favour. Don't.

Friday, March 6, 2009

Pronouns - oh my!

Recently I've been editing a lot of technical articles. Poor grammar and the use of passive voice has apparently been a bit of an issue for me!

In the course of understanding the difference between active and passive voice, I stumbled over HyperGrammar, which is an electronic grammar course at the University of Ottawa's Writing Centre. It's an excellent website, one which I'm interested in reading - especially as the Australian school system never taught me grammar as it fell out of fashion with the teaching profession. A terrible mistake we are now all paying for - I understand that grammar is now back in fashion, so I suppose that there is still hope.

I just started reading about what a pronoun is, but as it's almost 11PM, I decided to stop when I discovered you can use subjective personal pronouns, objective personal pronouns, possessive person pronouns, demonstrative pronouns, interrogative pronouns, relative pronouns, indefinite pronouns, relative pronouns and, finally, intensive pronouns!

Thursday, February 26, 2009

Richard Simmons

I have a confession. I'm a bit of a fan of, all people, Richard Simmons. Why? Well, he's a pretty positive guy, he doesn't hold grudges (go watch any of the thousands of clips of him appearing on Letterman), he's happy about himself and he's confident enough that he doesn't mind being totally camp or make himself look a bit foolish in front of millions of people. Most of all, he seems like a fairly kind individual, and that is one of the most important aspects to anyone really.

You see, I think that when most people watch Richard Simmons they don't see a guy who has a very strong will, but I do think this. Richard Simmons is happy in his own skin (literally - *shudder*) that it really doesn't matter what mockers like Letterman think. Simmons goes out and does what he wants to do, and to hell with what others think of him. And in the process, he's actually helped out a reasonable amount of people who do feel bad about themselves.

Therefore, Richard Simmons, I salute you. You may have shocking dress sense, you may act the fool, you might show your 60+ year old legs on Letterman far too often, and you may market ridiculous steamers, but you are still an unlikely role model for our generation.

Thursday, February 19, 2009

KFC nutrition


World, meet Emily

Warning: exceedingly cute baby pictures ahead.

How To Kill Redundancy With a Redundancy

Terry Childs was a network administrator of the San Francisco FibreWAN. When I say a network admin, I really mean the network admin - given that he was the only one who looked after it. So while I'm impressed he is one of the world's few CCIEs, I do think that this is really a bit too much for anyone to take on by themselves.

So when he went rogue and wouldn't disclose the passwords to the Cisco routers and passwords he administered, I was somewhat gobsmacked. Not at Terry Childs, mind you, but at the dumb-arse morons who left a sole employee the administrator and contact for their entire critical networking infrastructure. Seriously, what would have happened if the man had expired? If he'd dropped off the mortal coil then they would not have been able to recover the passwords at all. Luckily Childs turned them over to Mayor Gavin Newsom, so a big problem was averted.

I'm sure that the San Fran network infrastructure had built-in redundancy. But it looks like they forgot the most important redundancy of all - the people to administer it.

Sunday, February 15, 2009

Python takes Youtube

I notice that there is a "report background image" link down the bottom of the Monty Python YouTube channel. All hail Monty Python!

Tuesday, February 3, 2009

Coffee printer

On slashdot I found this awesome printer, that prints with coffee. Hmmmm... if I bought this I think my caffeine addiction would only get worse :-)

This made me look at the website where it was submitted. It was part of the Greener Gadgets design competition, which in its own right is pretty interesting. Go check it out!

Sunday, February 1, 2009

On Not Getting It Redux

I thought John C. Dvorak could not have looked more ridiculous with his "my Windows XP idle process is killing my computer!" line, but I stumbled across the following article today. In it, he rails against CSS.

If your Internet connection happens to lose a bit of CSS data, you get a mess on your screen.
How on earth did this guy get to be a widely known and respected pundit on all things technology?

Tuesday, January 20, 2009

I pity the fool!

I pity the fool... who allows users to control text output via a URL.

Nice going TV Guide!

On Not Getting It

On Slashdot today I read a post entitled "Do Nice Engineers Finish Last?" which had the best first post ever. Which then got better.

First post:
Do Nice Engineers Finish Last In Tough Times?
Why, just the other day, a coworker was in contention for a promotion that was going to a younger engineer. My coworker found the specs to the younger engineer's car online and determined the precise rate it would have to leak coolant to completely drain the reserve tank precisely when he was leaving home to make an important customer meeting the next morning. I saw him on a crawl board attaching the regulator and a valve system in the parking lot and sure enough it overheated at precisely the right time so our customer just sat their waiting.

It's a calculate-or-be-calculated world out there!
Aside from the fact that your post is a load of horseshit, I suppose that you didn't step up to the plate by telling management what you witnessed.

And, incidentally, once the youngster took his car to the shop to be repaired, the tampering would have been discovered, and your fictional coworker would have been thrown in jail (hmm just where did this after market valve and regulator come from anyway?). In most states tampering with an automobile is a felony.
I think this might have spiked the parent poster's conscience, because he replied:
Alright alright, I need to come clean ... I embellished on this story a little bit. Here's the truth:

I was going to tell my boss but when I walked in, the coworker I was ratting out was on his knees with a mouthful of my boss and I think he said, "Oh hai!" I didn't stick around to clarify, I just left.

And it wasn't a car, it was a hovercraft. And it wasn't a regulator & valve, it was a detonator & C4. And he wasn't late for a meeting, he died. And don't worry about the law, Virginia isn't a state it's a commonwealth.

I feel almost relieved to get that off my chest and to come clean with you. I think I answered all your questions truthfully and fairly. Hopefully, together you and I can keep the internet a sound unbiased source of nothing but the unadulterated truth and historic account of everything.

You've helped me help myself. I love you.

Monday, January 19, 2009

Stupid bash tricks

The following is a very stupid bash command to run.


This is a fork bomb, so don't run it without having set process limits.

I was trying to understand how this worked, and I found the following blog that set me straight. What was tricking me was the colon... it didn't look like it should be valid but it is. Who knew? Well, certainly not me.

The site I note above helpfully suggests changing the colon to "bomb", which gives us:

bomb() {
bomb | bomb
}; bomb

Stupid slashdot tricks

Well, only one. It looks like the Slashdot crew are bigger fans of Futurama than I realised!

I got this from a slashdot sig.

chris@ubuntu:~$ echo -e "HEAD / HTTP/1.1\nHost:\n\n" | netcat 80
HTTP/1.1 200 OK
Date: Mon, 19 Jan 2009 12:23:53 GMT
Server: Apache/1.3.41 (Unix) mod_perl/1.31-rc4
X-Powered-By: Slash 2.005001237
X-Bender: Farewell, big blue ball of idiots!
Cache-Control: private
Pragma: private
Connection: close
Content-Type: text/html; charset=iso-8859-1

chris@ubuntu:~$ echo -e "HEAD / HTTP/1.1\nHost:\n\n" | netcat 80
HTTP/1.1 200 OK
Date: Mon, 19 Jan 2009 12:24:02 GMT
Server: Apache/1.3.41 (Unix) mod_perl/1.31-rc4
X-Powered-By: Slash 2.005001237
X-Fry: People said I was dumb but I proved them!
Cache-Control: private
Pragma: private
Connection: close
Content-Type: text/html; charset=iso-8859-1

chris@ubuntu:~$ echo -e "HEAD / HTTP/1.1\nHost:\n\n" | netcat 80
HTTP/1.1 200 OK
Date: Mon, 19 Jan 2009 12:24:07 GMT
Server: Apache/1.3.41 (Unix) mod_perl/1.31-rc4
X-Powered-By: Slash 2.005001237
X-Leela: There's a political debate on. Quick, change the channel!
Cache-Control: private
Pragma: private
Connection: close
Content-Type: text/html; charset=iso-8859-1

chris@ubuntu:~$ echo -e "HEAD / HTTP/1.1\nHost:\n\n" | netcat 80
HTTP/1.1 200 OK
Date: Mon, 19 Jan 2009 12:24:11 GMT
Server: Apache/1.3.41 (Unix) mod_perl/1.31-rc4
X-Powered-By: Slash 2.005001237
X-Fry: Stop abducting me!
Cache-Control: private
Pragma: private
Connection: close
Content-Type: text/html; charset=iso-8859-1


Tuesday, January 13, 2009

Dvorak and the nefarious "idle process"

While most people know that John Charles Dvorak once famously said that "the Macintosh uses an experimental pointing device called a ‘mouse’. There is no evidence that people want to use these things. I don't want one of these new fangled devices", they might not also know that he said the following in PC World about the Windows XP idle process:

This week's column is about exploring the commonly observed problems that crop up with each new release [of Windows]. Maybe Microsoft should patch the patches once in a while.

Here are a few of my gripes – most of them a result of excessive patching

IDLE-TIME PROCESS. Once in a while the system will go into an idle mode, requiring from five minutes to half an hour to unwind. It's weird, and I almost always have to reboot. When I hit Ctrl-Alt-Delete, I see that the System Idle Process is hogging all the resources and chewing up 95 percent of the processor's cycles. Doing what? Doing nothing? Once in a while, after you've clicked all over the screen trying to get the system to do something other than idle, all your clicks suddenly ignite and the screen goes crazy with activity. This is not right.

Saturday, January 10, 2009

Reply-all storms

Ah yes, the old reply-all storm. Never good, but even worse when it takes out U.S. diplomatic mail servers!

According to this report on Associated Press:

Officials said the storm started when some diplomats used the 'reply all' function to respond to a blank e-mail sent recently to many people on the department's global address list.

Most demanded to be removed from the list while others used 'reply all' to tell their co-workers, in often less than diplomatic language, to stop responding to the entire group, the officials said.

Some then compounded the problem by trying to recall their initial replies, which generated another round of messages to the group, they said.

The best email storm I've heard of though, is the one involving journalists who were accidentally sent a mass email from the Casey Journalism Center at the University of Maryland inviting them to their "Casey Medals" Awards.

According to Editor and Publisher the email snafu caused some interesting effects:
The back-and-forth sparked a circle of never-ending responses that, in some cases, kept hundreds of e-mails filling electronic mailboxes over several hours on Tuesday and Wednesday morning. But, in an unexpected surprise, it also brought many journalists in touch with old colleagues, while forging a number of new industry connections through something of an online cocktail party.

"People started chit-chatting back and forth and inviting themselves to the awards," said Kim Platicha, editor and publisher of Parentwise Austin magazine in Austin, Texas. "It really evolved from there, it was hysterical. I have already started an e-mail conversation with a couple of folks."

Saturday, January 3, 2009

What killed the Zune30?

So we've been hearing a lot about Microsoft Zune 30s crashing. Microsoft have now said that it was a leap year bug.

And indeed it is! From Pastie (start at line 249):

// Function: ConvertDays
// Local helper function that split total days since Jan 1, ORIGINYEAR into
// year, month and day
// Parameters:
// Returns:
// Returns TRUE if successful, otherwise returns FALSE.
BOOL ConvertDays(UINT32 days, SYSTEMTIME* lpTime)
int dayofweek, month, year;
UINT8 *month_tab;

//Calculate current day of the week
dayofweek = GetDayOfWeek(days);


while (days > 365)
if (IsLeapYear(year))
if (days > 366)
days -= 366;
year += 1;
days -= 365;
year += 1;

// Determine whether it is a leap year
month_tab = (UINT8 *)((IsLeapYear(year))? monthtable_leap : monthtable);

for (month=0; month<12;>wDay = days;
lpTime->wDayOfWeek = dayofweek;
lpTime->wMonth = month;
lpTime->wYear = year;

return TRUE;

Why is this bad? Well, 2008 was a leap year that has 366 days. Let's step through the lines of code that caused the problem.

//Calculate current day of the week
dayofweek = GetDayOfWeek(366);

year = 2008;

while (366 > 365)
if (IsLeapYear(2008))
if (366 > 366)
days -= 366;
year += 1;
days -= 365;
year += 1;

As you can see, the while loop condition becomes true - yes, the day is day 366 and that's greater than 365. And yes, 2008 is a leap year. But as you can see, 366 will never be greater than... 366.

Therefore, the loop condition never evaluates to false, hence an infinite loop. Thus your Zune will crash.

Guess Freescale, the makers of the Zune's processor (the MC13783), had a programmer who didn't understand about boundary conditions.

Update: Another blogger has now gone and suggested a few bug fixes for the Zune issue. Nice going :-)