For a bit of light reading, I've been going through some of the documents published by Sir John Chilcot's Iraq Inquiry. There are thousands of Telegrams, eGrams, Teleletters, and other miscellaneous communications. For those unfamiliar with the jargon, the inquiry have helpfully published a guide to reading the evidence.
In it is this delightful titbit about the timestamps used on telegrams.
That's not a normal looking timestamp. I've deliberately obscured the explanation because a good standard is like a joke - if you need to explain it, it's not that good!
Let's see if we can work it out from first principles - feel free to play along at home.
170356Z JULY 03
A naive reading would think, perhaps, 17:03:56 July 2003. But that doesn't tell us the day. And are timestamps on Telegrams needed to be precise to the second?
First off, the text
JULY is the month. That's completely unambiguous. It isn't abbreviated, so provides quite a lot of textual redundancy.
Ever since the dreaded Millennium Bug, we've been accustomed to writing years in full. None of the groups of four digits look like a year (unless this telegram was sent in the 16th Century!), so I think it's safe to assume that the
03 at the end refers to 2003.
So far we're at
??????? MONTH YEAR
Next is the letter
Z. This usually refers to "Zulu Time" which, for historic reasons, is GMT (Greenwich Mean Time).
OK, we've now got
??????TIMEZONE MONTH YEAR
Logically, we're left with the day of the month, and the time. I would expect that the timezone indicator is next to the time.
0356 must be four minutes to four in the morning. That leaves us with the day as the first two characters.
We end up with
DAY HOUR MINUTE TIMEZONE MONTH YEAR. That's pretty cumbersome, isn't it?
- It is an unnatural speaking order - "I sent it on the seventeeth at three fifty six, in July oh-three."
- Messages are variable length - if you're using the full text of the month then
Decemberis a longer word than
- Months are in English - making the message potentially confusing for an international audience.
- It also relies on knowing Military timezones rather than printing their offsets like
- No indication of the day of the week - is it helpful to know this was sent on a Thursday?
But is it right?
Confused? During British Summer Time (BST) the UK is at
MMM indicates the first three letters of the month of the year. In written messages, all letters are in upper case
JULY should be
Now, in fairness, the telegrams use both standards interchangeably - sometimes within the same document!
Additionally, I can't find any UK-based public information describing the standard.
The eGram System used by the UK's Foreign and Commonwealth Office (FCO) uses a different timestamp.
Is this clearer?
- Any American reading that date would parse
01/09/as "9th January" rather than "September the 1st".
- No timezone. It was sent from Baghdad - so is that
- Is per-second accuracy useful?
- How easy is it to compare between the two standards? Is
170356Z JULY 03before or after
16/07/2003 04:55:00 PST?
The origins of DTG's
170356Z JUL 03 is, as far as I can tell, from the 1960's standard "METAR" - a method for airports to report weather conditions to aviators. It was adopted by the military - and from there found its way into Government communications.
Of course, there is one-standard-to-rule-them-all, which is ISO 8601:
- Precision is from least to most: Year - Month - Day - Hour - Minute - Second.
- This means you can remove precision from the right and still have a valid date - for example 2003-07 represents July 2003.
- Easily parsed by computer systems.
- No English language names.
- Uses the Arabic numbering system - may be hard to understand for people from other cultures.
- Doesn't contain the name of the day - for example "Thursday".
There's also RFC 3339 which is similar, but subtly different.
Why Do Old Standards Hang Around?
If you're a former flying ace, who has started working for the Government, it probably makes a lot of sense to use the timestamp standard which is most familiar to you. And, after a few decades, that's just the standard we've always used. We're familiar with it, the machines are configured for it, and it only takes a few weeks of training for new joiners to understand why we use it.
Sometimes there is good reason for inertia. Can you imagine the cost and chaos if, for example, the UK decided to change the side of the road people drive on?
Of course, a number of countries have done exactly that. They calculate that the benefit of cheaper cars and improved compatibility with their neighbours is worth the cost of change and the risk of injury. I thoroughly recommend listening to 99% Invisible's podcast on the subject.
How To Change To A Better Standard
As with any change, we have to look at who benefits from the old system, what the risks are in changing, and the advantages (if any) of a newer system.
Timestamps hold a curious "dual-citizenship" of being both data and metadata. It is simple for a computer to look at a string, determine that it is a date, and then do something with it.
But humans also have to read and understand the data. We could just represent the date as
1058414160 - that's how most computers look at dates; the number of seconds since 1970 - but that's inconvenient for our delicious meaty brains.
For example, consider a local council sending you a message to say when your rubbish bin will be collected. What the user really wants is a message where the prominent data is "Thursday". Something not covered by ISO 8601 or RFC 3339. Although RFC 850 supports full length day names.
It's all about the users
So, who are the users in this case? They can broadly be split into three groups.
- Things not yet invented.
Computers can process just about any string and (usually) understand it. But it makes life more convenient if there is a single standard. So let's stick to ISO 8601 for everything.
Humans are, thankfully, fairly easy to deal with. Computers are very good at manipulating data. If you really want to see dates as Military DTG - that's OK; your software can interpret the time and display it however you like. It could even alter depending on when the document was sent. All of these are viable time representations for humans:
- 6 minutes ago.
- Last Thursday.
- Summer 2003.
An eye on The Future
It is said that the only reason NASA were able to land humans on the Moon was that they predicted which technology would be available in the future. They didn't just have to invent the future - they had to guess at what materials and components should be available in their timescales.
What is the future for time? We know from history that time is confusing.
- There are 11 missing days in 1752.
- Daylight Savings Time is confusing and can change with little notice.
- Some cultures' calendars may drift out of sync with each other.
- The Earth's rotation isn't constant - requiring Leap Seconds
- The start of the year was not always January 1st
- Some countries change their time zone.
What do timestamps look like when we need to send Telegrams into space? The International Space Station has settled on GMT (for now) - but will our colonies on the Moon or Mars be happy to keep in sync with a distant planet?
We know from Einstein that objects travelling at speed experience Time Dilation - if we wish to communicate with an interstellar probe like Voyager 1, we need to remember that it is experiencing time differently from us.
How do we choose a standard which will be understood by humans and computers in The Future? Will we have moved to Decimal Time? Will future historians understand that our months aren't the same as theirs?
In short, how do we unambiguously represent a moment in time so that it can easily be parsed by computers and comprehended by humans today - while ensuring future compatibility?
Answers on a postcard to the usual address.