DailyDirt: Time, Time, Time. See What's Become Of..

from the urls-we-dig-up dept

With the Apple Watch available now, maybe more people will be interested in wearing fancy watches again -- instead of just relying on their phones. Fancy watches once focused on telling time with extreme accuracy, but then digital watches made it really cheap to keep accurate time that was more than "good enough" for most folks. It used to be annoying to need to adjust clocks for daylight savings and power outages, but as more and more clocks are connected to the internet (except for ovens and some cheap alarm clocks), we barely need to think about how to change the time on a clock (who owns a VCR anymore?). Check out these links on accurate time keeping. After you've finished checking out those links, take a look at our Daily Deals for cool gadgets and other awesome stuff.

Filed Under: accuracy, atomic clock, clocks, john harrison, leap second, pendulum clock, time, watches


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Spaceman Spiff (profile), 30 Apr 2015 @ 6:47pm

    Leap seconds

    Writing software to deal with this stuff is not simple. Most developers simply gloss it over, and when everything crashes they simply shutdown and restart all their systems. FWIW, I had to write software that could deal with leap seconds back about 20 years ago, so I know something of what I speak here.

    reply to this | link to this | view in chronology ]

  • icon
    charliebrown (profile), 30 Apr 2015 @ 7:18pm

    @midnight

    Oh no! Their show will have to run extra long on June 30 as it always starts at 11:59 and 59 seconds! Actually, Comedy Central will probably just run an extra commercial.

    reply to this | link to this | view in chronology ]

  • icon
    charliebrown (profile), 30 Apr 2015 @ 7:20pm

    Using The Extra Second

    What are you going to do with the extra second?

    I'm going to turn my foreplay into fiveplay!

    reply to this | link to this | view in chronology ]

  • identicon
    Stephen, 30 Apr 2015 @ 7:53pm

    Leap Seconds are Essential

    Or join the movement trying to abolish the leap second entirely, but that probably won't happen until 2018.
    Leap seconds are needed for the same reason leap days are needed every fourth year: to keep our clocks and watches synced with the Earth's rotation and its motion around the Sun.

    By the way, if you think Linux (& UNIX) systems have a problem with leap seconds that will be a minor hiccup compared to the train wreck looming for January 19, 2038 at 3:14:07 GMT.

    Perhaps those looking to abolish leap seconds should get their priorities in order. (Reportedly some systems are going to start showing issues re the 2038 problem as early as 2018.)

    reply to this | link to this | view in chronology ]

  • identicon
    Super Tela, 30 Apr 2015 @ 11:36pm

    Great

    Great post! thanks

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 May 2015 @ 4:57am

    If the MBAs don't like the present time keeping system, including leap seconds, then why don't they just make their own time keeping system and leave the present one as is?

    There are benefits to time keeping which is in sych with the star systems, take that away and people will just have to create another one.

    reply to this | link to this | view in chronology ]

  • identicon
    Rekrul, 1 May 2015 @ 6:43pm

    Can someone explain in plain English, the problem with leap seconds and why they will cause computers to crash?

    Maybe I'm just dumb, but I don't understand the problem. The computer itself doesn't know about the leap second. Disconnect it from the net and it will happily go on keeping time as if nothing had happened, except that it will be one second ahead of everyone else. Connect it back to the net, it will see that the time is off and reset itself to match everyone else.

    I can set my computer to an hour ahead and nothing bad happens other than the timestamps on files being wrong. Nothing crashes.

    I read one explanation which claimed that leap seconds were a problem because all computer time is calculated by counting the seconds from something like 1972. What kind of a nutcase would calculate time based on decades worth of seconds? That would be like calculating all monetary transactions in pennies.

    Is my digital clock going to explode because the world has added an extra second to the time? Putting aside the fact that my clock doesn't match up to the world time anyway, all I would need to do is set it back one second and it's back in sync.

    It's not like they're adding an extra second to every day. They're adding a single second at one specific time. Once that second has been added and computer clocks adjusted, time keeping goes back to normal.

    OK, calculating a time difference between dates before and after the leap second would carry a single second error, but does that really matter? How many applications in the world absolutely depend on computers being able to calculate the difference between two dates down to the second? And if it matters, the software can be patched to insert an extra second when performing the calculations.

    reply to this | link to this | view in chronology ]

    • icon
      MrTroy (profile), 3 May 2015 @ 11:01pm

      Re:

      See eg http://www.theregister.co.uk/2012/07/02/leap_second_crashes_airlines/

      It's not specifically a problem with leap seconds, it's just that some versions of some software didn't cope well with NTP adjustments as large as a second, because apparently someone thought it was a good idea to use a non-monotonic clock to implement kernel timers. See http://serverfault.com/questions/403732/anyone-else-experiencing-high-rates-of-linux-server-crashes- during-a-leap-second/403767#403767 for good coverage of the details.

      reply to this | link to this | view in chronology ]

      • identicon
        Rekrul, 4 May 2015 @ 3:59pm

        Re: Re:

        It's not specifically a problem with leap seconds, it's just that some versions of some software didn't cope well with NTP adjustments as large as a second, because apparently someone thought it was a good idea to use a non-monotonic clock to implement kernel timers.


        Thanks. The linked article didn't make it clear that this was due to a bug. It made it sound as if it was just a shortcoming in the software that couldn't deal with the time being adjusted.

        reply to this | link to this | view in chronology ]

    • icon
      MrTroy (profile), 3 May 2015 @ 11:10pm

      Re:

      I read one explanation which claimed that leap seconds were a problem because all computer time is calculated by counting the seconds from something like 1972. What kind of a nutcase would calculate time based on decades worth of seconds? That would be like calculating all monetary transactions in pennies.

      That's a very real problem that has nothing to do with leap seconds. See http://en.wikipedia.org/wiki/Year_2038_problem

      Note that the problem is well underway to being solved in general purpose computers, and mostly remains for specific classes of embedded systems... so is mostly an issue for things that have already been created that will last for another 20 years or more (and care about the date).

      reply to this | link to this | view in chronology ]

      • identicon
        Rekrul, 4 May 2015 @ 4:11pm

        Re: Re:

        That's a very real problem that has nothing to do with leap seconds. See http://en.wikipedia.org/wiki/Year_2038_problem


        I still think it was a mistake to store the time as the amount of seconds that have passed since some arbitrary date.

        Sure, storing the date as a 32-bit number might have been the most efficient way to store the date and time, saving a few bytes, but then they would need to add extra code to convert that 32-bit number back into a human-readable format. If they'd used separate bytes for the seconds, minutes, hours, days, months and year, they could have made up the difference in much simpler code used to convert those bytes into human-readable date formats. At the very least, they should have used a single byte for the year and then they could have stored the rest of the time/date as an offset from that year. That would have given them more than 256 years for the cost of a single extra byte.

        reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 3 May 2015 @ 9:42am

    "Can someone explain in plain English, the problem with leap seconds and why they will cause computers to crash? "


    Poorly written code and an unwillingness to correct it.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Techdirt Gear
Shop Now: I Invented Email
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.