Forensic Analysis of Digital Time

April 17, 2015

Imagine you want to find out whether an employee copied files onto a USB drive before joining a competing firm. Alternatively, imagine you want to find out whether a person possessing a GPS-enabled cell phone and suspected of being involved in a crime was at the scene when the crime was committed.

The common technology underlying the potential solution to both these problems is the concept and implementation of digital time. Specifically, do computers (or mobile devices) believed to be present at these events have some record of digital time associated with these events? As an example, inserting a USB drive can leave indicia in the form of digital timestamps on the computer in which the USB device was inserted. Similarly, GPS information may have been recorded with digital timestamps on the mobile device.

In both these situations, an understanding of digital time and how it is created, stored, and updated is essential to reconstruct a sequence of events for forensic purposes. Digital timestamps are available in numerous scenarios (e.g., file metadata, image metadata, or log files, to name a few), but the layperson may not know of their existence or reliability, let alone how to obtain and use them. In this article, we present an overview of digital time and discuss various factors that can influence the veracity (i.e., introduce uncertainty) of digital time indicators, thus complicating the job of a forensic investigator in discerning a sequence of events in space and time. However, a skilled practitioner will understand the context and environment of digital time indicators and thus know how to interpret the information.

What is Digital Time?

The concept of time is intuitively familiar to us through our everyday acquaintance with clocks, calendars, time zones, etc., as well as our physiological sense of the passing of time. We use the term “digital time” to refer to time kept by computers. Digital time is a discrete approximation used by digital machines to keep track of physical time. In practice, digital times are typically represented as positive integers that are stored as 32-bit or 64-bit numbers. The physical time whose digital value is 0 is often called the “epoch.” Two successive integers representing digital time differ by a “unit” of digital time. The unit is the smallest time difference we can represent. Thus, digital time is defined by: the bit length (e.g., 32 bits), the unit (e.g., 1 millisecond), and the epoch (e.g., Jan 1, 1970).

As an example, consider the Microsoft NTFS file system. This file system, used by the Windows 7 operating systems, for example, uses a 64-bit bit length, a 100 nanosecond unit, and an epoch equal to midnight on the first day of the 17th century.1 The exact meaning of “midnight on Jan 1, 1601,” will be discussed in the next section, along with other practical issues. One consequence of setting
an epoch value to some point of time in history is that, while a computer can represent a finite number of times that are later than the epoch, we cannot represent a time prior to the epoch.

Setting the Clock

Now that we understand how digital time is represented, let us examine how our computers keep track of the time when turned off. Modern
computers use a small lithium battery (typically, model CR2032); in this section, we consider a “typical” computer. This battery has a lifetime of about 10 years and is used to power a small
hardware device on the motherboard called the “real-time clock”. The real-time clock usually does not provide any time zone or daylight savings
information. It only maintains a time value. 

The real-time clock is a solid-state integrated circuit, powered by the CR2032 battery, containing a crystal oscillator and non-volatile random access memory. When a personal computer is first turned on, the operating system (via low-level modules called BIOS) knows that it can typically set its own clock, the “system clock,” by reading the time on the real-time clock. Once the operating system has set the initial time of its system clock, it can keep track of time independently of the real-time clock using the central processing unit (CPU), which is powered by the main computer power source rather than the CR2032 battery. During this time the system clock and the real-time clock may stay roughly in sync, but these two clocks may typically be considered as independent; the purpose of the real-time clock is to maintain a rough track of the time when the power is off whereas the system clock is the source of digital time for the operating system, e.g., for time-stamping of files (discussed below). When the computer is shut down, the real-time clock value is set equal to the system clock value. In the powered-down state the real-time clock will continue to maintain track of time via power supplied by the CR2032 battery. When the computer is again powered up the system clock is again set by reading the real-time clock. This allows the system clock and the real-time clock to stay in sync.2 In this situation, neither the real-time clock nor the system clock is necessarily in sync with any universally agreed upon standard such as International Atomic Time (discussed below).

Let us now examine how the real-time clock and the system clock are set correctly in the first place. The time will usually first be set by the user when installing the operating system. During installation, the user may also be prompted to specify the local time zone. Furthermore, once the operating system is installed, a user can access various software tools to change the value of the system time and the time zone. For example, in Windows 7, the application called “Date and Time,” shown in Figure 1 below, can be used to change the time and time zone to whatever value (greater than midnight Jan 1, 1601) is desired.

Figure 1. Screen shot from a Windows 7 operating system showing the "Date and Time" utility that can be used to set the date, time, and time-zone values.

The small blue and yellow image of a shield in the “Change date and time…” button shown in Figure 1, indicates that special administrative privileges may be required to change the date and time on this computer. However, it is still possible for any person with physical access to this computer to boot into a different operating system, set the system time arbitrarily, and then shut down the machine, thereby setting the real-time clock. The next time the computer boots normally, this will set the system clock to the changed time.3 

Thus, human intervention is a very important way in which uncertainty could be introduced into digital time keeping and can be used to manipulate the time at which events appear to have occurred. For example, in Figure 2 below, the ostensible creation date of a file has been set to Jan 1, 1601, by manipulating the system clock on a Windows 7 operating system. 

It is not a straightforward process to set the date of the system clock earlier than 1980 via the Windows “Date and Time” application in Windows 7; however, various other applications, such as Cygwin, can be installed that make it easy to set the date arbitrarily within Windows. 

Due to the fact that the system time can be manipulated, it is important to have independent methods of confirming that the system time is consistent with the actual time. One useful method of corroborating system time with physical time is by comparing the times written in log files generated by known software (such as operating system updates or antivirus software updates), and comparing these timestamps with known physical events (e.g., when the updates were released). For example, if the log file for updating a local anti-virus database contains the logged date and time of the update (reflecting the local system time), this time can be compared with the known release date and time of the update as published on the anti-virus software vendor’s website. 

Figure 2. Properties dialog windows for a file named "NewFile.txt" showing an ostensible creation date of January 1, 1601.

Digital Time in File Metadata 

One of the most common uses for digital time is timestamping of files. A file’s timestamps are metadata associated with the file that record the digital times for certain events in the file’s lifetime. For example, most computers running a Microsoft Windows operating system today use the NTFS file system, which records the digital time associated with events related to the file. For example, when one views the “Properties” of a file in Windows, the following are displayed: Modification Time, Access Time, and Creation Time, collectively referred to as MAC Times.4

The Modification Time generally refers to the most recent time the file’s content was changed, the Access Time generally refers to a time when the file was accessed, and the Creation Time generally refers to the time the file was created. Although these definitions of the timestamps appear straightforward, complex situations can produce counter-intuitive (at least at first glance) timestamp values. For example, when a file is copied from one location to another on a disk organized using NTFS (the file system in modern Windows computers), the creation time of the new file reflects the time when the new file was created, but the modification time of the newly created file (the copy) still has the old value.

Interpretation of NTFS timestamps can be complicated further by variations in the behaviors of different operating systems. NTFS refers only to the organization of the data in a disk (and the low-level file system software). It does not refer to the operating system that enforces the policies for maintaining and updating digital timestamps. For example, Windows XP, Windows Vista, and Windows 7 are three different operating systems, each of which can read and write data to a disk organized using NTFS. However, the three operating systems can enforce different policies with respect to timestamps. In Windows XP, the operating system will attempt to update the access times for files, but in Windows Vista, the operating system—by default—does not update any access times.5 Another interesting quirk is that NTFS allows deferring the writing of Access Times to disk for up to an hour.6 

This does not mean that the timescale for distinguishing Access Time events is actually an hour instead of 100 nanoseconds, but rather that Access Time updates to disk are delayed. However, it is possible that this one-hour delay quirk could possibly lead to inexact access times, for example, when very large numbers of files are being copied simultaneously.

As a final note on NTFS access times, even though it would seem that the act of copying a file should be considered as accessing the source file, and that the access time should be updated when the file is copied, this may not always be the case. Thus, in a situation requiring forensic analysis of digital timestamps, it is important to understand the context and the software environment in which the timestamps were generated. 

Other file systems, such as ext2 (or ext3 or ext4) often associated with Linux, or HFS (or HFS+) often associated with Apple computers, similarly make use of MAC times as discussed above. Likewise, mobile operating systems such as Android or iOS also keep track of these timestamps. The specific details of use may vary from file system to file system, and with the operating system used to access the files. 

Clock Drift, Synchronization, and Atomic Time 

Even if the system time has been set initially to the exact proper time by a system administrator, it will not, in isolation, maintain the correct time, due to various system limitations. In other words, the system clock “drifts,” and over a period of time will deviate from the actual time. The “actual” or “correct” time is best represented by International Atomic Time (abbreviated “TAI” based on the French translation). Unfortunately, TAI is a quantity that is never actually defined in the present, but rather is defined only in hindsight after averaging the values of hundreds of atomic clocks kept at different locations across the globe.7 

A useful approximation to TAI can be obtained, for example, from a network time server such as, which is physically located at the National Institute of Standard and Technology (NIST) in Boulder, CO. This server supplies NIST’s atomic clock’s specific contribution to TAI, which is called UTC(NIST). The time UTC(NIST) is available in a number of different formats from (different formats are served by differently numbered ports). For example, UTC(NIST) is available as a “Network Time Protocol” data packet on port 123 or as a human-readable plain text format on port 13. A connection to receiving UTC(NIST) (on network port 13) and the outputs returned from the server are shown in Figure 3. 

The term “UTC” stands for Coordinated Universal Time, which is the successor of Greenwich Mean Time (GMT) and the current world time standard based on TAI. The NTFS and most other file systems store time values as the system clock’s approximation of UTC. The operating system typically displays times (e.g., MAC times) in local time by applying the known time-zone offset to the UTC-valued MAC times stored as a file’s metadata. 

One way to keep the system clock in sync with actual time is to make use of tools that use Network Time Protocol to update the system clock based on queries to a network time server.8 Often, computers that are part of an administrative domain (e.g., work computers) use this feature to keep the computer time up to date. 

Figure 3. A Cygwin session on Windows using the “date” command to display the system time, and the “nc” command to recieve UTC(NIST) from a server in Boulder, CO.

Digital Time in Mobile Devices

Mobile operating systems such as Android and iOS also need to keep track of digital time by specifying an Epoch, unit, and bit length. Because Android is essentially a Linux-based operating system, and Apple’s iOS is based on Mac OS, the digital time features available in these systems carry over to the respective mobile platforms. 

For example, Android supplies a Timestamp object that takes a 64-bit integer as input specifying the number of milliseconds (unit) since Jan 1, 1970 (epoch).9 Apple’s iOS also technically measures time from the 1970 epoch, but some versions of iOS (or programs running on iOS) appear to add a fixed “epoch offset,” such that timestamps have a reference date of 2001.10 The system time on mobile phones is used to create MAC time metadata for files, similar to the non-mobile operating systems described above. MAC times, log files, and other well-known database files are useful repositories of system time data on mobile devices that can be used to piece together events. Moreover, location-based information is also often present on mobile devices.

For example, Figure 4 shows the some of the information available in the file “consolidated.db” on an iPhone 4s. This database contains timestamps and cached longitude/latitude values that are typically used to help improve mapping/GPS performance. 

This database, obtained from an iOS device, also illustrates the use of the 2001 reference date. For example, consider the timestamp 398139235.060831 shown in Figure 4. Adding 978307200 (the number of seconds between 2001 and 1970) to this value and using appropriate software tools, we can obtain a date for the event recorded in the “consolidated.db” database.* 

For instance, we find that this timestamp refers to 2:13:55 AM Aug 14, 2013 UTC. Thus, this phone was ostensibly near the Hong Kong airport (latitude ~22.3, longitude ~113.9) at 2:13:55AM Aug 14, 2013 UTC, which is 10:13:55AM on Aug 14th, local Hong Kong time.** Similar time-stamped databases of GPS data also can be found in Android devices. 

Figure 4. Some of the data available in the file "consolidated.db" found on an iPhone 4s. 

Summary and Conclusions 

This article presents a brief overview of digital time, and some of the complexities involved in sifting through digital timestamps that are present in various digital systems. The widely variable digital systems that interoperate in today’s complex world often leave behind a trail of digital timestamps that can be of great value to reconstruct a sequence of events. However, an understanding of what digital timestamps mean and how they are generated is important to derive the benefit of timestamp information. 

Stated simply, digital timestamps measure the number of units of time elapsed since an arbitrary point in time. Knowledge of which system created this timestamp, and in what format, is essential to understanding a sequence of events. Moreover, the computational systems that create digital timestamps can differ or drift from real-world time. Furthermore, human intervention can potentially manipulate these timestamps. Thus, a careful forensic evaluation typically involves validating the system time against known events in the real world.


* As one example, we could use the Python programming language command “time.gmtime()” to obtain a human-readable time corresponding to a number. 

** We know this to be true, because Dr. Sorini, who had the phone with the consolidated.db, was at the Hong Kong airport at this time.



1. desktop/ms724290(v=vs.85).aspx. 

2. See, e.g., 

3. If network time synchronization is enabled, the computer will eventually reset to the “correct” time in the real world. 

4. Carrier, Brian. File System Forensic Analysis. AddisonWesley Professional, 2005. 


6. desktop/ms724290(v=vs.85).aspx 


8. cc773263%28v=ws.10%29.aspx. 

9. Timestamp.html#Timestamp(long). 

10. Cocoa/Reference/Foundation/Classes/NSDate_Class/ index.html#//apple_ref/doc/constant_group/ NSTimeIntervalSince1970.