Thursday, December 6, 2018

What was my IP? Ask DoSvc on Windows 10

   I recently watched the recording of the interesting talk Windows Forensics: Event Trace Logs that Nicole Ibrahim gave at SANS DFIR Summit 2018. I then used the tool ETLParser to dump the contents of all the ETL files stored on my own workstation. I glanced through the giant CSV output file looking for anything of interest and accidentally noticed a string "ExternalIpAddress". As you can see, the string is followed by an IP address.

That IP address is my current public IP address. Why is it there?

The string "ExternalIpAddress" is located next to other interesting strings like "GEO: response", "CountryCode" and a precious timestamp. If this is a geolocation response, what triggered it? Since "ExternalIpAddress"appears several times in the log files, how many geolocation requests have been made so far and why?

DoSvc - Delivery Optimization
   All the hits were found within some logs whose names begin with "dosvc". I searched on Google and found out that "dosvc" stands for Delivery Optimization which is the update delivery service for Windows 10 clients. In the online documentation Optimize Windows 10 update delivery, Microsoft explains what this service does:

Delivery Optimization is a new peer-to-peer distribution method in Windows 10. Windows 10 clients can source content from other devices on their local network that have already downloaded the updates or from peers over the internet.

Depending on the version of Windows 10, the various Event Trace Log (ETL) files created by the Delivery Optimization service (DoSvc) are stored here:

OS Version Default path Filename
Win10 (1507)  C:\Windows\Logs\dosvc dosvc.\d*.\d.etl
(e.g. dosvc.1377765.1.etl)
Win10 (1709/1803)  C:\Windows\ServiceProfiles\NetworkService\AppData\Local\Microsoft\Windows\DeliveryOptimization\Logs dosvc.yyyyMMdd_HHmmss_\d*.etl
(e.g. dosvc.20181111_180339_399.etl)

On my computer running Win10 (1803), which is always On and connected to the internet 24/7, the default DoSvc log path contains each day about 140 log files. Based on what I've observed, Win10 (1803) daily removes the ETL files older than 57/58 days. In such a scenario, there may be the chance of extracting several public/external IP addresses from the logs.

Even though the log files lead in the direction of "Delivery Optimization", I think something else might be responsible for the geolocation calls. On my computer, the "Delivery Optimization" service is off. Even the "Location" service is turned off.

(Delivery Optimization)

How to parse the logs
   After some trial and error trying to figure out what was the best way to extract the data I needed from the CSV output file created by ETLParser, I found out that Win10 has a built-in Powershell cmdlet named "Get-DeliveryOptimizationLog":
This cmdlet retrieves decoded logs for Delivery Optimization. If no parameter is given, the cmdlet parses the default DoSvc path. The parameter "-Path" is required to parse other locations:

Get-DeliveryOptimizationLog -Path C:\CustomPath\*

Here I used the cmdlet to search for the keyword "ExternalIpAddress".

PS> Get-DeliveryOptimizationLog | Where-Object Message -Like "*ExternalIpAddress*"

The output was:
(27/Nov/2018 11:05:47)

I also spotted the IP that I was assigned to when using CyberGhost VPN.

(24/Nov/2018 23:26:39)

What triggered the geolocation requests? Using the two examples shown above, I searched for "ProcessId" 13104 and 36192 and noticed that some events contain the message: "Create job name = WU Client Download".

TimeCreated : 27/11/2018 11:05:47
ProcessId   : 13104
ThreadId    : 9952
Level       : 4
LevelName   : Info
Message     : Create job name = WU Client Download, jobId = 4d66d186-68e2-4bfc-8d74-f40de415fc20, type = 0. hr = 0
Function    : CDeliveryOptimizationManager::CreateJob
LineNumber  : 495

TimeCreated : 24/11/2018 23:26:43
ProcessId   : 36192
ThreadId    : 33560
Level       : 4
LevelName   : Info
Message     : Create job name = WU Client Download, jobId = bc698001-4916-4c93-b513-cdcfe325ae9d, type = 0. hr = 0
Function    : CDeliveryOptimizationManager::CreateJob
LineNumber  : 495

What was downloaded and probably installed by the "Windows Update" (WU) client?

PS> Get-WuaHistory | Format-Table

Get-WuaHistory is a third party cmdlet.

This seems to be the answer:

Result    Date                Title                                                                                                       
------    ----                -----                                                                                                       
Succeeded 27/11/2018 11:04:59 Definition Update for Windows Defender Antivirus - KB2267602 (Definition 1.281.899.0)                       
Succeeded 27/11/2018 11:04:59 Definition Update for Windows Defender Antivirus - KB2267602 (Definition 1.281.899.0)                       

Succeeded 24/11/2018 23:25:54 Definition Update for Windows Defender Antivirus - KB2267602 (Definition 1.281.756.0)                       
Succeeded 24/11/2018 23:25:54 Definition Update for Windows Defender Antivirus - KB2267602 (Definition 1.281.756.0) 

I see from my Windows Update history that Windows Defender Antivirus is updated on a daily basis. Based on the log files, it seems that "WU Client" makes a geolocation call before downloading any available update. That could explain why I have at least one geolocation response per day in the logs. Additionally, the creation time (TimeCreated) of each "GEO: response" event message always matches or is very close to the installation date of each antivirus definition update. 

A hive file named "dosvcState.dat" is also involved in the process, but I haven't had the time yet to check what it contains. The hive can be found here:


   I wrote a Powershell script that adapts to my needs the output provided by the mentioned cmdlet. The script adds to the output the name of the log files from which the information was extracted and shows the contents of the "Message" object in a different way. The script will generate two output files in both CSV and JSON format:
  • <timestamp>_dosvc_ExtIpAddress: contains the extraction of each IP found in the ETL files;
  • <timestamp>_dosvc_ip2location: contains additional details about each unique IP found like the Internet service provider, latitude and longitude. The script uses an external API.
Filenames are prepended with the timestamp of when the script was executed. To use the script, just provide the path containing the ETL files to parse:

PS> .\Get-DoSvcExternalIP.ps1 C:\LogPath

From the logs of the computer I mentioned above (Win10 - 1803), I managed to extract 124 IP addresses whose dates range from October 10th 2018 to yesterday.

I also used the script against the DoSvc ETL files that I extracted from a laptop (Win10 - 1709) that I last used in May in Las Vegas during the Magnet User Summit 2018. I connected to the Wi-Fi network in the hotel a couple of times for a short moment, but that was long enough for Win10.
This is an example of what I could extract from the logs:

PS> (Get-Content 20181205_143620_dosvc_ExtIpAddress.json | ConvertFrom-Json) | Where-Object ExternalIpAddress -eq ""

LogName                  : C:\TEMP\ETL_Laptop2\dosvc.20180522_170803_773.etl
TimeCreated              : 22/05/2018 17:09:04
ExternalIpAddress        :
CountryCode              : US
ProcessId                : 7840
ThreadId                 : 776
Level                    : 4
LevelName                : Info
KeyValue_EndpointFullUri :
Version                  : <omissis>
Function                 : CGeoInfoProvider::RefreshConfigs
LineNumber               : 58

These are the additional details provided by the script by using an external API:

C:\> type 20181205_143620_dosvc_ip2location.json | jq

    "as": "ASxxxxx Cox Communications Inc.",
    "city": "Las Vegas",
    "country": "United States",
    "countryCode": "US",
    "isp": "Cox Communications Inc",
    "lat": xx.x892,
    "lon": -xxx.x63,
    "query": "",
    "region": "NV",
    "regionName": "Nevada",
    "status": "success",
    "timezone": "America/Los_Angeles",
    "zip": "89106"

I hope you find this long blog post useful!

You can download the script from my GitHub repository here.

[UPDATE March 31, 2019]:
the peer-reviewed version of this article can be read at DFIR Review.

[UPDATE April 5, 2019]:
I improved the script and created a "Get-DoSvcExternalIP" module for KAPE.

Wednesday, October 3, 2018

Calllog.db and SMS data on Android 7.0 Nougat

A few weeks ago, Jamie McQuaid at Magnet Forensics wrote an interesting article titled Android Messaging Forensics – SMS/MMS and Beyond. The article is a great overview of the different databases that Android uses to store SMS/MMS data.

Based on my recent findings, I think that another database should be mentioned: calllog.db. During the analysis of two Samsung smartphones (SM-G935F and SM-G930F) running Android 7.0, I've found out that calllog.db also contains SMS data.

I was able to find and analyze this db because I had created a physical dump of the two non-rooted smartphones by using UFED 4PC. I haven't verified if the file can be obtained with other methods like a ADB backup.

This sqlite database can be found here:


The table calls within the database has a column named m_content. If the length of this field is greater than 0, it means that the record contains a text message (if zero, it's a phone call log).

At a quick glance, the most relevant fields in the table are:
  • number: sender's or recipient's phone number;
  • date: message date;
  • type: its value indicates if it's a sent or received message; 
  • name: contact name associated with the phone number;
  • last_modified: message date (it usually matches the date of the field date);
  • m_content: first 50 characters of SMS message body.

This is a quick query to view the contents of these fields:

SELECT number,datetime(date/1000,"unixepoch","utc") AS date,type,name,datetime(last_modified/1000,"unixepoch","utc") AS last_modified,m_content FROM calls WHERE m_content <>""

If I run the query SELECT DISTINCT type FROM calls WHERE m_content <>"", I can see that the field type has value "1" or "2". After some comparison with other messages, I deduced that "1" means "received" and "2" is "sent".

As of writing, by default the tools UFED Physical Analyzer ( and Magnet AXIOM ( - with custom artifact plugin) only extract call logs from calllog.db.

This is how I adapted the tools to my needs.

UFED Physical Analyzer

By using the internal tool SQLite Wizard within the Physical Analyzer, I created a query to do the parsing:

Select calls.number,,
From calls
Where Length(calls.m_content) > 0

I then mapped the fields by drag&drop and customized the conditions of the field type.

I ran the query and successfully retrieved 500 additional SMS records that I added to my report.

The table shows the big difference in the number of SMS found before and after parsing the file calllog.db.



Some messages were duplicates, but many others were not. This is an example sorted by message body:

While creating the query, I tried with and without the "include deleted rows" option. In the end I decided to keep it unticked since I was getting too many false results.

Anyway you can download both query versions from here.

How to import and run the query: Physical Analyzer | Tools | SQLite Wizard | Open SQLite query manager | Import | select the query file to use | Run.

Magnet Axiom

When creating a new case with Magnet AXIOM Process, I recommend to use the Dynamic App Finder ("Find more artifacts" turned ON). It's a useful feature that allows to discover additional databases that may contain relevant data. Once the search is complete, a window pops up asking to select any of the found databases and to map the needed fields.

When done, click on "Save selected artifacts". Magnet AXIOM Examine will show these custom artifacts under the category "Custom".

My custom artifact can be downloaded from here. Just import it from the menu in Magnet AXIOM Process (Tools | Manage custom artifacts | Add new custom articact) or simply copy the file to the path "C:\Program Files\Magnet Forensics\Magnet AXIOM\AXIOM Process\plugins".

Thursday, September 27, 2018

Converting from .heic to .jpg

A file with a .heic extension is an image using the new High Efficiency Image File Format (HEIF).

If somehow you've never heard about it, here are some sample files:

At the time of this writing, not all forensic tools fully support it. For instance, among the tools I have:

  • X-Ways Forensics (v19.7 SR-2) does no picture preview or metadata extraction. It has by the way a file carving algorithm for .heic images since version 19.5 Preview 2;
  • Magnet AXIOM (v2.5.1.11408) is able to preview the images but it doesn't extract (for now) any EXIF data from the HEIF format;
  • Cellebrite Physical Analyzer (v7.9.0.223) stands out from the crowd. If you have a set of images, you can easily view them and parse their metadata by simply choosing: File | Open (advanced) | Blank project | Folder | select the folder containing the pictures to analyze | Finish | Start decoding

Depending on the situation, it could be useful to use external viewers like XnView / CopyTrans HEIC or to convert .heic images to .jpg to make them "compatible" with other tools.

What follows is my method to do the conversion and can be used within a script.

Download and install ImageMagick (free - tested version on Win10: v7.0.8-12 7.0.8-28-Q16-x64-dll). Then open the command prompt and type:

magick SrcFile.heic DstFile.jpg

This will create a JPEG version of the HEIC file, but for some reason the EXIF metadata of the newly created file will be ignored and not parsed by many tools. After some trial and error, I managed to fix this issue with ExifTool (v11.10):

exiftool -overwrite_original -all= -TagsFromFile SrcFile.heic DstFile.jpg

The meaning of each option is the following:

-overwrite_original Overwrite the destination file without creating any backup copy of the destination file
-all= Strip off all metadata from the destination file
-TagsFromFile Copy all metadata tags from the SourceFile into the DestinationFile
SrcFile.heic Source file
DstFile.jpg Destination file

If you need to set the filesystem timestamps "last modified date" and "creation date" equal to the ones of the source .heic file, run ExifTool with these options:

exiftool -overwrite_original -TagsFromFile SrcFile.heic -FileModifyDate -FileCreateDate DstFile.jpg

For more details, you may check the ExifTool Documentation.

Additional resources on the HEIF format:

[UPDATE February 20, 2019]: Thanks to Phill Moore for letting me know that the metadata step described in the article is no longer needed when using the latest version of ImageMagick. Metadata is now properly added to the converted files.

Sunday, June 10, 2018

UsrClass.dat stores more history than you think

This is a quick post about two new plugins I wrote for RegRipper that will pull the following artifacts from a Windows 10 UsrClass.dat hive:

  • Microsoft Edge web history (plugin
  • Microsoft Photos recent file history (plugin

The plugins will parse the following keys:

Microsoft Edge
  • Local Settings \ Software \ Microsoft \ Windows \ CurrentVersion \ AppContainer \ Storage \ microsoft.microsoftedge_8wekyb3d8bbwe \ MicrosoftEdge \ TypedURLs
  • Local Settings \ Software \ Microsoft \ Windows \ CurrentVersion \ AppContainer \ Storage \ microsoft.microsoftedge_8wekyb3d8bbwe \ MicrosoftEdge \ TypedURLsTime
  • Local Settings \ Software \ Microsoft \ Windows \ CurrentVersion \ AppContainer \ Storage \ microsoft.microsoftedge_8wekyb3d8bbwe \ MicrosoftEdge \ TypedURLsVisitCount

Microsoft Photos
  • Local Settings \ Software \ Microsoft \ Windows \ CurrentVersion \ AppModel \ SystemAppData \ Microsoft.Windows.Photos_8wekyb3d8bbwe

Here are some output examples:

msedge_win10 v.20180610
(USRCLASS.DAT) Get values from the user's Microsoft Edge Windows App key

|-- \Local Settings\Software\Microsoft\Windows\CurrentVersion\AppContainer\Storage\microsoft.microsoftedge_8wekyb3d8bbwe
|----- \MicrosoftEdge\TypedURLs
|----- \MicrosoftEdge\TypedURLsTime
|----- \MicrosoftEdge\TypedURLsVisitCount

url1 (TypedURLs)           ->
url1 (TypedURLsTime)       -> Tue Jan  2 17:19:53 2018 (UTC)
url1 (TypedURLsVisitCount) -> 4

photos_win10 v.20180610
(USRCLASS.DAT) Get values from the user's Microsoft Photos Windows App key

Local Settings\Software\Microsoft\Windows\CurrentVersion\AppModel\SystemAppData\Microsoft.Windows.Photos_8wekyb3d8bbwe\Schemas
  PackageFullName => Microsoft.Windows.Photos_2017.37071.16410.0_x64__8wekyb3d8bbwe

Local Settings\Software\Microsoft\Windows\CurrentVersion\AppModel\SystemAppData\Microsoft.Windows.Photos_8wekyb3d8bbwe\PersistedStorageItemTable\ManagedByApp

   KeyLastWrite   : Sat Jun  9 14:50:21 2018 (UTC)
   LastUpdatedTime: Sat Jun  9 14:06:02 2018 (UTC)
   Metadata       : StartFileC:\Users\username\Desktop\3rd.jpg

## Microsoft Photos (Windows App): Recent Files ## (Tab-separated values)

StartFileC:\Users\username\Desktop\3rd.jpg       KeyLastWrite: Sat Jun  9 14:50:21 2018 (UTC)

The tests were done with registry hives exported from computers running Windows 10 version 1511 and 1709.

The scripts are available for download here on my GitHub page.

Let me know if you find any other interesting app storing history activity within this registry hive. There's more than just shellbags inside UsrClass.dat!