Symmetric IT's Tech Blog

Life's too short for complicated solutions.

Recover Missing or Lost iCloud Contacts

Recover Missing or Lost iCloud Contacts

We had a case recently where iPhone contacts that were synced to an iCloud account suddenly disappeared. During the troubleshooting process the cause was unable to be determined. Either a glitch or via user interaction somehow the some of the contacts were lost/missing.

Logging into the web based side of the iCloud account confirmed the same - missing contacts on the web interface as well. There were no local backup of the contacts unfortunately.

Thankfully Apple provides a way to recover your missing contacts:

1. Log into Go to Settings:

2. Look for "Restore Contacts" towards the bottom under Advanced:

3. Now pick a contact 'set' that you would like to restore. Apple will replace all contacts and then it should sync down to your phone after the restore.

You should in most cases have a history of contacts going back around 1 month.

Hopefully this will help if you have misplaced your contacts.

Notes on Secure Remote Access in an Age of Ransomware

Notes on Secure Remote Access in an Age of Ransomware

Remote access to your servers and other PC's is essential for effective support. The problem is implementing this in a secure manner that only gives you access and keeps everything else out. Ransomware has abused remote desktop and ports exposed to the internet as an attack vector, not just the traditional avenues such as phishing and malicious websites.

In 2017 Wannacry used publicly exposed SMB ports as its primary attack vector and not via a coordinated email campaign. Remote desktop is also often compromised by user accounts with weak passwords, the attacker would guess a common username and brute force password options on the exposed RDP connection.

Let's look at remote desktop and Teamviewer as remote access tools and how to use them securely:


RDP is still a great remote access tool but needs a few layers of security to make it safe:

  • Ideally the RDP endpoint should not be directly exposed to the internet. RDP should be accessed via a VPN. This reduces the endpoint's attack surface considerably.
  • The RDP default port (3389) should be changed. This can(and should) be done in 2 ways, the first being on the firewall, your firewall should redirect a non-default port to the endpoint. If the endpoint is a Windows machine then you can also change the listening port for RDP on the machine itself. Yes, security by obscurity is not security, it's simply cutting out the noise and adding another layer of 'security'. Changing the internal listening port also means more obscurity on the LAN side.
  • Monitor Security Event Logs for brute force attempts on your RDP endpoint. Event ID 4625 will show failed login attempts. If you have a publicly accessible RDP endpoint on the default port (3389) you will most certainly have a significant amount of failed login attempts in your logs. You'll also notice the attacker probably use different usernames such as 'guest', 'user' etc. Changing the listening port instantly cuts out almost all of the 'noise' from the internet.
  • Audit users every single user that has RDP access. Quite often it is not the administrator account that is compromised, it's usually a secondary account with a weak password. This has been the case in every single ransomware attack on servers that we have encountered.


A great tool for remote access that can be used in a variety of ways. Teamviewer has sharpened up their security recently after a breach, so this is why we are including them as an option. 2 Factor authentication as well as more control over what devices are allowed to access your Teamviewer account (all new devices should first be verified).

  1. Teamviewer Host is a great tool for unattended access. First and foremost you should have a strong password for access. Secondly 2-factor authentication should also be enabled.
  2. Teamviewer VPN combined with Remote Desktop is also a good combination. Use Teamviewer to establish a VPN to your endpoint. From there you can use traditional Remote Desktop to connect to the endpoint.

Teamviewer handles brute force attacks quite well, preventing multiple password attempts with timeouts but there are more options to configure Teamviewer to be more secure:

  • Disable the random 1 time use password
  • Setup an access Whitelist to only allow yourself
  • Set to lock computer on session finish (Options -> Advanced -> Lock Remote Computer = Always.)

From a security perspective, Teamviewer appears to be quite safe but one always has to consider future security holes that that can be abused to bypass all security layers and gain access to you or your endpoints.

In summary, the fewer ports you expose to the internet, the better. Also consider that no matter how many layers of security you implement, the possibility of software exploits always exist and these can potentially render all security moot.

Hopefully this gives you some ideas to consider when implementing a remote access strategy for your servers or PC's.

Do you use a different strategy? Been bitten once and now have experience? Let us know in the comments.

Symmetric IT provides IT and Computer Support in Auckland.

Investigating Dropbox Delta Incremental Syncing

Investigating Dropbox Delta Incremental Syncing

We will be investigating Dropbox's delta incremental syncing feature. This is the feature that only uploads the internal file changes of a big file when it has been updated. This is very efficient when working with large files that need to be synced to your cloud provider.

Delta sync advantages:

  • Saves network bandwidth
  • Saves upload time
  • Saves storage space
  • Creates file a history for recovery

A possible disadvantage is computational overheads in computing the delta differences. The advantages mostly outweigh the disadvantage though - I/O is generally more expensive than CPU cycles. 

We will look at the threshold at which Dropbox initiates the delta sync instead of uploading the entire file. For comparison we will compare this with Google drive and its mechanisms to handle and sync small changes inside large files.


TEST 1 (50MB)

We generated a 50MB file filled with random data from here ( We then dropped the file in our Dropbox folder and timed the upload until the tray icon shows all files up to date:

Upload speed is approximately 8MBit/s. The 50MB file uploaded in approximately 60 seconds.

Next we made a small internal change by changing 2 bytes inside the file using a hex editor. We then measure the time it took for Dropbox to equalize after pressing the save button. Time to equalize took 8 seconds.

It's clear that Dropbox initiates delta syncing at the 50MB level already otherwise it would have taken another 60 seconds to upload the changes.

TEST 2 (25MB)

Next we halve the file size to 25MB. The upload took 31 seconds as expected. Changing 2 bytes inside the file and saving takes 6 seconds to upload. So once again we can safely say the delta syncing is happening at 25MB level.

TEST 3 (10MB)

Next we upload a 10MB file. The upload takes 17 seconds. Still long enough to discern between full and delta uploads. The sync took 6.5 seconds. We can still assume that delta syncing happens at 10MB level.

TEST 4 (5MB)

5MB took 10.5 seconds to upload in full. The changed upload took 6 seconds. Delta sync still active at 5MB level.

TEST 5 (2MB)

For the 2MB test we will limit Dropbox to 100KB/s upload to be able to have a more accurate result between a full upload and a changed upload. As expected the upload takes longer (25 seconds). The changed upload after 2 bytes took 5 seconds. So delta sync still active.

TEST 6 (500KB)

500KB took 8.5 seconds to upload and the changed upload took 3.6 seconds.



Next we compare it to Google Drive, which is now called "Backup and sync from Google". Dropping the 25MB file into the Google folder takes 38 seconds to sync. Changing 2 bytes and saving the file takes another 39 seconds to finish syncing.


10MB took 23 seconds to upload, the changed version took 23 seconds as well.

The rest of the tests all exhibited the same results, equal full uploads and changed uploads, so we will exclude them for the sake of brevity.

To summarize:





(limit 100KB/s



Google Drive

Compressed Files Insight and Disproving Filecloud Claims

Googling 'delta sync' has a link to this blog that states that delta sync is a myth and hyperbole. It also mentions that file compression renders it useless and the only useful scenario is uncompressed files such as logs.

Lets run a test on a 30MB file with random non-repeating data inside a compressed zip file:

(Curiously, the non-repeating data means that the file is larger after compression).


The full upload took 32s.
Next we'll add a 1KB text file to the zip file, and measure the time to sync: 4s.
Next we change some random internal bytes using a hex editor and measure the sync time: 7 seconds.

This proves that delta sync can be implemented independent of file type or contents. The delta comparison most likely occurs at byte level thus it is completely file type agnostic. It simply doesn't care or matter what the contents of the file is. The advantages remain clear, disproving the above mentioned blog.


Dropbox applies a delta sync algorithm to all files. There doesn't seem to be a threshold at which it just uploads the entire file instead of implementing a delta sync. Dropbox may upload full files at an extremely small sizes (ie, 5KB or so) but at this stage delta sync becomes irrelevant.

Google drive uploads the full file every time regardless of size. This is ineffective and bandwidth intensive.

In extreme cases this becomes almost unusable. Imagine a scenario where you're storing files greater than 500mb up to a few gigabytes, any changes to those files will prompt a full upload in services such as Google Drive, Onedrive etc.

Delta sync opens up possibilities such as keeping a virtual machine image inside a cloud provider folder while keeping it in sync both local and in cloud.

Hopefully this will help you make an informed decision when choosing a cloud storage provider.


Symmetric IT is an IT support provider in Auckland.

Macrium Image Consolidation & Synthetic Backups

Macrium Image Consolidation & Synthetic Backups

We will be looking at Macrium's Synthetic Backup and Consolidation functions to find out how the settings impact actual real world backups.

Synthetic backups refer to the creation of a full backup - instead of physically creating the backup of the source the software will artificially combine and consolidate incremental backups into a full backup. This will reduce the incremental chain length and give you a new baseline full image. Synthetic backups and consolidation have some advantages:

  • Saves Time - Synthetic backups are generally quicker to perform than an actual full backup*
  • Save Storage Space - Creating full synthetic backups reduce storage requirements (merging an incremental into a full will not increase the full by the exact size of the incremental)

*In most cases the time taken for a full synthetic backup is less than taking a manual full. The I/O requirements for a manual backup generally exceeds the requirements of a synthetic backup. Conversely we have observed some cases where the synthetic takes longer and almost as long as a manual full. It's worthwhile to monitor the merging times vs manual fulls to optimize your backup strategy.


For our first scenario we will be using Macrium to run a full backup image with 5 incremental backups. We will show the implications and considerations of using a fixed length incremental chain. The base image is taken of a server with a used hard drive size of 80GB. The resulting full backup on medium compression came to 51GB.

The settings we used to make backup are as in the screenshot below:

Notice the amount of incrementals to keep is set at 5. When the 6th backup is run the following happens: incremental 2 will be merged into incremental 1 forming a new incremental 1(but named 02-02 in explorer). A new 5th incremental will join the end of the chain. Running Backup 7 will merge incremental 3 into the newly formed incremental 1 (02-02), once again a new incremental added to the end.

Visually, this is what happens, illustrating the 6th and 7th incremental backup:

Here is what the backup folder would look like after the 6th backup and first consolidation:

Notice how the first incremental 01-01 is  gone and replaced by 02-02. The base image remains untouched. The first incremental (into which all others will merge) grows linearly according to each individual incremental size.

This is an effective backup plan when you need to have a reference base image with a set incremental chain length which would correspond to your required recovery targets (how far back would you need to be able to recover, in this case 5 days from the last backup or the first day of the original backup). Aspects to take note of:

  1. The base image remains anchored in time, its creation date does not change and remains a reference point if you need to restore data from the original image.
  2. The first incremental grows in linearly in proportion to the individual incremental sizes when they merge.
  3. At some stage the first incremental will exceed the size of the base image.
This what the folder looks like after the 7th backup:

Note that the first incremental (03-03) has now grown in size in proportion to the 2 incrementals that merged to form it.

A situation to consider is when the merged incremental nears the size of the base image. This situation is less than ideal because dealing with such a large incremental when doing a full restore becomes cumbersome and will increase the restore time.

At this stage you can consider doing a Synthetic Full backup. You can enable this by ticking the "Create a Synthetic Full If possible" option:

In this case the first incremental will merge into the base image. The synthetic full took only a few minutes, much faster than performing a normal full.

Here is a visual representation:

This is what the folder looks like afterwards:

The large (400mb) first incremental from before is merged into the base image freeing up disk space. The base image size only increases slightly (less than 400MB) due to data inside the image being overwritten and not appended.

As a result of the synthetic backup the base image date has now moved forward in time. Subsequent backups will keep moving the base image forward in time.

Differential Backups

Next we'll look at how differential backups affect the incremental queue.

We will keep the backup definition as is but add a differential at the end.

Next we run run 5 incrementals. This will merge all the initial 5 incrementals into the base image and create 5 new ones after the differential. The differential will now be directly after the base image:

Notable points:

  • The 5 new incrementals are now dependent on the differential
  • You will not be able to do a Synthetic Full anymore - the differential can't merge into the base images
  • All subsequent incrementals will be merged and consolidated into each other but not the differential
  • Differentials act as 'checkpoints' that can't be moved unless deleted (note that the incremental chain that's dependent on the differential will also be deleted)

In summary, Macrium's synthetic backups and consolidation gives you immense flexibility when designing your backup regime. Backup plans are often a balance between storage space and the required restore history so this may help in optimizing your strategy.

Extra Reading :

Macrium has a standalone consolidation application that can be used to merge backups independently. This useful when archiving backups or when managing backups away from the backup source.

Outlook Search Broken After Windows 10 1709 Update

Outlook Search Broken After Windows 10 1709 Update

This is an odd one but an easy fix at least. It presents as Outlook instant search not working anymore (ie, it manually searches through the mailbox folder). You can try rebuilding the index, changing the index location, creating a new outlook profile, repair install Office and even a different Windows profile. All will fail.

The  case I had was with Outlook 2007 on Windows 10 1709. More specifically when checking "Tools" then "Options" then "Search Options" you notice that the box under "Index messages in these data files" is empty/blank.

Outlook does find any any pst's to index or search. Upgrading to Office 2010 did resolve the problem but quite often that's not possible, it's not a satisfying solution either.

Adding the following registry key will sort it:

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\Microsoft\Windows\Windows Search\Preferences]

Copy and paste into a text file, then change the file extension to .reg. Then right click and select Merge.