Cloud and Software Architecture, Soft skills, IOT and embedded
Why are there 4 different windows backup tools?
Get link
Facebook
Twitter
Pinterest
Email
Other Apps
I do not understand how any mere mortal has any idea what data protection backup tool to pick or how to balance the risks. The Windows backup/data protection story is a sorry mess. Windows 11 and 10 have at least 4 different ways to protect or backup your files. Who thought this was a good idea? Why isn't there a Microsoft-provided capabilities matrix? I understand that these are legacy but someone should have at least pulled this stuff together and put a pretty cover page on it.
Some of these are traditional backup tools. Others are network drives or network drive mirrors where files in the mirror are deleted when you delete the file locally. This can leave you one action away from losing data depending on how the trash folder operates.
A backup is an off-drive copy of your data that can be recovered when your hard drive goes bad, a machine is damaged/stolen or you accidentally delete a file. Backups are usually saved on network shares, cloud drives, or removable media.
Microsoft
This is just a summary of the available tools. I have no real details about their strengths or weaknesses are relative to each other.
Windows Backup
This backs files in the cloud. Multiple machines can be backed up to a single cloud. You can recover files from the cloud.
I have no idea what the backup strategy is, full, incremental, rolling, single-shot, etc.
OneDrive can save your local files in the cloud. It is a mirror rather than a backup. The backup reflects the files since the last sync and is overwritten as part of the sync process. If you delete a file on your machine then it is deleted from the cloud. There is a trashcan but the data is gone after some time, I believe it is 30 days.
One drive also hacks your explorer links. Your Documents folder is moved from being local to being a replicated one-drive directory with the same name. It is really hard to undo this replication and get your files back in their original director. I'm happy to have someone prove me wrong.
This is a file system-based tool. It can be used to keep versions and file history on an on-prem storage device like a drive, SD card, or NAS device.
I have no idea when to use this. The control panel interface tells me this is deprecated and on life support.
In the past, I have used this to back up to my local NAS.
Other Packages
There are plenty of other options. I've used both of these with good success.
Duplicati
This tool creates AES-encrypted backups and stores them on a variety of cloud platforms. I've used this tool for years for file-based backups. Duplicati Web Site Duplicati spins up a web server and provides a web interface. Duplicati supports the following backends.
Local folder or drive
FTP
FTP (Alternative)
OpenStack Object Storage / Swift
S3 Compatible
SFTP (SSH)
WebDAV
Azure blob
B2 Cloud Storage
Box.com
Dropbox
Google Cloud Storage
Google Drive
Jottacloud
Mega.nz
Microsoft Office 365 Group (Microsoft Graph API)
Microsoft SharePoint v2 (Microsoft Graph API)
Microsoft SharePoint (Microsoft.SharePoint.Client API)
Microsoft OneDrive v2 (Microsoft Graph API)
Microsoft OneDrive (LiveConnect API)
Microsoft OneDrive for Business (Microsoft.SharePoint.Client API)
Rackspace Cloudfiles
Rclone
Sia Decentralized Cloud
Tahoe-LAFS
Storj (ex Tardigrade) Decentralized Cloud Storage
Tencent COS
Acronis True Image
I have legacy licenses of this floating and still use it on critical machines. to create disk images or to do c:/ backups. Acronis True Image website.
I do a lot of my development and configuration via ssh into my Raspberry Pi Zero over the RNDIS connection. Some models of the Raspberry PIs can be configured with gadget drivers that let the Raspberry pi emulate different devices when plugged into computers via USB. My favorite gadget is the network profile that makes a Raspberry Pi look like an RNDIS-attached network device. All types of network services travel over an RNDIS device without knowing it is a USB hardware connection. A Raspberry Pi shows up as a Remote NDIS (RNDIS) device when you plug the Pi into a PC or Mac via a USB cable. The gadget in the Windows Device Manager picture shows this RNDIS Gadget connectivity between a Windows machine and a Raspberry Pi. The Problem Windows 11 and Windows 10 no longer auto-installs the RNDIS driver that makes magic happen. Windows recognizes that the Raspberry Pi is some type of generic USB COM device. Manually running W indows Update or Update Driver does not install the RNDI
The Windows Subsystem for Linux operates as a virtual machine that can dynamically grow the amount of RAM to a maximum set at startup time. Microsoft sets a default maximum RAM available to 50% of the physical memory and a swap-space that is 1/4 of the maximum WSL RAM. You can scale those numbers up or down to allocate more or less RAM to the Linux instance. The first drawing shows the default WSL memory and swap space sizing. The images below show a developer machine that is running a dev environment in WSL2 and Docker Desktop. Docker Desktop has two of its own WSL modules that need to be accounted for. You can see that the memory would actually be oversubscribed, 3 x 50% if every VM used its maximum memory. The actual amount of memory used is significantly smaller allowing every piece to fit. Click to Enlarge The second drawing shows the memory allocation on my 64GB laptop. WSL Linux defaults to a maximum RAM size of 5
The Apache Tika project provides a library capable of parsing and extracting data and meta data from over 1000 file types. Tika is available as a single jar file that can be included inside applications or as a deployable jar file that runs Tika as a standalone service. This blog describes deploying the Tika jar as an auto-scale service in Amazon AWS Elastic Beanstalk. I selected Elastic Beanstalk because it supports jar based deployments without any real Infrastructure configuration. Elastic Beanstalk auto-scale should take care of scaling up and down for for the number of requests you get. Tika parses documents and extracts their text completely in memory. Tika was deployed for this blog using EC2 t2.micro instances available in the AWS free tier. t2.micro VMs are 1GB which means that you are restricted in document complexity and size. You would size your instances appropriately for your largest documents. Preconditions An AWS account. AWS access id and secret key.
Comments
Post a Comment