Posts

Use Face ID While Wearing a Mask in iOS 15.4

Shortly after the start of the COVID-19 pandemic, Apple made it so your Apple Watch could unlock your Face ID-enabled iPhone when you were wearing a mask. Starting in iOS 15.4, the company has taken the next step and enabled Face ID on the iPhone 12 and later to work even when you’re wearing a mask. If you didn’t already set up Face ID with a mask after updating to iOS 15.4, go to Settings > Face ID & Passcode and enable Face ID with a Mask. You’ll have to run through the Face ID training sequence again, and more than once if you sometimes wear glasses, but it’s quick and easy. Face ID may not work quite as well when you’re wearing a mask, and it doesn’t support sunglasses, but it’s way better than having to enter your passcode whenever you’re masked.

(Featured image by iStock.com/Prostock-Studio)

Export Passwords from Safari to Ease the Move to a Password Manager

Although Apple has improved the built-in password management features in macOS and iOS (you can now add notes to password entries!), third-party password managers like 1Password and LastPass are still more capable. For those still getting started using a password manager, another new capability will ease the transition: Safari password export. To export a CSV file of your Safari passwords, choose Safari > Preferences > Passwords, and enter your password when prompted. From the bottom of the left-hand sidebar, click the ••• button, choose Export All Passwords, and save the Passwords.csv file to the Desktop. After you import the file into 1Password (instructions), LastPass (instructions), or another password manager, be sure to delete the exported file and empty the trash.

(Featured image by iStock.com/metamorworks)

Plan for the Future by Establishing a Legacy Contact

Have you heard the expression “hit by a bus”? It’s a somewhat macabre attempt to inject a little levity into planning for the unthinkable event of dying without warning. No one expects to be hit by a bus, but people do die unexpectedly in all sorts of ways. That’s terrible, of course, but it’s also incredibly hard on that person’s family, who suddenly must deal with an overwhelming number of details. Many of those details revolve around the deceased’s digital life—devices, accounts, passwords, subscriptions, and more.

We strongly encourage everyone, regardless of age or infirmity, to think about what your family would want and need to do with your digital presence in the event of your death. The ultimate guide to this topic is Joe Kissell’s book Take Control of Your Digital Legacy, although the current version is a little out of date and is slated for updating in 2022.

The next edition of that book will undoubtedly discuss Apple’s new Legacy Contact feature, introduced in iOS 15.2, iPadOS 15.2, and macOS 12.1 Monterey. It enables you to specify one or more people as a Legacy Contact. Should you die unexpectedly, those people can use an access key along with your death certificate to access much of your Apple content and remove Activation Lock from your devices. (If you have time to prepare for your passing, it’s easier to share all your passwords and passcodes explicitly.) The person or people you set as Legacy Contacts don’t have to be running Apple’s latest operating systems or even be Apple users, though it’s easier if they are. (Like so many other things in life.)

Don’t put off specifying someone as a Legacy Contact, whether it’s a family member or close friend. The entire point of the “hit by a bus” scenario is that it’s both unexpected and could happen at any time. (It’s possible to get access without being a Legacy Contact, but it requires a court order and will undoubtedly be significantly more work.)

Apple provides good directions for the Legacy Contact feature, and while we’ll summarize the steps below, read Apple’s documentation to get the word from the horse’s mouth. Apple’s support pages include:

What Data Can a Legacy Contact Access?

Apple has the full list at the link above, but in short, a Legacy Contact can access anything stored in iCloud, including photos, email, contacts, calendars, messages, files, and more, as well as the contents of iCloud Backup. Not included are licensed media (music, movies, and books), in-app purchases (upgrades, subscriptions, and game currency), payment information (Apple ID payment info or Apple Pay cards), and anything stored in the account holder’s keychain (usernames and passwords, credit card details, and more). A Legacy Contact cannot access the deceased’s devices—Apple is incapable of sharing passcodes. However, Apple can remove Activation Lock so those devices can be erased and reused.

How Do You Add a Legacy Contact?

Adding someone as a Legacy Contact is easy. You must be running iOS 15.2, iPadOS 15.2, or macOS 12.1 Monterey to initiate the process, and two-factor authentication must be turned on for your Apple ID (this is a very good idea anyway).

On an iPhone or iPad, go to Settings > Your Name > Password & Security > Legacy Contact > Add Legacy Contact. On a Mac, use System Preferences > Apple ID > Password & Security > Legacy Contact > Manage. You can choose a group member if you’re in a Family Sharing group or pick someone from your contacts list.

As part of the process of picking someone, Apple allows you to share the access key via Messages if they’re running iOS 15.2, iPadOS 15.2, or macOS 12.1 Monterey. If they accept, a copy of the access key will automatically be stored in their Apple ID settings. If they’re not running a necessary operating system or don’t use an Apple device, you can instead print out an access key QR code and give that to them. You might also want to print a copy to store with your will and other important documents.

It may often be appropriate to act as a Legacy Contact for the people you’re asking to be your Legacy Contacts, particularly with spouses or adult children.

How Does a Legacy Contact Request Account Access?

Let’s assume the worst and pretend ​​that someone who has added you as a Legacy Contact has passed away. To request access to their Apple ID, you need the access key that the person shared with you and a copy of their death certificate. You can find the access key on an iPhone or iPad in Settings > Your Name > Password & Security > Legacy Contact > Contact’s Name, and on the Mac in System Preferences > Apple ID > Password & Security, where you click Manage next to Legacy Contact settings and then Details next to the person’s name. It’s also possible that the person shared the access key as a document stored with their estate planning documents.

The screens that provide the access key also have a Request Access link. Tap or click that and follow the instructions to upload the death certificate. If you don’t have an appropriate Apple device, you can also do this on the Web at Apple’s Digital Legacy – Request Access page.

Apple evaluates all access requests to make sure they’re legitimate, and once approved, sends you an email with more details and instructions. That email will also include a special Legacy Contact Apple ID that replaces the deceased’s previous Apple ID. You can use that Apple ID to log in to iCloud.com or download data at privacy.apple.com, sign in to an Apple device, or restore an iCloud backup to another Apple device. Having an access request approved also removes Activation Lock from the deceased’s Apple devices so you can restore them to factory settings and set them up again, either fresh or with the Legacy Contact’s Apple ID’s data.

The main limitation is that the Legacy Contact Apple ID is good only for 3 years, after which the legacy account is permanently deleted. So be sure to download everything important fairly quickly—don’t just keep using the Legacy Contact Apple ID or assume that you’ll be able to go back to it at any time.

We sincerely hope that you never have to act as Legacy Contact for a loved one, but we can say from experience that this new feature can only help make an already stressful time more manageable.

(Featured image by iStock.com/Olga Serba)


Social Media: Apple’s new Legacy Contact feature makes it simpler for you to give a family member access to your iCloud data after your death. Read on to learn how to make someone a Legacy Contact or what to do if you are a Legacy Contact.

Avoid Unusual Top-Level Domains in Custom Domain Names

Remember the heady dotcom days, when businesses were desperate to get a short, memorable, easily typed .com domain? It quickly became difficult to get what you wanted—so much so that deep-pocketed companies paid exorbitant sums for just the right domain.

Before we go any further, let’s make sure we’re all on the same page. Domain names are necessary because computers on the Internet are all identified by inscrutable numeric IP addresses. You can remember and type apple.com easily; 184.31.17.21 not so much. Domain names have two or more parts: the top-level domain (read from the end, such as com) and the second-level domain (like apple), plus optional third-level domains (which could give you support.apple.com).

Since the days of speculating in .com domains, however, hundreds of additional top-level domains have been opened up, including domains from .aaa to .zone. There are now top-level domains for .doctor, .florist, .lawyer, and many more, including the general .xyz. It might be tempting to switch from the awkward dewey-cheatham-howe.com to the shorter and more memorable dch.lawyer. And even if there isn’t a profession-specific top-level domain that works for you, you may think that if abc.xyz is good enough for Google’s parent company Alphabet, surely it’s good enough for you.

Alas, much as we appreciate the creativity and flexibility offered by these alternative top-level domains, we’d like to dissuade you from using one, if possible. Problems include:

  • Email deliverability: If you’re sending email using an alternative top-level domain or including links to that domain, it’s much more likely that your email will be considered spam by receiving systems.
  • SMS deliverability: Some SMS text message providers will automatically delete messages containing URLs with alternative top-level domains in an effort to protect their customers from phishing attacks.
  • Social media spam filtering: As with SMS text messages, social media posts that include URLs with alternative top-level domains may be categorized as spam or as linking to a malicious site.
  • Firewall blocking: Abuse of alternative top-level domains has become so commonplace by scammers that some companies prevent their employees from accessing websites using certain alternative top-level domains at the firewall level.
  • User perception: Although there’s no telling how anyone will react to a particular top-level domain, people won’t think twice about .com but might think .ooo seems sketchy. (We would.)

Obviously, it may not be possible to get the domain name you want in .com. What to do? There are a few strategies:

  • Expand or abbreviate: At this time, people mostly don’t see, remember, or type domains apart from those that go with businesses that do a lot of real-world advertising. So if you need to add or subtract words (or letters) in your domain to find a unique one, that can work.
  • Use a country domain: Two-letter top-level domains are restricted for use by countries, so .us is for the United States, .ca for Canada, and .au for Australia. Every country has different rules for who can register them. For instance, it’s possible to get a domain ending in .it (Italy) as long as you work through a registrar that acts as your representative there. .io (British Indian Ocean Territory) and .ai (Anguilla) are popular top-level domains among tech companies.
  • Stick with better, pricier alternatives: Not all alternative top-level domains are equally problematic. The classic .net and .org are fine, and .biz isn’t bad. But how to determine that? When you’re checking to see if a domain name is available, compare prices. For instance, at one domain name registrar, iphonewhisperer.xyz costs only $1 per year, whereas the iphonewhisperer.biz version is $4.98 per year, iphonewhisperer.net is $9.18 per year, and iphonewhisperer.studio is $11.98 per year. The more you pay, the less likely that domain has been abused by spammers and marked for filtering.

In the end, when it comes to domain names, it’s best to be conservative and stick with a top-level domain that won’t cause people or filters to think twice. That’s probably .com, if you can make the rest of the name work for you.

(Featured image by iStock.com/BeeBright)


Social Media: Tempted to get a short, memorable domain name ending in .xyz or .shop? As we explain, that’s a bad idea if you care about user perception, email and text message deliverability, and not being blocked by social media and firewalls. Details at:

About That Worrying Message Saying Your Password Has Been Breached…

In iOS 14, Apple added a feature that warns you when one of your website passwords stored in iCloud Keychain has appeared in a data breach. We’ve fielded some questions of late from people worrying if the message is legitimate, and if so, what they should do. What has happened is that online criminals have stolen username and password data from a company, and your credentials were included in that data breach. You should indeed change your password immediately, and it’s fine to let the iPhone suggest a strong password for you. Or, if it makes you feel more comfortable, you can usually change the password in Safari on your Mac instead. Either way, make sure it’s unique—never reuse passwords across multiple sites!

(Featured image by iStock.com/LumineImages)

Frequently Asked Questions Surrounding Apple’s Expanded Protections for Children

Apple’s recent announcement that it would soon be releasing two new technologies aimed at protecting children has generated a firestorm of media coverage and questions from customers. Unfortunately, much of the media coverage has been based on misconceptions about how the technology works, abetted by uncharacteristically bungled communications from Apple. It’s not inconceivable that Apple will modify or even drop these technologies in the official release of iOS 15, iPadOS 15, and macOS 12 Monterey, but in the meantime, we can provide answers to the common questions we’ve been hearing.

What exactly did Apple announce?

Two unrelated technologies:

  • Messages will gain features that warn children and their parents when sexually explicit photos are received or sent. Such content will be blurred, the child will be warned and given the option to avoid viewing the image, and parents may be alerted (depending on the age of the child and settings).
  • Photos uploaded by US users to iCloud Photos will be matched—using a complex, privacy-protecting method that Apple has developed—against known illegal photos considered Child Sexual Abuse Material, or CSAM. If a sufficient number of images match, they’re verified by a human reviewer at Apple to be CSAM and then reported to the National Center for Missing and Exploited Children (NCMEC), which works with law enforcement in the US.

Does this mean Apple is scanning all my iPhone photos?

Yes and no. Messages will use machine learning to identify sexually explicit content in received and sent images. That scanning takes place entirely on the iPhone—Apple knows nothing about it, and no data is ever transmitted to or from Apple as a result. It’s much like the kind of scanning that Photos does to identify images that contain cats so you can find them with a search. So scanning is taking place with this Messages feature, but Apple isn’t doing it.

The CSAM detection feature operates only on images uploaded to iCloud Photos. (People who don’t use iCloud Photos aren’t affected by the system at all.) On the device, an algorithm called NeuralHash creates a hash and matches it against an on-device database of hashes for known illegal CSAM. (A hash is a one-way numeric representation that identifies an image—it’s much like how a person’s fingerprint identifies them but can’t be used to re-create that person.) NeuralHash knows nothing about the content of any image—it’s just trying to match one hash against another. In this case, it’s matching against existing image hashes, not scanning for a type of content, and Apple is notified only after enough image hashes match.

It’s also important to note that this is different from how companies like Facebook, Google, and Microsoft scan your photos now. They use machine learning to scan all uploaded photos for CSAM, and if they detect it, they’re legally required to report it to the NCMEC’s CyberTipline, which received 21.7 million CSAM reports from tech companies in 2020, over 20 million from Facebook alone. Because Apple does not scan iCloud Photos in the US like other companies scan their photo services, it made only 265 reports in 2020.

What happens if the CSAM detection feature makes a mistake?

This is called a false positive, and while vanishingly improbable, it’s not mathematically impossible. Apple tested 100,000,000 images against NeuralHash and its CSAM hash database and found 3 false positives. In another test using 500,000 adult pornography images, NeuralHash found no false positives.

Even if NeuralHash does match an image hash with one in the known CSAM hash database, nothing happens. And nothing continues to happen until NeuralHash has matched 30 images. Apple says that the chances of there being 30 false positives for the same account are 1 in 1 trillion.

I have terrible luck. What if that happens with my account?

Once at least 30 images have matched, the system enables Apple to decrypt the low-resolution previews of those images so a human can review them to see if they are CSAM. Assuming they are all false positives—remember that possession of CSAM is illegal in the US—the reviewer sends them to Apple engineers to improve the NeuralHash algorithm.

Could non-CSAM images end up in Apple’s CSAM hash database?

It’s extremely unlikely. Apple is constructing its database with NCMEC and other child-safety organizations in other countries. Apple’s database contains image hashes (not the actual images; it’s illegal for Apple to possess them) for known illegal CSAM images that exist both in the NCMEC database and at least one other similar database. So multiple international organizations would have to be subverted for such image hashes to end up in Apple’s database. Each source database will have its own hash, and Apple said it would provide ways for users and independent auditors to verify that Apple’s database wasn’t tampered with after creation.

Plus, even if a non-CSAM image hash were somehow added to Apple’s database and matched by NeuralHash, nothing would happen until there were 30 such images from the same account. And if those images weren’t CSAM, Apple’s human reviewers would do nothing other than pass the images to engineering for evaluation, which would likely enable Apple to determine how the database was tampered with.

Couldn’t a government require Apple to modify the system to spy on users?

This is where much of the criticism of Apple’s CSAM detection system originates, even though Apple says the system will be active only in the US. On the one hand, Apple has said it would resist any such requests from governments, as it did when the FBI asked Apple to create a version of iOS that would enable it to break into the San Bernardino shooter’s iPhone. On the other hand, Apple has to obey local laws wherever it does business. In China, that already means that iCloud is run by a Chinese company that presumably has the right to scan iCloud Photos uploaded by Chinese users.

It’s conceivable that some country could legally require Apple to add non-CSAM images to a database, instruct its human reviewers to look for images the country finds objectionable, and report them to law enforcement in that country. But if a country could successfully require that of Apple, it could presumably force Apple to do much more, which hasn’t happened so far. Plus, the CSAM detection system identifies only known images—it’s not useful for identifying unknown images.

Is Apple heading down a slippery slope?

There’s no way to know. Apple believes this CSAM detection system protects the privacy of its users more than scanning iCloud Photos in the cloud would, as other companies do. But it’s highly unusual for a technology that runs on consumer-level devices to have the capacity to detect criminal activity.

(Featured image by iStock.com/metamorworks)


Social Media: Apple’s recently announced expanded protections for child safety have generated a firestorm of criticism and confusion. We attempt to answer some of the most common questions we’ve received.

Disable Unused Sharing Options on Your Mac If You’re Not Using Them

Many security breaches—even high-profile ones—stem from simple oversight. There’s one spot in macOS that has long been particularly susceptible to such lapse: the Sharing pane of System Preferences. In it, you can enable a wide variety of sharing services, some of which could allow another user to access your Mac remotely. They all let you limit access to particular users, but passwords can be stolen, accounts can be compromised, and server software can have bugs. For safety’s sake, if you’re not actively using a sharing service, turn it off. The most important ones to disable when not in use are Screen Sharing, File Sharing, Remote Login, Remote Management, and Remote Apple Events. We also caution against leaving Printer Sharing and Internet Sharing on unnecessarily.

(Featured image by Morgane Perraud on Unsplash)

Intuit Has Stopped Updating the QuickBooks Online Mac App; Switch to a Web Browser

If you’re using QuickBooks Online with the service’s Mac app to manage your business’s accounting, you may have seen a message like the one below announcing that Intuit has stopped updating the QuickBooks Online app. This doesn’t affect your QuickBooks Online account, which you can and should use via a Web browser at qbo.intuit.com now. Even if the QuickBooks Online Mac app continues to work, which it likely will for some time, we recommend that you delete it and switch entirely to a Web browser. It’s not safe to use an unsupported app for financial records because Intuit won’t be fixing any security vulnerabilities going forward.

(Featured image based on an original by RODNAE Productions from Pexels)

Don’t Store Confidential Files in Online File Sharing Services

Given their integration into the Mac’s Finder, it can be easy to forget that online file sharing services like Dropbox, Google Drive, iCloud Drive, and Microsoft OneDrive can be accessed using a Web browser by anyone with your username and password. Obviously, you should always have strong, unique passwords, but to be safe, it’s best not to use services designed for public file sharing to store unencrypted files containing sensitive information like credit card numbers, Social Security numbers, passport scans, privileged legal documents, financial data, and so on. Keep such data secure on your Mac—outside of any synced folders—where accessing it requires physical access to the machine.

(Featured image based on an original by Kenaz Nepomuceno from Pexels)

When Asking about Phishing Email, Make Sure to Write Separately Too

Sadly, email is not an entirely reliable communications medium, thanks to spam filters, addressing errors, and server failures. With certain types of email, it’s worth double-checking that a message was seen. One example of that we see is with reports of phishing email, which miscreants use to try to trick you into revealing passwords, credit card info, or other sensitive information. Phishing messages can be tricky to identify—that’s their goal. If you’re forwarding a possible phishing email to us or another trusted technical contact for evaluation, remember that spam filters often catch such messages, so they may go unseen. To work around this awkwardness, send a separate message saying you’ve forwarded what you think might be a phishing message so the recipient knows to check their Junk mailbox if need be. It’s helpful if you can include the Subject line of the suspect message.

(Featured image by Mikhail Nilov from Pexels)