Posts

The Plug Is Mightier Than the Puck: Wireless Charging Is Wildly Inefficient

In 2017, Apple added support for Qi wireless charging to the iPhone 8 and iPhone X, and with the iPhone 12 lineup, it introduced its own MagSafe wireless charging technology. There’s no denying the convenience of wireless charging, but keep in mind that it’s extremely inefficient compared to wired charging. Individually, that may not matter much when you’re charging overnight from a wall-connected charger. But across billions of phones, it’s more problematic. One estimate suggests that wireless charging requires nearly 50% more power than cable. And if you’re charging from a wireless battery pack, wasting that juice means less of a top-up before exhausting the battery pack. Charging speed suffers too. In short, to charge your iPhone quickly and efficiently, whether from a wall-connected charger or a battery pack, stick with the traditional Lightning cable.

(Featured image by iStock.com/grinvalds)

Name That Tune with Siri or Control Center

Don’t you hate it when a familiar song is playing but you can’t think of what it’s called? Or worse, when you hear a new track you really like but have no one to ask what it is? Never worry about that again, thanks to your iPhone or iPad. Back in 2018, Apple bought the music identification app Shazam and has since integrated it into iOS. You can still use Shazam, but it’s easier to ask Siri, “What’s playing?” or tap the Music Recognition button in Control Center (add it in Settings > Control Center) and then let your iPhone listen to the music for a few seconds. Siri is easiest, but the Control Center button is perfect in situations where you’d prefer to keep your question quiet. The music recognition feature recognizes only recorded music—no high school glee club versions, sorry—and while not perfect, is often helpful. Tap the notification that appears to open the song in Apple Music.

(Featured image by Laura Balbarde from Pexels)

The Ten Upcoming Mac/iPhone/iPad Features We Think You’ll Most Like

At its Worldwide Developer Conference keynote on June 7th, Apple shared details about what we can expect to see later this year in macOS 12 Monterey, iOS 15, iPadOS 15, watchOS 8, tvOS 15, and HomePod Software 15. It was a firehose of announcements, but one thing became clear: Apple wants to spread its technologies across its entire ecosystem of devices. Although each platform—Mac, iPhone, iPad, Apple Watch, Apple TV, and HomePod—retains its unique qualities, nearly every feature that the company announced works across as many platforms as make sense.

Before we get into the ten features that we think you’ll most like when everything ships in September or October, we should note that Apple was surprisingly silent on one topic: future Apple silicon chips. Many observers had expected Apple to announce an M1X or M2 chip that would power professional laptop and desktop Macs. We’ll have to satisfy ourselves with the impressive performance of the M1-based Macs we have now and wait a little longer for whatever comes next.

On to the hot new features!

Account Recovery and Legacy Contacts Simplify Recovering Account Data

It’s all too common that people forget their Apple ID passwords and can’t access their accounts. Apple hopes to make that a little less stressful with Account Recovery Contacts. Specify someone as your Account Recovery Contact, and they’ll be able to help you reset your password and regain access to your account, with no need to call us or Apple for assistance.

Also welcome will be the addition of Legacy Contacts. Once this feature is available, everyone should make sure they have appropriate family members or friends set as Legacy Contacts. Then, in the event of your untimely death, your Legacy Contacts can access your account and personal information. Using Legacy Contacts will be far easier than having to provide the legal paperwork to Apple to request access to a deceased family member’s accounts.

FaceTime Gains Features That Make It Competitive with Zoom

During the last year, we’ve all spent vastly more time in videoconferencing apps for work, school, and socializing. Alas, Apple’s FaceTime has been a weak entry in that market. With the features Apple is now promising, however, it should compete well with the likes of Zoom, Skype, and Google Meet. FaceTime will finally get a standard grid view, blur your backgrounds with Portrait mode, and offer two microphone modes: Voice Isolation to cut down on background noise (for standard meetings) and Wide Spectrum to leave ambient sound unfiltered (for performances, say). FaceTime will even be able to alert you when you’re talking but muted.

More important yet is the fact that you’ll finally be able to invite Windows and Android users to FaceTime calls using standard Web links. Non-Apple users will have to use a Chrome-based browser like Google Chrome, Microsoft Edge, or Brave. Plus, when you create an event in Calendar, you’ll be able to make a Web link for the call that you can share. And when it’s time for the call, a Join button makes it easy to get in.

Universal Control Lets Macs and iPads Share a Keyboard and Pointing Device

With Sidecar in macOS 10.15 Catalina and iOS 13, Apple made it so you could use an iPad as a secondary screen for a Mac. In macOS 12 Monterey and iPadOS 15, Apple is taking that concept further. With Universal Control, if you merely set a Mac and an iPad next to each other, you’ll be able to use the Mac’s keyboard and mouse or trackpad to work between the two devices (in fact, Universal Control supports up to three). No setup is required—just move your pointer to the edge of the Mac screen and push it “through” the edge to move it to the iPad screen. You can even drag and drop content between devices.

Live Text Lets You Work with Text in Images

Have you ever taken a photo of something just to capture a phone number or address? We have, for sure. Apple’s new Live Text feature treats text in images just like text you type, so you can use functions like copy and paste, lookup, and translate. Live Text will work in Photos, of course, but also in Quick Look, Safari, and Screenshot, and in live Camera previews on the iPhone. It’s an impressive use of image recognition technologies.

Along the same lines, in Photos, you’ll also be able to use the information button on any photo to highlight recognized objects and scenes and get additional information about them. Apple says you’ll be able to learn more about popular art and landmarks, plants and flowers, books, and pet breeds.

Siri Gets Faster, More Reliable, More Private, and More Useful

Thanks to the ever-increasing power of the Neural Engine in Apple devices, Apple says it will bring all processing of Siri requests onto your device. That may not sound like a big deal, but it means that Siri should work faster, more reliably, and more privately. It will be faster because there’s no need to send speech to and from Apple’s servers for processing. It will make Siri work more reliably when your iPhone doesn’t have strong cell service and enable offline support for many types of requests. And Apple won’t know what you’re saying at all.

Other Siri improvements will include the capability to announce reminders when you’re wearing AirPods, improved conversation context so you can refer to what you just asked, and support for controlling HomeKit devices at specific times. HomeKit developers will even be able to add Siri support to their products through a HomePod.

Improved Multitasking Controls Come to the iPad

The big problem with Apple’s multitasking options on the iPad has been remembering how to use them. With iPadOS 15, Apple hopes to solve that with a new menu that will appear at the top of apps, with buttons for entering full screen, Split View, or Slide Over.

Apple also added a new multiwindow shelf that appears at the bottom of the screen at launch and provides a Dock-like view of all the open windows in that app. If you ignore it, it fades away quickly, but it should help you remember which windows you have open and access them quickly.

The iPad Finally Gets the App Library and Home Screen Widgets

Last year, in iOS 14, Apple introduced the App Library and Home Screen widgets. The App Library holds all your apps so you can declutter your life by removing them from the Home Screen. And Home Screen widgets let you add app-specific widgets that provide at-a-glance information. Sadly, iPadOS 14 didn’t include those features.

iPadOS 15 rectifies that oversight, adding both the App Library and Home Screen widgets, complete with some larger widget sizes for the larger iPad screen. They’ll work just like on the iPhone. It’s about time!

Locate Lost AirPods Pro and AirPods Max with Find My Network Support

As it stands now, you can theoretically find AirPods using the Find My app. However, it shows only the last position of the AirPods at a general level, and you have to get within range of them to play a sound. In the future, however, the AirPods Pro and AirPods Max will support the Find My network, so other people’s devices can report their location generally, and once you get within Bluetooth range, you can play a sound to locate them.

Hopefully, that will happen less often thanks to new separation alerts that, when enabled, will alert you when you leave an Apple device, AirTag, or Find My-compatible item behind.

Private Relay Protects Safari Traffic for iCloud+ Subscribers

Apple has been adding lots of privacy-protecting features over the past few years, but Private Relay goes even further to ensure that even your ISP can’t track where you go on the Web and sell that data to advertisers. Private Relay encrypts your Safari traffic and passes it through two Internet relays. No one—not even Apple—can then use your IP address, location, and browsing activity to create a detailed profile of you. Everyone who pays for extra iCloud storage will transition to the new iCloud+ for the same cost and will get Private Relay for no additional fee.

While we’re talking about iCloud, Apple also says that you’ll be able to get custom domain names for iCloud Mail addresses and invite family members to use the same domain with their iCloud Mail accounts.

Use AirPlay to Send Audio or Video to Your Mac

Many people have discovered how neat it is to use AirPlay to display photos or videos from an iPhone or iPad on a TV attached to an Apple TV. Macs could also broadcast their displays to an Apple TV. But what you couldn’t do is use AirPlay to send audio or video from another Apple device to a Mac. With macOS 12 Monterey, that will become possible, enabling you to use a Mac’s large screen to play a video, share a Keynote presentation, and more.

Apple’s upcoming operating system releases boast many other new features, and we plan to explore more of them once everything ships in a few months. We’ll let you know when it’s time to update!

(Featured image by Apple)


Social Media: At its Worldwide Developer Conference keynote, Apple announced a boatload of new features that we’ll see in macOS 12 Monterey, iOS 15, iPadOS 15, and watchOS 8 later this year. Here are the ten features we think you’ll most like:

Looking for More iOS 14 Widgets? Be Sure to Launch Seldom-Used Apps

Home screen widgets are one of the coolest features of iOS 14. They enable apps to offer quick access to features or at-a-glance previews of changing information, such as the Weather app’s widget providing a quick look at upcoming weather. What you may not realize, however, is that an app’s widgets become available for adding to your Home screen only if you have launched the app since upgrading to iOS 14. (To see the list, press and hold on an empty part of the Home screen and then tap the + button in a top corner.) For instance, if you haven’t traveled since the pandemic started, you might not realize that the Kayak app has a handy price alert widget. Just launch the app once, and you’ll see its widgets the next time you look through the complete widget list.

(Featured image by Omid Armin on Unsplash)

Use Messages to Share Your Current Location Quickly

We’ve all gotten that panicked “Where are you?!?” text message at some point. Sometimes it’s an easy question to answer, but at other times, the answer is “Well, right here, wherever that is.” That’s unsatisfying, of course, but using Messages on your iPhone, you can do better. Tap the person’s name at the top of the conversation, tap the Info button, and in the screen that appears, tap Send My Current Location. Messages immediately sends a little thumbnail map showing where you are, and if the recipient taps it, they can see a larger map, get directions, or open it in Maps. It’s a brilliant little feature!

(Featured image by Andrea Piacquadio from Pexels)

Four Ways to Reduce Zoom Fatigue

After a long day of video calls, you might feel like your brain has been wrung out like a wet washcloth—we certainly do. It’s exhausting to stare into a computer for hours every day while participating in meetings or classes. This condition is called Zoom fatigue, and it’s a recent affliction for most of us because the pandemic has dramatically increased the popularity of video calls. We don’t mean to beat on Zoom here—this condition plagues people who use Cisco WebEx, FaceTime, Google Hangouts, Google Meet, Microsoft Teams, Skype, and other videoconferencing software too.

But there are techniques you can employ to reduce Zoom fatigue. Researchers at Stanford University have identified four reasons why video calls are so tiring and offer suggestions on making them less so. They include:

  • Close-up eye contact is overwhelming. You usually sit about an arm’s length from your computer display, and if one person is on screen at a time, their head may be close to life-size. You’d never be that near someone’s face in real life unless they were a close family member, and even then, you wouldn’t hold that position for long. Shrink your window or switch to gallery view so you’re talking to postage stamps rather than feeling like someone is up in your face.
  • Looking at yourself is psychologically harmful. We all have mirrors, but can you imagine staring into one for hours every day? Only a pathological narcissist would do that. Worse, constantly seeing your own image can make you worry about your appearance and what others think of you. Once you’ve verified that you’re properly framed and don’t have salad in your teeth, hide your preview or switch to a view that doesn’t include you.
  • Sitting perfectly still is difficult. This is hardest on kids, but even adults have trouble staying sufficiently still to remain perfectly framed in a video window. When you’re on a standard phone call or in an in-person meeting, you might pace around the room or at least adjust your position in your chair. Try turning off your camera when possible—most calls work just as well without video—or position it so you can fidget or pace in person. Another solution is Apple’s Center Stage technology on the new M1-based iPad Pros, which automatically pans and zooms to keep you in the picture as you move around.
  • Video calls make you constantly think about call mechanics. There’s nothing natural about interacting with multiple people on a screen, so we’ve all come up with behaviors (some of which we just recommended!) to smooth over the cracks in the system. For instance, your brain has to expend extra effort to help you stay framed in the video window, worry about how you look, use exaggerated facial expressions so people know you’re paying attention, and use techniques like a thumbs-up to indicate approval without unmuting. The solution is to turn off your camera and hide the video window so your brain can take a break and focus on just the audio content of the call.

You’ll notice that most of the recommendations for reducing the mental strain of video calls come down to eliminating video. It shouldn’t be surprising because talking on the phone isn’t nearly as tiring, even when you’re on a conference call with a couple of people. There’s no question that video can help convey information that would be lost in a phone call, and it’s nice to see far-flung friends and family, but there’s no rule that video calls are the best form of communication for all situations.

We’ve started to put these recommendations into practice ourselves, and we encourage you to do so as well. And if you need support for why you’re turning off your camera or asking for audio-only calls, send people a link to this article.

(Featured image by Anna Shvets from Pexels)


Social Media: Why are video calls so exhausting when all you’re doing is sitting around and talking? Here’s the word from Stanford University researchers, along with advice on making those non-stop calls less tiring.

Don’t Store Confidential Files in Online File Sharing Services

Given their integration into the Mac’s Finder, it can be easy to forget that online file sharing services like Dropbox, Google Drive, iCloud Drive, and Microsoft OneDrive can be accessed using a Web browser by anyone with your username and password. Obviously, you should always have strong, unique passwords, but to be safe, it’s best not to use services designed for public file sharing to store unencrypted files containing sensitive information like credit card numbers, Social Security numbers, passport scans, privileged legal documents, financial data, and so on. Keep such data secure on your Mac—outside of any synced folders—where accessing it requires physical access to the machine.

(Featured image based on an original by Kenaz Nepomuceno from Pexels)

Keep iPhone 12 and MagSafe Accessories Away from Pacemakers

Remember when we had to keep magnets away from floppy disks to avoid scrambling them? Modern storage is no longer vulnerable, but magnets and electromagnetic fields from consumer electronics can interfere with medical devices, like implanted pacemakers and defibrillators. Although iPhone 12 models contain more magnets than prior models, Apple says they’re not expected to pose a greater risk of magnetic interference. However, after a study found that one pacemaker could be deactivated by holding an iPhone 12 near it, Apple issued a support document recommending that you keep your iPhone 12 and MagSafe accessories more than 6 inches (15 cm) away from your medical device or more than 12 inches (30 cm) away while wirelessly charging. Better safe than sorry—if you have a pacemaker, don’t put your iPhone or any other consumer electronics in a breast pocket.

(Featured image by Ulrike Leone from Pixabay)

What Are Those Orange and Green Dots in Your iPhone’s Status Bar?

In iOS 14 and iPadOS 14, Apple added two new status indicators to the right side of the status bar at the top of the screen. They’re designed to give you feedback about what an app is doing. An orange dot indicates that an app is using the microphone, and a green dot means that an app is using the camera (and possibly the microphone as well). They’re subtle and shouldn’t be distracting, but if you ever notice them when you don’t think the camera or microphone should be in use, look for apps that might be using them in the background.

(Featured image by Bruno Massao from Pexels)

8 Ways Apple Improved the Camera App in iOS 14

It’s difficult for most of us to imagine that a camera—something that still feels like it’s a standalone object—could be improved significantly with a software update. But now that cameras are part of our phones, code is king. With iOS 14, the camera in your iPhone becomes all the more capable. You’d be excused for not discovering the new features, though, so here’s a rundown.

Apple ProRAW

For professional and committed amateur photographers using an iPhone 12 Pro or Pro Max, perhaps the most important new feature of iOS 14 is the Apple ProRAW image format. Standard RAW images provide raw information from the camera sensor, which can be tweaked in editing to achieve results that the camera’s standard processing can’t. Alas, RAW images can’t take advantage of the iPhone’s computational photography capabilities, such as stitching together many images to produce a single image with good exposure even in low light conditions.

The Apple ProRAW format gives you the best of both worlds: the iPhone’s computational photography plus the added flexibility of working with raw data to adjust exposure, color, and white balance. It’s far too complex to get into here, so if you’re interested, check out these articles by photographers Ben Sandofsky, Austin Mann, Nick Heer, and Om Malik, all of which feature copious visual examples.

Faster Performance

We’ve all missed shots because we couldn’t get the Camera app open in time. That may still happen, but Apple is doing its best to help. The company says that the Camera app now opens faster and the time to the first shot is 25% faster. When taking a series of Portrait shots, the time between shots is 15% faster. Overall, Apple says, the Camera app is 90% faster, taking up to 4 frames per second.

Prioritize Faster Shooting

Want still more shooting speed? If you take a lot of action shots, iOS 14 offers a new Prioritize Faster Shooting option that reduces the amount of processing (probably reducing image quality slightly) when you press the shutter button rapidly. Turn that on in Settings > Camera.

Use Volume Buttons for Burst Photos or QuickTake

Burst mode is the best way to make sure you get the photo when shooting fast-moving subjects. Historically, you invoked burst mode by pressing and holding the shutter button. Unfortunately, in iOS 13 on the iPhone 11 models, Apple assigned that action to the QuickTake feature, which automatically starts taking a 1080p video regardless of the current mode. Burst mode required pressing the shutter button and dragging to the left, which is tricky to perform correctly under pressure.

Happily, iOS 14 gives us additional options. When in the Camera app, press and hold the physical Volume Up button to invoke burst mode—let up to stop taking photos. Pressing and holding the Volume Down button invokes QuickTake and records video as long as you press the button.

QuickTake Comes to iPhone XR, XS, and XS Max

QuickTake was initially available only on the iPhone 11, 11 Pro, and 11 Pro Max from 2019. When Apple released the second-generation iPhone SE in 2020, it too featured QuickTake. With iOS 14, the QuickTake feature also comes to 2018’s iPhone XR, XS, and XS Max. So if you have one of those models, try pressing and holding the shutter button to take a video, or use the Volume Down button.

Change Video Mode in the Camera App

Most people will probably want to set the resolution and frames-per-second for videos once and then forget it. That’s what you do in Settings > Camera > Record Video and Record Slo-mo. But if you do want to change the settings, getting back to that screen quickly is difficult. In iOS 14, Apple added a pair of tiny indicators to the upper-right corner of the Camera app when you’re in Video or Slo-mo. They tell you what resolution and frames-per-second you’re using, and tapping either one cycles you through the other options.

Preserve Exposure Adjustment

Sometimes, when you’re taking photos in challenging lighting conditions, you want to override the automatic exposure settings and keep those settings across multiple shots. In Settings > Camera > Preserve Settings, you can now enable Exposure Adjustment ➊, which maintains your settings across shots and shows the exposure adjustment indicator ➋ near the upper left at all times. Tap that indicator to display the exposure adjustment slider ➌ below.

Mirror Front Camera

By default, when you’re taking a selfie with the iPhone’s front-facing camera, the preview shows you what you’d see in a mirror, but the eventual photo instead displays what someone looking at you would see. This is most noticeable when there’s text in the shot. Some people want the photo to look exactly like the mirrored version without having to edit the photo and flip it. iOS 14 now makes that possible with a Mirror Front Camera switch in Settings > Camera. It affects only the photo you take, not the preview, so you won’t see any change while composing the shot. In the examples below, the left-hand image shows the Camera app’s default behavior, and the right-hand image shows what you get if you enable Mirror Front Camera.

If any of these new features sound compelling, take a few minutes to see if you can work them into your regular shooting.

(Featured image based on an original by Element5 Digital from Pexels)


Social Media: Our phones may seem to be cameras, but they’re really computers, and software updates like iOS 14 can provide new camera capabilities, even with existing iPhone hardware. Here’s what to look for: