Our 9L0-314 dumps questions go through tough test of our certified experts. They review all the technical issues of 9L0-314 Questions and Answers before they are posted on our website for candidates to download. We keep our 9L0-314 dumps updating with real exam questions and answers. Our vce exam simulator is updated with latest changes in 9L0-314 braindumps. You can rely our 9L0-314 dumps for your real 9L0-314 test.
Apple
9L0-314
Apple Hardware Recertification
https://killexams.com/pass4sure/exam-detail/9L0-314 Question: 113
You are attempting to power a Mac mini (original) with an 85 Watt power adapter. Will this
work?
A. Yes
B. No Answer: A Question: 114
Examine the exhibit.
Which of these is a valid memory configuration for a Mac Pro (Early 2008)?
A. Configuration A: Two DIMMs on top riser card, four DIMMs on bottom riser card.
B. Configuration B: Four DIMMs on top riser card, two DIMMs on bottom riser card. Answer: B Question: 115
In order to read ANY of the diagnostic LEDs in a Mac Pro (Early 2008), you must press the
DIAG_LED button on the logic board.
39
A. True
B. False Answer: B Question: 116
How did you prepare for this exam? (Choose all that apply.)
A. Apple Mac OS X Help Desk Essentials leader-led course
B. Apple leader-led technician training course
C. Self-study AppleCare Technician Training purchased from Apple
D. Non-Apple courses or books
E. On-the-job training / apprenticeship
F. Self-taught
G. None of the above Answer: E Question: 117
If you took an Apple leader-led course to prepare for this exam, when did you take it?
(Choose the closest answer.)
A. Last week
B. Two or three weeks ago
C. One month ago
D. Two months ago
E. Three months ago
F. More than three months ago
G. Did not take an Apple leader-led course Answer: E Question: 118
What additional Apple certifications do you have or plan to seek? (Choose all that apply.)
40
A. Apple Certified Technical Coordinator
B. Apple Certified System Administrator
C. One or more of the Apple Digital Media Pro certifications
D. None Answer: B Question: 119
Where is the BEST place to look for information regarding special take-apart tools for an
Apple product?
A. Discussions
B. Service News
C. User's manual
D. Service manual Answer: D Question: 120
You are running Apple Hardware Test on a customer's Mac. The test fails with an error code.
Which one of the following resources is the Apple-recommended choice for locating the
meaning of this code?
A. Apple Hardware Test Help
B. Service Source
C. Service manual
D. User's manual Answer: B Question: 121
A useful tool to have when servicing an Apple portable computer is _____ tape.
A. plastic
B. kapton
C. packing
D. electrical
41 Answer: B Question: 122
A customer states that he cannot open an AppleWorks file. What is the most productive
Question to ask him FIRST?
A. Can you open any files?
B. Is your Mac connected to a network?
C. Can you open other AppleWorks files?
D. What version of AppleWorks are you using? Answer: C Question: 123
What is the first step to take if you have a Mac that constantly ejects any CD / DVD that is
inserted?
A. Replace the logic board.
B. Replace the optical drive.
C. Replace the optical drive cable.
D. Disconnect all peripheral devices, especially the mouse. Answer: D Question: 124
When removing or replacing the heat sink or processor on the Mac Pro (8x), what is the
maximum acceptable amount of time for the heat sink to be separated from the processor?
A. Five (5) minutes
B. Fifteen (15) minutes
C. Thirty (30) minutes
D. Sixty (60) minutes Answer: B
42
For More exams visit https://killexams.com/vendors-exam-list
Kill your exam at First Attempt....Guaranteed!
Apple Recertification book - BingNews
https://killexams.com/pass4sure/exam-detail/9L0-314
Search resultsApple Recertification book - BingNews
https://killexams.com/pass4sure/exam-detail/9L0-314
https://killexams.com/exam_list/AppleEverything Apple Announced at WWDC 2023
This year’s WWDC is one for the history books. Apple made is usual software announcements, introducing iOS 17, iPadOS 17, macOS Sonoma, and watchOS 10, and even introduced the largest MacBook Air yet. However, the talk of the town won’t be the next iPhone update or the 15-inch MacBook Air: It’s definitely the new “Vision Pro” AR headset we’ve been hearing about for years. But let’s not get ahead of ourselves.
New Macs announced at WWDC
Apple kicked off WWDC this year with new Macs, and kicking off that announcement was the reveal of the long-rumored 15-inch MacBook Air:
15-inch MacBook Air
It might sound like an oxymoron, but the 15-inch “Air” is real. It comes in at 11.5mm thin, weighing in 3.3 pounds, and comes with MagSafe charging, two USB-C ports, and in four colors, including recent additions Midnight and Starlight.
The new Air sports a 15.3-inch screen that can hit 500 nits of brightness, six speakers, a 1080p webcam, and Apple’s M2 chip. The company brags this iteration of the Air is 12 times faster than the “fastest” Intel-based MacBook Air, but those stats should always be taken with a grain of salt. Hopefully one stat that pans out is the Air’s 18-hour battery life, which would be more than enough to get you through an entire day of work and play.
The new 15-inch MacBook Air starts at $1,299 ($1,199 for education). It’s available to order today, and will be in stores next week.
Apple also cut the price of the 13-inch M2 MacBook Air by $100: It now starts at $1,099.
Pro Products
Apple then moved onto its pro Macs, starting with the Mac Studio. The new Studio now gets the M2 Max chip, which Apple says is 25% faster than M1 Max. But if that’s not enough power for you, you can buy a Mac Studio with the brand-new M2 Ultra: The chip connects two M2 Max dies together, with a 24-core CPU and up to 76-core GPU. Mac Studio has 8K display support, and can support six Pro Display XDRs.
But it’s not all about the smaller Mac Studio. Apple finally made a Mac Pro with Apple silicon. Specifically, any configuration of the new Mac Pro comes with M2 Ultra, with up to 192GB of unified memory. There are eight thunderbolt ports in total, two HDMI-ports, and six open expansion slots that you can customize with whatever supported cards you want. You can run 22 concurrent streams of 8K Pro Res video, a task that typically consumes a huge amount of processing power. It’s available in tower and rack-mount configurations, depending on how you need to store your Mac Pro.
Moving on to iPhone, Apple introduced iOS 17. While it isn’t the most revolutionary update in iPhone history, it does add some great new features to the mix.
Updates to communication
Apple started with major updates to the Phone, FaceTime, and Messages apps: With iOS 17, we now have Personalized Contact Posters: These posters are sort of like your own customizable caller ID with the same editing engine as the Lock Screen: You pick an image of yourself and overlay your name, and this poster shows up whenever you call another iPhone.
Speaking of phone calls, Apple introduced Live Voicemail: This new feature displays a live transcription of a voicemail as it happened, providing easy call screening for all iPhones running iOS 17. You can take a peak and see whether that call is worth picking up, whether you’ll get around to it later, or whether it’s a spam call you can safely ignore forever.
On that note, Apple is adding video voicemail to FaceTime. Now, when you try to call someone on FaceTime and they don’t answer, you can leave them a video message filling them in on what they missed.
Messages gets a lot of new features
I’m really looking forward to Messages myself: This year, Apple is adding search filters, which lets you include additional terms to narrow down search results. Even better, there’s now a catch-up arrow that jumps to the first unread message in a thread. That will prove useful when you missed a lot of messages in the group chat. You can also swipe to reply on any chat bubble, a simple but efficient update. Also simple and efficient is that audio messages are now transcribed, so you don’t need to play the message to know what someone said.
Location sharing now appears in your conversations, but that isn’t the only location update. iOS 17 adds “Check In,” a new feature that helps your friends and family keep tabs on you when you’re on the move. When you start a Check In, your iPhone will automatically tell your chosen contacts when you get home. If it thinks you’re detouring from your route, it can share location and battery info with your friends to keep them in the loop. This feature, which is end-to-end encrypted, seems like a great way to stay safe.
Messages will look a lot cleaner now, too. A new Plus button hides away options like Camera, Photos, and iMessage apps, replacing the App Drawer. There’s also a brand new Stickers experience: When you tap the Plus button, then “Stickers,” you can see all your stickers at once. Also, all emoji are now stickers, and subjects you lift out of photos can now be stickers too. iOS saves a history of subjects you’ve lifted from photos so you can easily stick them to messages. You can also make live stickers out of Live Photos with effects. All of these stickers are now system-wide, so can add them in third-party apps as well.
AirDrop
AirDrop gets some new updates, too, for the first time in a while. Now, there’s NameDrop, which lets you share numbers or content with another iPhone just by bringing your devices close together. For numbers specifically, the transfer brings up your Contact Poster, and lets you choose phone numbers and email addresses you want to share. The feature works with Apple Watch, too. For the first time, you can leave AirDrop range before a transfer is done, and it’ll still work. Finally, you can bring phones together to enter SharePlay.
Apple improves autocorrect
Autocorrect now uses a new transformer language model for more accurate corrections. You can tap the underlined correction to revert back to original if it didn’t get it right. Best of all, the keyboard will learn from your corrections, so you won’t have to deal with “ducking” autocorrect all the time (Apple made that joke themselves, by the way).
Apple makes their own journaling app
“Journal” is an Apple-made journaling app that is deeply integrated with iOS. Journal uses on-device machine learning to make suggestions from your daily moments, like photos, location, music, workouts, etc, to generate entries for the day. You can customize what suggestions to include from this list. Journal includes writing prompts to help you get started with any given entry, too, and the entire experience is E2EE.
Turn your iPhone into a smart display
StandBy is a new feature that kicks in when you turn iPhone on its side while charging: When you do, a new full-screen display appears, displaying the time, weather, alarm, and more. StandBy lets you customize your displays with widgets or photos, and supports both Live Activities and Siri.
Siri gets some small but powerful changes
Speaking of Siri, you no longer have to say “Hey” to get her attention. Just say “Siri,” and the assistant will kick in. You can also run back-to-back commands now, so you can naturally ask a follow-up question after the first query to get a response.
Maps and Photos get two new features
As an aside, Apple also mentioned offline maps coming to Apple Maps, which will let you navigate areas when you have no service. Plus, the People album is improved in Photos, and can now recognize cats and dogs.
iPadOS 17
iPads this year gain many of the features Apple highlighted in iPadOS 17, but also have some updates of their own:
Widgets and Lock Screen
You can interact with widgets without opening the app itself, such as ticking off a to-do in a Reminder widget, controling lights in a Home widget, or playing songs from a Music widget.
In addition, iOS Lock Screen customizability comes to iPad. Machine learning adds a slo-mo effect to live photo wallpapers when waking your iPad. Rather than appearing underneath the time, Widgets run all along the side of the display, which lets you add more than you can on iOS. It also supports Live Activities.
Health
The Health app is now on your iPad: The app syncs across iPhone and Apple Watch, as well as third-party apps.
PDF improvements
With iPadOS 17, the system now identifies PDF fields, and supports auto-fill, which should make working in PDFs much easier than before. The Notes app also supports interacting with and annotating PDFs.
Stage Manager updates
Stage Manager, which brings macOS window manager to iPad, now adds more flexibility to window sizing and positions. Hopefully this means the feature works a bit better than before, and more like macOS than the odd hybrid we got in iPadOS 16. You can also now use the built-in camera on an external display when connecting your iPad to it.
Freeform
Apple’s digital whiteboard, Freeform, adds new drawing tools in iPadOS 17, and adds Follow Along, which lets you track collaborator activities.
macOS Sonoma
Like iPadOS 17, macOS Sonoma features some of the new features in iOS 17, but also some new features exclusive to Mac (although, granted, fewer than on either iOS or iPadOS).
To start, there are new screensavers, which sport slow motion video of places around the world. We also get support for widgets on the desktop: They’re interactive, too, like they are on iPadOS.
Gaming updates
Mac might not be known as a gaming platform, but Apple wants it to be: They’re adding a new “Game Mode,” which prioritizes CPU and GPU during bouts of gaming. The feature also reduces lag with wireless devices like headphones and controllers. Apple even collaborated with Hideo Kojima, who announced Death Stranding Director’s Cut will be coming to Mac soon.
Mac adds new video conferencing features
Apple is adding a new “Presenter Overlay” feature to the Mac. This option puts your video feed on top of your content: You can either be a small bubble over your work, or have the shared screen in the background like a poster. There’s also new reactions in video calls, similar to what you find in Zoom or Google Meet: You can even use gestures to activate reactions. These work in third-party apps, too, so you don’t need to settle for whatever effects your app offers.
Safari focuses on privacy
When you turn on private browsing in Safari in macOS Sonoma, it now locks your browser window behind authentication, so only you can return to a private session. The feature both blocks trackers and removes tracking from URLs.
You can also now securely share Passkeys with close group of contacts under a feature called Family Passwords. This is also E2EE. We have Profiles for the first time on Safari, so you can separate your work from your personal browsing. Plus, you can add websites to the dock to turn them into web apps, regardless of whether or not the developer made the site a web app in the first place.
Audio & Home
Apple announced Adaptive Audio for AirPods, which blends Transparency and Noise Cancellation together at the same time. This feature reduces loud noises while letting you hear what’s around you. When you start speaking, for example, your music lowers and the AirPods focus on who you’re talking to. When on a call, it reduces noise around you. And automatic switching is faster when moving between devices.
AirPlay now learns your preferences, so it queues up the right audio output for a new play request. Apple also announced “AirPlay in Hotels:” Scan a QR code on your hotel TV, tap confirmation, and you’ll be able to AirPlay to the hotel TV, no wires or payment required.
Finally, car passengers can AirPlay their music through CarPlay, so anyone can pass the metaphorical aux.
tvOS and Apple TV
Apple didn’t make many new announcements here, but they did mention the following: tvOS 17 has a new Control Center design, and you can now use your iPhone to find your lost Apple TV remote. Plus, you can see your photos as a TV screensaver.
The biggest announcement, however, is FaceTime on Apple TV: It connects to your iPhone or iPad, and uses Continuity Camera and Center Stage to let you have FaceTime calls on your TV.
watchOS 10
WatchOS 10 is made up of a bunch of small features this year. To start, there’s a new Smart Stack watch face. This face is organized chronologically, so you can scroll ahead to see what’s on your plate later on in the day.
Small changes
World Clock now has colors to reflect time of day, so you can easily tell what time it is in a different city; the Activity app now has corner icons; and Snoopy and Woodstock now have a cute watch face. They’re featured in different scenes, including one that reacts to current weather locations. If it’s raining in your location, it might just be raining on Snoopy, as well.
Workouts
The Cycling workout now connects to Bluetooth bike accessories, and can estimate your FTP, the highest level of intensity you can maintain for an hour. The app estimates power zones to help with your training, and Cycling workouts show up as Live Activity on your iPhone.
A Hiking workout shows your last-known location, and shows where on your route you can make an emergency call. You can use search to look for new trails nearby, which shows details about each—helpful if you’re looking for a particular length of time or difficulty.
Health
A new Mental health option helps you visually identify how you’re feeling throughout the day. This feature works on iPhone as well, if you don’t have an Apple Watch. Plus, you can take standardized screening tools to see if you’re suffering from something like depression or anxiety.
Apple is also hoping to help Strengthen vision health, as well. Your Apple Watch tells you if your child is spending enough time outside in daylight: The company talked about how a lack of sunlight is a leading cause of myopia, and this feature could help promote more sun exposure. In addition, there’s a new Screen Distance alert that will warn you if you’re using your iPhone or iPad too close to your face.
All of the new software versions have a developer beta out today. The public betas for each will arrive in July, and all release in the fall.
The tail end of the presentation focused on Apple’s long-rumored new headset, Vision Pro, which brings a classic Apple experience into augmented reality (AR). While the headset completely covers your eyes, the cameras bring your room into full view, overlaying your content onto the real-world. It’s like something out of Iron Man or Terminator.
Vision Pro runs a new operating system called visionOS, which essentially combines iPadOS and macOS into AR. The windows and elements actually respond to light and cast shadows to enhance the illusion. You can scale any app as large or small as you want, wherever you want in the space.
You also don’t need to stay in your current space: Environments bring different scenes into your room, like a forest or beach background. These backgrounds aren’t like Zoom backgrounds: They’re incredibly high-quality, and can be a small part of your room, or completely take over the space.
Apple designed Vision Pro to be used with just your eyes, hands, and voice, no need for controllers like you find on the Meta Quest. You look at icons to highlight them, tap your fingers to select, and flick to scroll. That said, you can use keyboards, trackpads, Mac, and more to complement the experience. A Mac brings in a 4K display into the experience.
While Vision Pro looks like a pair of ski goggles with a screen that blocks your face, Apple hopes other people in the room will be able to chat with you without issue. That’s because the front screen shows a visual representation of your eyes in a feature called “Eye Sight.” Eye Sight creates the illusion that people can look you in the eye while you’re wearing the headset. Your eyes only appear when someone is in your space, too. Otherwise, it’s just a blank screen.
If nothing else, Vision Pro seems perfect for entertainment. You can watch your shows and movies on a giant floating screen in your room, dim the environment, place yourself in a “movie theater,” bring in other Environment, and even watch 3D movies. Disney’s Bob Iger advertised its vision of Vision Pro: You can Watch The Mandalorian on Tatooine, or watch a football game “at the game.”
Of course, the same benefits apply to gaming, as well. Apple says there will be over 100 Arcade titles to play when Vision Pro launches, but hopefully we finally see some more major games appear on Apple’s platform now.
Vision Pro’s design
Vision Pro shares the same design language as AirPods Max, sporting a digital crown and long power button. The headset has audio built in, but not as headphones: These audio outputs rest of the sides of your head, rather than over or in your ears. However, it does support wireless earbuds like AirPods. Like an Apple Watch, you can change out the band to find the right one to fit your head, and for those of us who wear glasses, there are Zeiss inserts to accommodate our frames.
Vision Pro performance
Vision Pro has some impressive hardware. The internal displays are micro OLED, and fit 64 pixels in the space of a single iPhone pixel. There are 23 million pixels in total across both screens. The built-in audio matches the sound to your room to create the illusion that the sound is happening in your space.
The headset has an M2 chip built-in, but also a new R1 chip which handles the cameras, sensors, and microphones. There’s an internal cooling system to keep things from getting too hot, but that won’t help you when it comes to battery life: There’s only two hours of use with the on battery attachment, but the headset can be plugged in to external power for all-day use.
You can FaceTime with this thing, but obviously, there’s no way to properly capture your face during the call. So, there’s a feature that lets you scan your face, and creates a digital version of yourself from the scan. It’s a step up from a Memoji or a character in Meta’s metaverse, but it is a bit odd.
Don’t think people will be able to steal your headset and use it for themselves, either: Vision Pro authenticates with a new feature called Optic ID, which scans your irises rather than your face to unlock. Wild.
As you might guess, a headset like this costs a lot. Apple is pricing the headset at $3,499, and is making it available early next year.
It's official: watchOS 10 is coming to the best Apple Watch models this year. Announced at WWDC 2023, it’s a big update, providing instant access to a lot of your most-used tools without having to trawl through the old watchOS’s honeycomb of apps and complications, and will certainly change the way we use the Watch as we know it. Helpfully, it didn't list which Apple Watch models, if any, would be excluded from the update, so we have to assume watchOS 10 will be available on Apple Watch 4 and younger, just as watchOS 9 is.
Apple has revamped and improved its old Glances system into all-new widgets, which sit at the forefront of the watchOS 10 design ethos. The aim is to allow you to open up your watch and use it instantly by scrolling through your favorite widgets with the Digital Crown rather than picking through the polka-dot Where's Waldo? cloud of apps.
Widgets enable you to access some of your most-often functionalities such as weather (which got a big redesign), workout metrics, stock tickers, timers and more. The timer, for example, can be activated via the Complications widget with a simple press of the crown, and then you can check the timer simply by scrolling up from your watch face.
2. Revamped home and lock screen
Of course, if you’re seeing all these personalized widgets as soon as you open the watch, it’s going to be a very different-looking home screen. You can have the Portraits watch face loaded up and simply turn the Digital Crown to flip straight into the widgets.
The home screen uses machine learning to show you relevant information at the right time of day. Early in the morning? You get meetings and weather information, while later on, you’ll be served other (unspecified) information, probably a traffic report or something similar. Something like a timer will jump to the top of the list when it's active.
3. Two new watch faces
A pair of new watch faces have been added to watchOS 10. Palate watch faces offer beautiful pastel tones that shift as the time moves throughout the day. However, our most excitement is reserved for the second new face – Snoopy and Woodstock!
The pair of famous cartoon characters are animated and play off what’s happening around you, gathering information from the weather and activity apps. For example, if you’re working out, Snoopy and Woodstock will be animated to be exercising with you, while if it’s raining, they may be sporting umbrellas.
4. new workout features
A redesigned activity app plays host to a bevy of new features here, including redesigned trophy case and full-screen views for different aspects of the movement rings.
Cycling is the individual activity that has arguably got the biggest overhaul, with Bluetooth input from cycling computers enabling you to add Cadence and Power to your metrics. If you do so, Apple Watch will estimate your Functional Threshold Power, or FTP, creating five Power Zones that operate like heart rate zones. The Apple Watch can also turn a paired iPhone into a live activity HUD, so you can attach the watch to your bike frame or handlebars and get an indication of how you’re doing at a glance.
Hiking also got an overhaul, with the compass automatically generating emergency waypoints, including the last place you got reception – a Cellular Waypoint. A three-dimensional view of the compass also shows you elevations, and in the US, a new topographic map – seemingly without the need for GPX files – can also be used with the hiking app. Combined with the Apple Watch Ultra, this is going to be an outdoor beast.
5. Mental Health
New functionality in the mindfulness app is designed to help you identify your feelings and journal your thoughts, through a simple sliding scale of unpleasantness-to-pleasantness, adding additional notes and tags to dictate exactly what’s making you feel this way – you'll be able to compare it to other data in the health app, so you could see how not having enough sleep or not getting exercise changes your mental health.
Standardized assessments used in clinics can add more information and will let you know if you should talk to someone. Helpful articles and features also round out these “screening tools and resources”.
6. Daylight Sensitivity
Apple is concerned about children developing myopia, or near-sightedness, and says that it’s less likely to develop if children spend at least 80 minutes or more in daylight. Apple Watch can now measure the time spent outside using the ambient light sensor, and parents can monitor it in the Health app. It comes as Screen Distance is being used across other devices to help people avoid holding their iPhones and iPads too close, which can be another cause of issues while children are developing, apparently.
Unless you've been living under a rock, you've probably heard the term "generative AI" at least a handful of times now, perhaps thanks to the wildly popular ChatGPT service. The AI-powered chatbot's success didn't just shine a spotlight on OpenAI, the creator behind it, but it also catalyzed an AI arms race in the tech industry – a race from which Apple has been noticeably absent.
In May, Google made a flurry of AI-related announcements at its annual developer conference, including a new AI-infused version of search and Bard, its AI-powered chatbot, which is being rolled out across the world. It's not just Google. Before that, Microsoft built generative AI into its suite of long-established productivity apps like Word, PowerPoint and Outlook in a move that's changing how more than a billion people work. In February, Meta released its own sophisticated AI model, which has many of the same capabilities at ChatGPT and Bard, as open-source software for public use.
But what about Apple?
The short answer: Even though AI technology is hardly new to Apple, the iPhone maker still remains missing – at least publicly – from the current generative AI gold rush.
"We're in the heart of the generative AI hype cycle, and there are major new developments weekly, " Avi Greengart, analyst at Techsponential, told CNET. "Apple can afford to be deliberate in how it applies new technologies to fit its ecosystem."
Apple has typically adopted a wait-and-see approach around emerging technology, and that has often worked for the tech giant. For instance, the iPad wasn't the first-ever tablet, but for many, including CNET editors, it is the best tablet. A more recent example on the hardware side is foldable phones. Apple is the only major holdout, with Google beating it to the punch. The search giant launched its inaugural foldable phone, the Pixel Fold, at its developer conference in May – and it hasn't been making phones for as long as Apple. There are rumors, however, that a foldable iPhone, possibly known as the iPhone Flip, could go to market in 2025.
Based on remarks from CEO Tim Cook, it seems like Apple may be taking a similar approach with generative AI. "I do think it's very important to be deliberate and thoughtful in how you approach these things," Cook said in response to a question related to generative AI on Apple's earnings call in May. "And there's a number of issues that need to be sorted… AI is being talked about in a number of different places. But the potential is certainly very interesting."
However, with a fast-developing AI technology, Apple could risk falling far behind its rivals. For all Apple's business success, it has lagged in specific categories. For instance, its HomePod smart speaker didn't hit the market until years after the Amazon Echo and Google Home, which have a far higher market share than Apple in the smart speaker category.
When it comes to the subject of AI, Apple isn't alone in adopting a cautious approach. It's also coming from the technology's own backers – including the founder and CEO of OpenAI, Sam Altman, who has concerns ranging from election disinformation to mass jobs displacement.
Last Tuesday, speaking before a Senate subcommittee, Altman said he's "eager" for artificial intelligence to be regulated. He also spoke about the promise of artificial intelligence and discussed its potential harms. "If this technology goes wrong, it can go quite wrong," he said.
Altman's comments followed calls by a group of AI researchers and tech leaders, including Elon Musk and Steve Wozniak, to pause development of AI systems more powerful than GPT-4 over concerns about runaway risks without sufficient guardrails. Geoffrey Hinton, credited as the "godfather of AI," resigned from Google in May so that he could freely share his concerns about the technology he helped create, which he says could cause the world serious harm.
Does generative AI fit into Apple's business?
Although Apple hasn't publicly entered the generative AI fight, a recent 9to5Mac report said that the iPhone maker is working on an upgrade to Siri, one that could Strengthen the virtual assistant's conversational abilities via ChatGPT-like AI concepts. Apple didn't reply to a request for comment.
While Apple hasn't publicly discussed any plans for generative AI-based products, Cook did discuss the company's focus on AI during its May earnings call. He cited AI-powered features like fall and crash detection, which are both available on the latest iPhones and Apple Watches.
"We view AI as huge," he said. "We'll continue weaving it into our products on a very thoughtful basis."
AI is far from a brand new concept to Apple. Siri, which was released 12 years ago, uses speech recognition and machine learning to understand a query and serve up an answer. In recent months, Apple debuted camera enhancements such as photographic styles and the ability to cut and paste a subject from an image, both of which depend on AI.
In addition, Apple's Macs and MacBooks, which now run on Apple-designed M1 and M2 chips, have dedicated neural engines with 16 cores, which are aimed at AI and machine learning tasks. Apple says AI performance is 40% faster than with its old Intel chips.
"You can expect that AI performance will become more and more important as more developers figure it out," wrote CNET's Stephen Shankland in a January article detailing Apple's M2 chipset.
But as Greengart highlights, it would make sense for Apple to bring the tech to certain products that extend beyond Siri as well as its current AI-powered offerings.
"Apple likes to position itself as being at the intersection of technology and liberal arts," Greengart told CNET in an email. "Generative AI would fit nicely into tools and software that Apple provides for artistic and personal expression; that could include anything from GarageBand to photo editing to email across iPhones, iPads, and Mac."
However, a chatbot in the vein of OpenAI's ChatGPT or Google's Bard is likely not in the books for Apple. The underlying technology behind those chatbots, known as large language models, has a high resource requirement for development. That means significant investment in the form of computing resources, human talent and power, rendering it a possibility for huge enterprises with vast resources. While Apple presumably has those resources, it'll have to be a worthwhile investment for the iPhone maker.
What It Means for Apple to Reveal VR at WWDC
SHARE
SHARE
TWEET
SHARE
EMAIL
What to watch next
Bear vs. Loom & Leaf: Memory Foam Mattress Review and Comparison
CNET
RedMagic 8 Pro Review: What to Know About This Lower-Priced Gaming Phone
CNET
Apple's Mixed Reality Mystery Headset: What WWDC Needs To Discuss
CNET
Apple, Please Bring These Apple Watch Features to WatchOS 10
CNET
How This Lab Produced a Historic Nuclear Fusion Reaction
CNET
Apple's WWDC 2023: What We Expect
CNET
New Solar Shingles You May Not Even Notice
CNET
UP NEXT
All eyes on WWDC
After Google devoted a considerable amount of air time to generative AI at its conference this month, all eyes are on Apple and what it might reveal at its Worldwide Developers Conference on June 5. Apple executives could offer more clues on how the iPhone maker views generative AI and how it fits into the broader business. At WWDC, Apple typically introduces new software for the iPhone, Apple Watch, iPad and so on, and it's possible that Apple could bake more AI into those updates.
Ahead of the conference, Apple previewed a slew of accessibility software features expected to make their way to its upcoming iOS 17 mobile operating system. One of the noteworthy drops is called Personal Voice. It uses on-device machine learning to allow users at risk of speech loss to replicate a voice after about 15 minutes of training. The phone can then speak aloud typed-out phrases, and it's compatible with FaceTime and phone calls in a feature that could be a form of generative AI for voice.
More likely to take center stage, however, is Apple's highly anticipated mixed reality headset, which would mark the company's first entry into a new hardware category since 2015. According to a January Bloomberg report, it'll cost around $3,000, run on Apple's latest M2 chipset, boast eye- and hand-tracking systems, and feature a digital crown that lets users switch between AR and VR modes. It's also probable that Apple will take advantage of fast-developing AI technology for its latest device as well, even if it's doesn't receive explicit mention.
"We need to keep in mind that generative AI is not only about generating text but also other types of content like graphics," Will Wong, of market researcher International Data, told CNET. "Thus, it will be an area that is favorable for Apple to look into, especially if there is an AR/VR headset that comes into its product portfolio."
Thu, 01 Jun 2023 01:17:42 -0500en-UStext/htmlhttps://www.msn.com/en-us/news/technology/will-apple-enter-the-generative-ai-race-all-eyes-are-on-wwdc/ar-AA1bwVWbYour iPhone will soon be able to replicate your voice after 15 minutes of training
CNN —
Apple on Tuesday announced a series of new accessibility tools for the iPhone and iPad, including a feature that promises to replicate a user’s voice for phone calls after only 15 minutes of training.
With an upcoming tool called Personal Voice, users will be able to read text prompts to record audio and have the technology learn their voice. A related feature called Live Speech will then use the “synthesized voice” to read the user’s typed text aloud during phone calls, FaceTime conversations and in-person conversations. People will also be able to save commonly used phrases to use during live conversations.
The feature is one of several aimed at making Apple’s devices more inclusive for people with cognitive, vision, hearing and mobility disabilities. Apple said people who may have conditions where they lose their voice over time, such as ALS (amyotrophic lateral sclerosis) could benefit most from the tools.
“Accessibility is part of everything we do at Apple,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, in a company blog post. “These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways.”
Apple said the features will roll out later this year.
While these tools have potential to meet a genuine need, they also come at a moment when advancements in artificial intelligence have raised alarms about bad actors using convincing fake audio and video – known as “deepfakes” – to scam or misinform the public.
In the blog post, Apple said the Personal Voice feature uses “on-device machine learning to keep users’ information private and secure.”
Other tech companies have experimented with using AI to replicate a voice. Last year, Amazon said it’s working on an update to its Alexa system that would allow the technology to mimic any voice, even a deceased family member. (The feature has not yet been released).
In addition to the voice features, Apple announced Assistive Access, which combines some of the most popular iOS apps, such as FaceTime, Messages, Camera, Photos, Music and Phone, into one Calls app. The interface includes high-contrast buttons, large text labels, an option for an emoji-only keyboard and the ability to record video messages for people who may prefer visual or audio communications.
Apple is also updating its Magnifier app for the visually impaired. It will now include a detection mode to help people better interact with physical objects. The update would allow someone, for example, to hold up an iPhone camera in front of a microwave and move their finger across the keypad as the app labels and announces the text on the microwave’s buttons.
Tue, 16 May 2023 13:57:00 -0500entext/htmlhttps://www.cnn.com/2023/05/16/tech/apple-accessibility-features/index.htmlWill Apple's Reality Pro signal the beginning of the immersive internet?Jason Hiner/ZDNET
There are countless reasons to be skeptical about Apple launching its own virtual reality headset. But, there's one reason to believe Apple thinks it has cracked the code: the fact that it's actually going to unveil a product at WWDC 2023.
Apple only launches into a new category if the world's largest tech company believes it can transform the landscape and bring unique value to the world in the process.
It's a foregone conclusion that Apple will announce a headset at its annual Worldwide Developer Conference on June 5. What's less clear is why, and what Apple has discovered that Google, Samsung, Sony, Facebook/Meta, and countless others have missed over the past decade. None of them have been able to deliver the world a reason to love VR.
What Apple shows us on June 5 is likely to be fascinating and limited. In the same way that the first Mac, iPod, iPhone, and Apple Watch fascinated us with their possibilities but were extremely limited in their initial capabilities, the Reality Pro (if that's what it's truly named) will likely be less groundbreaking in its physical form than in the picture Apple paints of what it will ultimately make possible.
Despite VR's struggles in recent years, we have learned that there are things it does very well. For example, VR is rapidly making a massive impact on training. According to research from PwC, learners who use VR train four times faster than classroom learners, are four times more focused than e-learning students, are 275% more confident to apply skills after learning, and are 3.75 times more emotionally connected to what they learn than people in classrooms.
PwC
That kind of training effectiveness is likely to be in high demand in the years ahead as AI impacts the future of jobs and a ton of workforce retraining will inevitably be needed.
It makes sense that VR is so effective as a training tool, because it's so immersive. The difference between VR's fully engaged experience and reading a web page or watching a video -- while notifications are going off on your phone, or the person next to you asks you a question, or you reach for that snack on your desk -- is obviously a huge gap.
The question now is where can that level of immersiveness create other kinds of next-generation experiences?
Could Apple's long-awaited headset have a similar design to recent HTC Vive models?
Jason Hiner/ZDNET
That's likely the big picture that Apple is seeing and thinking about. And it's likely what has led the company to start a journey toward creating a headset, a future pair of augmented reality glasses, and an ecosystem of experiences that are far more immersive and engaging than what we have today on our computers, smartphones, and tablets.
To be clear, this headset won't replace those everyday devices and it won't be something we use all the time. But it will be something we can use for some of our most important or favorite experiences where we want to go deeper. Think about the most important courses you engage with in your work or life, and what it might look like if you could step into those experiences in a deeper and more engaged way.
For example, a realtor could now offer a client five virtual walkthroughs of their top housing choices in less than an hour, rather than having to block out a whole day and drive around the city. A family choosing a vacation spot could pass around a headset and each tour three different options, including quick looks at hotels, restaurants, and activities. A sports lover could use a headset to take them to a live event and put them in a front row seat that they wouldn't be able to afford in real life. A person wanting a job in a highly specialized field could put on a headset and apprentice with a world-renowned expert on the other side of the planet as part of a one-to-many program that feels like one-to-one.
The internet has already given us the first level of many of the experiences mentioned above. Virtual reality is at the beginning of unlocking the next level. And now Apple is going to bring its focus on usability, elegant hardware, and of course, powerful consumer storytelling.
So when Apple announces its long-anticipated headset at WWDC 2023, it's probably less useful to think of it as Apple entering the VR market. It's probably more useful to think of Apple's arrival in this space as a signal that the new generation of an immersive internet is about to take some of our most valuable digital experiences to the next level.
Sun, 28 May 2023 04:50:00 -0500entext/htmlhttps://www.zdnet.com/article/will-apples-reality-pro-signal-the-beginning-of-the-immersive-internet/This could be Apple’s biggest product launch since the Apple Watch
Apple CEO was presented with an original Macintosh. See his reaction
SHARE
SHARE
TWEET
SHARE
EMAIL
What to watch next
Hear sonic boom from US fighters scrambling to unresponsive aircraft
CNN
Hear Nikki Haley answer questions about abortion
CNN
Ex-Trump WH official throws cold water on Pence's 2024 chances
CNN
Survivor shares harrowing account of Indian train disaster
CNN
Robert F. Kennedy Jr. on challenging Biden for 2024 nomination
CNN
'Not even close to accurate': Dale fact checks Haley's claim about crime
CNN
'He's being hypocritical': Haley criticizes DeSantis' handling of Disney
CNN
Haley breaks with Donald Trump on Putin and Ukraine
CNN
Sonic boom heard across Washington, DC, after fighter jets scrambled
CNN
Watch Taylor Swift's concert message to LGBTQ fans
CNN
CNN Exclusive: Soccer superstar gives tearful tribute to grandfather
CNN
Elton John bumps into Manchester City at the airport and gets serenaded
CNN
Video shows near collision between warships in Taiwain Strait
CNN
Production on TV shows on hold amid writers' strike
CNN
Raskin: I'll decide on Senate run 'before the Fourth of July'
CNN
GOP Rep doesn't rule out ousting McCarthy after debt deal
At its Worldwide Developers Conference, which kicks off Monday at its Cupertino, California, campus, Apple (AAPL) is widely expected to introduce a “mixed reality” headset that offers both virtual reality and augmented reality, a technology that overlays virtual images on live video of the real world.
The highly anticipated release of an AR/VR headset would be Apple’s biggest hardware product launch since the debut of the Apple Watch in 2015. It could signal a new era for the company and potentially revolutionize how millions interact with computers and the world around them.
But the headset is just one of many announcements expected at the developers event. Apple will also show off a long list of software updates that will shape how people use its most popular devices, including the iPhone and Apple Watch.
Apple may also tease how it plans to incorporate AI into more of its products and services, and keep pace with a renewed arms race over the technology in Silicon Valley.
The event will be livestreamed on Apple’s website and YouTube. It is set to start at 10:00 a.m. PT/1:00 p.m. ET.
Here’s a closer look at what to expect:
A ‘mixed reality’ headset
For years, Apple CEO Tim Cook has expressed interest in augmented reality. Now Apple finally appears ready to show off what it’s been working on.
According to Bloomberg, the new headset, which could be called Reality One or Reality Pro, will have an iOS-like interface, display immersive video and include cameras and sensors to allow users to control it via their hands, eye movements and with Siri. The device is also rumored to have an outward-facing display that will show eye movements and facial expressions, allowing onlookers to interact with the person wearing the headset without feeling as though they’re talking to a robot.
Apple’s new headset is expected to pack apps for gaming, fitness and meditation, and offer access to iOS apps such as Messages, FaceTime and Safari, according to Bloomberg. With the FaceTime option, for example, the headset will “render a user’s face and full body in virtual reality,” to create the feeling that both are “in the same room.”
The decision to unveil it at WWDC suggests Apple wants to encourage developers to build apps and experiences for the product in order to make it more compelling for customers and worth the hefty price tag.
The company is reportedly considering a $3,000 price tag for the device, far more than most of its products and testing potential buyers at a time of lingering uncertainty in the global economy. Other tech companies have struggled to find mainstream traction for headsets. And in the years that Apple has been rumored to be working on the product, the tech community has shifted its focus from VR to another buzzy technology: artificial intelligence.
But if any company can prove skeptics wrong, it’s Apple. The company’s entry into the market combined with its vast customer base has the potential to breathe new life into the world of headsets.
New MacBooks
A mixed reality headset may not be the only piece of hardware to get stage time this year.
Apple is expected to launch a new 15-inch MacBook Air packing the company’s M2 processor. The current size of the MacBook Air is 13 inches.
Previously, users who wanted a larger-sized Apple laptop would need to buy a higher-end MacBook Pro.
New features for iPhone, iPad and Apple Watch
Considering WWDC is traditionally a software event, Apple executives will likely spend much of the time highlighting the changes and upgrades coming to its next-generation mobile operating systems, iOS 17 and iPadOS 17.
While last year’s updates included a major design overhaul of the lock screen and iMessage, only minor changes are expected this year.
With iOS 17, Apple is expected to double down on its efforts around health tracking by adding the ability to monitor everything from a user’s mood to keeping tabs on how their vision may change over time. According to the Wall Street Journal, Apple will also launch a journaling app not only as a way for users to log their thoughts but also activity levels, which can then be analyzed to reveal how much time someone spends at home or out of the house.
The new iOS 17 is also said to get a lock screen refresh: When positioned in horizontal mode, the display will highlight widgets tied to the calendar, weather and other apps, serving as a digital hub. (iPadOS 17 is also expected to get some of the same lock screen capabilities and health features.)
Other anticipated upgrades include an Apple Watch OS update that would focus on quick glances at widgets, and more details about its next-generation CarPlay platform, which it initially teased last year.
Apple tries to flex its AI muscle
While much of the focus of the event may be on VR, Apple may also attempt to show how it’s keeping pace with Silicon Valley’s current obsession: artificial intelligence.
Apple reportedly plans to preview an AI-powered digital coaching service, which will encourage people to exercise and Strengthen their sleeping and eating habits. It’s unclear how it could work, but the effort comes at a time when Big Tech companies are racing to introduce AI-powered technologies in the wake of ChatGPT’s viral success.
Apple may also demo and expand on some of its recently teased accessibility tools for the iPhone and iPad, including a feature that promises to replicate a user’s voice for phone calls after only 15 minutes of training.
Most of the other Big Tech companies have recently outlined their AI strategies. This event may be Apple’s chance to do the same.
For more CNN news and newsletters create an account at CNN.com
Sun, 04 Jun 2023 02:48:47 -0500en-UStext/htmlhttps://www.msn.com/en-us/lifestyle/shopping/new-vrar-headset-and-macbooks-what-to-expect-from-apple-s-developer-event/ar-AA1c6yPUApple WWDC 2023 expectations: Reality Pro, iOS 17 to MacBooks, know what is on the cards
Apple's highly anticipated Worldwide Developer Conference (WWDC) is just around the corner, with the keynote scheduled for Monday, June 5. This year's WWDC is expected to be a significant event for Apple, unveiling new products and software updates. Rumours suggest that Apple might finally reveal its long-awaited mixed reality headset, codenamed "Reality Pro," which will provide virtual reality (VR) and augmented reality (AR) experiences. Alongside the headset, we can anticipate updates to operating systems, new apps and features, and the introduction of new hardware, including a larger MacBook Air. Let's dive into the details of what to expect from WWDC 2023.
Apple's Rumoured Mixed Reality Headset
All eyes are on Apple's mixed reality headset, generating a lot of excitement. Although Apple has not officially confirmed its existence, reports indicate that the headset will offer both VR and AR experiences. Referred to as "Reality Pro," this developer-focused device is rumoured to have a ski goggle-like design with a dial for transitioning between virtual reality and the real world. According to Bloomberg's Mark Gurman, the headset may feature external sensors for hand tracking and internal eye-tracking sensors. The standalone headset is likely to have a battery pack connected via a proprietary cable. Analyst Ross Young suggests that the device could sport two 1.41-inch Micro OLED screens with 4,000 pixels per inch (ppi), delivering high brightness of over 5,000 nits and 4K resolution for each eye.
Another exciting expectation is the introduction of a larger 15-inch MacBook Air. Traditionally featuring a 13-inch display, the MacBook Air is rumoured to receive an upgrade with the inclusion of the in-house M2 chip, similar to last year's models. Reports from Bloomberg suggest that the screen resolution might match that of the 14-inch MacBook Pro, with a slight reduction in sharpness due to the increased screen size. Apple is also speculated to release several new Macs, potentially including a refreshed 13-inch MacBook Pro, 13-inch MacBook Air, and a 24-inch iMac, all powered by the new M3 chip. Gurman's sources hint at an expanded trade-in program, suggesting the arrival of new devices.
Expected updates to iOS, iPadOS, and macOS
WWDC primarily caters to developers, ensuring updates to Apple's operating systems. iOS 17 is expected to bring various quality-of-life improvements, including new accessibility features like the Personal Voice tool, enabling users to create synthetic voices with just 15 minutes of training. Bloomberg reports suggest that iOS 17 might introduce a feature that transforms the iPhone's lock screen into a smart home-style display when tilted horizontally. iPadOS 17 is likely to bring updates to Stage Manager, the multitasking interface released last October, aimed at enhancing its functionality. Details on macOS 14 and tvOS 17 are scarce, but watchOS 10 is anticipated to receive a substantial update with a new interface centred around widgets. There are also rumours of the Health app making its way to iPadOS, benefiting users of Apple Fitness+. WatchOS 10 might bring improvements to the home screen, functional widgets, and new functionalities for the Apple Watch's digital crown.
In short, Apple's WWDC 2023 is set to attract technology enthusiasts worldwide with the potential unveiling of the Reality Pro mixed reality headset, the anticipated 15-inch MacBook Air, and a range of software updates across iOS, iPadOS, macOS, and more. As Apple continues to push boundaries and introduce innovative products and features, the future of technology looks promising. Stay tuned for more updates.
Thu, 01 Jun 2023 07:34:00 -0500en-INtext/htmlhttps://www.msn.com/en-in/money/news/apple-wwdc-2023-expectations-reality-pro-ios-17-to-macbooks-know-what-is-on-the-cards/ar-AA1bYMYJApple says your iPhone will soon be able to speak in your voice with 15 minutes of training
Apple announced a set of new accessibility features on Tuesday.
That includes "Personal Voice," which will replicate your voice through AI in 15 minutes.
It's designed to help people who struggle to speak with phone calls and in-person conversations.
Thanks for signing up!
Access your favorite courses in a personalized feed while you're on the go.
If you have an iPhone or iPad, you'll soon be able to hear it speak in your own voice, Apple announced Tuesday.
The upcoming feature,"Personal Voice," will deliver users randomized text prompts to generate 15 minutes of audio.
There'll be another new tool called "Live Speech" which lets users type in a phrase, and save commonly-used ones, for the device to speak during phone and FaceTime calls or in-person conversations.
Apple says it'll use machine learning, a type of AI, to create the voice on the device itself rather than externally so the data can be more secure and private.
It might sound like a quirky feature at first, but it's actually part of the company's latest drive for accessibility. Apple pointed to conditions like ALS where people are at risk of losing their ability to speak.
"At Apple, we've always believed that the best technology is technology built for everyone," said Tim Cook, Apple's CEO.
The Personal Voice feature in use.Courtesy of Apple
And Philip Green, a board member at the Team Gleason nonprofit whose voice has changed significantly since being diagnosed with ALS, said in the press release: "At the end of the day, the most important thing is being able to communicate with friends and family."
"If you can tell them you love them, in a voice that sounds like you, it makes all the difference in the world," he added.
It's not the first time Apple has ventured into the AI voice market, as iPhone users will be familiar with Siri. It uses machine learning to understand what people are saying.
And back in 1984, Steve Jobs was passionate about getting the Apple Macintosh 128K to say "Hello" in a voice demo at its unveiling. That was pretty advanced tech for the time, and was dramatized in a key plot point in the 2011 biopic about the late Apple cofounder.
It's not clear exactly when Personal Voice will be available, but Apple says it'll be before the end of the year.
Wed, 17 May 2023 06:05:00 -0500en-UStext/htmlhttps://www.businessinsider.com/apple-announces-iphone-will-able-speak-in-your-voice-ai-2023-5What training with the Apple Watch Ultra taught me about multiband GPS and failure
It’s easy to blame trackers when fitness goals don’t go according to plan, but sometimes the problem lies closer to home.
ByVictoria Song, a writer focusing on wearables, health tech, and more with 11 years of experience. Before coming to The Verge, she worked for Gizmodo and PC Magazine.
Share this story
Illustration by Hugo Herrera for The Verge
Illustration by Hugo Herrera for The Verge
I almost quit this year’s New York City Half Marathon.
The moment is seared into my brain. I’d been running for nearly two hours in freezing temperatures, straight into the wind. The Apple Watch Ultra on my left wrist buzzed to tell me I’d just passed mile nine. On my right wrist, the Garmin Forerunner 265S said I’d only run 8.55 miles. A short-ish distance ahead, I could see the official mile nine marker. I had no idea which distance was “true.” It didn’t matter, though. All I wanted was to beat last year’s time, even by just one millisecond. That hadfelt like an achievable goal. I’m no math whiz, but what I saw on the official clock meant I’d have to run the last four miles atEliud Kipchoge-level speeds to match last year’s time. That wasn’t just unachievable. It was impossible.
I broke. If it weren’t for a well-timed cheer from a friend around mile 10, I probably would’ve called it a day. I’m really not sure how I powered through the rest of the race; something inside me died at the finish line. (The watches didn’t die, though; the race barely made a dent in the Ultra’s battery, and that was without any low-power settings.)
Whatever it was, it left a gaping hole that no finisher’s medal could ease.I hadn’t missed my goal by one or two minutes. I was a whole 13 minutes slower than last year. None of it made sense. Sixteen weeks of consistent trainingshould’ve been more than enough for a race I was familiar with. So after 48 hours of moping, I set out to find out what had killed my half-marathon dreams.
Suspect one: GPS
You’re guaranteed to see GPS watches at any road race. In outdoor running, GPS watches help you calculate pace and distance, both of which are crucial when training. The more accurate your GPS watch, the easier it is to trust the results of your training. Maybe I whiffed it on race day because the stars and satellites were misaligned.
Its superior GPS is one reason I picked the Apple Watch Ultra as my primary training watch for the NYC Half. (That and I wanted to spend more time with the new running form metrics in watchOS 9.) Not only is the Ultra geared toward endurance athletes but it’s also one of a handful of smartwatches that have dual-frequency GPS.
The appeal of this new-ish technology is that it’s supposed to deliver next-level accuracy. While I’d been running with the Ultra since it launched, I’d yet to see how it stacked up against a Garmin on longer distances over an extended training period. On my spare wrist, I alternated between the Garmin Fenix 7S Sapphire Solar and Forerunner 265S, which also have dual-frequency GPS, for my long runs. (I’d have done it for the entire 16 weeks, but the plight of a smartwatch reviewer is you have to keep one wrist free at all times for new products.)
Let’s talk GPS terminology
GPS stands for Global Positioning System, a satellite-based navigation system operated by the US. Colloquially, we use the term GPS to refer to all global navigation satellite systems (GNSS), even though it properly refers to only one type. Some other well-known GNSS include Russia’s GLONASS, Europe’s Galileo, and China’s Beidou.
Most consumer tech devices with GPS operate on the L1 frequency band. However, for the past few years, an increasing number of gadgets now use the L5 band as well. Devices that use both the L1 and L5 bands are often referred to as having dual-frequency or multiband GPS / GNSS. On its own, the L1 band alone is often foiled by tall buildings, foliage, tunnels, and water. Using both bands improves the signal-to-noise ratio, meaning accuracy isn’t as impacted by these obstacles.
Looking back at the data, the Ultra and both Garmins delivered similar results during training. At most, I’d see a difference of maybe a tenth of a mile. That wasn’t the case on race day. The Ultra recorded 13.42 miles, while the Forerunner reported 12.92 miles. Neither of those distances is 13.1 miles, which is the official length of the course. And yet, the maps generated by both were nearly identical. Clearly, something had happened that day. While I consider myself well versed in the practicalities of GPS watches, I wanted to talk to an expert about what happened. So I asked Apple to get into the nitty-gritty of how the Ultra’s multiband GPS works — and why my data was so different on the day it mattered most.
I expected Apple to launch into why the Ultra’s GPS was leagues ahead of the competition’s. To be fair, every smartwatch maker will tell you their GPS tech is the best. That said, I was surprised that the Ultra (plus the Series 8 and SE) doesn’t rely on GPS alone.
“A product like this, that also has network connectivity, enables us to use the entire system in ways that traditional GPS systems can’t,” says Rob Mayor, Apple’s director of motion and location technologies.
This thin gray metal band is what enables the Ultra to access the L5 frequency. Photo by Amelia Holowaty Krales / The Verge
Traditional GPS requires downloading a satellite’s estimated position in order to begin tracking. That can create challenges if you’re in an obstructed environment. Signals can be corrupted, take more time to download, or get blocked by objects like skyscrapers, canyons, or tree foliage. Mayor says the Ultra can cache orbit predictions for up to a week. That means you can go offline and still get an immediate location fix because you don’t need to wait for your watch to decode that information; it’s already there.
Similarly, if you fly to another state for a race, the Ultra doesn’t have to go fishing in the sky for the correct satellites. According to Mayor, the watch can acquire GPS more quickly by leveraging Wi-Fi access points and cell-tower locations to get a rough idea of your location and figure out which satellites to look for. Maps data also plays into the equation. While most people in the US think of Maps directions in the context of driving, this hybrid approach can help put pedestrians on the right cycling, hiking, or running route — especially in cities. Basically, that extra Maps data ensures your route summary isn’t going to say you’re running through a river or magically ghosting through buildings.
Altogether, the additional L5 signal is cross-referenced with data from Maps and Wi-Fi for what Mayor calls hyper-accurate GPS. It’s important to maintain a healthy skepticism, but it’s hard to argue that this method doesn’t deliver freakishly accurate location data. For instance, the Ultra (plus Series 8, SE, and any watch running watchOS 9) can automatically detect when you arrive at a running track. It also knows which lane you’re running in without calibration. If I hadn’t tried it out myself — multiple times, mind you — I’d be inclined to think it’s too good to be true.
But even if the Ultra uses a blend of tech, it doesn’t piggyback off your phone’s GPS as previous Apple Watches did. Mayor told me the Ultra has gotten to the point where your iPhone’s signal doesn’t add much.
I can definitely see the benefit of Maps data here from my race. Screenshot: Victoria Song / The Verge
My Garmin data is also still pretty good, but it’s possibly a bit tripped up by tall skyscrapers in Times Square. Screenshot: Victoria Song / The Verge
That still didn’t answer why such high-tech devices with fancy GPS gave me results that didn’t match up with the official course. It’s something that I’ve noticed at multiple races, and after my half-marathon debacle, I wondered if this “miscalculation” had in some way left me underprepared.
“A lot of people don’t understand how they map and measure race courses. They assume they’re going to cross the finish line at exactly 13.1 [miles] or 26.2 [miles],” Eric Jue, director of Apple Watch product marketing, told me after I relayed my NYC Half tale. “And they’re a little bit discombobulated when they see something different.”
As it turns out, you’ll run at least 13.1 miles in a half-marathon. The official distance is based on the most optimal route and doesn’t account for zigzagging through other runners, running toward the sides of the road, or stopping at water stations. Most people don’t run the most optimal route and end up running a bit more. By that reasoning, you could argue that the Ultra’s 13.42 miles is closer to what I actually ran than is the Forerunner 265S.
“I think that users’ perceptions [are] like, here, I have this thing that’s very precise and I’m comparing it to things that are potentially less precise, as well as my perception — which is probably less precise as well,” agrees Mayor. “It’s a confluence of those things.”
Okay, okay. It wasn’t the GPS.
Suspect two: training data and features
A couple of weeks later, I found myself at McCarren Park Track in Brooklyn. It was a chilly, overcast day, and I tried to stay warm hopping from one foot to the next. Apple had invited a gaggle of journalists out to demo watchOS 9’s running features. I’d used them before while reviewing the Ultra, but I hoped a refresher would reveal something I’d missed during my own training.
We ran around 3.5 miles, broken up into several shorter runs. Apple had us try the custom workouts, which let you program running routines, including intervals and pace targets, on the watch, while Fitness Plus trainer Josh Crosby demonstrated how certain alerts for heart rate zones, running power, and pace worked. I ran multiple laps around the track, which again highlighted the Ultra’s eerily accurate GPS. But of all the running features, the one that “bothered” me most was the pace alerts.
The feature itself is self-explanatory. Set a target pace — or a pace range — and then run. Whenever I ran too fast or too slow, the Ultra would buzz on my wrist. You’d think, after all these years, I’d have a good sense of what my 10-minute mile feels like versus an 11- or 12-minute mile. You’d think I’d be good at maintaining a consistent pace — a vital skill for running a long-distance race.
Maps data can help zero in on the exact track lane you’re running in as well as make sure your map doesn’t have you running through buildings. Photo by Victoria Song / The Verge
Turns out, I wasn’t. At least, not that track day. I kept getting alerts that I was slipping in and out of my target ranges. After a few loops at what should have been an easy pace, I found myself breathing harder than normal, wondering why it felt like I was running through molasses even though 3.5 miles was well within my wheelhouse. I thoroughly enjoyed myself, but afterward, that heaviness and the alerts lingered in my mind. Is this what a 10:30 pace really felt like these days? Had I inadvertently underestimated my pace during those 16 weeks with sporadic, intermittent checks?
For myhalf-marathon training, I only occasionally dabbled with these features. I mostly stuck to a training plan I’d found in Runner’s World. After the day at McCarren, I felt inspired to go all in on the watchOS 9 running features on the Ultra. Custom runs with pace alerts, racing my previous times on common routes, engaging more with the metrics mid-run — you name it, I tried it over the course of a month. I figured I should see some improvement. At the very least, I would have a trove of data to pore over to see if any patterns emerged.
I learned three things. First, I’m not a competitive person. Racing my time on past routes was a unique kind of torture. Second, I’m not as good as I thought at regulating pace for runs that are over an hour long. And third, I was getting slower. Slower! Before you ask if I added strength training and proper recovery during and after half-marathon training — yes, I did.
It’s tempting to lay the blame on the Ultra (and all my other wearables). But the Ultra’s GPS maps continued to match what I got on my iPhone and my Garmin watches. My heart rate matched my Polar H10 strap. My sleep data wasn’t as accurate as the Oura Ring, but it wasn’t too far off either. If I’m honest with myself, I knew none of these devices were at fault.
After a month, I was at a loss. So I gave up. I pared down the Ultra’s running features to custom workouts and put on pace alerts for speed work only. (Mainly so I wouldn’t go too fast out the gate.) I stopped scrutinizing my workout summaries. As it turns out, as soon as I put some boundaries in place, I finally, finally started to improve.
I’d used the features properly — and they’d worked as advertised. The data was reliable. I improved as soon as I stopped trying so hard. I didn’t like what that meant, but it’s like Sherlock Holmes says. Once you eliminate the impossible, whatever remains must be the truth.
Suspect three, and the culprit: me
All I wanted from this year’s NYC Half was to run it faster than last year. I wanted that because, last year, I ran the race— my first half-marathon — as part of Team ALS. I joined on a whim mere weeks after my mom died from the incurable disease. It felt like rebellion after a year defined by ALS. I had about six weeks of real training, and going from zero to 100 like that was... an experience. Four of my toenails fell off, and I honestly don’t remember much of the race itself. But just as I’ll never look at 2021 without thinking of ALS, I’ll never look back at 2022 without feeling my grief. I wanted to run the 2023 NYC Half Marathon for myself, and withoutrealizing it, I planted the seed in my head that doing well would mean I was finally okay.
Unfortunately, it doesn’t work like that.
Loss is exhausting, and I feel most free from its grip while running. Viewed from that lens, I suppose it’s obvious that I’d turn to the Ultra and its training metrics so that I could run the best race of my life. I could list a dozen valid reasons why that didn’t pan out, but I’ve come to accept that, at the heart of it, I didn’t really want to do better this year. Part of me wanted to tie up this chapter of my life with a pretty, symbolic bow. To move on to what’s next. The rest of me isn’t ready.
The thing about wearables — and people — is that they assume health can be measured. If you can read the metrics the right way, if you can interpret the data, then you can compare who you were to who you are becoming. Among smartwatches, the Ultra is a magnificent beast. But for all the impressive tech it packs in its titanium case, it cannot measure grief. Mental health is also health, and as they say, the body keeps the score. It just so happens that the mental side of the equation is much harder for wearables to meaningfully quantify.
It seems silly to spill this much ink over a botched race, but technically, this is how wearables are supposed to work. You collect data over a period of time. Depending on your progress (or lack thereof), you learn something about yourself. We just anticipate that progress to be a linear, ever-rising line. Growth is annoyingly not linear, and facing failure isn’t pleasant. I can’t say these are the lessons I wanted to learn, but I’m grateful for them nonetheless.
All that’s left is to take these lessons and apply them going forward. I’ve already signed up for my next race. I will 1,000 percent be testing no less than four devices — including the Ultra — while training. But for once, I have no time goal. I’m trying not to make a narrative of the data. I think I’ll simply run and see what that feels like.
Correction, May 30th, 2023, 9:30AM ET: A previous version of this article incorrectly stated dual-band frequency GPS reduces signal-to-noise ratio. In fact, it improves it. We regret the error.
Correction, May 31st, 2023, 2:40PM ET: A former version of this article misstated that the Ultra accesses Apple’s data base of Wi-Fi access points. It’s not a proprietary database but all existing Wi-Fi access points and cell towers. We regret the error.
Sat, 27 May 2023 05:03:00 -0500en-UStext/htmlhttps://www.theverge.com/23663107/apple-watch-ultra-gps-runningNew VR/AR headset and MacBooks: What to expect from Apple's developer event