Founded in 2000, the International Association of Privacy Professionals (IAPP) bills itself as “the largest and most comprehensive global information privacy community and resource.” It is more than just a certification body. It is a full-fledged not-for-profit membership association with a focus on information privacy concerns and topics. Its membership includes both individuals and organizations, in the tens of thousands for the former and the hundreds for the latter (including many Fortune 500 outfits).
Its mandate is to help privacy practitioners develop and advance in their careers, and help organizations manage and protect their data. To that end, the IAPP seeks to create a forum where privacy pros can track news and trends, share best practices and processes, and better articulate privacy management issues and concerns.
By 2012, the organization included 10,000 members. By the end of 2015, membership had more than doubled to 23,000 members. According to a Forbes story published that same year, approximately half of the IAPP’s membership is women (which makes it pretty special, based on our understanding of the gender composition for most IT associations and certification programs). Current membership must be between 30,000 and 40,000 as growth rates from 2012 to 2015 have continued, if not accelerated in the face of the EU’s General Data Protection Regulation (GDPR), which went into full effect on May 25, 2018. The IAPP also claims to have certified “thousands of professionals around the world.”
The IAPP has developed a globally recognized certification program around information privacy. Its current certification offerings include the following credentials:
All these certifications comply with the ANSI/ISO/IEC 17024 standard, which means they have been developed to meet stringent requirements for analyzing the subject matter and the fields of work to which they apply, along with formal psychometric analysis of test items to make sure that exams truly differentiate those who possess the required skills and knowledge to do the related jobs from those who do not.
All the IAPP exams follow the same cost structure, though charges vary by location. In the U.S., each first-time test costs $550, with a $375 charge for any subsequent retake of the same exam. Those who already hold any IAPP certification pay just $375 for each additional certification test they take. IAPP certification holders can either pay an annual maintenance fee of $125 to keep their certifications current (and meet continuing education requirements of 20 CPE credits every two years) or they must join the IAPP.
If a person joins, they’ll pay an annual membership fee. Currently, that’s $250 for professional members, $50 for student members, and $100 for all other membership categories (government, higher education, retired and not-for-profit). Those who elect to pay the certification maintenance fee need pay only once a year, no matter how many IAPP certifications they earn.
IAPP exams are available at Kryterion testing centers, which may be identified with its test center locator. Exams consist of 90 question items. Candidates may take up to 150 minutes (2.5 hours) to complete any IAPP exam. Payment is handled through the IAPP website, but Kryterion handles date and time windows for exams at its test centers.
This credential is the most likely place for a person working in IT to start their IAPP efforts. The CIPT validates skills and knowledge about the components and technical controls involved in establishing, ensuring and maintaining data privacy. To be more specific, the body of knowledge (BoK) for the CIPT stresses important privacy concepts and practices that impact IT, and makes sure that practitioners understand consumer privacy expectations and responsibilities.
It also addresses how to bake privacy into early stages of IT products or services to control costs and ensure data accuracy and integrity without impacting time to market. CIPTs understand how to establish privacy policies for data collection and transfer, and how to manage privacy on the internet of things. They also know how to factor privacy into data classification, and how it impacts emerging technologies such as biometrics, surveillance and cloud computing. Finally, CIPTs understand how to communicate on privacy issues with other parts of their organizations, including management, development staff, marketing and legal.
IAPP describes this certification as just right for “the go-to person for privacy laws, regulations and frameworks” in an organization. This audience may include more senior privacy or security professionals with IT backgrounds, but it may also involve people from management, legal or governance organizations whose responsibilities include data privacy and protection concerns. This goes double for those involved with legal and compliance requirements, information management, data governance, and even human resources (as privacy is a personal matter at its core, involving personal information).
Because managing privacy and protecting private information is often highly regulated and subject to legal systems and frameworks, the IAPP offers versions of the CIPP certification where such content and coverage has been “localized” for prevailing rules, regulations, laws and best practices.
There are five such versions available: Asia (CIPP/A), Canada (CIPP/C), Europe (CIPP/E), U.S. Government (CIPP/G) and U.S. Private Sector (CIPP/US). As of this writing, the CIPP/E perforce offers the most direct and focused coverage of GDPR topics. That said, given that GDPR applies to companies and online presences globally, such material will no doubt soon make its way into other CIPP versions in the next 6-12 months. The U.S.-focused exams are already scheduled for a refresh in August 2018, as per the IAPP website’s certification pages.
For example, the CIPP/US page includes the following materials:
Each of the other regional versions of the CIPP has a similarly large, detailed and helpful collection of resources available to interested readers and would-be certified professionals.
The CIPM is a more senior credential in the IAPP collection. It seeks to identify persons who can manage an information privacy program. Thus, the focus is on privacy law and regulations and how those things must guide the formulation of workable and defensible privacy policies, practices and procedures for organizational use. The CIPM BoK covers the following topics:
In general, CIPMs play a lead role in defining and maintaining data privacy policies for their organizations. They will usually be responsible for operating the privacy apparatus necessary to demonstrate compliance with all applicable privacy rules, regulations and laws for the organization as well.
The IAPP also offers two other elements in its certification programs. One is the Privacy Law Specialist, which aims at attorneys or other licensed legal professionals who wish to focus on privacy subjects in a legal context. The other, called the Fellow of Information Privacy (FIP), aims at those at the top of the privacy profession and is available only to those who’ve completed two or more IAPP credentials, including either a CIPM or a CIPT, and one or more of the CIPP credentials. It requires three professional peer referrals and completion of a detailed application form. We won’t discuss these credentials much more in this article, except to note here that the Privacy Law Specialist garnered a surprising 200 hits in our job board search (see below for other details gleaned thereby).
Finally, the IAPP website recommends the combination of CIPP/E and CIPM as the possible credentialing for those wishing to focus on GDPR, shown in this screenshot from its Certify pop-up menu:
We visit four job posting sites to check on demand for specific credentials: Simply Hired, Indeed, LinkedIn and LinkUp. Here’s what we learned.
Certification | Search string | Simply Hired | Indeed | LinkUp | Total | |
---|---|---|---|---|---|---|
CIPP | CIPP | 668 | 745 | 1,064 | 401 | 2,878 |
CIPM | CIPM | 187 | 198 | 260 | 191 | 836 |
CIPT | CIPT | 146 | 155 | 276 | 210 | 787 |
The breakdown for CIPP fell out like this: CIPP/A 27, CIPP/C 287, CIPP/E 351, CIPP/G 154 and CIPP/US 401. As you’d expect, the U.S. categories combine for a majority, with Europe a surprising second ahead of third-place Canada.
Salary information appears in the next table. We collected low, median and high values for each credential, finding surprisingly little difference between the CIPM and the CIPP. Given that a CIPM is likely to hold a management position, this shows that the CIPP holds considerable value in employers’ estimations. It’s also interesting that the median values show the CIPT and the CIPP are close to one another too. This bodes well for IT professionals interested in pursuing the CIPT.
Certification |
Low |
Median |
High |
---|---|---|---|
CIPP | $33,969 | $66,748 | $131,156 |
CIPM | $41,069 | $73,936 | $133,106 |
CIPT | $32,131 | $62,279 | $120,716 |
Privacy Law Attorney | $46,146 | $89,026 | $171,752 |
Typical positions for privacy professionals are very much one-offs. We found a risk management and compliance manager position at a South Carolina government agency charged with defining and implementing security and privacy policies for the department of corrections. That position paid $120,000 per year and involved security and audit compliance, business continuity and disaster recovery planning, and risk and incident management. By itself, the requested CIPM would not be enough to qualify for that job.
The next position was for a healthcare services director position in Albuquerque, New Mexico, which involved auditing, risk management, and contract and vendor negotiation. Its pay range was $140,000 to $190,000 per year, and it required serious management chops, along with IT governance and risk and compliance experience, with calls for knowledge of tools like Archer and Clearwell. The third position was for a senior data privacy associate at a Washington law firm, which sought a person with a CIPP/E, CIPP/US and CIPT, with pay in the $120K-$150K range.
Thus, it appears there are plenty of opportunities – some with high rates of pay – for those willing to climb the IAPP certification ladder. Both the job boards and the individual postings speak directly to strong and urgent need in the field for qualified privacy professionals at all levels.
IAPP courses are available through many channels, including classroom training through the IAPP and its partner network. Online training classes are also available, for lesser charges. The IAPP provides ample references and resources, with authoritative and supplemental texts, websites, legal references and statutes, and more for each of its credentials. There’s also plenty of self-study material for those who prefer that route.
The IAPP also offers practice exams (which it calls trial questions) to help candidates prepare for exams. Surprisingly, there is even something of an aftermarket for IAPP books and materials, as a quick trip to Amazon will attest.
In latest months, rumors about Apple working on a top-secret headset project have reached a fever pitch. But at the same time, the chatter has become increasingly convoluted — Apple is reportedly planning to use mixed reality (MR) rather than solely augmented reality (AR) or VR, but how exactly that will work is unknown. What will the device look like? And what features will it have?
That is where this roundup comes in. We combed through the rumors and reports to find all the latest key information, then combined it in one convenient location. Here is everything we know about Apple’s upcoming mixed-reality headset, including price, features, and more.
Apple analyst Ming-Chi Kuo originally suggested a January 2023 announcement date, with a shipping date coming in the second quarter of 2023. However, that January date has come and gone with no news. Elsewhere, Kuo has said that a development toolkit could make its way to interested parties two to four weeks after the reveal event.
In late 2022 and early 2023, a consensus begun to form around the idea that Apple’s headset will launch at a special event in spring 2023. Only a handful of devices are expected at the show, but the mixed-reality headset was strongly rumored to be the headline attraction.
However, the idea was thrown into disarray when Bloomberg reporter Mark Gurman claimed his sources had told him that Apple had pushed back the release date from April to June due to various hardware and software issues. If true, it’s just the latest in a long line of delays to have apparently beset the device.
The June date makes sense, though, because it would allow Apple to launch the device at its Worldwide Developers Conference (WWDC). That would give the company the perfect stage to get developers up to speed on the new device, which is allegedly going to run on a new operating system that developers will need to get used to.
Whatever the real launch date, we’re talking about the first headset launched by Apple. The company is also rumored to be working on multiple devices, including a more affordable version that will follow the initial product. After that, Apple could launch a set of AR glasses that look more innocuous than a full-on headset. There’s no release date for those yet, though, and they could be a long way from launch.
As for the price, Kuo initially suggested a much lower price of $1,000. This puts the headset in consumer territory and what we would expect from Apple: expensive, but still considered mainstream and consumer-focused. However, in August 2022, Kuo then upped that prediction to $2,000, and Mark Gurman cited a similar figure.
That might have been an underestimate, though. The Information has offered details claiming the headset will cost $3,000. That would put it in the company of Microsoft’s $3,500 HoloLens 2, but with a price that high, it would likely be restricted to use by developers and professionals. That seems a little out of character for Apple. However, a January 2022 report from Display Supply Chain Consultants (DSCC) also claimed the price tag would be “several thousand dollars,” and later reporting from Gurman has backed up that $3,000 price. Leakers and reporters seem to have settled on that amount, making it an expensive first effort from Apple.
Interestingly, a report from Gurman in January 2023 stated that, despite the high price, Apple would actually lose money on each headset. That’s an indication of how much high-tech wizardry is packed into the device, and also suggests Apple is banking on its first-generation headset getting people in the door and paving the way for later products. With a $3,000 asking price, though, it could be a hard sell.
Ever since the first Apple headset rumors started to leak on the internet, people have been speculating about what the device might be called. Some early contenders have come and gone, but an answer appears to be getting closer.
The first widely promoted name was Apple Glass. This was mooted by leaker Jon Prosser in a YouTube video from mid-2020 after he claimed to have seen a prototype of the device. However, this name was shot down by Gurman, who expressed his doubt that Apple would name a product after the flop that was Google Glass. Fair point.
Gurman himself proposed a number of options Apple might go with, including Apple Vision, Apple Reality, Apple Sight, and Apple Lens.
In late August 2022, Apple filed trademarks for the names “Reality Pro,” “Reality One,” and “Reality Processor.” Gurman believes Reality Pro is the name Apple will use for its first headset, and that name indicates it will be a high-end device, perhaps one to rival Meta’s upcoming Quest Pro.
After that, Gurman believes a more affordable headset will launch without a few of the headline features of the Reality Pro. That pared-back device might take the name Reality One. Apple has previously used the “One” name, such as with the Apple One subscription service.
We don’t yet have a name for the AR glasses Apple is supposedly working on, but we’re sure a name will leak out soon enough. Perhaps it already has — Apple Lens seems like a strong contender to us.
What can you expect Apple’s mixed-reality headset to look like? Well, seeing as it combines AR and VR, chances are it will be a full wraparound set to keep you immersed while using its virtual reality features. Anything that lets you see your surroundings — like Microsoft’s HoloLens 2 or the Magic Leap 1 — would take you out of the virtual world you are experiencing. Rumors also suggest Apple’s device will be totally wireless to give you the freedom to move without being yanked back by cables — another immersion-breaker.
Then there is the augmented reality side. To make this happen, the headset is going to require cameras to capture the outside world and feed it back to you. According to a report from The Information in early 2021, there will be up to a dozen cameras and lidar sensors mounted on the device, the latter of which Apple has already incorporated into devices like the iPhone 12 Pro and iPad Pro to help with augmented reality processing. However, a newer report from The Information in May 2022 asserted that there would actually be 14 cameras on the device — something that the outlet claimed again in an October 2022 report, which added that the headset would have two downward-facing cameras to record a user’s legs.
The May report from The Information also contained an interesting tidbit on the headset’s body: it could use straps that look awfully like those on the Apple Watch Sport Band. It is not the first time we have seen one Apple device take design cues from another — the AirPods Max headphones borrow the HomePod Mini’s fabric mesh and the Apple Watch’s Digital Crown, for example. In October 2022, The Information claimed the headset would resemble “a pair of ski goggles” and be made primarily from “mesh fabrics, aluminum, and glass.” The report added that the headset conceals its cameras noticeably better than the Meta Quest Pro.
Kuo, however, contends there will be 15 cameras — eight for AR, one for environmental detection, and six for “innovative biometrics.” Kuo backed this up with a further report in April that reiterated the claim of 15 cameras. It is possible both versions exist as prototypes, with Apple to decide which to settle on in the future. Whichever claim ends up being correct, it is evident Apple is taking the camera situation on its headset seriously.
What about the real body of the device? This is an interesting one, as it could be a real differentiator — and advantage — for Apple. A report from Kuo in March 2021 claimed the entire headset could weigh as little as 150 grams (0.33 pounds), which is about half the weight of many rival devices. The $1,000 Valve Index VR headset would weigh more than five times what Apple’s headset weighs if Kuo is correct, and Apple’s device would also weigh significantly less than the 722g Meta Quest Pro. Aiding that low-bulk goal would be the use of lightweight fabric instead of heavy plastic in the frame.
The Information claimed in May 2022 that Apple’s former design guru Jony Ive has been brought in as a consultant for the headset. Given Ive’s penchant for incredibly thin and light devices, we wouldn’t be surprised if the rumors that the headset will be super-lightweight prove to be correct.
When Gurman published a detailed exposé of the Reality Pro in January 2023, he claimed the device would be made from aluminum, glass, and cushions, and will be “reminiscent of Apple’s $550 AirPods Max headphones.” Gurman also stated that the device’s battery would be separate from the headset itself, with users tethering it to the Reality Pro and stashing it in a pocket. That will apparently help bring down the weight of the headset, and could also offset any concerns of having a warm battery lodged onto your head.
It is not just the exterior of Apple’s headset that sounds promising, as the interior could come with some eye-opening features, too — quite literally in the case of the display resolution. Reporter Gurman has claimed the device’s resolution will be 4K per eye, giving a huge level of detail. For comparison, the HTC Vive Cosmos Elite comes with a 1440 x 1700 resolution per eye. What’s more, according to The Elec, Apple has increased its pixels-per-inch goal for each eyepiece, up from 2,800 pixels per inch to a huge 3,500 ppi. That could bring unrivaled clarity to the headset’s creations.
A January 2022 report from industry analysts at DSCC backs up that idea, claiming that the front-facing lenses could have a 4000 x 4000 resolution. The report adds other enticing details, including that the front panels will be micro-LED displays, while Apple will add a third panel for peripheral vision. This will be an AMOLED display and run at a lower resolution than the micro-LED screens, which could help create an immersive, all-encompassing experience that keeps your peripheral vision slightly blurred to help you focus on what’s ahead of you.
Apple is said to be gunning for high-quality visuals in other ways, with Kuo alleging that the headset might come with iris recognition based on the tech his sources tell him is in the device (such as the cameras used for “innovative biometrics” mentioned earlier). Iris recognition could be used to authenticate you for Apple Pay, says Kuo, or to unlock your accounts, enabling you to perform these tasks without having to take off the headset to enter a password on your iPhone.
Kuo has also suggested in a tweet that the headset might smoothly switch between AR and VR modes, creating an “innovative experience” that could become “one of [the] key selling points of Apple’s headset.” That’s something Gurman agrees with, as he claims the Reality Pro will use a toggle like the Apple Watch’s Digital Crown to switch between AR and VR. That could give it a distinct advantage over rival headsets that are limited to either AR or VR or cannot switch as seamlessly between them.
We mentioned earlier how Apple has been trademarking various names relating to its headset. One more name claimed by the company is “Optica,” which Gurman speculates could refer to “some feature surrounding the device’s interchangeable prescription optics system.” A more latest report from Gurman claims Apple will offer custom prescription lenses that will fit into the device’s enclosure. Since we doubt a pair of glasses could fit under the headset, that’s great news for anyone who uses eyewear daily.
One thing we have not seen much news on is the refresh rate and field of view that will be used in the headset’s displays. The refresh rate will need to be high enough that lag and motion sickness are kept to an absolute minimum, and rival headsets typically aim for 90Hz or higher. We will have to wait and see what Apple opts for here.
While the visuals could be stunning, Gurman believes you’ll need to wear a pair of AirPods if you want to enjoy spatial audio. Reality Pro will have its own speakers, he says, but they will apparently be limited in their abilities.
Powering all this tech would be a pair of custom-designed Apple Silicon chips. The main CPU will be one of Apple’s “most advanced and powerful” processors, according to Gurman, who believes this could be an M2 chip with 16GB of RAM. Apple’s ARM-based chip architecture is incredibly efficient — so much so that the M1 MacBook Air does not even need a fan — which makes it ideal for a compact device like a mixed-reality headset, where keeping things cool is essential (for both you and the chip). However, while Gurman previously claimed the headset itself might not need a fan, he changed tack in January 2023, saying it would likely come equipped with one.
According to Kuo, the dual-chip setup would consist of one made with a 4-nanometer process and another on a 5-nanometer process. Kuo says the former would offer the main computing power, while the latter would manage the device’s sensors. The combined power output would require one of Apple’s 96-watt adapters, Kuo believes — the same adapter as the one that juices up the MacBook Pro, which could be a potential indicator of the headset’s power.
The idea of two chips powering the headset has been backed up by reporter Gurman on several occasions. Gurman has claimed one of the two chips would be “on par with the M1 Pro in the MacBook Pro.” The secondary chip — apparently called the Reality Processor — will handle the headset’s graphics. Gurman says the chip combination could make the headset rather warm, so the battery has been shifted to an external unit that fits in your pocket to help offload some of the heat.
Kuo has also claimed that the headset will boast Wi-Fi 6E rather than the Wi-Fi 6 found in the current iPhone 14 lineup. This opens up a new 6GHz band, granting you lower latency and faster data rates. Considering the demanding nature of mixed-reality content, we think this claim makes a lot of sense.
However, despite all the power the headset might be loaded up with, sources The Information has spoken to claim the device will not focus on gaming (something the sources have criticized). According to The Information’s report, gaming is “a category of software that appeals to early adopters, which was important to the success of the iPhone and has been a big priority for Meta’s VR group,” so making a headset that does not put much of a focus on gaming might seem odd. Then again, given how Apple has never really fully embraced gaming with its other devices, it’s perhaps not entirely surprising.
The headset’s vast camera array could allow for eye- and hand-tracking features. Apple has already patented ideas for these control methods in the past, both for the Mac and for a mixed-reality headset. Adding to that, The Information states Apple is aiming to use hand-tracking and a “clothespin-like finger clip” as input devices and is not looking to include gaming controllers with the headset. That idea has been backed up by a recent Apple patent that describes finger-mounted devices that could detect movement and provide haptic feedback. With that in mind, do not be surprised if Apple promotes the hand-tracking capabilities of its MR headset.
There’s another possibility. Apple recently uncovered a patent filed by the Cupertino giant that outlines how a pair of Apple Watches — one worn on each wrist — could be used to enable gesture controls on the headset. With two Apple Watches, a user could potentially use the palm of one hand as a trackpad of sorts and a finger on the other hand as a mouse, letting you interact with your virtual world. This being just a patent, though, there’s no ensure Apple is doing anything other than exploring ideas here — but it’s still intriguing.
Ultimately, though, we think this system is unlikely to be the one Apple goes for, at least at first. A single Apple Watch is not a cheap purchase, and buying two could add several hundred dollars to an already-expensive headset. There’s been very little to indicate that Apple is working on controllers for its headset, so we’re expecting the first-generation model will use its onboard cameras for gesture controls with your hands.
A report from Gurman seems to contradict both ideas (the clothespin-like clip and the Apple Watches). Gurman believes the headset will not require any kind of physical device to be worn on a user’s hands. Instead, eye-tracking sensors and hand-tracking cameras will detect what you are looking at on-screen, then allow you to open an app by simply pinching your forefinger and thumb together.
What seems more certain is the presence of eye-tracking capabilities, and a report from The Information in October 2022 added further details on this. Citing two anonymous individuals who apparently used to work on the device, the report states that eye-tracking would allow users to log in to accounts and make payments, with the eye verification allowing multiple people to use the same headset. Tracking users’ vision would also let the headset reduce graphical fidelity in peripheral vision, thus saving battery life.
An outward-facing display would show the user’s facial expressions to other people, but this would also save battery life by having a low refresh rate, according to The Information.
With all these advanced features reportedly in the works, Apple’s headset is going to need a powerful operating system to bring everything together. Gurman has commented it could be called “rOS,” with the “r” being short for “reality.” Since then, a number of tweets have surfaced apparently revealing the name “realityOS” in Apple’s code. Given Apple’s propensity to use full words in its OS naming schemes — think WatchOS, iPadOS, and MacOS — realityOS could be a good bet for the final name of the system.
In a tweet from early February, iOS developer Matthew Davis revealed a seemingly official Apple GitHub page that apparently accidentally revealed the name realityOS. Some of the comments in the code seem to make reference to iOS executables using realityOS libraries, which could hint at some form of interactivity between the two operating systems.
However, it seems Apple might have changed its mind about that realityOS name. According to Gurman, Apple is now calling the system “xrOS” internally. Gurman also claims Apple has patented the name using a shell corporation, which might suggest that it will use that operating system name publicly as well as internally — and that it could be gearing up for an imminent product launch.
Details on how the operating system will actually work had been scarce, until Gurman’s report in January 2023. There, the reporter outlined that xrOS would be an “ambitious attempt to create a 3D version of the iPhone’s operating system.” That means apps like Safari and Mail will exist in the Reality Pro’s three-dimensional environment. There will also be an app store and compatibility with Apple’s services, such as Apple Music and Apple TV+.
Those apps will be contained on a home screen much like that seen in iOS and iPadOS, with a grid of apps and widgets that can be reordered by the user. To input text, you’ll be able to use Siri voice commands or type using an Apple device (like an iPhone) or an external keyboard. Gurman said Apple is working on allowing you to type in midair using a virtual keyboard, but that this probably won’t be ready by the time the Reality Pro launches.
In one-on-one FaceTime calls, each caller’s face and body will be realistically rendered, according to Gurman, providing they’re both using a Reality Pro headset. In calls attended by more people, though, Memoji will be used, as the processing power required to portray everyone realistically will be too great.
Finally, Gurman’s report claims you might even be able to use the headset as an external monitor for a Mac. That will let you sit at a desk and use your Mac’s keyboard and mouse, all while viewing content through the headset.
It already looks like Apple is outfitting its headset with a ton of great features, but there are still a few extras we would love to see. At the top of the list is great battery life — after all, what is the point of having an excellent device to play with if it dies after a few minutes? Fortunately, the processor choice spells good news in this department, as Apple’s custom chip has led to incredible battery life in its MacBooks. That might be countered by the super-high resolution the headset is apparently going to use, but we have our fingers crossed.
The word is that Apple is developing a special operating system dubbed xrOS that will drive the headset. Apps and games will need to run on this system, but we are hoping that, due to the common Apple Silicon architecture in both the headset and Apple’s other devices, some degree of cross-compatibility will be available.
For instance, it would be great if the headset can recognize if you are playing a game on your Apple TV or your Mac, for example, and then mirror the content onto the headset with added mixed-reality goodness (provided the game is VR-compatible, of course). It would be a shame if Apple limits the headset to only work with xrOS-compatible games and apps, as developers might be put off if they must build apps from scratch for the new operating system.
One final request concerns the headset’s control method. We do not know whether the device will come with handheld controllers or will rely entirely on gestures. If it is the former, one thing Apple really needs to incorporate is haptic feedback. This is already included to great effect in every MacBook and the Apple Watch, so Apple knows how to make the tech work. Gentle taps that are built into apps and games would be a great addition that does not break immersion.
We might get to see some of these features in action, if not in the first Apple VR headset, then in the second-generation model. Kuo claimed in late 2021 that Apple was working on a second-generation VR headset that would be lighter and faster than the original model, with better battery life too. The Elec followed that up in June 2022 by saying that LG Display was hoping to supply the main OLED panels for the second-generation device, adding further evidence that Apple is already thinking ahead to its follow-up headset.
AppleInsider may earn an affiliate commission on purchases made through links on our site.
New technologies are coming in virtual reality headsets, and terms are getting bandied about like everybody already knows what they are. Here are most of the terms you're hearing, and what they mean
Rumors suggest that Apple VR will have an 4K display for each eye, a powerful M-series processor, and extensive user tracking features. It could cost up to $2,000 in its first iteration.
The following terms apply to the headset market in general and have been used, at least in part, to describe upcoming features for Apple's headset. These terms are lightly technical, but aren't self-explanatory at first glance.
All the terms are listed alphabetically. We'll be adding new terms as they are introduced.
Augmented reality, or AR, refers to a software overlay that is independent of the real world. Consumers got their first taste of AR products through things like the Nintendo 3DS, PS Vita, and Google Glass.
Augmented Reality places digital objects as an overlay of the real world
Apple has pushed hard for augmented reality to become an integral part of its operating systems. However, it hasn't moved beyond being an interesting party trick, for the most part.
Users can use augmented reality on their iPhone or iPad today for various gaming experiences or art installations. Apple has even added an AR direction mode to Apple Maps, though it is limited and requires the user to hold their iPhone up in the air to view the information.
Augmented reality differs from mixed reality, which uses information from the real world to produce 3D overlays. AR is more passive, meaning it will look more like a heads-up display, not a fully-rendered video game.
For example, "Pokemon Go" will use a device's LiDAR to find a flat surface and place a Pokemon there to interact with. It doesn't change what kind of environment is viewed through the display — the software just adds the creature irrespective of what is surrounding it.
It is expected that Apple will use augmented reality in a limited fashion for its first virtual reality headset. It may allow for some real-world image passthrough to help users set up rather than offer full AR experiences.
Eye tracking uses the exact movement of the user's eyes to animate an avatar or control movements. This is different from foveated rendering or field of view.
For example, a VR headset with eye tracking can be used to animate an avatar's eyes within a virtual meeting room. It enables a more realistic depiction of the user's emotion or viewing direction.
This is also useful for judging when a user is trying to see outside their field of view without turning their head. The VR experience can adjust to show that region or navigate a menu with minimal user interaction.
Extended reality is a blanket term that refers to multiple technologies that involve augmented reality, virtual reality, or mixed reality. Companies generally refer to XR when discussing wearable displays, but it can also be used to discuss on-device systems like iPhone's AR functionality.
Generally, all of these terms are used interchangeably, though sometimes incorrectly. Extended reality is a technology set, not the technology itself. Someone wouldn't be "in XR" like they would be "in AR" or "in VR" though, it technically could be used that way.
Field of view is the area visible in front of the user's eyes due to the available screen real estate. VR headsets generally have a set field of view that is very wide, so users feel like they are "inside" an environment rather than viewing a screen.
Since VR screens are so close to a user's eye, they are able to take up the entire range of vision. However, they are still physical objects that don't move, so a user can still perceive the edge of a VR screen if they look for it.
Generally, users will only use a small area of the screen directly in front of their eyes, with less detailed rendering occurring at the edges.
Foveated rendering enables the VR headset to control what is being rendered based on where the user is looking. This is different from eye tracking because the information is being used to control environment renders, not avatar renders.
Foveated rendering makes objects more detailed based on where the user is looking
Rather than render a scene with exact pixel-perfect precision across the entire VR display, foveated rendering ensures only the region in front of the eyes is fully rendered to save computing power.
Advanced algorithms determine where a user might look next to prepare other regions for more advanced rendering. This should lead to a seamless experience, though it is highly reliant on the processing power of the headset.
Hand and body tracking are self-explanatory, though how it is accomplished differs from headset to headset. Early VR headsets relied upon colorful LEDs or multiple cameras in a room to track a user's movement. However, this has advanced to placing sensors directly into the headset.
Hand tracking with HoloLens
Apple's VR headset is expected to operate independently from other products, so it will likely track the user's movements without an advanced external camera rig. Instead, the headset will be able to see the user via cameras or other sensors, like LiDAR.
Gyroscopes can also be used to track the user's head position, and outward-facing LiDAR can map the immediate area in a room to help avoid object collision.
Controllers are also useful for tracking a user's hands. Rumors aren't clear on whether Apple will develop a VR controller or not. The company may be confident in its headset's ability to track hands without them.
Haptic feedback refers to a device's ability to react to software-based interactions with a physical response. Basically, think of the vibrating motors in a controller that tell you when a character has been damaged in a game.
Apple uses haptic feedback in the iPhone to simulate key presses or other functions. When implemented correctly, the user will notice haptic feedback as part of the interaction but disassociate it from a vibrating motor.
A VR headset could take advantage of haptic feedback to alert the user of impending object collision, or software could take advantage for simulating parts of a game. For example, haptic feedback might tell a user something is behind them.
Haptics are also useful in VR controllers. The controller might vibrate to indicate their in-game sword has collided with an object, or perhaps, they've highlighted a menu item with a virtual pointer.
LiDAR, or light direction and ranging, is a sensor used to create 3D representations of a physical environment for software. On the iPhone or iPad Pro, LiDAR sensors have generally been used to quickly find flat surfaces for augmented reality experiences.
LiDAR can determine what's in a 3D environment and rebuild it in software
In a VR headset, LiDAR can be used to map a room, so the software knows where objects are. This information can be used to help the user avoid collisions while using the headset.
More advanced applications of LiDAR could provide real-time information so a mixed reality experience can be rendered. Or, the sensors could be used to track a user's hands or body as they move.
Mixed reality is a more advanced technology that combines the real world with software within the VR headset. Users would still be within a completely enclosed virtual reality experience, but real-world objects would be represented by virtual ones in real-time.
HoloLens showing torque values of screws on a motorcycle
If augmented reality is just software overlaid on the real world, mixed reality is software that is aware of the real world. For example, mixed reality would see a motorcycle you're working on and point out exactly what parts you're adjusting in real-time, versus augmented reality just overlaying a list of steps.
It is different from virtual reality, too, since VR experiences are completely unaware of the outside world. You're placed inside a pre-rendered world for VR and don't see the world around you within the experience.
Mixed reality requires a lot more computational power than augmented reality or virtual reality because it is using both technologies in real time. Imagine converting your home into a jungle in virtual reality, but every piece of furniture is represented by a tree or bush in the simulation. A culmination of AR and VR tech.
Spatial Audio is Apple's branded implementation of directional audio that accounts for sound origin, distance, and direction within a 3D space. So, music or media that take advantage of Spatial Audio will sound like it is coming from all around you, not just in front of you.
Spatial Audio with head tracking lets users move through 3D audio spaces
Apple uses Dolby's existing audio formats to recreate audio in a 3D space. Spatial Audio is different from standard Dolby Atmos tracks, for example, due to how Apple processes the files. It can use device gyroscopes to let the user "move" through the 3D audio space with head tracking.
Spatial Audio will likely play a pivotal role in virtual reality experiences. Modern Apple headphones like AirPods Pro 2 and AirPods Max can take advantage of the format, though it isn't clear how Apple will address audio in VR if a user doesn't have AirPods.
Virtual reality, or VR, is a completely encapsulating software experience that doesn't account for the real world, nor is the real world visible when in use. A headset obscures the user's view entirely as software is shown on displays that are inches away from the user's eyes.
The software is shown in a pre-rendered state that is only vaguely aware of the user's location. Sensors on the headset can help alert the user to a potential collision with real-world objects, but the VR experience is not affected by this.
Virtual reality differs from augmented reality because it completely takes over the user's vision rather than overlaying information on the real world. Virtual reality becomes mixed reality if the headset is able to create software renders within the real world while accounting for surrounding objects.
Apple is rumored to be working on an operating system that is built for augmented reality and virtual reality. It would be used in its first VR headset and has been referred to as RealityOS or xrOS.
Apple's VR headset is expected to run xrOS
The operating system will likely take on aspects of Apple's other software, so users are instantly familiar with how to interact with software and menus. Apple has been pushing developers to build AR experiences already, so the step up to VR will likely be a small one.
While there are still hints to RealityOS in Apple's iOS, the final name is expected to be xrOS, which stands for "extended reality operating system." A single OS for AR and VR could bridge the gap from the Apple VR headset to a future set of AR glasses that have been dubbed "Apple Glass."
Your mind is very powerful. Yet, if you're like most people, you probably spend very little time reflecting on the way you think. After all, who thinks about thinking?
But, the way you think about yourself turns into your reality.
The Link Between Thoughts, Feelings And Behavior
Your thoughts are a catalyst for self-perpetuating cycles. What you think directly influences how you feel and how you behave. So if you think you’re a failure, you’ll feel like a failure. Then, you’ll act like a failure, which reinforces your belief that you must be a failure.
I see this happen all the time in my therapy office. Someone will come in saying, “I’m just not good enough to advance in my career.” That assumption leads her to feel discouraged and causes her to put in less effort. That lack of effort prevents her from getting a promotion.
Or, someone will say, “I’m really socially awkward.” So when that individual goes to a social gathering, he stays to in the corner by himself. When no one speaks to him, it reinforces his belief that he must be socially awkward.
Your Beliefs Get Reinforced
Once you draw a conclusion about yourself, you’re likely to do two things; look for evidence that reinforces your belief and discount anything that runs contrary to your belief.
Someone who develops the belief that he’s a failure, for example, will view each mistake as proof that he’s not good enough. When he does succeed at something, he’ll chalk it up to luck.
Consider for a minute that it might not be your lack of talent or lack of skills that are holding you back. Instead, it might be your beliefs that keep you from performing at your peak.
Creating a more positive outlook can lead to better outcomes. That’s not to say positive thoughts have magical powers. But optimistic thoughts lead to productive behavior, which increases your chances of a successful outcome.
Challenge Your Conclusions
Take a look at the labels you’ve placed on yourself. Maybe you’ve declared yourself incompetent. Or perhaps you’ve decided you’re a bad leader.
Remind yourself that you don’t have to allow those beliefs to restrict your potential. Just because you think something, doesn’t make it true.
The good news is, you can change how you think. You can alter your perception and change your life. Here are two ways to challenge your beliefs:
• Look for evidence to the contrary. Take note of any times when your beliefs weren’t reinforced. Acknowledging exceptions to the rule will remind you that your belief isn’t always true.
• Challenge your beliefs. Perform behavioral experiments that test how true your beliefs really are. If you think you’re not good enough, do something that helps you to feel worthy. If you’ve labeled yourself too wimpy to step outside of your comfort zone, force yourself to do something that feels a little uncomfortable.
With practice, you can train your brain to think differently. When you give up those self-limiting beliefs, you’ll be better equipped to reach your greatest potential.
Amy Morin is a psychotherapist and the author of the bestselling book 13 Things Mentally Strong People Don't Do.
(CNN) -- Blair MacIntyre imagines a world where tiny clouds of information -- Facebook statuses, business cards, Twitter posts -- float above all of our heads.
In some ways, it's not that far from reality.
Advancements in mobile phone technology have cleared the way for a coming wave of "augmented reality" applications that merge the physical world with information compiled about people and places on the Internet.
"When the technology gets there, this stuff could be amazingly useful and mildly terrifying in some ways," said MacIntyre, an associate professor at the Georgia Institute of Technology who has taught classes in augmented reality for a decade.
The idea of pairing digital information with our real, 3-D environments is not especially new -- think robot-human vision in the "Terminator" movies. MacIntyre even plodded about college campuses in the 1990s wearing a 40-pound backpack and nerdy goggles, trying to make something similar happen.
But as mobile phones become better equipped with GPS systems, which use satellites to locate the phones; compasses, which tell the direction the phone faces; and accelerometers, which relay the device's tilt; the once-lofty idea of augmented reality is being put into the hands of consumers.
Last July in the Netherlands, a company called SPRXmobile released a mobile browser, Layar, that lets people see pieces of this new info-reality through their phone screens.
A Layar user sets his or her phone to video mode, aims it around and sees all kinds of information pop up on the screen: blinking dots on apartments that are for sale, the values of those units, pull-down reviews of the bar up on the corner or details about sales at a nearby retail store. Watch a video demo of the app
This makes information easier to find and helps people make better sense of the physical world around them, said Maarten Lens-FitzGerald, co-founder of Layar.
"I think it will actually get you out more than you would stay at home," he said. "You're not at your couch anymore, you're not at your desk" when you need to find information.
Layar, which bills itself as the first mobile browser that features augmented reality, is only available in the Netherlands and only on certain phones, including Google's Android, T-Mobile's G1 and the HTC Magic. But Lens-FitzGerald said the company plans to announce a global expansion plan on August 17 and will develop an app for the iPhone if Apple changes policies that obstruct developers from creating such applications on that device.
A range of other "AR" apps are in development or are on the market. One, called Nearest Tube, highlights subway routes in New York and London. Wikitude is an app that aims to show people encyclopedic information about nearby landmarks. Like Wikipedia, users can add information to the service. The idea could usher in an era of cell-phone tour guides.
Total Immersion, a French company, developed an app that makes 3-D baseball players spring to life from baseball cards. Users can turn the card to see their favorite players, through a phone screen, from all angles. And at Georgia Tech, researchers are working on video games that may one day make it look like virtual zombies are chasing players down real-world streets.
Alex Michaelis, CEO of Tweetmondo, a site that pairs Twitter posts with geographical information, said he has developed an app that will let mobile phone users see their friends' tweets through the video camera on their phones. He expects it to be available within the month.
"It adds information to your world, and this is what it's all about," he said.
To picture how that service would work, think about walking into your living room in the evening. If a roommate had posted to Twitter from the couch, his or her Tweet would hover in that space when viewed through a mobile phone's video camera.
Michaelis admits the model is a bit clunky for now. But he sees a future when the app will let people stand on a street corner, hold their phone up to their face, and see the Twitter posts of crowd members as they mill about. Phones would have to be able to communicate with satellites and computer services constantly, instead of only when someone posts a message, to make that possible, he said.
"I see this being resolved in the near future," he said, "because, really, it's just a matter of really experimenting with this technology and pushing it to the limit."
But there are doubts about augmented technology on phones.
Lens-FitzGerald, of Layar, is concerned that augmented reality is being over-hyped and may create unrealistic expectations from consumers.
"It's a cool technology, but yeah, we need to see how much [funding and visibility] our companies will get," he said. "It's getting a lot of press now without being proven, but do we make money, are we going to make people happy with it? We don't know. We're just starting."
He added: "It's like the first TV. We need to build an audience."
MacIntyre, of Georgia Tech, said the technology behind today's augmented reality apps is crude. Mobile phone GPS isn't nearly accurate enough to make sure a Twitter post is tagged to a person, for instance, rather than the lamp post that's 50 feet away.
Furthermore, the idea behind the information-reality mesh on mobile phones is off-base, he said.
"I don't see them answering a problem that needs to be solved," added MacIntyre, who believes two-dimensional maps can be used to display information much more easily with current technology.
More functional problems exist as well. People don't necessarily want to walk around the world holding cell-phone screens in front of their faces. And the world's information has to be tagged geographically to make sense in an augmented-reality setting.
But MacIntyre does see a bright future for augmented reality.
Within a year, mobile phone applications will become much more functional, he said, and in the foreseeable future, augmented reality will move off of phone screens and onto futuristic sunglasses, whose wearers will see blips of information about everything around them, he said.
If that happens, the "Terminator" vision will have truly arrived.
The Apple VR/AR mixed reality headset is one of those products that's perpetually rumored but never seems to materialize — though that could soon change. Apple hasn't officially announced the device, but we know the company has big plans for augmented reality. The VR/AR headset is the next step on that journey.
It's worth noting that the VR/AR headset is totally different from the rumored Apple Glasses. Those are said to be purely AR-focused and aren't likely to arrive anytime soon. Meanwhile, the VR/AR headset could be here as soon as 2023, and it's likely to compete with the Meta Quest 2, PSVR 2 and all the other best VR headsets.
We've seen several reports coming out regarding Apple VR/AR, including next-generation display technology and its potential price and release window. We've even heard rumors about the display for the Apple VR/AR Headset 2, the theoretical successor to Apple's still unannounced mixed reality headset. Now that Samsung, Google and Qualcomm have teamed up to work on XR projects, it's clear the mixed reality headset business is going to heat up in 2023.
Here's everything you need to know about the Apple VR/AR mixed reality headset that's expected to arrive at some point this year.
Rumors around the Apple VR/AR headset had suggested a March 2023 reveal with orders not shipping until Fall 2023. That comes from Bloomberg's Mark Gurman, who now thinks Apple might push back its headset preview. Now Gurman thinks it will happen in June at the Worldwide Developer Conference, as Apple races to address hardware and software issues.
That's consistent with what analyst Ming-Chi Kuo has claimed regarding a Spring 2023 or Summer 2023 reveal, with a fall launch after that. He suggests that the continued delays appear to be due to production problems causing delays to both the announcement and retail launch. But it seems now that those delays have been resolved as rumors coalesce around a springtime reveal.
In a Medium post (opens in new tab), exploring several aspects of the VR headset market, Kuo describes Apple as "a game-changer for the headset industry," and predicts that Cupertino's first headset will lead a new wave of products from others trying to emulate its ideas, and boost in demand for associated AR games and apps. He's also claimed a second generation (plus a cheaper edition) is supposedly coming in 2025.
A Bloomberg (opens in new tab) report unearthed possible trademarks linked to Apple's VR/AR headset, including Reality One, Reality Pro and Reality Processor (opens in new tab). More recently a report from Bloomberg suggested "Reality Pro" would be the name Apple opts for.
According to reports, the Apple VR/AR mixed reality headset is designed to be a precursor to Apple Glass. The AR lenses are supposed to offer an "optical see-through AR experience," according to Ming-Chi Kuo. However Apple Glasses may not arrive for a long time, with Apple reportedly delaying the project due to technical challenges.
In other words, based on everything we've heard, Apple Glass is designed to look and act like an ordinary lightweight pair of glasses. We're talking about glasses that are able to project information, and presumably imagery, onto its lenses
The Apple VR/AR mixed reality headset is expected to be like a typical VR headset, but one with a number of exterior cameras and sensors that unlock bonus functionality.
That way Apple's VR and mixed reality headset can offer body tracking, and incorporate real-world environments in a virtual space. Plus, the Apple VR headset could incorporate a see-through experience that can deliver a form of augmented reality. So, it's not quite like the Oculus Quest 2, which is VR-only.
However Mark Gurman has claimed that the Apple headset will be designed for short trips into VR — rather than jumping on the 'metaverse' bandwagon like so many others. In fact Apple is said to have declared metaverse if "off limits". Users will be able to use the mixed reality headset for communication, content viewing and gaming, but it won't be a device you wear all day, or as a replacement for real life.
Still not sure what the difference between mixed reality, augmented reality and virtual reality actually is? We have an explainer that tells you exactly what mixed reality is and what Microsoft, Meta and Apple have planned for it.
Reports on the Apple VR/AR mixed reality headset price have been mixed. But rumors suggest a developer focus, so pricing may center around attracting programmers.
Tim Cook has spoken at length about how AR is Apple's end goal. The headset is reported to be the first stage in the company's wearable AR ambitions. The headset's main goal is reportedly to prepare developers for the launch of Apple Glass, and ensure the specs have app support for launch. Apple's main incentive is not to make money, and reports claim the headset's price will reflect that.
That being said, Mark Gurman has claimed the headset will be heavy on gaming, media consumption and communication, suggesting Apple is designing something with consumers in mind. Maybe this could mean a cheaper second-gen headset later. However, that doesn't necessarily mean the first-generation headset won't be expensive, or a primarily designed for developer use.
While Apple's VR and mixed reality headset is supposed to be expensive, reports are divided on how expensive it's set to be.
A report from The Information claims that Apple's VR headset will be as high as $3,000. Gaming VR headsets rarely cost more than $1,000, though the Microsoft Hololens 2 does cost a whopping $3,500.
Mark Gurman claims that the headset could cost upwards of $2,000. That price tag is supposed to account for the headset's hardware, which could include the Apple M1 Pro chip, an extended development time and the usual increased markup applied to other Apple products.
However, Ming-Chi Kuo has now reported that the final price should be between $2,000 and $2,500. This has since been refuted by a Bloomberg report, claiming that the headset would indeed be priced at $3,000.
In any case, the cost of entry is going to be high and certainly a lot higher than other stand-alone VR headsets. For example, the Oculus Quest 2 costs $300 by comparison.
The Information (opens in new tab)'s reported claims that the VR/AR headset will feature 12 tracking cameras that can feed information to two 8K displays in front of the user's eyes. On top of that, the headset will also reportedly feature LiDAR sensors. However, this report was contradicted by Display Supply Chain Consultants, or DSCC, which claims that Sony is making 4K 4000 x 4000 displays for Apple's headset with a 1.4-inch diagonal. It is important to know that in its comments DSCC did mention that LiDAR was still a possibility.
For those that don't know, LiDAR uses lasers to measure distance, which can gather the area of a space quickly and accurately. A device can use this info to better place objects in AR, and has already been used this way on high-end iPad Pro as well as the iPhone 12 Pro and iPhone 12 Pro Max.
DSCC also suggests that Apple might cram in three displays total inside its headset. There could be the two Sony-made 4K displays mentioned above, as well as one larger lower resolution AMOLED display on the back. This, according to the report, would allow Apple to create a foveated display.
A foveated display refers to the fovea that sits at the back of a person's eye, along the retina. The fovea helps with sharpening central vision. A foveated VR headset could use eye tracking to help focus an image on what the user is looking at while lowering the resolution around the periphery. This video by YouTube channel SweViver does an excellent job explaining fixed foveated rendering (FFR).
There is some credence to the rumors of Sony developing the display panels for the mixed reality Apple headset. latest reports suggest that the Apple headset will use OLEDoS panels, which use silicon instead of glass substrates to create unbelievable resolutions. Sony, Samsung and LG are all working on OLEDoS panels, with Sony and LG rumored to provide displays for the first generation of the Apple VR/AR headset and Samsung and LG to provide displays for the second generation.
The main feature of the Apple headset is mixed reality. According to Mark Gurman, the headset will include external cameras which are currently being used to test features like hand-tracking and gesture control. Part of this includes the possibility of being able to type in the air with a virtual keyboard.
At least one report claims that Apple isn't focusing on games for its AR/VR headset. That seems like a curious decision, given how early adopters are often attracted to features like gaming.
Rumors suggest that the Apple VR/AR headset will not rely on sensors alone. Sources have told The Information that users will be able to wear a "thimble-like" device on their finger to help with hand tracking and other controls. The latest reports also claim that there will be a Digital Crown, similar to the one on AirPods Max, to switch between AR and VR modes.
A lot of power is going to be needed to keep all this going, and Gurman's report claims that the headset will feature Apple's "most advanced and powerful chips." Apparently, the chip inside Apple's VR headset will be more powerful than the newly-launched M1 Mac chip.
According to a later report from Gurman, the headset will come with the new M2 chip and 16GB of RAM. It may not be the most powerful chip in the Apple Silicon range, but it dopes offer a good balance of power and energy efficiency.
Speaking of specs, a newer report from Kuo has the headset tipped to get a brace or processors, suggesting Apple won't be scrimping on power.
"The higher-end processor will have similar computing power as the M1 for Mac, whereas the lower-end processor will be in charge of sensor-related computing," Kuo predicts.
Similarly, The Information (opens in new tab) reports that there will be two processors on board the headset, with the main processor the equivalent of the M2 chip slated to come out later this year in devices like a new MacBook Air.
According to Kuo, all that hardware will also need a significant amount of power. To the point where the headset will apparently come with a 96W MacBook charger to keep everything powered on. It's also been suggested that an external battery pack may be necessary, which will offer 2 hours of battery life.
Not so long ago, Kuo also shared that the device could get hand gesture controls as well as object detection features, which could be enabled through "highly sensitive 3D sensing modules."
"The AR/MR headset can detect not only the position change of the user or other people’s hand and object in front of the user's eyes but also the dynamic detail change of the hand," he predicts. As an example, Kuo suggested when users change their hand from a clenched fist to an opened hand, the machine could track this movement and create an image of a balloon floating away as if released.
More details have emerged in another Information report about auto-adjusting lenses and a separate battery pack being part of the headset. Not having the battery built in sounds a little inconvenient for using the headset, but being able to rely on the internals to adjust to your vision does sound welcome.
Apple's latest patent wins could also shed light on some other expected features, including finger gestures. Recent patent filings from Apple indicate that it wants to integrate wearables with its VR/AR mixed reality headset — and possibly Apple Glasses. These inventions would allow users to use wearables ranging from two Apple Watches to a VR glove to execute finger gestures. These gestures could allow users to do a range of tasks, from scrolling through pages to hanging up a phone call.
A patent discovered by Apple Insider (opens in new tab) reveals that Apple has been working on some smart rings, which can be used to track finger and hand movements. This could be employed with the VR and mixed reality headset, to boost the capabilities of the external cameras.
The patent also mentions being able to detect objects the user is holding, including an Apple Pencil. That means the headset will be see what you want to do, and alter its functionality accordingly. So if you hold an Apple Pencil it sill know you want to hand write something, as oppose to typing. And so on
Per analyst Ming-Chi Kuo, he believes that Apple will use "3P pancake lenses" that will have a folded design to allow light to reflect back and forth between the display and lenses. This could allow for a headset design that's compact and lightweight.
Ming-Chi Kuo also claims that the headset will come with Wi-Fi 6E support, which would allow it to connect to a separate device and transfer large amounts of data with low latency. This means the headset could allow a separate device, like an iPhone or Mac, to do all the hard work and beam it to the headset without the need for a physical cable.
Not doing all the processing in the headset itself will also mean Apple can help keep the weight low and preserve battery life for much longer than it would have lasted otherwise.
On top of all these possibilities, a report from The Information claims that Apple will add iris scanning tech to the VR/AR headset. The idea is that the headset can authenticate the user as soon as they put it on, which will be beneficial for situations where multiple people share the same device. It may also be used to authenticate purchases, the same way Face ID and Touch ID are utilized on iOS devices.
If that wasn't enough, a new report from The Information claims that the Apple VR/AR headset will let users create their own apps — regardless of whether they know how to code or not. This feature will apparently run through Siri, allowing people to use the headset to scan real world objects and transform them into digital assets. User-created apps will also be allowed to appear on the app store, though they will no doubt need to pass Apple's strict approvals process.
According to a report by The Information from early 2021, the outlet received design sketches of what Apple's mixed reality headset might look like. This is apparently based on early prototype work by Apple engineers and may not reflect the final product.
Concept artist Ian Zelbo also created some renders of a possible headset design, based on The Information's leak from earlier in the year.
Regardless, with this being Apple, we'd expect the mixed reality headset to sport a bit of slick industrial design with a lot of user ergonomics in mind.
For what it's worth, Ming-Chi Kou has claimed the Apple headset will weigh between 300 and 400 grams (a little less than a pound) when it debuts. A lighter version is in the works for a subsequent release, Kou adds.
A whole new device form factor requires a tweaked operating system, and it looks like that’s what Apple will be providing, with reference to “realityOS” in App Store upload logs by eagle-eyed developers. A second appearance of this name, along with "xrOS", was found in the code for the Windows 11 Apple Devices app.
There's not much information on this potential software, but it would make sense for Apple to come up with a custom OS for it's VR and AR gadgets. We'd hazard a guess that such an operating system would have more in common with iOS than macOS.
Current reports and rumors suggest that Apple's AR/VR mixed reality headset will have a professional and developer focus. The idea is to ensure developers have a real device so they can get to grips with designing apps for augmented reality, ahead of the eventual launch of the Apple Glasses AR specs.
We've read reports that suggest that Apple's AR/VR headset could cost as high as $3,000, whereas other reports suggest a device that's "several thousand dollars."
But with the Apple Glasses still reportedly several years away, time may change the appeal of the Apple headset. After all, the more time developers have with it, the more apps they can release, and the more interesting it will be to own. That's assuming the price tag doesn't continue to put people off.
Granted, analyst Ming-Chi Kuo has stated that Apple could be looking to launch a second-generation headset in 2024. He expects this headset to sell 10 million units, rivaling the Oculus Quest 2. Maybe this headset will have much larger mass-market appeal.
Long-term comfort: The problem with most VR headsets is that they're not ideal for long-term use. Discomfort generally increases after about 30 minutes. Of course, the more comfortable the headset is from the start, the longer you'll be able to keep going.
If Apple can design the AirPods Pro in such a way that you can forget they're there, it can certainly ensure its mixed reality headset is as pro-comfort as possible.
Solid battery life: Currently, the battery life on standalone headsets isn't great. The Oculus Quest 2 only lasts two to three hours, depending on what you're doing. We want Apple's VR and mixed reality headset to offer at least this much battery life, but ideally more.
A focus on fitness: With possible integration with Apple Fitness Plus and the Apple Watch, the Apple headset could be a game changer for fitness. You could use the device during workouts and see your progress as you follow along with personal trainers.
Proper AR: If Apple is going to kick start its wearable AR efforts with a mixed-reality headset, we want to see some proper AR features. Users will always be aware that the headset is in place, but Apple should, at the very least, do everything it can to make sure that any see-through AR functionality is as realistic as possible. That means good image quality, no noticeable lag, and a good field of view.
No gimmicks: If the mixed reality headset really is a developer device that's being released to the public, the least Apple can do is make sure there's a reason to have one. Don't release the headset for the sake of it, especially if it really is that expensive. give people an real reason to pick one up for themselves, beyond the logo.
Apple Inc. has pushed back the unveiling of its highly anticipated mixed-reality headset from April to June, according to a new report.
Bloomberg News reported Wednesday that the headset is now planned to debut at Apple’s annual Worldwide Developers Conference, after product testing found improvements were needed on both the hardware and software sides.
Apple AAPL, -0.75% has been working on the mixed-reality headset for years, and its launch has been delayed a number of times. In January, Bloomberg reported that the follow-up to the mixed-reality headset — lightweight augmented-reality glasses — were being delayed until 2024 or 2025.
In that report, Bloomberg reported the mixed-reality headset due this year would cost around $3,000.
Read more: Here’s what Apple’s mixed-reality headset could feature
Mixed reality combines real-world and digital elements, allowing users to interact in a virtual environment, as opposed to virtual reality, which is completely immersive, and augmented reality, which overlays virtual information, such as images, text and animation, over the real world.
Apple has not had a major product launch since AirPods wireless headphones in 2016.
Apple shares are up about 20% year to date, but down 10% over the past 12 months, compared to the S&P 500’s SPX, -0.28% 8% gain in 2023 and 7% decline over the past year.